Are A/B tests just hype?

Updated: Nov 21

Lately, we have seen multiple contradicting discussions on the usefulness of A/B tests. Many product leads have concluded that their only benefit is making us feel better when conducting research with minimal impact on actual performance.

Are A/B tests useful?

Data from HBR suggests that 80 - 90% of A/B tests fail to produce statistically significant results.

On the contrary, Google, Amazon, Twitter, and other big companies are conducting 10,000 A/B tests every year, and their exponential growth is attributed to their culture of experimentation.

Like many other things, A/B tests can be helpful if you conduct them appropriately.

The logic is simple - you prepare two options to show each to only half of your target audience and later measure which one performed better.

You could test which marketing messages perform better when run as ads, which call-to-action performs better on your landing, or create two different designs for functionality in your app.

Common mistakes with A/B tests include:

You don’t have enough volume

You won’t get statistically significant results if your transaction, lead or target audience size is very small. If you have less than 1000 transactions per month, you need more than 10% higher preference for an option to determine the winner.

You try to test everything at once

When conducting A/B tests, you must clearly understand what you’re measuring. For example, If you test marketing materials with different colors, pictures, target audiences, and messaging, you won’t understand what worked, even if you find a clear winner.

You test things with insignificant value to the business

Testing for the sake of testing is a waste of time and money. You should only test things that have an impact on conversion. Pages, buttons, and other materials worth testing are in your sales funnel.

You get the timing wrong

Some parts of the month or week have higher conversions and traffic. When measuring results, remember to compare the same periods. Secondly, your measuring period has to be long enough to get correct results.

You prepare two options for testing entirely on gut feeling

When you prepare options based on gut feeling, the result will show which was slightly better, not the one that is best for your target audience.

Similarly to everything else, you must conduct proper research beforehand to get valuable data from A/B tests.

Given #tech companies have a high success rate of A/B tests due to continuous testing - they simply know more about their customers. ☺️

Researching as an early-stage startup:

However, if you’re a startup that doesn’t have extensive data about your customers, there are a few tricks for research:

✅ See what works for your competitors and adjust it to your company.

✅ If you’re testing marketing messages, see what content your audience consumes, how they talk, and which words they use.

✅ Conduct interviews with your target audience.

✅ If your transaction or traffic is too low for A/B testing, consider driving traffic to your page through paid ads or use other methods like immediately making changes to the existing version and monitoring results or asking for feedback through UX testing platforms.

It’s easy to make mistakes with A/B testing. Remember what, how, and when you measure to get statistically significant results and avoid false positives. You have to be a little more creative if it’s too early to use this method.

Let us know your thoughts on the usefulness of A/B tests, and follow KoFounder for more! 👇

#ProductManagement #ExecuteGrowth #Research #Startups

12 views0 comments

Recent Posts

See All