12 A/B Testing Mistakes I See All the Time
A/B testing is fun. With so many easy-to-use tools, anyone can—and should—do it. However, there’s more to it than just setting up a test. Tons of companies are wasting their time and money.
A/B testing is fun. With so many easy-to-use tools, anyone can—and should—do it. However, there’s more to it than just setting up a test. Tons of companies are wasting their time and money.
If you want to get people to buy your stuff, you need to understand how consumers make purchasing decisions.
Social media isn’t a perfect source of market research: It’s not a representative sample and, for small businesses, it’s simply too small of a sample.
But for large organizations, it’s still a critical one. Why? Because it includes your most passionate fans.
For many, Google Analytics 360 is a black box. Marketing and sales collateral from Google is spartan, and common refrains about key features—like unsampled data—seem unworthy of a six-figure bill for most sites.
Freemium and free-trial signups have one thing in common: Neither generates revenue.
You may agonize over the decision to choose one path over the other, but you can save that strategic energy for figuring out how to transition more free users into paying customers with user onboarding.
In the world of data-driven marketing, more and more tasks require a bit of coding.
If you’re doing it right, you probably have a large list of A/B testing ideas in your pipeline. Some good ones (data-backed or result of a careful analysis), some mediocre ideas, some that you don’t know how to evaluate.
We can’t test everything at once, and we all have a limited amount of traffic.
You should have a way to prioritize all these ideas in a way that gets you to test the highest potential ideas first. And the stupid stuff should never get tested to begin with.
How do we do that?
While testing is a critical part of conversion optimization to make sure we actually made things better and by how much, it’s also the tip of the iceberg of the full CRO picture. Testing tools are affordable (even free), and increasingly easier to use – so pretty much any idiot can set up and run A/B tests. This is not where the difficulty lies. The hard part is testing the right things, and having the right treatment.
The success of your testing program is a sum of these two: number of tests run (volume) and percentage of tests that provide a win. Those two add up to indicate execution velocity. Add average sample size and impact per successful experiment, and you get an idea of total business impact.
So in a nutshell, this is how you succeed:
Executing point #1 obvious, but how to do well for points #2 and #3? This comes down to the most important thing about conversion optimization – the discovery of what matters.
Many dismiss copywriting as something that ad agency people do. Truthfully, all of us need to pay close attention to copywriting if we want to achieve our business objectives.