a/b

16 Ecommerce A/B Test Ideas Backed by UX Research

Nothing works all the time on all sites. That’s why we test in the first place; to let the data tell us what is actually working.

That said, we have done quite a bit of user experience on ecommerce sites and have seen some trends in terms of what generates positive experiences from a customer perspective.

This post will outline 16 A/B test ideas based on that data.

Keep reading »

How Many Variations Can You Have in an A/B/n Test?

Just when you start to think that A/B testing is fairly straightforward, you run into a new strategic controversy.

This one is polarizing: how many variations should you test against the control?

Keep reading »

How to Make More Money With Bayesian A/B Test Evaluation

The traditional (and most used) approach to analyzing A/B tests is to use a so-called t-test, which is a method used in frequentist statistics.

While this method is scientifically valid, it has a major drawback: if you only implement significant results, you will leave a lot of money on the table.

Keep reading »

PXL: A Better Way to Prioritize Your A/B Tests

If you’re doing it right, you probably have a large list of A/B testing ideas in your pipeline. Some good ones (data-backed or result of a careful analysis), some mediocre ideas, some that you don’t know how to evaluate.

We can’t test everything at once, and we all have a limited amount of traffic.

You should have a way to prioritize all these ideas in a way that gets you to test the highest potential ideas first. And the stupid stuff should never get tested to begin with.

How do we do that?

Keep reading »

UX Research and A/B Testing

A/B testing is common practice and it can be a powerful optimization strategy when it’s used properly. We’ve written on it extensively. Plus, the Internet is full of “How We Increased Conversions by 1,000% with 1 Simple Change” style articles.

Unfortunately, there are experimentation flaws associated with A/B testing as well. Understanding those flaws and their implications is key to designing better, smarter A/B test variations.

Keep reading »

10 Statistics Traps in A/B Testing: The Ultimate Guide for Optimizers

Even A/B tests with well-conceived test concepts can lead to non-significant results and erroneous interpretations. And this can happen in every phase of testing if incorrect statistical approaches are used.

Keep reading »

How 8 Different A/B Testing Tools Affect Site Speed (Original Study)

Both your visitor and Google prefer your site to be fast. Increasing site speed has been shown to increase conversion rates as well as increase SERP rankings, both resulting in more money for your business.

You’re doing a/b split testing to improve results. But A/B testing tools actually may slow down your site.

Keep reading »

How To Communicate A/B Test Results To Stakeholders

Data should speak for itself, but it doesn’t. After all, humans are involved, too – and we mess things up.

Keep reading »

What Do You Do With Inconclusive A/B Test Results?

So you ran a test – and you ran it correctly, following A/B testing best practices – and you’ve reached inconclusive results.

What now?

Keep reading »

Validity Threats

You have an A/B testing tool, a well-researched hypothesis and a winning test with 95% confidence. The next step is to declare the winner and push it live, right?

Not so fast.

Keep reading »