fbpx

All Things Data-Driven Marketing

The Hard Life of an Optimizer - Yuan Wright [Video]

Here’s another presentation from CXL Live 2015 (sign up for the 2016 list to get tickets at pre-release prices).

While optimization is fun, it’s also really hard. We’re asking a lot of questions.

Why do users do what they do? Is X actually influencing Y, or is it a mere correlation? The test bombed – but why? Yuan Wright, Director of Analytics at Electronic Arts, will lead you through an open discussion about the challenges we all face – optimizer to optimizer.

Keep reading

Your Test is Only as Good as Your Hypothesis [Video]

CXL Live 2016 is coming up next March (get on the list to get tickets at pre-release prices). We’re going to publish video recordings of the previous event, and here’s the first one.

You run A/B tests – some win, some don’t. The likelihood of the tests actually having a positive impact largely depends whether you’re testing the right stuff.  Testing stupid stuff that makes no difference is by far the biggest reason for tests that end in “no difference”.

Keep reading

The Discipline Based Testing Methodology

This is the methodology that I have developed over 12 years in the industry and working with over 300 organizations. It is also the methodology that has been used to have a near perfect test streak (6 test failures in 5.5 years), even if most others do not believe that stat.

Keep reading

ResearchXL

While testing is a critical part of conversion optimization to make sure we actually made things better and by how much, it’s also the tip of the iceberg of the full CRO picture. Testing tools are affordable (even free), and increasingly easier to use – so pretty much any idiot can set up and run A/B tests. This is not where the difficulty lies. The hard part is testing the right things, and having the right treatment.

The success of your testing program is a sum of these two: number of tests run (volume) and percentage of tests that provide a win. Those two add up to indicate execution velocity. Add average sample size and impact per successful experiment, and you get an idea of total business impact.

So in a nutshell, this is how you succeed:

  1. Run as many tests as possible at all times (every day without a test running on a page/layout is regret by default),
  2. Win as many tests as possible,
  3. Have as high impact (uplift) per successful test as possible.

Executing point #1 obvious, but how to do well for points #2 and #3? This comes down to the most important thing about conversion optimization – the discovery of what matters.

Keep reading

Lies Your Optimization Guru Told You

Before you get out your pitchforks, I want to stress that this article does not represent Peep’s views.

The easiest lies to believe are the ones we want to be true, and nothing speaks to us more than validation of the work we are doing or what we already believe.  Due to this we become naturally defensive when someone challenges that world view.

The “truth” is that there is no single state of truth and that all actions, disciplines, and behaviors can and should be evaluated for growth opportunities.  It doesn’t matter if we are designers, optimizers, product managers, marketers, executives, or engineers, we all come from our own disciplines and will naturally defend to the death if we feel threatened even in the face of overwhelming evidence.

Keep reading

Lies Your Designer Told You (or Data vs Design)

Designers versus data more than ever deserves its place in the pantheon of great conflicts: the Hatfields vs. McCoys, Android vs. iOS, Social Media Marketing vs. Results, Athens vs. Sparta, the Doctor vs. Daleks, Auburn vs. Alabama, and Fox News vs. reality.

We make this out to be some great collision of disciplines when in fact they are not opposites and they can and should work together.

Keep reading

Categories