fbpx

6 Key CRO Lessons I’ve Learned As Editor

6 Key CRO Lessons I've Learned As Editor

As of next week, I, Tommy Walker, will be leaving my post as editor of CXL.

I’ve learned so much in this past year, and I owe so much to this blog. If you’ll allow, I’d like to share the 6 major lessons I’ve learned as the editor of this blog. 

6. “Conversion Optimization” Isn’t About Green Or Red Buttons

Click here red CTA button on a grey background

People aren’t magically drawn to a button because it’s red, nor are we automatically pre-disposed to sign up just because you added the word “Free” in your headline. Optimization, isn’t about seeing which tests are “better”, but rather about studying your visitor’s behavior and creating designs & copy to engage in a dialogue with your visitor’s inner voice.

It starts with making a good first impression, clearly stating your value proposition, and communicating how you’re different – even if what you sell isn’t unique.

You want to carry the momentum from first click to final conversion, by maintaning scent and telling a story of how their world gets better as you guide them through the experience.

This isn’t guesswork and it’s never really “over”. It’s the combination of hard data, qualitative feedback, a deep understanding of what persuades your visitors & the different segments therein, that will provide the insights necessary for an educated test hypothesis.

Questions like “What color converts best” are a complete waste to time.  Instead, you should be always be seeking to find answers to questions such as:

  • Where do most people get stuck in the buying process?
  • What are common traits among our paying customers?
  • What hesitations do our leads have that prevent them from buying?

The smartest thing Peep taught me is to start closest to the money and work backwards from there.

Find out where your site is leaking money, then create a testing plan moving forward. If you aim to understand your visitors real motivations & hesitations, you’ll start running tests with substance that can turn into even more insight.

5. There Is No Failure, Only Learning

Analytics Screw Up

This was a huge shift in mindset for me. It’s all too easy to get emotionally involved with a test only to find that your challenger made no difference, or worse, decreased conversions.

But there are many reasons for a test not “winning” that aren’t always as cut and dry as the challenger page being worse, some of which include:

There are a large number of reasons why a test might fail on paper that are caused by inexperience & understanding what “failure” really is.

But, if I’m being honest, the real reason my early tests failed was because they were informed by my ego, not by real data.

That’s why most A/B tests fail. It was only once I started getting comfortable with the data that I stopped taking “failure” so personally. So I mis-interpreted a bit of feedback, or my colleague’s variation did better on this testing cycle… so what?

What did we learn?

Maybe “Free” wasn’t the thing they cared about. Maybe it was, but the offer still wasn’t entirely clear. Maybe an international celebrity died, and the internet had better things to do while they mourned. Maybe our visitors don’t appreciate Pre-Christmas sales starting in November.

Try not to take things so personally. Do your homework, follow the data, be smart about your tests & see what happens. If it doesn’t work, move on. There are lots of other things to test.

4. Incremental Gains Are Far More Realistic

first run experience

On a similar note, try not to get discouraged if the only gains you’re seeing are small like 5%

Blog coverage of conversion rate optimization skews towards double, triple & quadruple digit gains, but that’s only because it’s sexier to cover.

Much like media coverage tends to overstate war casualties, even though we’re statistically seeing a decline, coverage on CRO looks at huge wins because it makes for much more compelling reading. But it’s far from the “norm.”

Grigoriy Kogan talks about “The Problem With A/B Test Success Stories” and how blog coverage creates an unrealistic expectation for companies who are beginning to experiment with split testing.

The trouble, he states, is that when blogs report 30-50x increases, it creates a “survivorship bias” that minimizes the impact of more realistic wins, as well as learning from neutral & failed tests. As a result, companies tend to dismiss 5% gains as insignificant & not worth paying attention to or fully implementing.

But when compounded, a 5% increase in monthly checkout completions, for example, could increase average checkouts by 60% by the end of the year.

What would a 60% increase look like for you?

3. Case Studies Are Often Big Fat Liars

Testimonials--e1390952228699

At the very least, they need to be scrutinized with extreme care.

I’ve read a lot of studies that report significant lifts in conversions, but I also know from my own experience that a lift today can be neutral tomorrow.

The case study is only as good as the person managing it.

If they’ve set up their analytics improperly, or are calling wins too soon, they’re reporting “wins” that may not entirely be valid.

I’ve grown to become extremely skeptical of  other people’s success stories as a result of this. With many of the case studies I’ve published here, I will research the author to see, in a broader context, they appear to really know what they’re talking about.

When you’re looking at case studies or behavioral research, there are several things you need to also take into account:

  • The amount of traffic included in the test
  • The segment of traffic being tested
  • The length of the testing period
  • The number of absolute conversions
  • The number of relative conversions
  • The actual impact on revenue
  • The total impact on customer lifetime value

These, among many other variables, play a huge role in whether a test was actually successful. In many cases, case studies report on a false positive – the test appears to be a winner now, but in reality the “win” a temporary increase & everything returns to a neutral state a month later.

The other thing you need to look out for with case studies that report increases that are obvious. Peep talks about this in “Think About Customer Experience, Not Just Conversion Optimization

Portrait of Peep Laja

“If your only goal is to increase the conversion rate, that’s easy. You’re selling laptops for $1000? Well, sell them for $5 each and your conversions will go up 100x times at least, guaranteed. Done, goal achieved! Time for champagne?”

In the unlikely event you are able to get all of the test conditions within a case study, you have to also consider that the study is being published for PR reasons, not necessarily to provide you with insight.

That’s not to say you shouldn’t trust anyone, but rather, try not to hang your hat on any one case study or take any tactic as an absolute truth.

When you’re analyzing a case study, look at all of the information being presented to you, and really think critically about whether the lift they’re reporting is real & sustainable.

If the study is older, look to see if newer variations are building on what was found in the case study you’ve just read. There have been plenty of cases where I’ve found newer designs abandon the “learnings” found in the case study. While this could be due to internal politics, it is more likely that in the long term, the case study was actually reporting on a false positive.

As a general rule, I try to look at case studies not to understand “what” worked, but “why” it worked. If I can understand those principles, it can be incorporated into future testing.

2. Only Benchmark Off Your Own Data

What do you call the medical student who graduated at the top & bottom of their class? Doctor.

While it’s tempting to look to industry benchmarks as a way to “check in” on your performance, what you’re looking at is a warped average of the best & the worst being compiled together.

What’s a good conversion rate?” One that is sustainably better than the one that came before it.

Even though, Larry Kim is speaking in the context of PPC ads, he raises a good point:

larry-kim-2

“We recently analyzed thousands of AdWords accounts with a combined $3 billion in annual spend and discovered that some advertisers are converting at rates two or three times the average. Do you want to be average, or do you want your account to perform exponentially better than others in your industry?”

That said, it’s also important when you’re creating benchmarks, you’re not looking at your own averages, but rather the segments within your traffic.

How do “Big Spenders” use the site?  What about mobile users, or visitors that come from Facebook, or internal site searchers?

Instead of trying to make improvements to your overall averages, break down the performance of specific segments, and create an A/B testing plan tailored towards those segments.

It’s much easier to keep track of, and can be tackled way more systematically.

1. Stop Trying To Optimize For Stupid Crap That Doesn’t Matter

Pointless-Social-Media-Vanity-Metrics2

I used to think that “optimization” meant getting more people to click the button, or staying on the page longer, because those were the things that would lead to more sales. But that’s way too topical and short-sighted to be anywhere near the truth.

The answer I am always trying to arrive at now is “How does this make more money?” and it needs to start with the real data.

If I notice site searchers convert at higher rates, but the search function isn’t so noticeable, then it makes sense to try and draw more focus to site search.

If I notice that “time on page” is high for converters, by itself, that means nothing, and is likely not an accurate reflection of how long they were on the page anyways. If anything, I’d dig into the heatmaps to see what where visitors were spending the most time. I may also conduct a bit of real life user testing to identify any crappy ux issues on the page. I’d look at what pages they were on before hand, where they go after, and a whole lot of other things, in order to understand why the people on this page were converting.

But if I just tried to keep people on the page longer or more people to click the button… what’s the point if it doesn’t lead to more (and better) sales?

Conclusion

So my dear CXL reader, those have been the 6 most important lessons I’ve learned on my journey as an editor here. I encourage you to take a look at everything I’ve linked to, and even if you’ve already read those pieces, read them again.

We’ve worked very hard to put together some of the most comprehensive information in Conversion rate optimization online, and hope that you’ve been able to profit from it.

As for me, I’m moving over to Shopify to continue writing about CRO specifically for eCommerce. I love CXL, it’s readers & the connections I’ve made here. You are such a valuable part of my life now, and I appreciate the time you’ve spent reading, commenting on, and sharing my work here.

Don’t worry though, CXL isn’t going anywhere; Peep and the rest of the team have some really great stuff in the works that you can’t afford to miss.

Until we see each other again,

Tommy

Related Posts

Join the conversation Add your comment

  1. Congrats on the Shopify gig, Tommy.

    It’s great to know that you’ll be helping out an audience of thousands ecommerce entrepreneurs over there.

    1. Thanks Hashim,

      I’may really sad to be leaving CXL, but really excited for what is coming next :-)

  2. Being able to work with you on posts for the CXL blog has been amazing. You’ve taught me so much not only with editing my posts but also in being able to read everything you’ve written. Good luck with the new gig!

    1. Jen it has always been a pleasure working with you, and I look forward to doing it again!

  3. Good luck Tommy & great work this past year. Great to see this post since too often someone leaves and you never know what happened. Credit to Peep.

    1. Yeah, I wanted to make sure you still knew where to find me. I will still be writing about conversion on Shopify, and will still tryoung to do the occasional guest post here too. And yes, much credit to Peep, he took a chance on me and thank goodress it worked out! Much respect for him!

  4. Sorry to see you go Tommy. You’ve been an inspiration for me with Peep on being a CRO guy and write/blog about it. Now I have my community here in Turkey, mostly thanks to your academic style posts and knowledge.

    Thanks again, and good luck!

    1. Awe thanks Enes, I’m glad i’very been able to help. Thank you for spending the time with my work :-)

  5. The little time I spent under you was very educational for me. I really learnt a lot of stuff. See you over at Shopify.

    1. Thanks George, and it was a pleasure working with you as well. Feel free to hit me up in the upcoming weeks :-)

  6. Looking forward to seeing your next step in the journey.

  7. It was great working with you, Tommy. Good luck for the Shopify gig. :) Hope to work with you again very soon.

    1. Thanks Smriti! It’s been a pleasure working together, and I hope we can do it again soon :-D

  8. Many thanks Tommy! I have read all your pieces here and you have opened my eyes many times. Learned a lot from you. Good luck with your new ventures. I work in the same area but another market, much fun happening. Thank you once again for your excellent work!

  9. Great post! I was looking for basic information about CRO and I landed on this blog.
    I really loved #1 :)
    Very useful, thanks!

  10. Your last point really hit home for me. Thanks and good luck at Spotify.

Comments are closed.

Categories