CXL Live 2016 was a smashing success. The speakers were amazing, the resort phenomenal, and the networking world-class.
Here’s what Peep had to say about the event:
We had some really great feedback this year…
— Luiz Centenaro (@LuizCent) April 1, 2016
Day 1: People, Brains and Psychology
Peep Laja: Optimizing Optimization
- The biggest challenges facing optimizers today are:
- Lack of know-how. People don’t have the skills to do optimization and don’t know where to learn it or have the time.
- Lack of time
- Lack of resources
- What’s holding us back in optimization? Lack of experience is a big one. 20% of optimizers have been working in the industry for less than one year. 50% less than 3 years.
- If you rely on tactics you find online, you won’t be able to prioritize tests. You’ll be wasting time and traffic. And most of those articles are written by professional bloggers, not optimizers.
- Our original research found that form engagement increases when you use an arrow as a visual cue (instead other elements like human faces, facing any direction). Worth testing.
- There’s a lack of empirical research in optimization that is both rigorous and actionable. We’re opening CXL Institute to fill that gap.
Bart Schutz: Persuasive Journey Mapping
- Dual Process Theory – the brain is made up of two “systems.” System 1 is fast, automatic, subconscious, and emotional. System 2 is deliberate, rational, and conscious.
- What we think is a rational decision is often an emotional one, so we rationalize after the fact (post-purchase rationalization).
- Humans do “mental accounting,” a subjective coding and categorization of economic outcomes. So you can use framing to shake this up. Example: “Do you know what a good night’s sleep costs you? The average person sleeps 8 hours a night and a minimum of 10 years on this bed. For a perfect night of sleep on this bed you only pay €0.82/month”.
- The 3 Levels of CRO: 1) Customer Journey 2) Hypothesis 3) Testing.
- People hate popups, but they increase conversions. Why? Perhaps because they require effort (to X). Effort tells your brain you like something because you’re willing to put it in.
Nir Eyal: How to Build Habit-Forming Products
- The Hook Model is designed to connect the user’s problem to your solution with enough frequency to create a habit. The model is made up of 4 parts:
- There are two types of triggers: external (information for what to do next is within the trigger), and internal (information for what to do next is informed through an association in the user’s memory).
- Negative emotions are powerful internal triggers (e.g. people who are depressed check email more, when we feel lonely we use Facebook). Do you know your customers’ internal triggers?
- 6 Factors that can increase motivation and action:
- Seeking Pleasure
- Avoiding Pain
- Seeking Hope
- Avoiding Fear
- Seeking Acceptance
- Avoiding Rejection
- 6 Factors that can decrease motivation and action:
- Physical Effort
- Brain Cycles
- Social Deviance
Roger Dooley: Neuro-persuasion – Brain-based Strategies for Online Marketers
- Is psychology BS? A 2015 study that tried to replicate the results from 98 social science experiments was able to do so for only 39. Priming experiments in particular have come under fire. But, a brand new paper from Harvard claims the 2015 study was itself faulty. No doubt this won’t be settled soon, but regardless of the outcome it’s important to test any new strategies.
- Roughly 5% of the brain is conscious, while 95% runs non-consciously. Market to the 95%, too!
- Dooley developed the Persuasion Slide model to help market to the whole brain. It has 4 steps:
- Nudge (get their attention)
- Gravity (customer’s initial motivation)
- Angle (motivation you provide – both conscious and non-conscious)
- Friction (difficulty – both real and perceived)
- On your landing page, if it’s not motivation, it’s probably friction.
- Seemingly insignificant things can make huge differences in the effectiveness of your page. E.g. people that read instructions in an easy-to-read font completed the tasks in almost half the time as those who read the instruction in a barely-legible font.
Brian Cugelman: Motivational Chemistry and Susceptibility to Digital Persuasion
- Motivation = Emotion. How can we trigger our users to act in a certain way by triggering emotions?
- To trigger Dopamine (pleasure, curiosity), be novel, offer surprises. Offer more, better, bigger. Hold back the full story for later (mystery).
- How can you reduce stress (cortisol)? Simplify, reassure that the goal will be met, human contact, humor, entertainment, reducing cognitive load, etc. Example is humor used on 404 pages.
- To trigger Oxytocin, establish a relationship with your users. They interact with your site almost in the way they interact with other humans. Example: Unsubscription popups.
- To trigger Serotonin, use social comparison. People compare themselves to others, and make evaluations. Example: gamification, leaderboards, etc.
Talia Wolf: How to Create Landing Pages That Address the Emotional Needs of Customers
- Emotional targeting is a four step process, which takes 30-60 days:
- Emotional Competitor Analysis (grade according to message, color, image, and emotion triggers instead of features and product)
- Emotional SWOT
- Emotional Content Strategy (using emotional triggers to target your audience)
- Color can influence emotions, but you can go overboard and it can backfire. Different colors have different meanings.
- Emotion is best when it is inspired at the right time. For example, an email capture popup after 5 idle minutes. Afterwards, people were more likely to convert.
- Gaining knowledge is the goal. Learn as much as you can about your audience.
- Reality sucks. As marketers, we sell dreams. It’s your job to identify your audience’s dreams and then address that on your landing page.
Angie Schottmuller: Social Proof Power Plays
- Consumers are persuaded 12X more by “others” than you, the marketer. When we’re unsure, we look to others. Social proof is the glue.
- Social proof isn’t just reviews. There are many forms. Social proof is evidence of/from others (people like us) that reduces fear or uncertainty.
- The 6 formats of social proof:
- Sum it – How many?
- Score it – Out of how many? % satisfaction, rankings, etc.
- Say it – Reviews, expert Q&A, audio clips / videos.
- Sign it – Who said it? Their authority helps.
- Show it – Symbols. Star ratings, checks, thumbs up, logos, etc.
- Shine it – Approvals, badges, custom awards, BBB, etc.
- The quality of the social proof trumps the quantity of social proof.
- C-R-A-V-E-N-S model is a scorecard for social proof (Rate each of the factors on a 5 point scale):
- Credible (believable, authentic, trustworthy, authoritative, “ethos”)
- Relevant(pertinent, germane, applicable, meaningful, important)
- Attractive (emotional trigger, value-added, satisfying, “pathos”)
- Visual (pictured, drawn, mapped, graphed, viewable)
- Enumerated (quantified, scored, ranked, calculated, “logos”)
- Nearby (close, proximate, or near uncertainty/anxiety points)
- Specific (distinct, descriptive, named, detailed, precise)
Joanna Wiebe, Jen Havice and Joel Klettke present: Conversion Copywriting Panel
- Surveys are a goldmine of insight when doing copy research. They’re easy to execute and relatively quick if you have a list. You can run surveys by email and also on-site (tools like Qualaroo or HotJar).
- Tips for email surveys:
- Always survey your most recent (last 6 months) customers.
- Segment your lists. It makes data gathering easier.
- SurveyMonkey, Typeform, etc. can help.
- Open-ended questions are really important. You want to hear real results in their own words, which you can then use in your copy.
- Tips for on-site surveys:
- Timed popups, exit intent popups, etc.
- Multiple choice works best. Difficult to get open-ended answers this way.
- Why are they leaving? What didn’t they get from your site?
- User testing is also a goldmine of information. Have customers walk through tasks on your site and discover what they’re bother by and the words they use.
- Message mining is the most affordable research method. Basically, you can scour the web for reviews, testimonials and unbiased discussions to steal for your copy. Look to forums, comparison sites, social chatter (set alerts), and advanced Google searches for your brand + competitors’ brands etc.
Day 2: Optimization and Testing Strategies + Processes
Claire Vo: CRO Metrics for Performance and Insight
- Testing Capacity = How many test can you run per year? (52 / test duration weeks ) x (# of simultaneously testable pages/funnels)
- Testing Quantity – Make sure you’re not wasting traffic (use it or lose it).
- Testing Coverage – What % of testable days are you running a test? This will help to discover the waste of traffic you have in your testing program. Don’t waste traffic.
- Stop evaluating the quality of your program on a test-by-test basis. It makes you look good until you look bad. You should also track:
- Am I running effective tests? Win rate. What % of tests run win / lose / are inconclusive over time?
- Am I running tests effectively? Expected value of your testing program
- How can you measure the effectiveness of your testing program over time? How do you know if you’re getting better?
- Track quantity by measuring velocity (based on capacity calculation) and coverage (goal = 100%)
- Track quality by win rate, average lift, expected value
- Set goals and measure everything over time.
Ton Wesseling: How to Utilize Your Test Capacity?
- The ROAR model…
- Risk. The first part. You have to take lots and lots of risks. If you have 0-10 conversions, etc. you don’t have enough data and you can’t run A/B tests.
- Optimization. Can begin running tests. If you are below 1,000 conversions per month, you don’t have the power to run tests (w/ power of 80, you need 15%).
- Automation. You build processes. To automate, you need 10,000 conversions per month ( w/ power of 80, you need 5%).
- Rethink. When growth is declining or slowing.
- How much impact do you need? Use a calculator: http://ondi.me/size. Be sure to consider power. If you have 1,000 conversions, you can run about 20 tests per year.
- On average, 1 out of 3 will be a winner (7 to 8 weeks).
- If you can only report a win every 2 months, you don’t have a good program.
- Less than 1K? Don’t test. Get more traffic and conversions.
- What to test when you have just over 1,000? Big changes w/ big results + Big design changes.
- What to test when you have 10,000? Smaller changes w/ smaller results.
- Celebrate failures. If you only celebrate wins, people become afraid to take testing risks
Ronny Kohavi: A/B Testing Pitfalls – Getting Numbers is Easy; Getting Numbers You Can Trust is Hard
- Ronny finishes about 300 experiment treatments per week, mostly on Bing, MSN, but also on Office, OneNote, etc. There is no single Bing. Since a user is exposed to over 15 concurrent experiments. They get one of 30 billion variants.
- What is a valuable experiment?
- The absolute value of delta between your expected outcome and your actual outcome.
- If you thought something was going to win and you win, you haven’t learned much.
- If you thought it was going to win and loses, then it’s valuable because it’s learning.
- If you thought it was meh and it was a breakthrough, it’s highly valuable.
- Underlines improve clickthrough on Bing for both algorithmic results and ads (so more revenue) and improve time to successful click (better UX). Still, because of modern web design trends, most sites have done away with underlines despite data saying users click less and take more time to click. Data doesn’t always guide decisions.
- 6 A/B testing pitfalls:
- Misinterpreting P-values.
- Expecting breakthroughs.
- Not checking for sample ratio mismatch (SRM).
- Wrong Success Metric (OEC).
- Combining data when treatment percent varies with time.
- Get the stats right (run A/A tests to avoid this)
- Revenue is usually a good metric to use to evaluate effectiveness of test.
Pauline Marol & Josephine Foucher: What to Test Next – Prioritizing Your Tests
- Prioritization makes your tests impactful and meaningful. There are many frameworks, including:
- PIE Framework
- Bryan Eisenburg rules
- Monetate model
- Points model
- Every model boils down to balancing value (reach, lift, and strategic fit) and effort (creative, dev, and coordination).
- Use KPIs to communicate testing performance. They recommend:
- Testing velocity
- Win rate
- Conversion lift
- Benchmark your prioritization rules against your KPIs (correlation between value score and reality), and if your rules’ efficiency is not where you want it to be, revisit your rules.
- The prioritization process should be iterative and transparent:
- Share and educate.
- Build knowledge.
- Do your research, build prioritization rules, keep improving them.
Alex Harris: You Can’t Make This Stuff Up
- Moderate user testing is a goldmine of insight for conversion research, especially if you recruit from your ideal customer base. Their answers will give you better insight than the protestors and biased opinions you get using software.
- Three things not to do in moderate user tests:
- Lead the participant.
- Interrupt or intervene at the wrong time.
- Teach or train rather than observe.
- Quantify your findings (~3 out of 6 said X).
- You can check their facial expressions as well for qualitative insight. What is causing them stress or anxiety?
- In order to understand what’s going on in your customers’ minds, you need to talk to them.
Marie Polli: When, Why and How to Do Innovative Testing
- Innovative testing is risky but can also bring huge wins. Examples of innovative testing include:
- Navigation changes
- Radical redesign
- New functionalities, features.
- Combination of multiple elements. You’re adding, changing or removing multiple elements.
- When to choose innovative testing:
- When an iteration won’t suffice. If the basics are off, you need to do innovative testing.
- When you’re testing potential is small. You’ve tested every element on the site already. You’ve reached the point of diminishing returns.
- When you don’t have much traffic. Even an uplift of 10-15% is difficult to measure
- Innovative testing is still a better option than radical redesign. “We want you to change our whole site.” You can do that with innovative testing. First, you change the layout. 61% uplift. Then, the client wants to change the navigation. 7% decrease. If you had done a radical redesign, the 7% decrease would’ve been eaten up by the 61%. You wouldn’t have learned as much.
- Make sure your testing process is solid before running innovative tests. Four key questions to ask:
- Is the test based on a data-driven hypothesis?
- Does the question exactly match the hypothesis?
- Has the test been set up correctly with all the goals attached?
- Did you perform quality assurance?
- Perform qualitative research. Innovative testing isn’t the best option all the time, but it’s worth trying.
Justin Rondeau: “Best Practices” or “Common Practices” – Which Is It?
- Lower conversion rates are either caused by:
- Poor offer construction
- Poor articulation
- Poor offer/audience match
- Common practices aren’t always bad. It’s not about reinvention, it’s about meeting expectations – and common practices meet expectation. Stop hatin’.
- If common practices are a way to articulate better, then they are useful to the optimization strategy.
- Common practices help:
- Teams with limited dev resources.
- Companies without a testing culture or filled with office politics.
- Teams with low traffic.
- “I’m all for innovation.” That’s not step one. Start with the basics, then go to innovation.
Viljo Vabrit: Using Urgency to Boost E-commerce Conversions
- Urgency makes people behave rashly in response to emotions.
- Example urgency tactics:
- Deal of the day.
- 1 left in stock.
- 9 minutes to add it to cart and choose next-day shipping.
- If you do it now, you get it gift wrapped and delivered for free.
- The last potato was purchased 30 minutes from Austin.
- There are 34 people looking at this potato right now.
- Checkout or the offer will expire.
- Urgency can decrease and also increase friction. People can feel like they’re being forced to take part in a race they don’t care about. Urgency does not equal persuasion.
- If you have a bad offer, it doesn’t matter how much urgency you add.
- How to make it work:
- Have an offer that aligns with what the user needs.
- Demonstrate you have a solution for pain points.
- Establish a unique sales position (the specific benefit to the customer that leads to the desired outcome).
Jen Havice: From WTF to Hell Yes – How to Come Up With Copy That Persuades
- There’s a disconnect between what you’re saying and what your customers want to hear. Before you start writing your copy, take some time to listen to your customers.
- How consumers really feel about emails from businesses? Half said they receive irrelevant content on a daily basis – said they report spam simply because they find them irrelevant.
- In another study, they asked what consumers are willing to give up to get relevant content on their favorite websites? 25% chocolate for a month, 21% mobile phone for a day, 13% sex for a month.
- It boils down to one word: value. Our prospects and customers are literally telling us: “I can’t see the value in what you’re selling.”
- How can you convey value with your copy?
- Researching: Do the qualitative kind that answers “why”.
- Online review mining.
- Surveys by email or on-site popup.
Morgan Brown: Growth Hacking BS – Fixing Marketing One Truth at a Time
- Everyone is very excited about growth hacking, but they’re focused on “silver bullet” tactics. No great company was ever built on the back of a listicle.
- Growth hacking came from Silicon Valley, where people do not trust marketers. “Entrepreneurs thought marketers were going to buy them Super Bowl ads.”
- Growth Hacking in reality:
- It extends across the entire company.
- It’s really just product engineering + optimization.
- High tempo experimentation isn’t about one team in a silo. It’s about creating an entire organization designed to facilitate rapid experimentation and growth.
- Growth Process:
- Ideate: Unbridled ideation.
- Prioritize: Focus prioritization.
- Test: Rapid testing.
- Analyze: Learning.
- Weekly growth meeting. Meet regularly – how would you feel if finance only “met when needed.” Act like a proper team!
Joel Harvey: Master Mobile – What We’ve Learned from Hundreds of Mobile Tests
- Mobile optimization best practices don’t exist. Mobile users convert differently.
- It’s your job to understand what your top ten most important combinations are (device/operating system/browser).
- How times do your mobile users switch from landscape to non-landscape?
- Roughly 10% of people do.
- Why? It depends, but responsive designs often don’t respond in both.
- Test on 3G and 4G connections. They’re not always connected to WiFi.
- What to Test First:
- Headlines & CTAs
- Welcome mat
Stephen Pavlovich: SaaS Optimization – Effective Metrics, Process and Hacks
- Mode: Goal > KPIs > Data > Insight > Strategy > Testing. But most people want to start right at the top, testing. They skip everything before that.
- Figuring out what your goal is is fundamental. You need to gather everything in order:
- Goals: Why aren’t people converting? What’s stopping them?
- KPIs: Options – Always test through to active trials (SaaS)
- Strategy & Testing: The a-ha moment is the tipping point for retention.
- Free trials do not pay the bills. Don’t convert to trial before the user is sold on the product.
- Fix churn before acquisition – it will flatline growth.
- The a-ha moment is the tipping point for retention.
- Deliver the a-ha moment ASAP – even before the user signs up
- Increasing profits through the psychology of pricing – if there are two bottles of wine (one $5 and one $10), you can increase profit by offering a bottle much higher in price (say $25) to anchor the price point making the middle one seem cheaper.
John Ekman: The Grand Unified Theory of Conversion Optimization
- There are a lot of different models and processes and tools, etc. Explaining them all and wrapping them all together is difficult. When do you apply what?
- “Essentially, all models are wrong, but some are useful.” – George E.P. Box
- In addition to the hypothesis, you need data. You create the hypothesis using data. The third thing you need is experiments. It’s a continuous loop.
- Those three create the wheel. You need the spokes that push it forward. There are three: behavior analysis, planning and prioritization, and research and analytics.
- This entire model is very close to the Lean Startup model. Too many people use it as a concept, not a concrete model. What? Data. Why? Behavior. What Instead? Hypothesis.
Day 3 – Analytics, Personalization and Retention
Rand Fishkin: Fight Back Against Back
- Google moved away from basic algorithms. Now they’re using machine learning algorithms. Machines learn to emulate the good and bad results.
- Deep learning means the machine comes up with a classification itself.
- Algorithms are building algorithms. Machines determine the rankings themselves.
- Google doesn’t know exactly what it uses, but leverages it heavily. 3rd most frequent signal. Growing power / influence.
- A search result is good when: people don’t click back, they don’t go to page two, they don’t revise their search, top result gets the most clicks, etc.
- Engagement is becoming the web’s universal quality metric. It’s how suggest works, chrome auto, social network’s trending topics, suggested accounts to follow, what’s important in Gmail. Things they predict will hook you on their service.
- Sites and brands earn an engagement reputation and it determines visibility.
- Quantity of Posts, etc. / Quantity of Likes, etc. = Engagement Reputation
- Simple, but powerful.
- Every product wants engagement / addiction. That formula is how they’ll get it.
- Whenever someone clicks “back,” they’re sucking that engagement out.
- Understand and serve all of your visitors’ intents.
- We can’t let the filter take care of it anymore (like we did in 2012).
- You can’t ask who is my customer? Need to ask who are all of the searchers? What are their needs? Serve as many as possible. If you’re not serving the searcher, Google won’t rank you. You don’t only need to deliver value to your buyers.
Annie Cushing: How to Give Your Data an Annual Checkup
- Your testing is only as good as your data. There are many issues with analytics configurations. Many setups are broken.
- Common mistake: Mistagged mediums.
- You have three parameters: medium, source, campaign name. You have to get medium right for sure. If you mess up medium, almost all of your reports will be off. Top publishers don’t even know the visits they’ve tagged aren’t showing up in their reports.
- Another mistake: Fractured views.
- When you set up a unique view, everything outside that bubble is still going on. It’s just invisible to that view.
- “Let’s create a view for this subdomain.” But if someone can easily move between subdomains, everything they do on those sites is invisible. You’ll get not sets.
- Bots come in like a wrecking ball, you don’t want them on your site. Go to Admin > and Opt into bot filtering. Do this for every view.
- Use custom dimensions to better understand your audience.
Karl Wirth: How to Boost Conversions with 1:1 Personalization
- If you talk to everybody, you’re not talking to anybody. Analytics, design, A/B testing, and usability testing are all popular and are not going away, but they still give you the same thing: the best experience for the average of all of us.
- Personalizing when the first land for better acquisition. Personalize based on:
- Visitor’s location
- Referring source
- Relevant email capture. Once you know them, see what they’re interested, etc. If you give me your email, we’ll give you a discount on what you’re viewing.
- Make every page of the site a landing page. Take the offer they skipped over and remind them of it (not annoyingly). “This is what you came for. Remember? Are you ready?”
- Targeted upselling – build an on-going relationship with people. The longer someone is on your site or logged in, the more people expect that you know them and won’t spam them or serve them a one-size-fits-all solution.
Brian Balfour: Optimizing Retention – The Silent Killer and King of Growth
- Retention is probably the most under talked about part of CRO. If you have poor retention, nothing else matters, especially acquisition. If you’re not retaining users, growth (WAUs) will stall. If you are retaining users, growth (WAUs) will continue up and to the right.
- Why retention matters?
- Increase LTV –> higher CPA
- Increase virality –> decrease eCPA
- Increase upgrade rates –> decrease payback period.
- Every improvement to retention improves other areas of your business.
- How do you improve retention? Cohorts. Segment the curve as much as you can with Mixpanel, Amplitude etc.
- Segments tell a different story if the retention curves look different. Once you segment it by user source, you might find one is bringing down your average.
- What part of retention should you optimize? There are four:
- New user experience (D1, W1, M1). Need to get users to experience the core value as quickly as possible.
- Mid-term retention. Once they experience the core value, we need to get users to create habits around core value.
- Long-term retention. Get users to experience this core value as often as possible over a long period of time.
- Resurrection. Get users to reform their opinion. Essentially, dormant users.
Lea Pica: How To Present Your Testing Results to Get Results
- Why do bad things happen to good data?
- Understanding what makes brains work is the key.
- You have 8 seconds to capture their attention.
- Repetition commits information to their long-term memory.
- They don’t care about what you do all day, they want to know how to move the business forward.
- Three pillars of presentation enlightenment:
- Be your audience
- Use your presentation tools wisely.
- Maximize absorption
- Tips to understanding audience:
- Who’s attending the meeting? C-Suite? Tactical marketers? Don’t talk over people’s heads.
- What’s the takeaway? How can you write the insights like a BuzzFeed headline?
- Insights are more valuable than observation.
- Tips to using presentation tools:
- Death by PowerPoint? It’s not really a PowerPoint problem, more of a people problem.
- Stop using bullet points. No one likes them but you. Bullet points expose all info at once and our brains only like to process one idea at a time.
- Don’t put three ideas into one slide. Separate them into three slides.
- Don’t use clip art
- Tips to maximize absorption:
- Pie charts are the actual worst. They make your audience do more work to understand them, which means they don’t pay attention to you. Bar charts are better, but can be just as badly abused.
- Kill 3D.
- Reduce visual noise.
- Data labelling.
- Uniform color. Or strategic coloring to emphasize key learnings.
CXL Live 2016 was awesome. Next year will be even better! Make sure you sign up for our email list to get all the info and early discounts.