For a web analytics analyst or a data-driven marketer, these are words to live by: “Without data, you’re just another person with an opinion.”
Optimization isn’t about educated guesses and hunches, no matter how many years you’ve been in the industry. It’s about doing the research, asking the right questions, digging for clues in problem areas, paying attention to the signs when they appear, and running smart A/B tests.
Web analytics analysis is a big part of that. It helps separate the optimizers from just another person with an opinion.
A good conversationalist knows that asking closed-ended questions is no way to make real friends.
Similarly, in marketing research, there are good survey questions, and there are bad ones.
A/B testing is common practice and it can be a powerful optimization strategy when it’s used properly. We’ve written on it extensively. Plus, the Internet is full of “How We Increased Conversions by 1,000% with 1 Simple Change” style articles.
Unfortunately, there are experimentation flaws associated with A/B testing as well. Understanding those flaws and their implications is key to designing better, smarter A/B test variations.
As an optimizer, you might be thinking that user interviews fall outside your role. Or, perhaps, that they are a “nice to have” on the qualitative conversion research checklist. Worse, you might not be asking good survey questions because you’re rolling with an “I’ll just wing it” mindset.
User interviews are more complex and important than most optimizers realize.
You spend most days analyzing and interpreting numbers, right? You’re constantly sifting through Google Analytics dashboards, Formisimo reports, Mixpanel data – the list is endless.
When you spend so much time focusing on the numbers, it’s easy to forget about the people generating those numbers. [Tweet It!]
That’s where qualitative conversion research comes into play. At least, that’s where it should come into play.