Customer journey maps: a step-by-step guide to mapping user journey

“How do we get our customers to do what we want them to do?”
Digital marketers get asked this question all the time. But it’s the wrong question.
“How do we get our customers to do what we want them to do?”
Digital marketers get asked this question all the time. But it’s the wrong question.
One of my favorite UX quotes comes from Chikezie Ejiasi, Head of Studio and Design Systems at Google.
He wrote: “Life is conversational. Web design should be the same way. On the web, you’re talking to someone you’ve probably never met—so it’s important to be clear and precise. Thus, well-structured navigation and content organization goes hand in hand with having a good conversation.”
A/B testing splits traffic 50/50 between a control and a variation. A/B split testing is a new term for an old technique—controlled experimentation.
Yet for all the content out there about it, people still test the wrong things and run A/B tests incorrectly.
In digital analytics, it’s all about asking the right questions.
Sure, in the right context, you can probably get by doing what Avinash Kaushik refers to as “data puking,” but you won’t excel as an analyst or marketer.
In addition, you’ll consistently come up short on bringing true business value to your company.
You’ve read about color psychology, system one and two, emotional persuasion, etc. I know you have because it’s everywhere. It’s on Forbes, Entrepreneur, Inc., HelpScout, HubSpot… you name it. Hell, we’ve covered some of these topics ourselves.
Why? Well, because many psychological triggers do, in fact work.
But there’s another side to using psychology online that almost no one is talking about: backfiring.
Psychology isn’t a magic formula that can be applied to optimization seamlessly in all scenarios, despite what many self-identified experts are preaching today. [Tweet It!]
How you design a survey or a form will affect the answers you get. This includes the language you use, the order of the questions, and, of course, the survey scale: the default values and ranges you use.
According to Amplitude, product analytics “show you who your users are, what they want, and how to keep them.”
I remember the first time that a client told how they were able to increase their sign-up rate for their product by 22% while reducing their marketing costs. The secret to their success?
They simply used their analytics data to make informed decisions.
Nowadays nearly every online shop utilizes some sort of product recommendation engine. It’s no wonder—these systems, if set up and configured properly, can significantly boost revenues, CTRs, conversion rates, and other important metrics.
Moreover, they can have considerable positive effects on the user experience as well.
This translates into metrics that are harder to measure, but are nonetheless essential to online businesses, such as customer satisfaction and retention.
A good user experience equals more money. But how do we measure user experience? How do we know if it’s getting better or worse?
When you first start doing conversion optimization, you think that the biggest hurdles are technical things: running an a/b test the right way, collecting data correctly, QA’ing tests.
These things are all important, of course. But the solutions are fairly straightforward, and when you reach a certain level of experience and skill, they tend to be a given.
No, the biggest obstacle to a testing program – even a mature program – tends to be human error and cognitive bias.