Site icon CXL

We Quantified the UX of 5 Bike Websites. Here’s What We Learned. [Original Research]

Your design team likely thinks your website is number one compared to your competitors, but a quantified UX benchmark might tell you differently.

We all have our opinions on what good design looks like, but quantifying that and comparing it to competitors, really shows where you stand. Once you know that, you can take action based upon the insights.

This article outlines a UX benchmark study we conducted in partnership with Jeff Sauro and his team over at MeasuringU. We studied five road bike websites. We learned a lot in doing so, and you’ll certainly find some instant takeaways from our insights.

Though this may not be industry specific, you can still derive some UX guidelines from these bike sites. And, later, we’ll outline how you can conduct a similar study.

Note: CXL Institute recently published a comprehensive report on eCommerce UX guidelines using the same methodologies as outlined in this post. Check it out here.

How Do We Do UX Benchmarking? Background and Methodology

“Benchmarking” can be an ambiguous word. We know that it means to evaluate something by comparison to the standard, but by what metrics are we evaluating it?

A UX benchmark without a rigorous and standardized process could quickly devolve into something tenuously linked to performance and overrun by opinions.

So we worked with Jeff Sauro and his team over at MeasuringU to leverage the SUPR-Q methodology and supplementing this standardized survey methodology with an additional component that measures the clarity of a website’s message (e.g., value proposition).

The SUPR-Q tries to judge a website based on if it is usable, credible and visually appealing. It has 8 questions (including 1 NPS question), and it is scored on a 1-5 scale. The trust questions vary based on whether or not the site is commerce-oriented. They are as follows:

Scoring is pretty straight forward. You just average the responses for the first 7 questions and add this to 1/2 of the score for the Likelihood to Recommend question. This is your SUPR-Q score and it can then be compared to the industry benchmarks.

Now, ours (we’re calling it the Conversion-Focused UX, or CUX model) differs slightly, in that we really want to be as conversion-focused as possible. So we added a clarity question to the user testing process as well as on the survey.

We had a total of 102 people do this user testing + SUPR-Q study, then we calculated the metrics and benchmarked the sites against each other using both the site database of the SUPR-Q and our own database of scored sites that added the additional clarity component.

Finally, we took qualitative feedback from the user tests to bring you additional guidelines and insights – things you can possibly bring to your own website.

Specific Study Design for Bike Websites

So you know the general methodology we use for the study. Now I’ll quickly outline the specifics of this study (feel free to jump down to the results if you’re bored by the specifics).

Here are the websites we studied:

We had 102 curated participants do a task on each website – search for and add to cart a particular item. Then they answered the SUPR-Q survey and our added clarity question.

Results

Here are the benchmarking results for each corresponding UX dimension…

Appearance

Percentile rank scores compared to 100+ scored sites in the database. Appearance Survey Questions: 1. I found the website to be attractive, 2. The website has a clean and simple presentation. 90% confidence interval around the mean is indicated.

Clarity

Percentile rank scores compared to 100+ scored sites in the database. Clarity Survey Question: 1. I clearly understand why I should buy from this website instead of it’s competitors. 90% confidence interval around the mean is indicated.

Note: Clarity is not a dimension on the original SUPR-Q; rather, we added it to reflect a crucial part of conversion-focused design.

Credibility

Percentile rank scores compared to 100+ scored sites in the database. Credibility Survey Questions: 1. I feel comfortable purchasing from this website, 2. I feel confident conducting business with this website. 90% confidence interval around the mean is indicated.

Loyalty

Percentile rank scores compared to 100+ scored sites in the database. Loyalty Survey Questions: 1. How likely are you to recommend this website to a friend or colleague? 2. I will likely visit this website in the future. 90% confidence interval around the mean is indicated.

Usability

Percentile rank scores compared to 100+ scored sites in the database. Usability Survey Questions: 1. This website is easy to use, 2. It is easy to navigate within the website. 90% confidence interval around the mean is indicated.

SUPR-Q

Percentile rank scores compared to 100+ scored sites in the database. 90% confidence interval around the mean is indicated.

Takeaways: 10 Things We Learned

Now, unless you work at Trek, Kona, Giant, Specialized, or Felt, you might be wondering – what do I do with that information?

Well, we sifted through our hundreds of user tests and compared them with our benchmarked data and preferences questions to tease out some patterns, and even “best practices,” if you will.

Here are ten takeaways from our study:

1. Invest in high-quality photography. This applies to graphics, product pictures, and all other images on your site. Some of the most abundant responses during user testing were comments like these about the Trek website: “the photos were well done,” “the graphics were awesome,” and “nice, large, clear product pictures.” If you’re going to invest in one thing, it should be professional images.

2. Aim for a clean and simple layout. The phrase “clean and simple” comes straight from user testing. These two words were overwhelmingly the most frequently shared pieces of feedback. People love a minimal site layout. When facing a cluttered design, comments like, “the layout was too busy for my taste” (Felt website) and “the website was crowded, making navigation a little more difficult” (Specialized website) surfaced.

3. When in doubt, follow conventional design placement. This “lesson” addresses the power of typical design and the danger of atypical design. Conventionally designed websites create intuitive flow for users. Instead of wondering where to go next, users are already there. It’s the difference between going somewhere you’ve been to a thousand times and going somewhere for the first time. We saw many comments like this one about the Trek site: “I liked how easy it was to use. I knew right where to go in order to get where I needed to be.”

On the other hand, unconventional web designs threw users off course. Some feedback we received on Specialized’s atypical website design: “Really awkward presentation, sideways item classifications … very difficult to find what I was looking for”, and “It was very awkward to navigate the links, it seemed more focused on looking ultra-modern than being useful.

4. The navigation menu should be noticeable and placed at the top of the homepage. This falls under the category of conventional design (#3 above). However, it was mentioned so often during user testing that it has earned an entire recommendation of its own. A big navigation bar on the top of Trek and Felt home pages was called “simple,” “easy,” “great,” “good,” and “not hidden behind a bunch of junk.” It’s also worth adding that users often complained about hidden menus (“The dropdown menu was annoying,” – from user on the Specialized site).

5. Bigger is (usually) better. Images, copy, and buttons are made to be seen, and moreover, understood. People said they liked the “nice, large product pictures” on the Felt site, and had qualms about, “small menu buttons” and the font being “way too small” on the Specialized site. If an element is too small to be understood, it’s hurting your site— and ultimately your wallet.

6. Optimize product categories. Another common issue dealt with users’ inability to find the product they were looking for (we tasked them with purchasing a jersey). Specifically, determining which category to find the product in. When the categories were aptly named, we received comments like this one about the Felt website: “The website was incredibly professional, and the best part was it was categorized so well, it only took me a few seconds to find what I was looking for. Definitely recommend this website!

It didn’t always go this well, unfortunately. We saw lots of comments like this one about the Specialized website: “There were lots of menu layers to navigate. It wasn’t straightforward, I had to do some experimental clicks cause I was looking for ‘Jersey’ and the menu didn’t have that word.Card sorting (and reverse card sorting) is a good technique for determining logical product categories and organization.

7. Page load should be as fast as possible. The issue we saw across every site, even the most “usable” ones, dealt with loading times. Whether it was links, images, or an entire web page, people had a problem with waiting. “The website was very, very slow to load compared to others” (Giant website) basically translates to “I would abandon this site for its competition.” Check out this post to optimize your site speed.

8. Minimize the number of clicks users have to make to complete their goal. Regardless of the objective (reading an article, making a purchase, contacting customer service), users want to accomplish it quickly and effortlessly. When the experience went poorly, we saw many comments like, “it took a ton of clicks to navigate anywhere,” and “there were too many clicks to get what I wanted” (Specialized). When it went well, we saw feedback like, “It took only a few clicks to get where I needed to go” (Felt).

Conduct your own user testing, ask a friend or family member to navigate your site, or simply analyze the site yourself. Consider each click that’s made, whether the action was necessary, and if there’s any way to get rid of the click without taking away from the usability of your site. Repeat this process, following all major paths users take on your site (analyze your clicks for people who are making a purchase, for people who want to return a product, for people who have a question, etc.).

9. Make sure your search is actually functional. First: Provide a search bar. While this seems like common sense in 2016, we received negative feedback from 2 of the 5 sites tested that a search bar wasn’t offered.

Next, take advantage of your site search data. This information will tell you what people call products (i.e. what you should name them), which products people are buying most often, and so much more. Lastly, if a product is unavailable for some reason, don’t show users a mysteriously blank results page. Communicate the situation with them. Check out our piece on internal site search optimization.

Specialized site: “I also wasn’t able to search more specifically and the search feature started to break it seemed after the first try.
Kona site: “there is no search bar… I had to click through multiple pages to find the jersey.”

10. Ensure your site is secure. Upon entering the Giant website, the majority of participants were greeted with “an SSL certificate error stating [their] connection was not private.”

This (rightfully) warranted comments like:

Google Help provides useful information for securing a website.

Conclusion

We quantified the UX of five bike websites and learned a lot. First, we can see which sites are doing better in comparison to their competitors (something you can quantify with similar methodologies – feel free to reach out to us with questions if you’re interested in conducting your own).

And second, we have a series of “guidelines” based on the qualitative data. These are things that you can add to your test backlog and potentially increase conversions.

Note: CXL Institute recently published a comprehensive report on eCommerce UX guidelines using the same methodologies as outlined in this post. Check it out here.

Related Posts

Exit mobile version