Sites that don’t work don’t convert.
That’s why optimizers conduct quality assurance on sites, landing pages, test treatments, email campaigns, you name it… to make sure they work the way they’re supposed to.
While it’s common knowledge that quality assurance is something you should do, not enough optimizers complete it properly. If they did, there wouldn’t be so many sites that just plain don’t work.
So, What Exactly Is Quality Assurance?
Here’s how Lucian Adrian Stroie of R/GA explains quality assurance…
For optimizers, quality assurance is about ensuring the quality of a site, landing page, test treatment, email campaign, etc. before visitors have time to suffer through friction and other “conversion killers”.
When conducting quality assurance, frequently referred to as QA, you’ll work closely with the development team to ensure everything looks and works correctly on all relevant devices and on all relevant browsers.
Ideally, you conduct quality assurance before launching the tested item, but it’s never to late to start.
You’ll find a lot of low-hanging fruit if you’ve never conducted quality assurance on your site. When conducting quality assurance for a test treatment, however, it must always be done prior to launch or you’ll distort your data.
After all, if you don’t conduct proper quality assurance, you might declare the best treatment you could’ve possibly created a loser due to a technical problem.
User Testing vs. Quality Assurance
There is some uncertainty around whether quality assurance is just user testing. Often, you’ll see them lumped together and the terms used interchangeably.
There is a difference. Namely that user testing is focused on how the user actually experiences the site while quality assurance is focused on the site itself and how it stacks up against the developer’s intentions.
User testing is…
- Examining how real people perceive and use your site / software.
- Exploring points of visitor / user misunderstanding, unexpected visitor reactions, points of friction, etc.
- About understanding how the visitor experiences the site and how that’s different from the developer’s intention.
Quality assurance is…
- Examining the site itself.
- Looking for bugs, glitches, errors, broken links, points of friction, etc.
- About creating a faster, cleaner, better site that works the way the developer intended.
They complement one another well, but are two very different things.
You might also consider combining that qualitative user testing data described above with quantitative survey data to create a UX baseline. A UX baseline serves two purposes…
- To better understand the effect of your design changes.
- To better understand how your site’s UX compares to your competitors’.
According to Ben Labay from CXL Institute, you should create a UX baseline during the conversion research phase…
To get the quantitative survey data, you must…
- Develop task / study objectives and success criteria.
- Collect data via an unmoderated remote user testing survey.
- Turn the survey results into a score for each dimension of site quality.
Using your baseline, you can measure how UX is improving (or declining) over time, meaning you can measure the affects of your treatments.
UX vs. QA
Given that user testing and quality assurance are complementary, it shouldn’t be surprising that the same can be said of user experience and quality assurance. Jakob Nielsen of the Nielsen Norman Group explains…
The more interaction between UX and QA, the better. Anna Schmunk of Dave Ramsey suggests that they should be “BFFs”…
UX (and the entire development team) and QA teams should be working together from the beginning. This allows QA to better understand the UX team’s intentions, making it easier for them to assess quality and spot bugs. While most people think of QA helping UX, it really can be a two-way street.
Why Is Quality Assurance Important?
Maybe before you push something live, you’re checking it on your desktop, on your phone, in Safari, and in Chrome to make sure everything looks and works right.
At CXL Live, Marie Polli, a senior conversion strategist at CXL, explained how extensive proper quality assurance is and how risky it can be not to perform it correctly…
Marie was talking about innovative testing, which is riskier by nature, but launching something that doesn’t work properly is always risky. And what works on one browser might not work on another, as Ian Newman of Box UK explains…
If you don’t check all of the browsers (and all of their versions) as well as all of the devices (and all of their operating system versions), you won’t know whether something is broken. And while you might consider Browser XYZ and Device ABC irrelevant and outdated, I assure you someone has tried to visit your site using it.
Conducting proper quality assurance means minimizing two issues…
- Sites that don’t work hurt trust and credibility.
- Sites that don’t work cause frustration.
If you’re interested in being perceived as trustworthy and minimizing visitor frustration (and you should be), then you will want to perform proper quality assurance. As Jakob explains, your site will be tested no matter what…
For best results, quality assurance should be conducted on…
- Your landing pages.
- Your entire site.
- Your A/B test treatments.
- Your email campaigns, including transactional.
What to Look For While Conducting Quality Assurance
According to Usability First, the first step is to create a set of guidelines…
Your company should come up with its own set of guidelines to follow while conducting quality assurance testing. The guidelines should address editorial, graphics, and coding conventions. After the site has been built, it should be put through a rigorous post-production process. Finally, there should be a provision for user feedback, which can influence the ongoing maintenance of the site.
For best results, have the QA team work with the UX team to develop these guidelines. For example, here are some categories that you might find on a set of guidelines (plus links to CXL articles where you can learn more, where applicable)…
- Error messages.
- Copy and content.
- Image quality / performance.
- Font style and size.
- Site security.
- Online forms.
- Emails are sent as expected.
- Bugs and crashes. (Visitors might blame themselves or develop superstitions about what will and will not result in a bug / crash.)
Going Through Your Funnels
If you haven’t conducted quality assurance before, this is the perfect place to start. It just makes sense to start by conducting quality assurance on the user experiences that are the most profitable, right?
If you don’t go through your funnels step-by-step to ensure quality, you are neglecting to plug the biggest, most expensive leaks.
Or, you could waste thousands of dollars sending paid traffic to a landing page only to find out that people aren’t converting because of a problem on the checkout page.
To make sure the elements of your funnel are technically sound and issue-free, you could try writing scenario-based use cases, tasks for yourself. For example, “submit lead gen form for eBook 1” or “make a purchase by searching Google for keyword X”. Think of it as user testing, but you’re the user.
As Catriona Shedd of SalesforceIQ mentions, you should also create scenarios where you fail to perform the task…
Finally, don’t forget to conduct quality assurance on your analytics. As you’re going through your funnel and looking for issues, check to ensure your activity is being tracked properly in Google Analytics. We’ve written an entire article on conducting a Google Analytics health check, which you can use to help you with this step.
Cross-Browser and Cross-Device Testing
Next, let’s focus on making sure your site (or treatment or email or…) displays properly on all browsers and devices. Note that you conduct cross-testing for two reasons…
- To discover bugs and errors. (You’re trying to break something.)
- To verify user experience. (You’re making sure the experience is as the developers intended.)
Google Analytics can help you find problem areas by looking at how your current visitors browse your site.
First, navigate to Audience > Technology > Browser & OS. Then, switch from the default “Data” view to the “Comparison” view. You can compare based on any metric (e.g. conversion rate, revenue), but you’ll see below that I’ve chosen bounce rate…
Note that you shouldn’t be comparing browsers to one another. Instead, compare browser versions within the same browser family. So, click on the browser to see browser versions…
Now you’re looking at a prioritized list of browser versions to focus your energy on for the biggest ROI.
Then, you can go to Audience > Mobile > Devices and use the same “Comparison” view…
Again, compare within the same device family. So, you can immediately see that Samsung SM-8312E is a problem area, which is a good place to start exploring.
You can also use custom reports to keep an eye on the data more efficiently. Here’s a quick example, but you can customize the report however you like, depending on your core metrics. Here’s what the device tab of the report might look like…
And here’s what the browser tab might look like…
Now, if you’re conducting quality assurance pre-launch, the typical advice is to start with the most popular browsers and devices and then work your way down to the less popular ones. However, this isn’t your only option.
Chris Ashton of BBC explains how he and his team do cross-testing to save their sanity…
Here’s how that looks visually…
In step one, you’ll focus on the browser-agnostic issues. In step two, you’ll find many, many more issues by checking your most problematic browsers, which will make your site more resilient in your less problematic browsers.
Once you’re confident in the first two steps, you can move on to the other browsers knowing you’ve likely already fixed the majority of the issues.
You can also use tools, like BrowserStack…
…which allow you to test your site instantly on different browsers and devices. There are a few different testing methods to choose from, depending on the tool you choose…
- Live testing. An interactive lab where you can run live tests in hundreds of browser and OS combinations.
- Automated screenshots. Check your site’s design across multiple browsers using screenshots.
- Local testing. Test your site while it’s still in development, behind a firewall or on your desktop.
- Actual device testing. Test your site on actual browsers, not using emulators for even more accurate results.
- Selenium cloud testing. Automate browser tests across thousands of browsers for faster debugging and a pre-supplied infrastructure and Selenium grid.
So, for example, cross testing tools allow you to browse your site through the lens of a Samsung mobile device…
and then easily switch over to an iPad…
Given there are so many different browsers, browser versions, devices and operating systems, tools are a definite asset.
A Special Note on Mobile Quality Assurance
As we’ve written about before, don’t set up a responsive design and call that mobile optimization. Just because you have a responsive design, does not mean your site displays and works properly on all browsers and devices. You still need to conduct quality assurance.
Due to the nature of mobile, you also need to consider some mobile-specific factors. Remember, a quality desktop experience will typically look quite different from a quality mobile experience.
Talia Wolf of GetUplift.co offers some advice and identifies some common problem areas…
Of course, the list goes on and on. When you switch to mobile quality assurance, switch your mindset and adjust your definition of quality, too.
Check your site, landing pages, test treatments, and email campaigns for bugs and errors. Seems like pretty straightforward advice, right?
Yet there are still many sites that just don’t work.
Here’s how you can start plugging leaks and improving the quality of your site…
- Your UX and QA teams should work together closely so that they can both perform their jobs better, resulting in a better performing site.
- Conduct quality assurance on your analytics. Is activity being properly reported in Google Analytics?
- Create a set of guidelines with the help of the development team before you begin quality assurance.
- If you haven’t conducted quality assurance before, start by going through your funnels. Use scenario-based use cases to complete tasks and fail to complete tasks, both of which will shed light on issues.
- Use Google Analytics to help you find browsers and devices that aren’t performing as well as others.
- Quality assurance is time-consuming, so use cross-testing tools or modified strategies to get a bigger ROI faster, like BBC’s 3-phases.