You know when you search for something on Google sometimes you see review stars next to a search result?
Does it work to attract more clicks?
Inspired by our study Which Types of Social Proof Work Best?, we set out to quantify review stars as a way to increase click-through rates (CTR) in search engine results pages.
What kind of improvement in CTR can we get from including review stars in search engine results, if any? What does that mean for application in your business? We attempt to answer these questions with hard data in this CXL Institute study.
Our research was performed in collaboration with Nitin Manhar Dhamelia from Belron® International, a automotive glass replacement and repair group.
Results summaryThis study showed that review stars in search engine results significantly improve click-through rates, by as much as 35%. Our researchers conducted a custom research study to identify differences in click-through rate (or “CTR”) among variations of organic search results pages for two markets, one in Belron®'s most Western market, and another in a large European market. The results showed that review stars of SERPs for both markets resulted in a significantly higher click-through rate. Review stars in the search engine results page (or “SERP”) helped traffic uplift by 35% in the Western market and in the large European market, the same test yielded a 13% traffic uplift. Some statistics of note:
- Users for the Western market showed users took less time considering whether or not to click an organic search engine result with review stars included.
- The Organic Area of Interest (or “AOI”) treatment 1 with review stars had a higher percentage of users clicking and number of clicks metrics.
- There was significantly higher click-through rate for organic 1 SERP result with review stars, based on surveys for both the Western and European brands.
In this study, we compared two different search engine results page (SERP) treatments for two different markets, which we’ll call “the Western market” and “the European market” (made-up terms for the study). Two markets were used to replicate the study design in a different country and language.
The Google search terms used for these market tests were generic terms, to ensure that the test wasn’t undermined by any brand term influences. We used a combination of eye tracking and a ‘click-thru’ survey tracking for maximum certainty of results. The survey was essentially the same as the eye-tracking setup, which had a user follow a task and evaluate and interact with a SERP.
- the eye-tracking help provide a baseline of how the inclusion of review stars influenced how visitors visually processed the SERP content,
- the survey allowed us to quantify and measure click-thru-rate (CTR) with needed sample sizes to determine statistical differences in CTR among treatments.
A CXL Institute lab eye-tracking process and survey analysis were used for the Western market analysis, and just a survey analysis was used for the European market.
Study Question: Do review stars in SERPs drive higher click-through rates?
Data Collection Methods and Operations
Our researchers conducted a custom research study to identify differences in CTR among variations of organic search results pages for two markets.
- 2 target markets X 2 variations = 4 treatment ‘stimuli’
- 2 Target markets: 1 Western market & 1 European market
- Variation differences are inclusion/exclusion of review stars on target SERPs for each market
- Eye-tracking and custom click-thru surveys were used to collect CTR data among variations
The metrics we analyzed for eye tracking on the Western market include time to first fixation, average time fixating, percent clicking, number of clicks, average time to first click, percent clicking, and average number of visits.
Data from the eye-tracking analysis was used to evaluate the approximate difference in CTR between treatments (with & without review stars) to determine sample size for the survey analyses.
Results are displayed in heat maps to visually compare results, as well as a summary statistics table to compare numerical data.
For the eye tracking heat map study, we used about 100 people to compare each of the Western market brand SERP variations, for a total of around 200 people.
For both markets, the click-thru survey included approximately 500 participants to validate the results, this sample size determined beforehand from the results of the eye-tracking analysis.
Of the approximate total number of people in the eye-tracking study, around 59% produced results of sufficient quality to include in our data analysis, which gave us about 60 per variation for Treatment 1 and 58 for Treatment 2.
Industry standards (via NN/g and Tobii) suggest a sample of 30 people for valid heatmaps.
Here is an example of the Western brand survey. The European survey used a translated version provided by Belron®.
Scenario given to study participants:
You live in North America and are in need of windshield repair. You are on Google searching for a company to call.
View the following search engine results page, consider the options you see click on the link that you would choose only considering the information you have on the page.
Post Task Questions:
- What made you choose the link that you did?
- What company brands have you used for windshield repair/replace in the past?
- What’s important to you when searching for windshield repair?
The survey results provided us click-thru data for each treatment. For the first organic search result we looked for differences between CTR using an N-1 two proportion test.
Other Key Info (like treatment variations)
For the Western market SERP, we started by showing a user one of two variations of search engine results pages, either where the URL metadescription included review stars for the page and one where it did not.
To paint a picture, here is an example of the brand treatment variations for the Western market SERP (images blurred on purpose):
The Belron® eye tracking test for the Western market monitored the place on the SERP where users’ eyes rested longest and associated these fixations with predetermined ‘areas of interest’ “AOI” that correspond to individual search results.
Here is an example of a SERP with AOIs indicated. (The starred options in the image below are AOIs that include reviews):
In looking at the Belron® eye tracking heat map, you can see that the user’s eye is drawn to the upper left-hand corner of the SERP. Without review stars, the user showed a higher percentage of fixations further down the page on average.
Once we figured out where people were looking, we also analyzed how long it took them to click and where they ended up clicking.
These findings are limited to the individual Western market and European market studied.
While it might be easy to extrapolate these findings to use for other markets, each company should do their own experiments for review stars in SERP results, as each company’s and market’s user base is different.
Review stars definitely work to drive attention. This study showed that in can boost CTR by as much as 35%, and for both surveys we showed a >97% chance that the treatment with review stars had a significantly higher CTR compared to the treatment without.
As Google search results page is getting busier in general, and it’s harder and harder to rank #1, review stars is a great tactic to increase your CTR.
They invite attention and clicks because they are a form of social proof, causing the user to focus on them more seriously for longer amounts of time.