The reviews that are displayed in the Local Search Pack are an important signal to potential customers, but do they have an impact on those search rankings? Participants in the Moz Local Search Ranking Factors survey rated reviews as responsible for 15.44% of overall ranking signal.
SEO practitioners clearly believe that reviews play a role in ranking. In this post I'd like to go beyond the survey and try to quantify the impact of reviews on local search.
Some clarifications before diving into the numbers.
- The rankings refer to the Local Search Pack. The local pack appears separately from organic results and features a map. Most often appearing with three initial locations, it can be expanded to show more local listings.
- Proximity to the searcher is known to be an important factor for local pack results. For this experiment, my search location is outside the region my keywords are focused on.
In this experiment I'm focused on one city and several business categories. For each category, I have data-mined the local pack results and produced a plot from that data. On each plot I'll display a line of best fit, as well as a PPMCC score for that category.
The Pearson product-moment correlation coefficient is a score ranging from -1 to 1 that indicates the strength of a relationship between two variables -- a PPMCC of zero means that there is no relationship.
Let's see how reviews factor into ranking for each category.
From this plot we can see, for example, that the top 40 restaurants all have an average rating of at least 4.4. It also seems possible to rate highly, but still easily lag far behind in ranking. The PPMCC score of -0.33 indicates that, in fact, sitting further down the results is correlated with a drop in average rating.
My interpretation is that having a high rating does not guarantee anything, but a lower rating (here roughly less than 4.3) seems to guarantee a position on page three or lower. Review volume is high in the restaurant category, so unlike other categories, none of the first 160 appear with zero reviews.
Interestingly, restaurant reviews show a decent range of ratings. Other categories that we'll look at next appear to show more grade inflation.
Hair salons display similar behavior in comparison to restaurants, but with a slightly stronger PPMCC. As I mentioned earlier, most categories display some grade inflation, as is the case here with hair salons.
There are quite a few businesses with perfect 5.0 ratings. Just like with restaurants, however, that doesn't guarantee much in terms of ranking. The trend seems to be that lower review averages correlate with sinking rankings, but the opposite does not seem to be true.
Although there is a similar PPMCC of -0.32 here, the situation seems quite different from the above categories. For one thing, there are many listings with no reviews at all. In fact, that's what seems to drive the PPMCC here. The distribution is bi-modal -- lawyers largely either have 5 stars, or no reviews at all.
I can only speculate at what's behind this situation, but it would make an interesting future post to delve into the data. Lawyer searches are much higher value than restaurants or hair salons, so the incentive to manipulate results is much greater.
The story here is similar to lawyers, but without the glut of no review listings. There is still a correlation between review average and ranking, albeit fainter than other categories. Having at least a ~4.8 average seems to be table stakes in the dentistry category.
Delving into how these businesses are achieving such high ratings might yield interesting results. Are they using review-gating services? Encouraging clients to review with incentives? Or more cynically, are some of these reviews fake? I have a hard time believing that the averages are naturally this high, especially with so many seeing perfect 5.0 ratings.
Lastly, when searching for plumbing services, we see almost no correlation between reviews and ranking. As with lawyers, there are a large number of listings with no reviews. Here, however, those no review listings appear to rank higher, effectively driving the PPMCC to zero.
What's the takeaway here?
After researching a handful of business categories I feel confident in saying that there is a correlation between reviews and ranking. This seems especially true in categories with less intense keyword competition. High dollar categories, for whatever reason, appear to contain more noise than signal when it comes to reviews. Review patterns seem to vary by category, from more natural ranges, to distinctly bimodal distributions.
Higher margin businesses have more incentive to influence their reviews, and a whole host of services exist to meet that need. Simply asking customers to leave a review is a straightforward technique. Many tactics, however, fall outside of the Google TOS, including offering vouchers or gift cards in return for reviews. One can only assume that many businesses are doing just that and running the risk of an account suspension.
Blackhat techniques, such as paying for fake reviews, might also be responsible for some of the anomalous looking data that we've seen in this post. I might delve deeper into this subject in a future post.
In general, it does seem that reviews play an important role in regard to search, but it's hard to distinguish between correlation and causation. Reviews may be a ranking factor, or they could be the result of the increased traffic that higher listings receive.
Regardless, reviews are an important signal to consumers, whether they play a direct role in search ranking or not.