Customer reviews on websites like Amazon, Tripadvisor and Yelp.com can have a big impact on business, according to research by Professors Michael Anderson and Jeremy Magruder, published in the September 2012 issue of the Economic Journal.
Their analysis of the ‘crowd-sourced’ ratings of over 300 San Francisco eating places on Yelp.com finds that a restaurant that improves its rating by half a star – on a scale of 1 to 5 – is much more likely to be full at peak dining times. This increase in business happens without any change in prices or the quality of food and service, confirming that it is the reviews that bring in the new customers.
Online review websites are recognised as a convenient source of consumer information, but until now, there has been little evidence of their effects on corporate profits. The findings of this study demonstrate that – although social media sites and forums may not generate the financial returns for which investors yearn – they play an increasingly important role in how consumers judge the quality of goods and services.
The researchers note that while restaurants with strong reviews on Yelp.com do better business than poorly reviewed restaurants, establishing cause and effect is difficult. After all, restaurants that get good reviews are those that appeal to consumers and they would probably do well even in the absence of any reviews.
Measuring the causal effect of good reviews requires differences in restaurant ratings that are unrelated to differences in restaurant quality. The study solves this problem by exploiting the way in which Yelp.com displays a restaurant’s average rating.
Yelp.com aggregates customer reviews of local businesses and prominently displays each business’s average rating when presenting the search results. When Yelp.com computes a business’s average rating (which ranges from 1 to 5 stars), they round off to the nearest half-star.
Two restaurants that have similar average ratings can thus appear to be of very different quality to online viewers. For example, a restaurant with an average rating of 3.74 displays a 3.5-star average rating while a restaurant with an average rating of 3.76 displays a 4-star average rating.
This allows comparisons between restaurants that have different displayed ratings (for example, 4 stars versus 3.5 stars) but nearly identical quality (for example, a 3.76-star average versus a 3.74-star average). Differences in customer flows between such restaurants can therefore be attributed to the ratings themselves rather than differences in the quality of food or service.
The study collects review data and daily reservation availability for 328 restaurants in San Francisco. Analysing these data, it finds that moving from 3 stars to 3.5 stars increases a restaurant’s chance of selling out during prime dining times from 13% to 34%. Moving from 3.5 stars to 4 stars increases the chance of selling out during prime dining times by another 19 percentage points. These changes occur even though restaurant quality is held constant.
Consumer-generated reviews now appear on sites like Toptable.com, Yelp.com, Tripadvisor.com, Amazon.com, Netflix.com and more. This study demonstrates that these reviews have become a salient factor in consumer decisions.
While recent stock market events suggest that social media sites and forums may not generate the financial returns that investors hoped for, the study’s findings imply that crowd-sourced reviews play an important role in the economy by empowering consumers to make better judgements about the quality of goods and services.
‘Learning from the Crowd: Regression Discontinuity Estimates of the Effects of an Online Review Database‘ by Michael Anderson and Jeremy Magruder is published in the September 2012 issue of the Economic Journal. This article is available free.
University of California, Berkeley | +1-510-642-7628 | firstname.lastname@example.org
University of California, Berkeley