Confirmed: Crowdsourced reviews are better than Professional critics

C

A study by the Harvard Business School published in april 2012 via it’s online portal HBS Working Knowledge, confirms that “Expert ratings are correlated with Amazon ratings, suggesting that experts and consumers tend to agree in aggregate about the quality of a book. However, there are systematic differences between these sets of reviews.” In layman terms that just means that expert ratings are just as good as the average joe commenting on Amazon. It’s something I’ve suspected all along, that crowdsourced reviews from Amazon, tripadvisor, goodreads and IMDB is just as good as the ‘professional’ reviewer you see in magazines and paid adverts.

The study, which can be downloaded in it’s entirety here, is a really great read, that’s not too technical, but it’s not overly simplistic either so be prepared with some of your high school statistics for this one.

Overall though, the studies conclusion suggest:

  • The data suggest that media outlets do not simply seek to isolate high-quality books, but also to find books that are a good fit for their readers. This is a potential advantage for professional critics, one that cannot be easily replicated by consumer reviews.
  • Expert ratings are correlated with Amazon ratings, suggesting that experts and consumers tend to agree in aggregate about the quality of a book. However, there are systematic differences between these sets of reviews.
  • Relative to consumer reviews, professional critics are less favorable to first-time authors. This suggests that one potential advantage of consumer reviews is that they are quicker to identify new and unknown books.
  • Relative to consumer reviews, professional critics are more favorable to authors who have garnered other attention in the press (as measured by number of media mentions outside of the review) and who have won book prizes.

The 4 key points seem to suggest:

1) Media outlets and paid reviewers usually go out of their way to cater to the specific needs of their readers and customers. So while a reviewer at ABC magazine and Amazon customers may both rate a book 4 stars, ABC magazine would be able to point you to books that you might prefer rather than giving you all 4 stars books. Not all 4 stars books appeal to the all appeal, but part of the star rating is appealing to the section of society most likely to read the book. I think it just means reviews are not the same as recommendations and paid reviewers who review for specific interest groups are able to recommend better than crowdsourced data.

2) Expert Ratings and Crowdsourced ratings seem to correlate. Meaning experts and ‘regular’ people agree which books are good and which aren’t. Of course this is unsurprising since experts spend most of their time trying to figure out what the general public want to read–and so you could argue that this point is actually in favor of professional reviewers since  it means they did a good job predicting what the crowd want to read. It also could mean that instead of paying an expert reviewer it could be just as well to check Amazon reviews… catch-22 isn’t it.

3) Professional reviewers seem to favor well-established authors over new comers, while consumer reviews usually give new authors some slack in terms of reviewing. Quite interesting, this might suggest consumer reviews act as a equalizer of sorts giving new comers some much needed slack that the professional reviewers don’t seem eager to.

Overall though, it’s getting increasingly clearer that the line between professional reviewers and consumer reviews is beginning to blur, particularly when you have a host of pro-am review blogs out there.

However, as with all things crowdsourced we need to keep in mind, that one professional review is much better than one consumer review. It’s only when we aggregate the hundreds or thousands of consumer reviews do they begin to correlate. The variation in the consumer reviews is much higher than the variation among professionals, which is another way of saying that professionals tend to agree which books are good and aren’t, but the general population isn’t sure. It’s only when we aggregate the reviews of the general population do they correlate with the aggregated review of professionals.

Some time back I recommended that Maxis used the goodreads API for their reviews on the ebuuk platform. It appears they didn’t take my recommendation seriously, and their ebuuk platform doesn’t look like its going anywhere anyway. (although their new cloud storage ‘loker’ is pretty good). Goodreads would have provided a good data platform for maxis to leverage on user reviews which their new ebook platform would take years to get organically, and according to this study would be just as good as hiring professional reviewers. So why not? Reviews are a great way to get people to buy stuff, particularly since I never buy anything online without first reading a review of the retailer and the product.

That brings us to another topic of which is big data, but that’s for another day.

Thanks to the HBS working Knowledge HBS Working Knowledge for the article.

2 comments

Astound us with your intelligence