Thanks to the folks at Digital Photography Review (dpreview.com) for flagging a study of deceptive online reviews by researchers at Northwestern University and MIT (“New study investigates online reviews – makes surprising discoveries“).
Deceptive reviews were identified in two ways. There are reviewers who claimed to have previously purchased the product, but the online retailers’ database showed this was not actually the case. A second group of deceptive reviews are identified through several linguistic characteristics (you can check out the full paper for details).
What they found is that ‘approximately 5% of the product reviews are written by customers for whom there is no record they have purchased the item. These reviews are significantly more negative on average than the other 95% of the reviews for which there is a record that the customer previously purchased the item’.
It’s no surprise that positive reviews lead to higher sales, but according to the study, negative reviews have a much more profound negative effect.
In simple terms, if you’re considering buying something which has ten five star ratings and a single one star rating, that single negative review could make you move the cursor away from ‘buy now’. And if the review was dishonest, then you, and the online retailer, just lost out.
The reviews identified as deceptive were much more likely to be negative. There weren’t an equal number of positive but deceptive reviews.
Very few customers write reviews. They are written by less than 2% of the firm’s customers, while reviews without prior transactions are written by just 6% of all reviewers. This suggests that reviews without prior transactions are contributed by just the tail of the tail of all customers.
We also show that customers in this extreme tail are not representative of other customers. They purchase many more items, they are more likely to buy at a discount, they are more likely to return items, and are much more likely to purchase new or niche items. Unfortunately they are
also influential. Their low ratings reduce demand for the products that they review and this loss of demand is evident for the next 12 months.
This is interesting on its own, but also for raising the question of whether blog comments can also be analyzed for “deception,” such as comments written simply to be provocative or disruptive and not to reflect the actual views of the commenters.
If you run across any of those studies, let me know.
In the meantime, the full research paper referenced by dpreview.com can be found here.