Before you run out and buy that toaster oven on sale based on a glowing user review or staying at a particular side street hotel because John Smith gave it a five-star writeup on some Web reviews site, consider for a moment that you could be reading a bogus account. Positive reviews are a hot commodity, and Cornell is working on a formula to automatically detect the fake ones.
Before we dive into Cornell's custom algorithm, let's set things up. The New York Times ran an article about the "arms race of sorts" that exists as companies chase five-star reviews, and some are willing to pay for them.
"I will pay for positive feedback on TripAdvisor," NYT quotes a post from the Digital Point forum. It's also easy to find individuals actively seeking out companies willing to pay for positive reviews by posting their rates on forums and Craigslist.
NYT tells the story of Sandra Parker, a freelance writer who claims a review factory hired her to write Amazon reviews at a rate of $10 per post.
"We were not asked to provide a five-star review, but would be asked to turn down an assignment if we could not give one," Parker explains.
It's a shady business, and getting back to Cornell, university researchers are working on special code that's able to spot bogus writeups. It works by looking for strong and slight deceptive indicators. For example, in breaking down a fake hotel review, the program points out the greater use of first-person singular, direct mention of where the reviewer stayed, the use of an exclamation point, high adverb use, and other clues that, taken together, means there's a strong likelihood the review isn't real.
Unfortunately, this sort of thing does go on, you just don't hear about it very often. But every once in awhile, a high profile scandal makes the headlines, like when a Belkin employee was caught soliciting positive reviews for pay back in 2009. Belkin acknowledged one of its employees was paying for positive Amazon reviews and said it was an isolated incident.