Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Technology

Google Algorithm Discriminates Against Bad Reviews 175

j_col writes "According to the official Google blog, Google has altered their PageRank algorithm to not give back linking points to bad reviews of websites belonging to online retailers, following the publication of a recent article in the New York Times describing one woman's experiences in being harassed by an online retailer she found via Google. The specific changes to the algorithm are of course a guarded secret. So considering that these changes are already live, how do we know how the algorithm determines a bad review from a good one, and whether or not innocent online retailers will be wrongly punished by having their rankings downgraded?"
This discussion has been archived. No new comments can be posted.

Google Algorithm Discriminates Against Bad Reviews

Comments Filter:
  • by Animats ( 122034 ) on Thursday December 02, 2010 @03:50PM (#34421692) Homepage

    This is the fundamental problem with "crowdsourcing" reviews. Where the number of reviewers is large compared to the number of items being reviewed, as with movies, it works fine. Where the ratio is small, it doesn't. It's far too easy to game the system. There are automated tools for that. [wikipedia.org]

    This problem has become worse since the October 27th change to Google, when Google Places/Maps results were merged into web search. This made "local" results much more prominent. Look at the first screen of Google search results for a local product or service. Most of what you see are Google Places results, maps, or ads. The organic results are so far down they don't matter.

    As a result, the "black hat" SEO companies are now aggressively targeting Google's places and maps system. "Convert Offline" is quite open about this, with their article Dominating Google Maps- The Most Effective Spam Ever And What You Can Learn From It" [convertoffline.com] In some ways, Google Places is more vulnerable to attack than organic search. The number of web mentions of a local business tends to be small, so the amount of phony material that has to be generated to make a business look good is also small. Each mention carries a lot of weight.

    Google might lose this battle. Craigslist did. Back in 2008, Cory Doctorow wrote about "Spammers discuss breaking Craigslist verification system" [boingboing.net]. It's become much worse [techdirt.com] since then. Personals were the first to go, and are now over 90% spam. Then Computer Services and Self Employment fell to the spammers. Jobs and Real Estate are under attack. Along the way, Gmail became a spam haven [google.com], especially after Jiffy Gmail Email Creator [cnet.com] became widely used.

    The fundamental design assumption of Google is that important stuff has lots of links to it. That's not a valid assumption in local search.

"More software projects have gone awry for lack of calendar time than for all other causes combined." -- Fred Brooks, Jr., _The Mythical Man Month_

Working...