Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Google Businesses The Internet Spam

Google Warns About Search-Spammer Site Hacking 59

Al writes "The head of Google's Web-spam-fighting team, Matt Cutts, warned last week that spammers are hacking more and more poorly secured websites in order to 'game' search-engine results. At a conference on information retrieval, held in Boston, Cutts also discussed how Google deals with the growing problem of search spam. 'I've talked to some spammers who have large databases of websites with security holes,' Cutts said. 'You definitely see more Web pages getting linked from hacked sites these days. The trend has been going on for at least a year or so, and I do believe we'll see more of this [...] As operating systems become more secure and users become savvier in protecting their home machines, I would expect the hacking to shift to poorly secured Web servers.' Garth Bruen, creator of the Knujon software that keeps track of reported search spam, added that some campaigns involve creating up to 10,000 unique domain names."
This discussion has been archived. No new comments can be posted.

Google Warns About Search-Spammer Site Hacking

Comments Filter:
  • by vintagepc ( 1388833 ) on Thursday July 30, 2009 @11:06AM (#28882501) Journal
    I don't know about you, but something else that REALLY annoys me is pages that contain lists of words just so they come up on many searches... with no actual content. Or sites like "Buy *search term* at low prices" and they don't even sell what you're looking for. What's being done about those?
  • by ParticleGirl ( 197721 ) <{moc.liamg} {ta} {lriGelcitraPtodhsalS}> on Thursday July 30, 2009 @11:16AM (#28882621) Journal

    I found this pretty interesting: "Authentication [across the Web] would be really nice," says Tunkelang. "The anonymity of the Internet, as valuable as it is, is also the source of many of these ills." Having to register an e-mail before you can comment on a blog is a step in this direction, he says, as is Twitter's recent addition of a "verified" label next to profiles it has authenticated."

    The idea of universal authentication [gnucitizen.org] has been tossed around for a while. I feel like the biggest drawback is privacy (we'd have to trust some universal authentication system to hold onto some identifier even if posting anonymously) and the biggest obstacle is the need for universal participation. It's kind of too late to make an opt-in system. But I've liked the idea ever since early sci-fi interwebs (read: Ender's Game) had SOME kind of authentication.

  • by truthsearch ( 249536 ) on Thursday July 30, 2009 @11:28AM (#28882799) Homepage Journal

    Authentication would of course help for properly secured web sites. But many sites have content injected nefariously. One common method is to break into shared hosting servers via ftp or ssh and place javascript or html at the bottom of every html file.

  • by spyrochaete ( 707033 ) on Thursday July 30, 2009 @11:40AM (#28882947) Homepage Journal

    If your website's front page has a PageRank score of 3/10 or higher it is a prime candidate for hijacking. Google gives extra clout to hyperlinks from sites with a high PageRank (aka "link juice"), so it's easiest for a malicious party to hijack a small number of high-ranking sites than a large number of low-ranking sites. The higher your PageRank the greater your risk.

  • Re:Easy to spot? (Score:3, Insightful)

    by Shadow-isoHunt ( 1014539 ) on Thursday July 30, 2009 @12:02PM (#28883273) Homepage
    That doesn't work, because you can't possibly determine whether they're legitimate links or not(if the linking is done properly). For example, how do you differentiate inbetween something that starts as a result of an independently reported news event(or a slashdotting...), or something that starts as the result of hacking? If you want to waste the cycles, you can start mapping the event to find it's potential point of origin to see if it's a news site or something, but it's still going to hurt the little guys.
  • by D-Cypell ( 446534 ) on Thursday July 30, 2009 @12:16PM (#28883433)

    While I don't know for absolute certain, I *strongly* suspect that that data is collected and operated on. Most of the big sites are about so called 'collective intelligence', or collecting information about person A so that you can have a better idea of what you want to be providing to person B. This goes into what links are cicked, at which times of the day, how long people spend on a site or page etc etc. To have a function that is so incredibly explicit as 'This is crap, don't show me it again', and to *not* use that to refine future page generations would be deeply stupid, and stupid is one thing the guys at google aint.

  • by ex0a ( 1199351 ) on Thursday July 30, 2009 @01:21PM (#28884465)

    CustomizeGoogle is a firefox plugin(which hasn't been updated for 3.5 yet) lets you ignore domains.

    From the CustomizeGoogle page [mozilla.org] the reported version allowed is up to 3.6a1pre for anyone reading this not checking into the addon because of the parent. This addon is really handy.

8 Catfish = 1 Octo-puss

Working...