Forgot your password?
typodupeerror
Google Businesses The Internet Your Rights Online

Google News Found Guilty of Copyright Violation 223

Posted by CmdrTaco
from the de-index-belgium-in-retaliation dept.
schmiddy writes "A court in Brussels, Belgium, has just found Google guilty of violating copyright law with its Google News aggregator. According to the ruling, Google News' links and brief summaries of news sources violates copyright law. Google will be forced to pay $32,600 for each day it displayed the links of the plaintiffs. Although Google plans to appeal, this ruling could have chilling effects on fair use rights on the web in the rest of Europe as well if other countries follow suit."
This discussion has been archived. No new comments can be posted.

Google News Found Guilty of Copyright Violation

Comments Filter:
  • by Xonstantine (947614) on Wednesday February 14, 2007 @10:22AM (#18011090)
    Maybe Google should just delink the sites altogether, that way the offended media organizations can watch their traffic plummet to zero?
  • by xxxJonBoyxxx (565205) on Wednesday February 14, 2007 @10:27AM (#18011150)
    I'm not sure how much aggregation Google news does, but I'd think if they're copying in less than 10% or so of the story and providing a link to the original they'd be safely in the "fair use" arena.

    I suspect this has more with newspapers getting annoyed that people are starting to type in "[MyCity] news" in Google more often than looking up their local newspaper's web site. The newspapers also would like to restrict access to their "archives" (which they regard as a pay-to-see resource).

  • hmm (Score:3, Insightful)

    by TinBromide (921574) on Wednesday February 14, 2007 @10:29AM (#18011186)
    Sounds like they're biting the hand that feeds them. There was a rush of articles a while back where web analysts were blaming google for being a sort of web vampire/leech, sucking the blood out of websites without providing anything back. Those claims have quited because businesses realized that when they changed their model to accommodate the search centric interweb, times were good.

    You leave google, google leaves you. Buh-bye, thank-you for flying the interweb air, we hope you enjoyed your time on interweb and also hope to see you again soon.
  • IP Rights. (Score:5, Insightful)

    by nurb432 (527695) on Wednesday February 14, 2007 @10:31AM (#18011218) Homepage Journal
    Are going to destroy the world as we know it. ( well, that and the lawyers ).

    Its more insidious then any terrorist group, or rouge nation.
  • by Anonymous Coward on Wednesday February 14, 2007 @10:38AM (#18011304)
    Yes. Yes they do. Socialism has always been "big business OMG EVIL CAPITALISTS STABBITYSTABSTAB DIE DIE DIE" regardless of the particular merits or vices of the business in question. Except when the Big Business in question is The Government. Then it's all okay.
  • by gravesb (967413) on Wednesday February 14, 2007 @10:40AM (#18011336) Homepage
    This reminds me of when France was going to force Apple to open iTunes, and Apple said fine, we'll leave. Or when the EU took on Microsoft. Once companies get to be a certain size, its really difficult for countries to control them, especially when the controls will end up hurting their corporate citizens, as in this case. When Google stops linking to their newpapers, the newspapers will feel the pain, not Google. Especially since all of Google's competitors will have to play by the same rules, and can't provide unique content. If the governments were right in these cases, and could take the moral highground, then they might stand a chance of winning. However, by continuing to fight huge tech companies in these areas, where they can't win, they stand to lose the power to fight when it really matters. Also, in each case, there were other ways of dealing with the problem. Don't like MS bundling? Move the government to Linux, save money, and encourage your population to do the same. Don't like iTunes and the way Fairplay is locked down? Start a competitor, or encourage the labels to stop their love affair with DRM. Don't like Google lnking to news stories? Update your robot.txt to prevent cache's and Google indexing your site to begin with. Of course, they know they can't do that. They want to come up on Google searches, but not have Google index their content as well. Would you like to have that cake you just ate, anyone?
  • by GryMor (88799) on Wednesday February 14, 2007 @10:50AM (#18011436)
    No, but I've seen a lot of users go to www.cnn.com by means of entering www.cnn.com in google's search box.
  • by Tony Hoyle (11698) <tmh@nodomain.org> on Wednesday February 14, 2007 @10:58AM (#18011530) Homepage
    Fair use is a US concept. The 10% if it exists is probably a US thing as well. In the UK it's 5%, and only a single article. In belgium it's probably something different.

    Google news is unashamedly breaking copyright.. there's no argument there - the real question is why anyone would prosecute over something that's driving hits to their page and generating ad revenue?
  • by malsdavis (542216) on Wednesday February 14, 2007 @11:11AM (#18011678)
    This would work both ways though. People only use Google as their address bar because they are pretty certain the website will come up. National newspapers and other mainstream media websites are normally some of the highest traffic websites (in terms of unique hits) on the internet for any specific country, therefor by not linking to the media websites Google would also be doing themselves quite a lot of harm.

    If people typed in searches like 'www.nytimes.com', 'www.cnn.com', 'www.bbc.co.uk' into google and it didn't mention the respective websites then a lot of people would probably start switching their homepage away from Google.

    I therefore doubt Google will consider de-listing mainstream newspaper websites. It would give Google an immense commercial disadvantage to their rivals!
  • robots.txt (Score:4, Insightful)

    by GuyverDH (232921) on Wednesday February 14, 2007 @11:12AM (#18011704)
    If they don't want to be scanned by google, create the file.
    If they do want to be scanned (and therefore indexed as well as cached) then don't.

    Although, I for one, would prefer that we would have to *create* the file, and add entries that could say:
    Scan=Yes
    Index=Yes
    Cache=No

    If no robots.txt file is found, then do nothing for the site.
  • by Saint V Flux (915378) on Wednesday February 14, 2007 @11:25AM (#18011890)
    You might want to learn to read - he merely used France as the country for which he'd exploit his idea. He never said it was Belguim or even referenced TFA. I'm guessing you're one of the 95% of modern liberals who can't follow basic logic, right? :)
  • Re:Fair Use? (Score:2, Insightful)

    by smartr (1035324) on Wednesday February 14, 2007 @11:34AM (#18012016)
    Automatic machine processes don't obey copyright law. Is your browser violating copyright law? Is each of the servers passing all the information between you and the content provider, copying the information without permission? Could Belgium sue AT&T? It seems pretty clear AT&T is distributing their information without permission for profit.
  • by walt-sjc (145127) on Wednesday February 14, 2007 @11:41AM (#18012092)
    I therefore doubt Google will consider de-listing mainstream newspaper websites. It would give Google an immense commercial disadvantage to their rivals!

    Yes, but if these rulings stand (through the appeal process,) you can bet that EVERY news aggregator / search engine will ALSO have to remove content / links to the pages, therefore no competitive disadvantage.

    Without news aggregaters, there will be no way for major media sites to attract NEW customers / readers, and non-ahole media sites will end up with larger readership levels.

    The "cache" issue is those sites that want google to index their articles, but want readers to pay for the content. In essence, they want "free" advertising / marketing via google. I say, Delist the cheapskate bastards.
  • by Anonymous Coward on Wednesday February 14, 2007 @12:16PM (#18012588)
    If I have a news article on my site, I want people to come to my site to read it. Why? Simple. I got advertising up there. However, when people don't come to my site, I lose money. And if they're going to google instead of my site for the story, then I do blame google. Personally, I've got my eye on Sentinel from http://www.blogwerx.com/ [blogwerx.com] - they were at the Demo Conference this year, and I'm looking forward catching me some sploggers!
  • by Anonymous Coward on Wednesday February 14, 2007 @12:22PM (#18012668)
    How about lazy Google policies?

    Maybe Google should be kind enough to ask permission in the form of webmasters creating robots.txt instead of just assuming that anyone who doesn't go out of their way to satisfy Google's policy is an open target?

    This is like you crossing a stranger's property... in all human decency it's normally to ask before crossing, not crossing the land and bitching at the owner that if he didn't want people to cross his land he should be putting up fences.

    It's this lack of common sense and common courtesy that is making society into the shithole it is today.
  • by Steeltoe (98226) on Wednesday February 14, 2007 @06:49PM (#18017566) Homepage
    Im surprised at this ruling. It seems RSS / Atom-feeds has just been made illegal in belgium, or am I missing something?

    The courts should not address issues it has no understanding of. It should consist of younger people for technology-related rulings.

    It doesnt even fit this particular scenario. Google News is almost unreadable already, the snippets they cut from each news source is just a few words, and most often not even complete sentences. It is more of a free advertisement for the News agencies, because to get the story, or get any meaning out of it, you need to click the link. Such short snippets should be ruled as fair use, and the Google News should really be longer to be actually readable, but IANAL or a judge for that manner, so who can fathom the reasoning behind it?

    But of course, there will always be rulings going against common sense, but today, it will get more light and fame, so there are really more checks and balances today than say 100 years back.

    Maybe Google should just stop the feeds for those agencies that are suing, and when they see their traffic fall, they will beg to be listed on Google again. I remember this happened for some similar scenario of linking to news a year ago or so.

The number of computer scientists in a room is inversely proportional to the number of bugs in their code.

Working...