Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Google Technology

German Government Wants Google To Pay For the Right To Link To News Sites 186

First time accepted submitter presroi writes "Al Jazeera is reporting on the current state of plans by the German government to amend the national copyright law. The so-called 'Leistungsschutzrecht' (neighboring right) for publishers is introducing the right for press publishers to demand financial compensation if a company such as Google wants to link to their web site. Since the New York Times reported on this issue in March this year, two draft bills have been released by the Minister of Justice and have triggered strong criticism from the entire political spectrum in Germany, companies and activist bloggers.(Full disclosure: I am being quoted by Al Jazeera in this article)"
This discussion has been archived. No new comments can be posted.

German Government Wants Google To Pay For the Right To Link To News Sites

Comments Filter:
  • Say what? (Score:5, Insightful)

    by miffo.swe ( 547642 ) <daniel...hedblom@@@gmail...com> on Tuesday August 21, 2012 @02:08AM (#41065035) Homepage Journal

    If Google have to pay to index their sites, the news sites are the ones missing out. Unless Google are force to index them and also forced to pay, but that would in essence be a tax against a single company.

    • Re: (Score:3, Insightful)

      by Culture20 ( 968837 )
      The news sites are miffed because search engines preview their pages in the search results, and the users just skim the results instead of clicking the links.
      • Re:Say what? (Score:5, Insightful)

        by Anonymous Coward on Tuesday August 21, 2012 @02:14AM (#41065071)

        They can ask Google to not index them.

        captcha: retard

        • by Anonymous Coward on Tuesday August 21, 2012 @05:12AM (#41065845)

          They can ask Google to not index them.

          If only we had some way of doing that automatically per site?
          I propose a file named "robots.txt" file to be placed in a http server's root,
          in which is some parsable description that describes what web crawlers are and aren't allowed to access.

          It's not like we have anything like this right now... right?

          • Stop that, you're going to make me spill my morning coffee.

          • How in the world is that going to compensate them for the lost sales of their print version?
      • Re:Say what? (Score:5, Insightful)

        by NettiWelho ( 1147351 ) on Tuesday August 21, 2012 @02:15AM (#41065075)

        and the users just skim the results instead of clicking the links.

        Yes, because I didnt find anything of interest during the skimming.

      • The news sites are miffed because search engines preview their pages in the search results, and the users just skim the results instead of clicking the links.

        ...and the reason they can't make the preview interesting enough for me to want to click it is...?

    • Re:Say what? (Score:5, Insightful)

      by bickerdyke ( 670000 ) on Tuesday August 21, 2012 @02:42AM (#41065225)

      Unless Google are force to index them and also forced to pay, but that would in essence be a tax against a single company.

      Yep, that's what they want.

      If those sites just wanted Google to stop indexind their pages, a robots.txt would be enough.

      Honi soit qui mal y pense.

      • If those sites just wanted Google to stop indexind their pages, a robots.txt would be enough.

        They don't want them to stop, they want them to pay.
        This will never pass, of course. Google and others would simply stop indexing them, making them nearly invisible. Who wants that?

        • money is only half of what those publishers want. The other half is forcing Google to list their pages using anti-trust laws. That may be possible as Google is big enough to be a de-facto monopoly (that's ok under EU law as it was achieved without unfair means. It's using that position to activly suppress competition that not allowed. De-listing other sites may be seen as such an unfair attack, as it is not far fetched to see Google News as a direct competitor to other news sites as newspapers)

          • That doesn't necessarily stop Google. They could still list the sites as required by the courts, just give them a weight of .00000001, meaning they're on the last page(except for very specific searches), and only listed as a link, no text, so very few people would click on them anyways.

            • by bickerdyke ( 670000 ) on Tuesday August 21, 2012 @07:13AM (#41066515)

              Courts aren't stupid... They would still recognize that as an equally unfair action against competitors.

              • Someone has to be listed at the top, someone else has to be listed at the bottom.

                Forcing a top result to the bottom might be unfair, but they are complaining that they don't want their results shown for free.

              • by sabri ( 584428 ) *

                Courts aren't stupid... They would still recognize that as an equally unfair action against competitors.

                I disagree with you here. Courts are stupid, at least most of them. If you look at courts around the world, you will see that most senior judges (the ones handling the appeals cases, which are the most important ones) are well over their 40s, and have no affinity whatsoever with technology, let alone the complex internal workings of The Big Evil Internet.

                Want proof? Look at the various rulings that prohibit linking [slashdot.org] for example [afterdawn.com].

                Not to mention the hostility they hold towards internet providers, favoring t

            • Actually, Google could do what they asked and just give a bare link back to the site in the listing. That is, the site appears in the search listing as www.deutschebag.com with no more information.
        • Re:Say what? (Score:4, Interesting)

          by bluefoxlucid ( 723572 ) on Tuesday August 21, 2012 @07:35AM (#41066633) Homepage Journal

          Exactly. They want Google to pay them for providing the service of indexing your site. I want construction workers to pay me to build a deck on my house. Funny thing is the construction workers want ME to pay THEM. Crazy, isn't it?

          Google will just de-list them and, since nobody remembers bookmarks or URLs, Al-Jazeera and CBSNews will be swiftly forgotten and sent to the hell of bankruptcy. Google can't de-list, say, MSN, since Microsoft is a competitor and Google has a monopoly; however if MSN demands Google pay $1 per search result, Google can refuse to pay and then be compliant with copyright demands by de-listing them. Further, Google could then refuse to ever re-list MSN ever again; and a court would have to then order Google to list them, but it would be extremely difficult because MSN initiated the "take me off your list" call and how are we going to accuse Google of abusing their monopoly now? What's your argument? Google is being abusive by declining to take advantage of a special offer to utilize a competitor's product for free?

      • If those sites just wanted Google to stop indexind their pages, a robots.txt would be enough.

        Normally, slashdot readers are all for opt-in as compared to opt-out. Why is it different here?

        • by ColaMan ( 37550 ) on Tuesday August 21, 2012 @05:22AM (#41065887) Journal

          Because operating a webserver is basically opting-in to being part of the World Wide Web.

        • by bickerdyke ( 670000 ) on Tuesday August 21, 2012 @05:31AM (#41065943)

          Because one can safely assume that being listed in Googles index is what website operaters want. The existance of all those black- white- grey- and donkey-hat SEOs supports that assumption.

          But I partly agree, if someone would re-invent the internet and write specifications from scratch, opt-in should be the norm. But once again. THAT's NOT THE POINT here!

          Google offered those publishers who are pushing for that law, to ignore their pages, so they wouldn't even have to opt-out, but the following outcry "Google threatens to unlist us!!!!" was even louder than the former one "Google indexes our pages without paying compensation"

          This is NOT about indexing or being found by google news. Everybody wants to be indexed by Google!

          They simply want money!

          • by AmiMoJo ( 196126 ) on Tuesday August 21, 2012 @07:11AM (#41066491) Homepage Journal

            TFA doesn't explain the situation... at all.

            Newspapers seem to think that Google wants their content to create services like Google News and enhance its search results, from which it derives profit. The problem is that Google's attitude is that web sites have to make their own money from visitors, which makes sense for search but Google News is essentially creating a kind of "digital newspaper" from other people's content.

            Because Google integrates news stories into its search results the line between the two is now blurred. Personally I think Google is right here, and while some profit sharing would be nice doing so would set a dangerous precedent.

            • I doubt that Google News copies whole news stories... Noone would have ever thought that that should be covered by fair use. The original draft for this law specifically mentioned headlines. and isolated sentences.

            • Google might pull the entire text of news stories for indexing purposes, but they don't show it. If I'm searching for articles on a topic and enter my search term, Google News will show me links to the news articles in question with the title and perhaps some contextual text (the sentence surrounding it). They don't give me the entire article. If I want that, I need to click on over to the website to read it.

            • Except its complete BS. I use Google News all the time -- and I click through on the articles that are interesting to me. What it means is that I may not read all my morning's news on any one news site, which is annoying to them, but too bad.

              Also, it means all those news stories they pay for from AP and other news aggregators are worthless because I'm not going to read the exact same story from two different newspapers, but I might read the same story multiple times if they have different writers and angl

            • by tlhIngan ( 30335 )

              The problem is that Google's attitude is that web sites have to make their own money from visitors

              And Google offers a ton of various ad subsidiaries for that exact purpose, from AdMob for mobile, DoubleClick for your popover/popunder/flash ad needs, and probalby dozens of other ad networks they also own.

              Google is your friend - just subscribe to one of Google's many ad companies and you'll get paid.

              Seems like it's working fine for me.

        • Because publishing something to the Internet makes it public by default.

          If you don't want your information to be public, you hide it behind passwords and login forms. Once its public, there's no reason to stop people from indexing.

        • by Rakarra ( 112805 )

          Because the world wide web cannot operate as an opt-in service, and opting-out is not something you have to register with the operators. IE, a website owner doesn't have to go to some page on Google, Bing, Altavista, Wolfram, Yahoo, and whatever to opt out, a setting which would be conveniently 'forgotten' when those sites revised their ToSes, or terminated your account because you didn't log in, or anything along those lines. Those sites instead, are asking you what your preferences are, each time. You are

    • by jeti ( 105266 ) on Tuesday August 21, 2012 @03:10AM (#41065375)

      Congratulations. You identified the "???" before the "Profit!".

    • by swilver ( 617741 )

      Perhaps Google should start asking money to be included in search results, seems only fair.

    • He gets it in one. We call it "breaking the internet," and it's what most governments attempt to do.

  • Wrong reaction (Score:2, Redundant)

    by c0lo ( 1497653 )
    TFA (well, the last one)

    the new conservative-liberal German Government that was elected in late 2009 declared: “Press Publishers shall not be discriminated against other disseminators of copyright protected works [e.g. film or music producers]. Therefore we aim for the introduction of a neighbouring right for press publishers to increase the protection of press publications on the Internet.”

    First... a weird thing: are the press publishers in the same league as the copyright protected works? I know that an US court allowed FauxNews the right to serve "creative fiction" as news, but I thought this should be rather an exception than the norm.

    Second... now, I know that's a fool hope, but I cannot stop myself wishing that the discrimination (... which is a wrong thing, right?...) would have been resolved by lowering the rights of the film or music producers instead of increa

  • Misleading summary (Score:5, Insightful)

    by Anonymous Coward on Tuesday August 21, 2012 @02:26AM (#41065141)

    The proposed law has nothing to do with linking to news site at all. The point is that the publishers are to be compensated if anyone takes parts of the article or the full text and displays them somewhere else. There is not even so much debate about the intention itself, I think it's only fair if you reprint significant parts of an article (and thereby deprive the original author of advertisement revenue or subscription fees), but what constitues a "significant part" of a news article? For example Google News usually shows the first few sentences under the link, is that a significant part? In my opinion it's not, but that is what the discussion is about.

    In the original draft, even single sentences would have been regarded as "significant parts", but that would then also mean that you cannot quote from any news article anymore in any other publication, which would have significant negative side effects. So, what happens now is what happens in every democracy, someone drafts a bill, other people critisize it, and we have no clue yet what is going to happen in the end.

    • by Hazel Bergeron ( 2015538 ) on Tuesday August 21, 2012 @02:35AM (#41065183) Journal

      So, what happens now is what happens in every democracy, someone drafts a bill, other people critisize it, and we have no clue yet what is going to happen in the end.

      Perhaps your democracy is not old enough to be operating optimally. In Westminster, it works like this:

      1) One or more big businesses lobby government;

      2) Government produces draft legislation to benefit these businesses, but including all sorts of bullshit in it too;

      3) There is a "debate" in which the government "concedes" to removing all the bullshit that no-one was expecting to be included anyway;

      4) The bill passes.

    • by kill-1 ( 36256 )

      In the original draft, even single sentences would have been regarded as "significant parts", but that would then also mean that you cannot quote from any news article anymore in any other publication, which would have significant negative side effects.

      You could still quote articles. But that quote has to be embedded in another non-trivial work. Aggregation of news has never been quoting in the sense of German copyright law.

    • It seems like the simple solution (from Google's perspective) is still effectively what many posters are saying. If sites insist that Google pay them to include snippets in the search results, then Google should simply omit the snippets for those search results. Given the way people rely on snippets to give them an idea about whether or not the link is to what they're looking for, the result will be almost the same as if Google simply didn't link to them. Further, since Google, like all search engines, u

    • by Kirth ( 183 )

      ... yes, and the worst part of it would be that "publishers" suddenly can claim new rights on public domain content.

      This is a declaration of war against the public, public domain and the public good.

    • Google News usually shows the first few sentences under the link, is that a significant part? In my opinion it's not, but that is what the discussion is about.

      It really depends on the news source. Traditionally newspaper articles, specially those written for syndication, are written in a quite annoying style in which the most important bit of information comes in the first paragraph, then each subsequent paragraph adds less and less relevant details, until you reach to the absolutely useless stuff that's there only to fill space. This is done this way so that any number of newspapers can buy the article and make it fit their wildly varying available space, as the

    • IP is going away. This is just the death throes. IP is a way to control your personal property. For everything content based it means that someone prevents you from having 0's and 1's in specific sequences on a computer you own.

  • Is it too much to ask that you know what the fuck you're talking about before drafting or considering a piece of legislation that affects said fucking whatever?
    • by interkin3tic ( 1469267 ) on Tuesday August 21, 2012 @02:39AM (#41065207)
      I'm sure they knew exactly as much as about it as the lobbyists for the publishers thought they should know.
    • Yes. Requiring that the people in charge knows what they are talking about would limit who could be put in charge to intelligent and rational people, which is called elitism, and that's not socially acceptable. So it is indeed too much to ask.

      However it is not too much to ask, in my opinion, that the drafter of any bills are held personally responsible for it. If say this bill passes and company A is suddenly pressing Google for X millions in compensation, Google should be able to sue the person drafting th

      • by Kjella ( 173770 )

        prompting a legal review by the highest legal court in the country. If the bill/law is found to be bad

        I'm sorry, but how could a court possibly decide if a law is good or bad? That would make the Supreme Court into some kind of unelected super-parliament of nine deciding on their own whim what laws to keep and not. Yes, they can say if a law is constitutional or not but they don't form any opinion of whether that's good or bad, Congress has passed both the law and the constitution and the Supreme Court only make sure they're consistent. Despite all the flaws in the US election system, putting the democratic

        • First off, you assume america here. I don't, maybe because I'm not american, maybe because the article is about Germany, maybe because the world is bigger than your little pond. However, repealing laws are indeed a normal part of the duties of the legal system in a country, or perhaps of the government in some cases. Either way it's an investigation that is handled on the highest levels of the legal system, since it's a decision that affects all lower parts of the system.

          And yes, putting a democratically el

      • by Duhavid ( 677874 )

        That would only shift the payments to the lawmaker and the reviewer.

        Get corporate money out of politics, then you have a real fix.

        • Well, the people who push the bills are the lawmakers, so it would cure that. As for the reviewers, that's supposed to be a court of law like I said. If those accept bribes then it's a much more obvious problem. A politician can accept a lot of "campaigning contributions" and so on with little trouble, but a judge who suddenly received large payments of any kind from any people other than their employer - the courts - would be easy to bring under question. So if your country actually has a trouble with brib

  • There is robots.txt

    You don't want Google to link to you, update your robots.txt. It is so simple ?! Those that do will be indexed, those that don't wont and it is business as usual or lack there off.

  • by Anonymous Coward

    First of all: The so called "Leistungsschutzrecht" has already been cut back to become a "Lex Google", meaning it will (currently) only apply to Google, making it open to litigation (laws must not be tailored to one specific offender).

    The whole thing is a farce. It's been a concerted effort of German media companies trying to bully others into paying compensation. Consequently, the initiators being media companies, you won't find much criticism in the media.

    If you care to read some more about it, use google

  • by MidnightBrewer ( 97195 ) on Tuesday August 21, 2012 @04:23AM (#41065635)

    This is one of the most incomprehensible post summaries I've ever seen on Slashdot; it could have used a little TLC in the way of explanation.

    So basically the German publishers are claiming that the current copyright law be amended to make any quote from an article, even the headline, subject to a copyright licensing fee. Under current law, the headline and opening sentences of an article are in the public domain. Linking itself is free; it's the snippet quoting that Google and other sites like to do that would cost money. However, it would have disastrous consequences for blogging and online journalism as a whole, not to mention search engines, as pretty much any web page that quotes a German article would be liable to pay a fee.

    Reading the second article, it would appear that the second draft of the bill has already gotten to the point of compromise where nobody would be happy with the eventual outcome, including the publishers, so it will most likely stall or be shelved permanently. At this point, it's almost more a bullet dodged than actual news. Kudos on posting an article in which you're quoted, though.

    On a side note, the original German term seems much less ambiguous than the British English "neighboring rights" or American English "related rights". "Leistungsschutzrecht" literally means"right to protection of effort".

    • Slight correction: The whole of the article including first paragraph and headline is still under copyright protection and not public domain, but it may be used by others as part of fair use. (quoting in general is considered fair use)

      Something else that gets lost in translation of "Leistungsschutzrecht", is that we're nottalking aboiut the authors rights, as the news and newspapers sites usually aren't the authors. What comes closest to the proposed law is the "sweat of the brow" construct, as we're talkin

      • Regarding the copyright vs. public domain: really? I was quoting from the Al Jazeera article, so perhaps the article is wrong.

        • "Public Domain" in a strict sense means waiving all your rights to something. It becomes a common good. But here ownership of the parts of the articles stays with the author, even if it may be used by others for free.

  • bad translation (Score:5, Informative)

    by Tom ( 822 ) on Tuesday August 21, 2012 @04:26AM (#41065651) Homepage Journal

    "Leistungsschutzrecht" has nothing to do with neighbours. The three words it is made off are Leistung which translates as "achievement, effort, performance", Schutz = "protection" and Recht = "right, law".

    It plain and simple intends to protect the efforts of the newspapers. And it is highly controversial within Germany. Basically, our news and printing industry is what your movie and music industry are - strong lobby organisations buying special rights for themselves.

  • From the article:

    The latest draft amendment proposes far less than what some German publishers sought from the beginning. Throughout the last three years that a neighbouring right has been under consideration in public hearings, the publishers have insisted that the use of its material for any commercial gain - both in the online and offline spheres - should be reflected with some recompense to them. "The example that was given at the hearing was: a bank employee reads his morning newspaper online and sees something about the steel industry, and then advises his clients to invest in certain markets," says Mathias Schindler of Wikimedia Deutschland, who has attended the hearings. "The publishers argued that the bank consultant was only able to advise his clients because of the journalistic work in the published article. So that means the publisher deserves a fair share of any money made from that scenario. This was the proposal from the start."

  • by Lumpy ( 12016 ) on Tuesday August 21, 2012 @05:52AM (#41066035) Homepage

    Google needs to delist ALL German websites Let's see the German internet economy collapse overnight.

    I'm thinking that their government is made up of idiots and morons that have no clue how anything really works.

    Although we do have a senator that thinks women secrete something when they get raped to prevent pregnancy, so we have our share of complete idiots as well.

    • Google needs to delist ALL German websites Let's see the German internet economy collapse overnight.

      I'm thinking that their government is made up of idiots and morons that have no clue how anything really works.

      Although we do have a senator that thinks women secrete something when they get raped to prevent pregnancy, so we have our share of complete idiots as well.

      It'd be a good opportunity for another search engine provider to step in and fill the gap but it's not clear that the business model works when the search engine has to pay to display a link.

    • Although we do have a senator that thinks women secrete something when they get raped to prevent pregnancy...

      He's from the future. In 2250AD women have been genetically engineered to do exactly that.

      • He's from the future. In 2250AD women have been genetically engineered to do exactly that.

        Makes sense, since by 2213 soldiers will be retroengineered to rape everything that moves in order to breed the enemy out.

    • Although we do have a senator that thinks women secrete something when they get raped to prevent pregnancy, so we have our share of complete idiots as well.

      He's only a congressman. I'm sure they'd never let anyone that batshit crazy be a senator, right?

  • Everyone tries really really hard to be #1 to be linked by Google and other sites, these guys are ass-backward if they think people should actually pay THEM to put what's essentially free ads on their page. If I were Google, I would completely remove all links the sites that don't want to be linked, and let them die in the abyssal depths of Internet oblivion where nobody knows they exist. What a bunch of retards.
  • by devent ( 1627873 ) on Tuesday August 21, 2012 @07:41AM (#41066651) Homepage

    That is so typical of content-"producers" or copyright-holders of the "We want to eat our cake and have it, too" syndrome. They want the extra traffic generated from the news-aggrigators and search engines, but what also a share of the money the news-aggrigators and search engines are generating by offering an useful service.

    If they do not want that the news-aggrigators and search engines are using their content, they could just use the robot.txt file to opt-out of the indexing. But of course then they do not get the extra traffic. So they choose the next "logical" step: get the benefit from the news-aggrigators and search engines but complain loudly and weeping so they get an extra piece from the money.

    The inter-trade organizations VDZ and BDZV could also just exclude Google or any other news-aggrigators they don't like and either a) create their own search engine/news-aggrigators or negotiate an agreement with Google.

    But of course weeping and crying is not only more easily, but with a new law they can extend their rights indefinitely. Right now the discussion is about the Topics and automatically extracted excerpts that should be protected for one year. In 5 years they will push the law for a protection of 5 years, and sooner or later it will be "aligned" with German copyright law and Topics and automatically extracted excerpts are protected for 70 years.

    from http://www.heise.de/newsticker/meldung/Google-Leistungsschutzrecht-beispielloser-Eingriff-ins-Netz-1671227.html [heise.de]

    "Presseverlage im Online-Bereich mit anderen Werkmittlern gleichzustellen" und fordern die Bundesregierung auf, nicht "halbherzig" zu handeln.

    Meaning that they want the same copyright protection for topics and excerpts that they have for the article itself, meaning 70 years after the death of the author.

  • I think I just discovered a new business model (if the German plan goes through, anyway): Make content (or buy it from someone else, like the AP or Reuters), get it indexed by Google for several years, then "suddenly realize" that Google is indexing your pages in a way that generates income via pageviews/ads, then sue Google for back-royalties for all the years they "unfairly" linked to said content. Brilliant, if I do say so myself.
  • by PPH ( 736903 ) on Tuesday August 21, 2012 @09:41AM (#41067975)
    1. 1) De-list all web sites requesting payment.
    2. 2) Wait until their traffic dries up.
    3. 3) Web site owners request being re-listed.
    4. 4) Google presents them with their price list:
      1. 4a) $0 if we found you and listed you for free initially
      2. 4b) ${big_bucks} to re-list plus:
        1. 4b.1) An annual % based on click through traffic.

    5. 5) ?????
    6. 6) Profit!
  • robots.txt (Score:4, Interesting)

    by WaffleMonster ( 969671 ) on Tuesday August 21, 2012 @10:08AM (#41068303)

    Any news site not wanting a search engine linking to them need no legislation. All they need to do is create a file called robots.txt in the root folder of their site with the following content:

    User-agent: *
    Disallow: /

    This will ensure said news site is never seen by anyone. The choice is yours and under your full control.

  • If something like this becomes law in Germany, i really, really hope that Google doesn't cave on this one. It seems like Google has caved on stuff lately. If Google caves and pays, the floodgates would open and every country in the world would start demanding fees from Google for this and that.

You are always doing something marginal when the boss drops by your desk.

Working...