

Google Decided Against Offering Publishers Options In AI Search 13
An anonymous reader quotes a report from Bloomberg: While using website data to build a Google Search topped with artificial intelligence-generated answers, an Alphabet executive acknowledged in an internal document that there was an alternative way to do things: They could ask web publishers for permission, or let them directly opt out of being included. But giving publishers a choice would make training AI models in search too complicated, the company concludes in the document, which was unearthed in the company's search antitrust trial.
It said Google had a "hard red line" and would require all publishers who wanted their content to show up in the search page to also be used to feed AI features. Instead of giving options, Google decided to "silently update," with "no public announcement" about how they were using publishers' data, according to the document, written by Chetna Bindra, a product management executive at Google Search. "Do what we say, say what we do, but carefully." "It's a little bit damning," said Paul Bannister, the chief strategy officer at Raptive, which represents online creators. "It pretty clearly shows that they knew there was a range of options and they pretty much chose the most conservative, most protective of them -- the option that didn't give publishers any controls at all."
For its part, Google said in a statement to Bloomberg: "Publishers have always controlled how their content is made available to Google as AI models have been built into Search for many years, helping surface relevant sites and driving traffic to them. This document is an early-stage list of options in an evolving space and doesn't reflect feasibility or actual decisions." They added that Google continually updates its product documentation for search online.
It said Google had a "hard red line" and would require all publishers who wanted their content to show up in the search page to also be used to feed AI features. Instead of giving options, Google decided to "silently update," with "no public announcement" about how they were using publishers' data, according to the document, written by Chetna Bindra, a product management executive at Google Search. "Do what we say, say what we do, but carefully." "It's a little bit damning," said Paul Bannister, the chief strategy officer at Raptive, which represents online creators. "It pretty clearly shows that they knew there was a range of options and they pretty much chose the most conservative, most protective of them -- the option that didn't give publishers any controls at all."
For its part, Google said in a statement to Bloomberg: "Publishers have always controlled how their content is made available to Google as AI models have been built into Search for many years, helping surface relevant sites and driving traffic to them. This document is an early-stage list of options in an evolving space and doesn't reflect feasibility or actual decisions." They added that Google continually updates its product documentation for search online.
Pull the other one, it's got bells on (Score:2)
Re:Pull the other one, it's got bells on (Score:5, Informative)
Not quite. It IS their service, and while they can do what they want, what they can do is constrained by laws. And one of those laws is they can't just leverage monopolies to force things that are outside of those monopolies. Google search is pretty much a monopoly with its only real rival Bing being more or less an unwanted side effect of chosing Edge over chrome (google has about 89% share, with the remaining 10% split between bing, yandex and things like duck duck go. Its a monopoly in every sense that counts).
So yeah, while Google CAN do this, they do so at risk of running afoul of anti-trust laws and potentially also copyright laws. (burying a smallprint latin clause giving themselves the right to do this isnt necessarily going to fly with a judge if the judge thinks the publisher was not given a fair choice to agree or not, *especially* if it was just unilaterally added to a EULA without notifying the publisher clearly enough that the rules had changed)
Re: (Score:2)
The AI overview does not mean training. Its using a LLM to summarize what is linked on the result page. It's basically the result of a query "Summarize the following content [search results page]". The AI on the search page is merely a post-processing of the results and may not even need an additional web request to the site depending on the Google cache.
Translation (Score:3)
'Once again, we decided to check the box marked "be evil" - for consistency's sake as much as anything.'
Re: (Score:2)
Or another possible interpretation:
1. The legal fiction called "intellectual property" as it exists today stands firmly in the way of any innovation and
2. Unlike you, a very large company will always find a way to sidestep this legal roadblock to your detriment
Re: (Score:2)
True, but actually, how would anyone ever pay publishers for their work, or even let publishers have a choice of some sort? I can't see it ever being feasible.
Sure, you could do a deal with Penguin or someone and pay *them* for *their* works, but what about the little publishers in different countries? You'd never find them all, and you'd never be able to deals with them all. You'd also be inundated with scammers pretending to be publishers trying to get in on the action. So all you'd end up doing is enrich
Re: (Score:2)
Come to think of it, that almost sounds like a Futurama quote.
At Google True == False (Score:2, Insightful)
Don't be Evil => Profit is Good => Evil makes Profit => Evil is Good => True == False
Note: a new riff on an old theme. Wall Street has always been eager to wreck things and people for more profit.
Move along. Nothing to see here.
Re: (Score:2)
"too complicated" (Score:1)
The real use case of AI has always been "let us continue to make arbitrary decisions without the consequences affecting us".
Anything else is incidental, and the fact that it doesn't actually fulfill this use case is... yes. The whole thing is a failing gambit, and the actual uses are orthogonal to the way 'business leaders' imagine the world works. If they force it more, they'll just show more how they aren't in touch with the way real people actually have to live their lives.
Re: "too complicated" (Score:1)
robots.txt
ai.txt
itâ(TM)s not complicated AT ALL