Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Advertising Google

'How Google's Ad Business Funds Disinformation Around the World' (propublica.org) 206

Today ProPublica published "the largest-ever analysis of Google's ad practices on non-English-language websites," saying their report shows Google "is funneling revenue to some of the web's most prolific purveyors of false information in Europe, Latin America and Africa," and "reveals how the tech giant makes disinformation profitable...." The company has publicly committed to fighting disinformation around the world, but a ProPublica analysis, the first ever conducted at this scale, documented how Google's sprawling automated digital ad operation placed ads from major brands on global websites that spread false claims on such topics as vaccines, COVID-19, climate change and elections.... The resulting ad revenue is potentially worth millions of dollars to the people and groups running these and other unreliable sites — while also making money for Google.

Platforms such as Facebook have faced stark criticism for failures to crack down on disinformation spread by people and governments on their platforms around the world. But Google hasn't faced the same scrutiny for how its roughly $200 billion in annual ad sales provides essential funding for non-English-language websites that misinform and harm the public. Google's publicly announced policies bar the placement of ads on content that makes unreliable or harmful claims on a range of issues, including health, climate, elections and democracy. Yet the investigation found Google regularly places ads, including those from major brands, on articles that appear to violate its own policy.

ProPublica's examination showed that ads from Google are more likely to appear on misleading articles and websites that are in languages other than English, and that Google profits from advertising that appears next to false stories on subjects not explicitly addressed in its policy, including crime, politics, and such conspiracy theories as chemtrails. A former Google leader who worked on trust and safety issues acknowledged that the company focuses heavily on English-language enforcement and is weaker across other languages and smaller markets....

The former Google leader suggests Google focuses on English-language problems partly because they're sensitive to bad PR and the posibility of regulatory scrutiny (and because English-language markets have the biggest impact).

Google is spending more money to patrol non-English content, a spokesperson told ProPublica, touting the company's "extensive measures to tackle misinformation... In 2021, we removed ads from more than 1.7 billion publisher pages and 63,000 sites globally. We know that our work is not done, and we will continue to invest in our enforcement systems to better detect unreliable claims and protect users around the world."

But in some cases Google's ads appeared on false online article published years ago, the article points out, "suggesting that the company's failure to block ads on content that appears to violate its rules is a long-standing and ongoing problem... [T]he investigation shows that as one arm of Google helps support fact-checkers, its core ad business provides critical revenue that ensures the publication of falsehoods remains profitable."
This discussion has been archived. No new comments can be posted.

'How Google's Ad Business Funds Disinformation Around the World'

Comments Filter:
  • by VeryFluffyBunny ( 5037285 ) on Sunday October 30, 2022 @03:52AM (#63009701)
    ...of the cliff or an ambulance at the bottom? Google, Facebook, Twitter, et al. claim that it's better to put an ambulance at the bottom. They're publishing ads on unvetted content & then saying, "Oopsie! That's false, offensive, harmful & possibly illegal content. Oh well, we'd better unpublish it then." after the harm has been done.

    Until they're held legally responsible for what they're doing, they won't stop. It's far too profitable to spread lies & misinformation, as Joseph Pulitzer & William Randolph Hearst discovered & made their fortunes in the late 19th & early 20th centuries. It's time to rein Google et al. in like we did with that "yellow journalism" a century ago.
    • by Opportunist ( 166417 ) on Sunday October 30, 2022 @04:14AM (#63009715)

      A good read in this context is "The lost honour of Katharina Blum" [wikipedia.org], which makes exactly that the topic of the book: An innocent person gets discredited and slandered by a tabloid to the point where she eventually shoots one of its journalists.

      And the internet media today are pretty much what the unfettered yellowpress was: A sensationalist slandering machine.

      • by VeryFluffyBunny ( 5037285 ) on Sunday October 30, 2022 @06:26AM (#63009817)
        Well, that's why I think we need a strong fence at the top. The companies have the responsibility to ensure that whatever they publish, whether they be publishing it verbatim, e.g. comments on social media or from some idiot's blog or podcast, or using some NLP algorithm to add a message saying it isn't actually true & a providing a link to more info on the topic, the company is held liable for what they've published, regardless. If they want to do actual journalism, i.e. reporting/publishing what people have done, said, & written, then they should be held to journalistic standards & if someone's lying or misleading, to report that as the story rather than regurgitate their words & deeds verbatim. That's the fucking point of journalism & why we train people for years & pay them to do it. Don't let these ad agencies distract us with the medium, i.e. online social media, reporting is reporting & publishing is publishing. If they're lying, prosecute them for it until it hurts enough so that they stop.

        P.S. I do realise that a lot of what I've said also applies to newspapers & TV stations, & yes, it should. Many newsrooms have sacked their fact-checkers in the race to make money out of publishing click-bait. It's a race to the bottom & we pay for it with our public discourse & democracies while they make out like bandits. In the light of this, it's easy to understand why some countries choose to restrict or block many social media companies. It's not always entirely about authoritarian control.
        • by alvinrod ( 889928 ) on Sunday October 30, 2022 @11:28AM (#63010231)
          Sounds great until it's subverted by whoever is in power to get establish what the truth is and to punish everyone else who dares to say otherwise. The price of freedom is that you get a few cranks who get to say whatever they want. I wouldn't require Google or anyone else to do business with them, but you're incredibly foolish to think that the very tools you create to safeguard the truth or democracy won't be subverted by the worst sort of people.

          We already have historical examples of this happening. The Weimar Republic had such laws in place and they shut down many Nazi publications in the 1920s and some prominent Nazi party members even received prison sentences for violating those laws. Unfortunately they did nothing to prevent those ideas from spreading and worse yet gave the Nazis an easy claim that they were being persecuted and that it must obviously have been for the very reasons they were claiming. When they eventually gained enough political power the same laws were used to prevent criticism of the Nazi party and to shut down opposition of their message. Soon the only truth you were allowed to utter was the one they had decided was true.

          The best defense against bad ideas is an educated public that is free to argue against them. It's hard to claim any kind of victim hood when not one stops you from speaking your peace and everyone else is just as free to speak theirs and point out how ridiculous your ideas might be. Preventing them from being shared in public doesn't stop them from spreading, it only leaves them to fester in the dark. For anyone who truly thinks Trump and his ilk (or substitute your own equivalent) are so awful and must be prevented from speaking, you really should ask yourself if you'd be comfortable with them holding power with the kind of tools in place that allow someone to decide what kind of discourse is no longer free. As many have discovered throughout history, the guillotine they helped erect will gladly take their head as well.
          • by VeryFluffyBunny ( 5037285 ) on Sunday October 30, 2022 @12:02PM (#63010305)
            Are you arguing that the USA's slander laws & media regulation have undermined US democracy & turned it into a fascist state? Although to be fair, the FCC do seem more concerned with Janet Jackson's nipples than lies & misinformation. Mass media in the USA is far from free. All I'm arguing for is for the same laws to be applied to internet broadcasters as TV, radio, & newspapers.
            • Are you arguing that the USA's slander laws & media regulation have undermined US democracy & turned it into a fascist state?

              Uhh I don't know how to tell you this, buddy...

              Though to be clear, slander and libel laws have nothing to do with it.

          • by Geoffrey.landis ( 926948 ) on Sunday October 30, 2022 @12:50PM (#63010423) Homepage

            Sounds great until it's subverted by whoever is in power to get establish what the truth is and to punish everyone else who dares to say otherwise.

            Yes, that is indeed the problem.

            The price of freedom is that you get a few cranks who get to say whatever they want.

            Unfortunately, what we have found is that it's not just "a few cranks." The army of trolls evil trolls spreading disinformation for lolz is bad enough, but the worse problem is deliberate lies being spread by organizations and countries to manipulate political goals and spread hate and schism.

            We have discovered that this is not a theoretical problem, it is a very real problem and very destructive.

            I wouldn't require Google or anyone else to do business with them, but you're incredibly foolish to think that the very tools you create to safeguard the truth or democracy won't be subverted by the worst sort of people.

            This is a hard problem. Allowing disinformation to be spread unchecked is destructive. Stopping it, however, is liable to be misused.

            How do we solve this?

          • Far better to have some garbage than to allow private companies decide what we read, for profit.

            For google, search for

            origin of covid tuntable

            There is only one site on the internet with those words, and you will not find it on Google. You may or may not agree with the content, but to censor it is odious.

            https://www.originofcovid.org/ [originofcovid.org]

    • Until they're held legally responsible for what they're doing, they won't stop.

      It's this. Market pressures mean that companies tend towards the most profitable thing. Even if Google did stop this, someone else will step in and go worse. We need to recognize that "speech" by companies is not the same as speech by people. If someone is doing their job and being paid to speak in a certain way without complete personal freedom to say what they want without penalty then that is not "free speech".

      • This is piss-poor reasoning.

        What if a person was doing it, not google?
        You cannot draw some kind of magical distinction there.

        The private sector must have the right to censor on its property, just as people must have the right to speak while on their property (including public property)
        You cannot regulate the right to censor without abridging Freedom of the Press.

        Attempts to do otherwise are either misguided, or a direct attempt at putting speech into the control of the Government.
        If you can tell a
        • "This company has performed an illegal action and will be terminated."

      • It's this. Market pressures mean that companies tend towards the most profitable thing.

        This is a real problem. The product that Google sells is access to eyeballs. What keeps peoples' attention glued to the screen is outrage, and so the algorithms for what Google shows people are crafted to favor outrage.

        Unfortunately, the people who craft disinformation know that, and have evolved to craft lies that maximize outrage.

    • by znrt ( 2424692 )

      Until they're held legally responsible for what they're doing, they won't stop. It's far too profitable to spread lies & misinformation

      indeed but, otoh, the people spreading the misinformation are known yet aren't held responsible in any way. misinformation is as old as humanity, it's ubiquitous and can both grossly coarse or very subtle ... this is a very tricky subject, it is not possible to tackle but the most blatant offenses, there are lots and lots of actors involved and it simply can't be blamed on google alone.

      the only real solution to this imo is raising the general level of education and basic skills in critical thinking. for som

      • I believe in free speech & I see no crime in lying to reporters. It's the publishers' responsibility, nay, duty to make every reasonable effort to ensure that what they publish is factually correct, i.e. old media need to give fact checkers their old jobs back & new media need to do some serious fact checker hiring. Media companies have disproportionate influence & power & so it's only reasonable that they are held to transparent & democratic standards.
        • by znrt ( 2424692 )

          I believe in free speech & I see no crime in lying to reporters. It's the publishers' responsibility, nay, duty to make every reasonable effort to ensure that what they publish is factually correct

          duty. what is a publisher? because this encompasses now from google to individual tiktok users just the same as the the bbc or fox or the ny times. "the journalists oath" has never been much more than a mix of good intentions, self-righteousness, excuses and propaganda spin, implying that it had any substantial effect in the truthfulness of the emitted narratives is really innocent thinking, but today it is completely moot and diluted in a world of universal access. i say the only long term solution is maki

    • The same should go for ads.
      Users shall be able to flag ads. LinkedIn actually let you do that. You'll get different ads, but you'd get rid of the ones that are problematic.

      • That's still after the fact. There's no such thing as "unpublishing." When a media company commits an offence, it can't be undone; It has to be investigated & prosecuted by rule of law with transparent & democratic oversight.
  • Some proof (Score:5, Interesting)

    by DustPuppySnr ( 899790 ) on Sunday October 30, 2022 @04:08AM (#63009711)
    For those who haven't seen it. https://www.youtube.com/watch?... [youtube.com] A bit long, but definitely shows how Google doesn't follow its own policies based on the account in questions.
    • by MrL0G1C ( 867445 )

      +1, there amount of lies and deceptive statements from Youtube here so that they could continue to profit from dangerous lies is mind-boggling, what Youtube have done has likely contributed to the loss of lives from Covid and also (IANAL) looks treasonous to me.

      • It's arguably not treason unless it's willful aid to the enemy. Not doing enough diligence to determine whether someone is the enemy is a grey area.

        OTOH when someone gets paid to disseminate a piece of content, I would argue that's no longer merely something someone else posted on their service, but something they are willfully disseminating, so Section 230 should not apply. I propose that the payment should determine the nature of the interaction. This idea will of course be wildly unpopular among those fo

  • by Arnonyrnous Covvard ( 7286638 ) on Sunday October 30, 2022 @04:15AM (#63009717)
    Google Adsense created an expectation to get paid for having a web page. Adsense created a strong incentive for "SEO", which is the PR name for web spam. Google made it so that browsing without Javascript has become impossible, just to protect their ad business. It's amazing how much damage you are allowed to do unimpeded if you just give people a free email account and a second-guessing search engine.
    • You can't blame Google for capitalism, and SEO was a thing before Google even existed, it just wasn't broadly known like it is now. Google is actually responsible for plenty, let's just hold them accountable for the things they actually do (like accepting money to perpetrate fraud.)

    • God forbid people don't give you something for free anymore. Honestly did you think the Internet would stay a set of volunteer nerds posting stuff on Geocities forever? Back in the days where a Slashdot post will knock you off the internet?

      You know half the time those servers didn't suffer under bandwidth constraints, they were voluntarily and rapidly shutdown by the operators who didn't want the resulting bill from their service provider.

  • by nospam007 ( 722110 ) * on Sunday October 30, 2022 @05:34AM (#63009773)

    That's why we block them.

    • by evanh ( 627108 )

      The tracking is the root problem, not the ads. All commercial tracking of people needs to stop.

      • The tracking is the root problem, not the ads. All commercial tracking of people needs to stop.

        I agree with your latter sentence, but not your former one. Both things are problems, in different ways. That they are combined doesn't change that.

    • Indeed. Why risk it. Block them all. Pihole is your friend.

      https://pi-hole.net/ [pi-hole.net]

      Best,

  • get in the way of profits, isn't that right Google, Facebook, Amazon, etc?

  • Google's data is like a tool, like a hammer, that can be used for many purposes.
    Someone's misinformation is another person's strongly held beliefs. Who is to decide what is true or false? Each of us individually must

    • by splutty ( 43475 )

      Belief is by definition not fact. If it were fact, there would be no reason to 'believe' in it.

  • If you want to be lazy, "stuff I don't like" actually has one less syllable than "disinformation" ...
  • Yoohoo... People have been speaking freely for years, I've noticed.

    Seriously, is it a right for your garbage to be delivered to everybody?
  • Youtube is full of false advertising, and a lot of medical, like the ad that proclaims that you can stop taking your diabetes medication.

Always draw your curves, then plot your reading.

Working...