Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Facebook Google Youtube

How Facebook and Google Actually Fund the Creation of Misinformation (technologyreview.com) 196

MIT's Technology Review shares data from a Facebook-run tool called CrowdTangle. It shows that by 2018 in the nation of Myanmar (population: 53 million), " All the engagement had instead gone to fake news and clickbait websites.

"In a country where Facebook is synonymous with the internet, the low-grade content overwhelmed other information sources." [T]he sheer volume of fake news and clickbait acted like fuel on the flames of already dangerously high ethnic and religious tensions. It shifted public opinion and escalated the conflict, which ultimately led to the death of 10,000 Rohingya, by conservative estimates, and the displacement of 700,000 more. In 2018, a United Nations investigation determined that the violence against the Rohingya constituted a genocide and that Facebook had played a "determining role" in the atrocities. Months later, Facebook admitted it hadn't done enough "to help prevent our platform from being used to foment division and incite offline violence." Over the last few weeks, the revelations from the Facebook Papers, a collection of internal documents provided to Congress and a consortium of news organizations by whistleblower Frances Haugen, have reaffirmed what civil society groups have been saying for years: Facebook's algorithmic amplification of inflammatory content, combined with its failure to prioritize content moderation outside the US and Europe, has fueled the spread of hate speech and misinformation, dangerously destabilizing countries around the world.

But there's a crucial piece missing from the story. Facebook isn't just amplifying misinformation.

The company is also funding it.

An MIT Technology Review investigation, based on expert interviews, data analyses, and documents that were not included in the Facebook Papers, has found that Facebook and Google are paying millions of ad dollars to bankroll clickbait actors, fueling the deterioration of information ecosystems around the world.

Facebook pays them for permission to open their content within Facebook's app (where Facebook controls the advertising) rather than having users clickthrough to the publisher's own web site, reports Technology Review: Early on, Facebook performed little quality control on the types of publishers joining the program. The platform's design also didn't sufficiently penalize users for posting identical content across Facebook pages — in fact, it rewarded the behavior. Posting the same article on multiple pages could as much as double the number of users who clicked on it and generated ad revenue. Clickbait farms around the world seized on this flaw as a strategy — one they still use today... Clickbait actors cropped up in Myanmar overnight. With the right recipe for producing engaging and evocative content, they could generate thousands of U.S. dollars a month in ad revenue, or 10 times the average monthly salary — paid to them directly by Facebook. An internal company document, first reported by MIT Technology Review in October, shows that Facebook was aware of the problem as early as 2019... At one point, as many as 60% of the domains enrolled in Instant Articles were using the spammy writing tactics employed by clickbait farms, the report said...

75% of users who were exposed to clickbait content from farms run in Macedonia and Kosovo had never followed any of the pages. Facebook's content-recommendation system had instead pushed it into their news feeds.

Technology Review notes that Facebook now pays billions of dollars to the publishers in their program. It's a long and detailed article, which ultimately concludes that the problem "is now happening on a global scale." Thousands of clickbait operations have sprung up, primarily in countries where Facebook's payouts provide a larger and steadier source of income than other forms of available work. Some are teams of people while others are individuals, abetted by cheap automated tools that help them create and distribute articles at mass scale...

Google is also culpable. Its AdSense program fueled the Macedonia- and Kosovo-based farms that targeted American audiences in the lead-up to the 2016 presidential election. And it's AdSense that is incentivizing new clickbait actors on YouTube to post outrageous content and viral misinformation.

Reached for comment, a Facebook spokesperson told Technology Review that they'd misunderstood the issue. And the spokesperson also said "we've invested in building new expert-driven and scalable solutions to these complex issues for many years, and will continue doing so."

Google's spokesperson confirmed examples in the article violated their own policies and removed the content, adding "We work hard to protect viewers from clickbait or misleading content across our platforms and have invested heavily in systems that are designed to elevate authoritative information."
This discussion has been archived. No new comments can be posted.

How Facebook and Google Actually Fund the Creation of Misinformation

Comments Filter:
  • by Black Parrot ( 19622 ) on Sunday November 21, 2021 @11:48PM (#62009147)

    This should be a fun discussion.

    • by shanen ( 462549 )

      I wonder if the discussion would have produced more humor if your FP had been visibly modded Funny? But it was a weak attempt at humor and the Subject was vacuous, too.

  • by putnamca ( 5596549 ) on Sunday November 21, 2021 @11:52PM (#62009151)
    They don't. They never will. Why is this on Slashdot? This seems like an article that codifies what everybody already knows: all of these big tech companies will lie, and gaslight, eternally. They will perpetually say: "we're" looking into it. They aren't. They are using the protestations to their practices as ways to "fine-tune" heir practices. This is the definition of insanity, at this point.
    • Misinformation is clickbait; clickbait is profit.

      • And to your point... "SuperCunte", "fahrbot-bot", and other commenters here are obviously trolls, bots, or anti-Western shills trying to hijack the discussion (again) and steer it toward divisive and pointless banter. Ignore the noise and discuss the merits or flaws of the article and issue.
      • Comment removed based on user account deletion
    • by skam240 ( 789197 )

      Oh good, some one complaining about why an article was posted on Slashdot. Now I can read it assured that the article truly belongs.

  • I don't use Facebook, never have. Originally, it was just that I wasn't interested in that kind of thing, not because I thought it was 'evil' or 'bad'. But, over the years, mostly what I've read about it has been negative. So this is not meant as a rhetorical question. I'm really curious, what was the attraction of Facebook in the first place? And, does anybody benefit from it? (Besides advertisers that is.)

    • Same here, never had it. I see people staring at their phones all the time scrolling through that pig trough and wonder what the hell is so compelling. Hell, I don't understand how people can stare at such a small screen for so long. I've never had a "smart" phone either so I guess I'm missing that compelling reason as well.
    • what was the attraction of Facebook in the first place

      Market share.

      It is essentially one stop shopping to follow your favorite celeb, make contact with your high school crush, and family and friends.

      Hell, several businesses have it as their main portal on the web if they can't be bothered to maintain a website.

      Anyone could use any of the smaller alternatives, but as they don't have the reach, they are of limited value.

      And as they are one stop shopping, mounting an effective competitor is a Herculean effort

    • Dedicated groups. With forums dead or dying, there are two large generic community platforms left: Discord and Facebook.
      Everybody and their dog moves to one of those two platforms, and I admire those who still resist doing so.
      And yes, I benefit from both, from a knowledge point of view. It takes some skill and time spent to shift through the shit, but gems are there, and they helped me numerous times.

      • I've found these old fashioned things called 'books' tend to be full of useful, on-topic, coherently & cohesively organised, factually verified information. No need to virtually wade & sift through pages of idiotic brain farts to find the few 'gems' that may or may not be there. Books are far cheaper, in terms of my time, & more of a pleasure to read.
        • I've found these old fashioned things called 'books' tend to be full of useful, on-topic, coherently & cohesively organised, factually verified information.
          But not with information that people actually need right now

          E.g. I can not wait till one writes a book about current Thailand Immigration laws and procedures regarding "Thailand Pass" and Covid.

          Perhaps a historian will write a short paragraph about this in 30 years. And still then: I would need to have a way to find the book. Being in a face book gro

        • Ah, yes, because those "books" (which have not yet been written, and likely never will) would certainly answer ALL my questions, such as:
          "Has anyone beta-tested the Phaetus Rapido hotend, and if so, what is your opinion on how it behaves on a CoreXY printer?"

          Don't be obtuse.
          I refer to books when they can give me an answer, and I refer to other information means when books don't help.

    • by nagora ( 177841 )

      I don't use Facebook, never have. Originally, it was just that I wasn't interested in that kind of thing, not because I thought it was 'evil' or 'bad'. But, over the years, mostly what I've read about it has been negative. So this is not meant as a rhetorical question. I'm really curious, what was the attraction of Facebook in the first place? And, does anybody benefit from it? (Besides advertisers that is.)

      Facebook is fundamentally an instant website CMS. Anyone can have what to 99.99% of people is a website up and running for their business, club, political party, family newsletter, hobby or racist rantings about how the Jews are taking over something/everything in basically no time at all.

      The fast majority of people on FB use it for totally uninteresting (to other people) things and never get involved in the stuff that hits the headlines.

      • Not a good argument because you then have to endlessly moderate & curate the pages to save them from being overwhelmed by bots & nutjobs. Set up a WordPress instance once, turn off commenting, update or add to it as desired, & get on with your actual job/business. Additional bonus, visitors don't need a Facebook account to view your pages & it's open to search engines.
        • by nagora ( 177841 )

          Not a good argument because you then have to endlessly moderate & curate the pages to save them from being overwhelmed by bots & nutjobs. Set up a WordPress instance once, turn off commenting, update or add to it as desired, & get on with your actual job/business. Additional bonus, visitors don't need a Facebook account to view your pages & it's open to search engines.

          I'm not really saying it's a great argument, but it's the perception among most people. Want a website? Sign up to Facebook. Tell your friends; who cares about search engines?

        • And why would I do all this when I simply can use Facebook? Facepalm?

          Where would I host my Wordpress Page? You see: it already starts with the simplest things.

          As soon as my Word Press Site is up: sooner or later one will mention it on facebook anyway. So: why not just start there?

    • I'm really curious, what was the attraction of Facebook in the first place?

      A place to keep in touch with people you know.

      And, does anybody benefit from it?

      Can't speak for anyone else, but I benefit slightly from the alleged design purpose and also from groups which put me in contact with other people doing things I'm doing. For example, RV conversion, or repair of specific models of automobile.

    • I had it for about a year at one point as it was the easiest way to keep track of a band I followed. I literally did nothing else with it. But based on how miserable the people I know are who spend countless hours on it, I've never seen any real benefit to it. I deleted my account when that band folded up shop and haven't been back.

      Granted, I know my shadow profile still exists and my "delete" was really just a flag set in the database. My understanding is even if you delete your account, you can still

    • I'm really curious, what was the attraction of Facebook in the first place?
      1) Most places in Asia had no internet till roughly year 2000
      When internet popped up, also smart phones popped up, Facebook as an app popped up *AND* no one already had email. Because no one knew anything about "standard internet". So facebook became the ersatz for email, people would not ask you for your email address or give you theirs. They would ask "what is your facebook?" Plenty of people would have more than one facebook name

  • I think you mean 'Meta'. i.e. Meta-information.

  • by bb_matt ( 5705262 ) on Monday November 22, 2021 @12:45AM (#62009215)

    I'm not for one minute suggesting the practices of Facebook or Google when it comes to the models/algorithms employed are blameless, but it's useful to step back in history here.

    In the case of Myanmar, a quick look at Wikipedia will demonstrate a problem that has existed for a long time before Facebook came along: https://en.wikipedia.org/wiki/... [wikipedia.org]

    Sure, Facebook makes it ridiculously easy to spread misinformation, but it's not like they are the first platform to assist or rather facilitate the spread.
    We can look to printed media as well - a news stand on every corner, big bold headlines, accessible to anyone who can read and spreadable by anyone who can communicate. Then there's radio - reaching into every home, onto every street, via dirt cheap transistor radio sets.

    Like I said, I'm not saying big tech, specifically Facebook, are in any way blameless here.

    The question is, how much of an impact are they really having, over what would likely have happened anyway, through other forms of misinformation.

    Take a look at Nazi Germany in the 1930's - and how easy it was for the powers that be, to spread a message of hatred, via print, via radio, word of mouth, through fear.

    Facebook etc. can absolutely amplify that voice and there is the very different monetary "value" associated with that amplification.
    They are an amplification of what already exists in terms of spreading misinformation - e.g. print and radio.
    You could argue the same forced employing these techniques, are simply switching to another way of spreading what they want people to believe - misinformation and conflict are as old as conflict itself.

    It's an interesting and alarming trend, but I do think it needs far deeper analysis before jumping to conclusions. "Yeah, Facebook caused that conflict."
    You can also say "Yeah, that newspaper, The Daily Rag, caused that conflict" or "Radio Dingbat, 702FM, caused that conflict."

    • You're a moron. Just because Burma had civil wars, doesn't mean that Facebook didn't incite riots and homicidal violence recently.

      If you want to learn specifically what they incited and when, you can read about it:

      https://www.goodreads.com/book... [goodreads.com]

      • by jabuzz ( 182671 )

        The question is would it have happened without Facebook? I would point to the breakup of Yugoslavia and the ensuing violence and genocide that took place. There was no internet in Yugoslavia in 1991 let alone Facebook.
        I would also point to Rwanda in 1994. Nobody needed Facebook or any other sort of social media to provoke something far far worse than anything that has taken place in Myanmar.
        Given the long history of issues in Myanmar then I think it would be a reasonable conclusion that while Facebook facil

        • The question is would it have happened without Facebook?

          Probably. But we should punish those who get themselves involved rather than playing mind games about alternate time lines. Facebook's behavior is not one we should excuse so easily, let alone encourage from other tech companies.

          The best possible policy is that these American companies not get involved. Responsible social media shouldn't broadcast false and inflammatory material to algorithmically-determined groups that eat that shit up. Facebook on the other hand doesn't feel any need to behave responsibly

    • by AmiMoJo ( 196126 )

      The difference between Facebook and 1930s Nazi Germany is that Facebook makes that power available to anyone.

      In the 30s it cost a lot of money to print a newspaper or operate a radio station. You have to be very wealthy or in a position of power, like the government, to be heard. On Facebook anyone can build up an audience for free.

      What used to be the fringe and largely ignored because, despite it all, the publishers and broadcasters had some minimal standards, now gains traction on Facebook.

    • by XXongo ( 3986865 )

      I'm not for one minute suggesting the practices of Facebook or Google when it comes to the models/algorithms employed are blameless, but it's useful to step back in history here. In the case of Myanmar, a quick look at Wikipedia will demonstrate a problem that has existed for a long time before Facebook came along: https://en.wikipedia.org/wiki/... [wikipedia.org]

      So, you just said that a simple google search would have told them that Myanmar was flammable, and that they decided that pouring gasoline on the situation and then tossing in lit matches would be lucrative, because making the situation worse would generate outrage and ad revenue.

      Sure, Facebook makes it ridiculously easy to spread misinformation, but it's not like they are the first platform to assist or rather facilitate the spread.

      You didn't read the article. This isn't an article about "making it ridiculously easy to spread misinformation." This is an article about paying people to generate misinformation because misinformation generates ad revenue.

  • by I am Jack's username ( 528712 ) on Monday November 22, 2021 @03:15AM (#62009433)

    "According to the latest CNN/USA Today/Gallup Poll, conducted Oct. 3-6, 2002, 53% of Americans say they favor invading Iraq with U.S. ground troops in an attempt to remove Saddam Hussein from power." https://news.gallup.com/poll/6... [gallup.com]

    The likes of the New York times supported the war criminals Rumsfeld, Cheney, W., Powell, and the voters in the USA, UK, Australia, Poland, Netherlands, Italy, and Spain; in invading Iraq and killing more than a million civilians in a war of aggression which was so obviously based on lies, that it resulted in the largest anti-war protests in history: in 600 cities on 15 February 2003 - before the war even started https://en.wikipedia.org/wiki/... [wikipedia.org]

    • It's interesting in your revisionist re-telling of the run-up to the Iraq war that you do not include Fox News as playing a role in manipulating public opinion nor do you mention the ownership of initiating the Iraq and Afghanistan wars by the Republican party.
  • Cable news channels (Score:5, Informative)

    by cowdung ( 702933 ) on Monday November 22, 2021 @03:17AM (#62009437)

    Cable news channels long ago shifted "news" into a perverse "entertainment" medium.

    What used to be the confines of "shock jocks" and sleazy talk show hosts now is mainstream. Every night we have people on "news" channels that fill the airways with their opinions about things. News has taken a back seat.

    News used to be boring. News should be boring.

    Today what passes for news is simply designed to piss you off. By getting your emotional response they get ratings.

    And it doesn't matter what side of the political spectrum you're on. Both sides do it.

    So I always find it weird that those same "news" sources now somehow blame social media for "promoting" the stuff that they create and promote.

    Maybe they don't like the role social media has taken from them.

    • News used to be boring. News should be boring. Today what passes for news is simply designed to piss you off.

      This. Look at the news coverage of the Kyle Rittenhouse trial. Responsible journalists would have restricted themselves to fact, and to factual reporting about the trial itself. None of the MSM did that. Instead, they deliberately reported falsehoods designed to rile people up. Illegal weapon. Crossed state lines. And many other "facts" that they knew to be false, but that served the purpose of generating clicks and outrage.

      it doesn't matter what side of the political spectrum you're on. Both sides do it.

      Also absolutely true. Try to find neutral, factual journalism. It almost doesn't exi

      • "... they deliberately reported falsehoods designed to rile people up. Illegal weapon. Crossed state lines."

        It sounds salacious, yeah, but those are both facts. The prosecution tried to charge him with the first thing (a minor in possession of a prohibited weapon), until it turned out that the weapon was classified as a long gun by state law, and so they had to drop the charge. He did cross state lines, but it's not illegal to cross state lines. The BBC reported the same thing. The news is not lying in r
    • Re: (Score:2, Insightful)

      by drinkypoo ( 153816 )

      Maybe they don't like the role social media has taken from them.

      Taken from them? They abdicated. Now the print media is crying about subscribership and the TV media (except for Faux news, which makes its money telling conservacucks what they want to hear) is crying about viewership. But they stopped doing news and started doing entertainment and I don't want that, so why would I reward them for it?

    • "Cable news channels long ago shifted "news" into a perverse "entertainment" medium." - Thank you! I'm old enough to remember Walter Cronkite delivering the news. Yes, boring, but on point, informative, and professional.
  • by memory_register ( 6248354 ) on Monday November 22, 2021 @07:12AM (#62009725)
    The only winning move is not to play.
  • The summary uses lots of inflammatory text, but ultimately makes no sense. So I looked to the article for understanding.

    But there's a crucial piece missing from the story. Facebook isn't just amplifying misinformation.
    The company is also funding it. [technologyreview.com]

    The linked Technology Review article says this:
    1) The publishers of the misinformation run web sites. Those web sites serve ads. Ergo, advertisers bankroll the misinformation sites. The article does not blame Facebook in this case, and does not claim that Facebook is bankrolling the site.
    2) But some web sites are small and slow, so Facebook offers to host the content. Now the ads are

    • by unimind ( 743130 )
      Sure. I'm sure everyone knows that FB makes money off of ads. But when that revenue system gets hijacked by bad actors to propagate fake news, pays them to do it, and FB does virtually nothing to stop it, there's a massive problem, and FB bears substantial responsibility for facilitating it. I'm pretty sure that's the point of the article.
      • by MobyDisk ( 75490 )

        I'm pretty sure that's the point of the article.

        But that isn't what the article is talking about. The article says that everything you just described is okay, but only if the web site is hosted somewhere else. And as soon as the IP address hosting the site becomes inside Facebook, suddenly it's a problem.

    • The summary uses lots of inflammatory text, but ultimately makes no sense. So I looked to the article for understanding.

      But there's a crucial piece missing from the story. Facebook isn't just amplifying misinformation. The company is also funding it. [technologyreview.com]

      The linked Technology Review article says this: 1) The publishers of the misinformation run web sites. Those web sites serve ads. Ergo, advertisers bankroll the misinformation sites.

      Correct. Facebook makes money from the advertisers, and pays the people creating misinformation to create more misinformation and post it to Facebook, where they can make money from it.

      The article does not blame Facebook in this case, and does not claim that Facebook is bankrolling the site. 2) But some web sites are small and slow, so Facebook offers to host the content. Now the ads are served by Facebook, and Facebook does a revenue-sharing agreement with the misinformation publisher: the publisher gets 30% of the ad revenue, and Facebook gets 60%. Ergo, Facebook is bankrolling the misinformation sites.

      Yep! You said it: Facebook is bankrolling the misinformation sites. And in so doing, creating many more such sites, since they pay money that is huge compared to the average wage in the country.

      The rest of your post simply says well, you think it's fine that Facebook (and Google) does that. OK, your opinion is noted.

      • by MobyDisk ( 75490 )

        Facebook is bankrolling the misinformation sites.

        No, it's the opposite of bankrolling the sites - Facebook is *charging* the sites. The article explains that sites make *less* money under this scheme. Facebook is taking money from them, not paying them. Facebook is essentially charging those companies to be a caching provider. The article makes it seem like caching data is what makes this evil, but it is entirely okay so long as there is no caching going on.

        The article's headline is just there to capitalize on Facebook hate. I hate Facebook too, but

    • by MobyDisk ( 75490 )

      There still seems to be confusion on this, so let me restate this a bit differently. Facebook offers the ability to cache the site content, and in exchange takes a piece of the ad revenue. Offering caching services isn't evil. There are lots of things to hate about Facebook, but their caching server isn't one of them.

  • While(profitable) {
        c = selectContent();
        if (user.pantiesInAbunchOver(c)) {
          user.giveThemMoreOfTheSame(c);
        } else {
          c = selectDiffContentRandomly();
        }
    }

When your work speaks for itself, don't interrupt. -- Henry J. Kaiser

Working...