Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Facebook Social Networks The Internet News

Facebook Data Reveal the Devastating Real-World Harms Caused By the Spread of Misinformation (theconversation.com) 168

An anonymous reader quotes a report from The Conversation: Twenty-one years after Facebook's launch, Australia's top 25 news outlets now have a combined 27.6 million followers on the platform. They rely on Facebook's reach more than ever, posting far more stories there than in the past. With access to Meta's Content Library (Meta is the owner of Facebook), our big data study analysed more than three million posts from 25 Australian news publishers. We wanted to understand how content is distributed, how audiences engage with news topics, and the nature of misinformation spread. The study enabled us to track de-identified Facebook comments and take a closer look at examples of how misinformation spreads. These included cases about election integrity, the environment (floods) and health misinformation such as hydroxychloroquine promotion during the COVID pandemic. The data reveal misinformation's real-world impact: it isn't just a digital issue, it's linked to poor health outcomes, falling public trust, and significant societal harm. [...]

Our study has lessons for public figures and institutions. They, especially politicians, must lead in curbing misinformation, as their misleading statements are quickly amplified by the public. Social media and mainstream media also play an important role in limiting the circulation of misinformation. As Australians increasingly rely on social media for news, mainstream media can provide credible information and counter misinformation through their online story posts. Digital platforms can also curb algorithmic spread and remove dangerous content that leads to real-world harms. The study offers evidence of a change over time in audiences' news consumption patterns. Whether this is due to news avoidance or changes in algorithmic promotion is unclear. But it is clear that from 2016 to 2024, online audiences increasingly engaged with arts, lifestyle and celebrity news over politics, leading media outlets to prioritize posting stories that entertain rather than inform. This shift may pose a challenge to mitigating misinformation with hard news facts. Finally, the study shows that fact-checking, while valuable, is not a silver bullet. Combating misinformation requires a multi-pronged approach, including counter-messaging by trusted civic leaders, media and digital literacy campaigns, and public restraint in sharing unverified content.

Facebook Data Reveal the Devastating Real-World Harms Caused By the Spread of Misinformation

Comments Filter:
  • What 'trusted civic leaders'?

    And that, ladies and gentlemen, boys and girls, is the problem; we don't believe what our leaders say.

    It was fascinating at my church when Covid started: the highly respected medical student was asked the questions which were being posed on the media. He came up with the same answers; obviously his answers were more acceptable than what the talking heads on the TV were saying.

    And then, later, we discovered that the WHO had been spreading misinformation about aerial transmission.

    • by DesScorp ( 410532 ) on Thursday September 25, 2025 @10:37AM (#65682648) Journal

      What 'trusted civic leaders'?

      The ones that the writers of the piece liked.

      There's a continuing narrative that we hate journalists because of "malign influence" or some bullshit. That people are rubes, easily manipulated, and if political leaders would just take charge and prevent the "wrong" media, people would love journalists, politicians, etc. But people came by their distrust honestly. For much of my adult life, journalists have talked down to their audience (when it was a mass audience, anyway, before it fractured into pieces). I still remember Peter Jennings, in 1994, sourly lecturing viewers on how they voted the wrong way. The tone was very much Just what do you people think you're doing, anyway?.

      Trust in these institutions is gone, and probably will never return in my lifetime. And it's entirely the fault of the people in those institutions. No one else.

      • by AmiMoJo ( 196126 )

        They mean the people that were traditionally trusted. Political leaders, religious leaders, celebrity scientists, newsreaders, that sort of thing.

        Yes they did lie or make mistakes, and yes not everyone trusted them. But many people would watch the evening news and if their country's leader came on and told them something, or a famous doctor said something, many of them would trust it.

        Now we have the other extreme were much of the population automatically assumes that anything people like that say is a lie a

      • A European politician was quoted recently as saying: 'We know what we need to do. What we don't know is how to do it and still get re-elected.'

        For decades politicians have got elected by telling lies. Because the range of politicians being elected were acceptable to the elites, there wasn't a problem. Then Trump comes along and mobilises - with lies - the population to demand change that the elites aren't comfortable with, and they've got all sulky; Clinton's 'deplorables' comment captures the real attitude

    • by AmiMoJo ( 196126 )

      And then, later, we discovered that the WHO had been spreading misinformation about aerial transmission...

      That's an example of COVID misinformation.

      The WHO used the advice available to it from experts at the time. The WHO is a very conservative institution, in terms of only giving advice when there is a solid body of evidence to back it up, and wide consensus.

      So they started by giving the standard advice for respiratory diseases that had proven to be true and effective before. And it wasn't exactly untrue for COVID, it just got refined as understanding of the disease improved. Surgical masks did work to reduce

      • https://www.independent.co.uk/... [independent.co.uk]

        and

        https://www.scientificamerican... [scientificamerican.com]

        here's an attempt to explain the mistake

        https://www.sciencedaily.com/r... [sciencedaily.com]

        • by AmiMoJo ( 196126 )

          A mistake is not a lie. And your first link has a quote where she says "But at the same time, we were not forcefully saying: âThis is an airborne virus.â(TM) I regret that we didnâ(TM)t do this much, much earlier."

          • True, of course, but not relevant to this debate as the issue is about TRUST and the WHO's mistake means that it has, like a politician who deliberately lies, shown itself to be untrustworthy. THIS IS A DISASTER, of course, as it allows the deluded to ignore what the WHO says.

            The 'logic' appears to be:

            I don't want to believe this inconvenient truth
            A politician agrees with me
            The media point to 'trusted organisations' who tell me it's true
            The politician points to when the 'trusted organisations' got it wrong
            I

            • by AmiMoJo ( 196126 )

              It's only a disaster if you think that making a minor and largely inconsequential mistake in a high pressure situation where the supply of solid information is limited makes them "untrustworthy". Given they have not made mistakes with 99% of the advice they give, you would be pretty stupid to deliberately do the opposite of everything they suggest.

  • Facebook reporting on the danger of misinformation is like cancer reporting on the threat from cancer...

  • Gatekeeping (Score:5, Insightful)

    by RobinH ( 124750 ) on Thursday September 25, 2025 @10:18AM (#65682608) Homepage

    What we're talking about here is gatekeeping. Back a few decades ago, gatekeeping was the standard way things were done in the media simply because the technology created the gates: there were only so many channels, and setting up a channel was expensive. The FCC *did* regulate what you could say on broadcast TV (the 7 dirty words) and even instituted a "fairness doctrine" that forced channels to give equal airtime to both sides. Coupled with strong journalism ethics, only stories that were vetted by fact checking could even make it on the news, and you would hear the opposing point of view simply by watching the same channel.

    There were pros and cons of this system. It did prevent the spread of misinformation, but it also suppressed minority viewpoints. Flat earthers simply didn't have a platform back then, and were just the butt of jokes, but the same could be said of gay people.

    When the internet first arrived, it was still full of gatekeeping. Only people who were technologically savvy could setup a website or even a blog, so the content online during the web 1.0 phase tended to be more enlightened than what you see now. I was there. There was a lot of optimism in the 90's that worldwide access to the internet was going to lift humanity out of ignorance.

    Web 2.0, or social media as we call it now, destroyed that dream by handing everyone a megaphone. Yes, it destroyed all the gates and the fences along with it. It gave access to groups who didn't have access before (a good example is the Arab Spring protests). But it flooded us all with clickbait and content designed to make us angry, suspend rational thought, and share it. It also didn't take long for governments to realize the opportunity and flood social media with one-sided narratives they wanted shared widely, which traditional media never would have published.

    Social media has a fundamental problem: it spreads misinformation much more easily than evidence-based facts.

    Gatekeeping has a fundamental problem: we don't trust the gatekeepers. But it drastically reduces misinformation spread.

    But it's not like we can put the genie back in the bottle. We've democratized content generation. Anyone can make a youtube or facebook video from their phone, and it can spread far and wide long before it can be fact checked. And once it's fact checked, the truth doesn't have the sharing power of the original narrative.

    Would introducing gatekeepers to social media fix the problem? I don't think so. It's a problem of scale. There's just too much content to fact check. And the only ones who could do it at scale are the social media platforms themselves and only with automated solutions, which aren't going to be very accurate. Worse, we don't want to give social media companies the power to be the arbiters of truth. At least under the old system, it was small enough that the arbiters of truth were a profession of journalists with codes of ethics, and journalism awards, and a reasonably independent news room that was separate from the editorial office.

    So we're stuck with this. The future is a no holds barred competition of opinion manipulation. The tools that support this reality (the internet and web 2.0) are now augmented by even more powerful mass content generation in the form of generative AI. The worst part is, those of us who even want to fix it are in the minority. It's just too powerful of a tool, and both sides find it too valuable to give up.

    And it doesn't really matter if you work hard to be a critical thinker, and you're suspicious of all new information. Even if you do, you're in the minority, and you're at the mercy of a majority that's happy to swallow whatever new narrative fires them up. So all you can do is sit quietly, hoping the eye of Sauron doesn't suddenly turn on you, or whatever group you happen to identify with.

    • by AmiMoJo ( 196126 )

      It was worse. Gay people didn't just lack a voice, they were actively demonized by the TV media of the day.

      I don't think it's even a lack of trust in gatekeepers exactly, it's more that people have just decided they can believe their preferred version of reality is true. That gives them great licence to do whatever they like, because it can never be stupid or outright evil in their imaginary world.

      • Sure, but in a world where you can't trust any source, you have to rely on your own perceptions and experience, to determine what makes sense.

        I was a technology analyst for a small securities firm for a few years. I met with small tech businesses looking for financing. It was interesting work, but I wasn't battle hardened yet. I heard a lot of big talk. One day I asked my boss, the guy who takes you into a public stock market listing, "how do you know what makes sense?"

        He said, "if it doesn't make sense to
    • The future is this: there are still vetted, high quality news sources, and there always will be, but people need to pay a subscription fee to get access. I'm talking about stuff like the WSJ or the Economist.

      The venn diagram between "public news source" and "high quality, vetted, transparent and largely unbiased" is approaching zero. NPR and BBC are still arguably good news sources, although conservatives will point out that they lean decidedly left while claiming to be center. I wouldn't bet much on th
      • by RobinH ( 124750 )
        I've been using Straight Arrow News recently, which is free, and the entire point of it is to try to create unbiased news content. So far I like it. But how much does it matter if a few of us seek out unbiased news, if the vast majority of people are flooded with clickbait content that tells them whatever they want to hear, with no fact checking?
      • I like the BBC a lot. Most of their articles that aren't in the culture section are pretty good about just giving you the facts without injecting all the bullshit. Once you get over to the cultural section of the BBC, it's definitely left leaning.

        I have not really looked into NPR so much as every time the algo points me to their site, they want me to "listen" to an audio file. I'd much prefer to read my news. Since I haven't really explored their website, maybe I'm just missing the transcripts or the writte

    • Social media has a fundamental problem: it spreads misinformation much more easily than evidence-based facts.

      Gatekeeping has a fundamental problem: we don't trust the gatekeepers. But it drastically reduces misinformation spread.

      The only nit to pick here is that while gatekeepers might reduce misinformation spread, it is more about who's paying the gatekeepers and how misinformation is defined. It's more accurate to say that gatekeepers drastically reduce misinformation spread based on how misinformation is defined for them by their bosses, whether those bosses be a government entity, an interested corporation, or some other special interest group.

    • Web 2.0, or social media as we call it now, destroyed that dream by handing everyone a megaphone.

      But it did not do that. Yes, anyone can post, but not anyone can have their content seen. If it doesn't meet the approval of the platform's owners, statistically nobody sees it.

      • by RobinH ( 124750 )
        All you're saying is that Meta and Alphabet are already the gatekeepers. I would agree, and there was an admission that this was going on with Twitter, but I would still argue that while it's fairly easy for them to promote or suppress certain narratives, it's nearly impossible for them to moderate everything, so misinformation they just don't care about (ivermectin, etc.) just gets a pass.
    • They may very well put the genie back in the bottle when they eventually kill 230. Then, user generated content will die overnight because the liability for that content will make it unprofitable.

      Sure, the dark web exist and you could always hang out there, but the public Internet will eventually be locked down and for commercial and government use only. I don't like it but I'd be shocked if it doesn't eventually happen.

  • by v1 ( 525388 ) on Thursday September 25, 2025 @10:50AM (#65682682) Homepage Journal

    The classic counter-argument is you can't falsely cry Fire! in a crowded theatre. It's not freedom of speech, it's intentionally causing public harm. (whether or not you benefit from the outcome, but that just makes it all the more deplorable if you do)

    Politicians have always been keen to intertwine "freedom of speech" with "freedom from consequences". They're afraid of laws that could prevent them from lying when they want to, which for a politician, can be quite often. So they'll wrap Freedom from Consequences in a Freedom of Speech blanket to gather support and protect their interests.

    Unfortunately, lately we've had a larger than average number of powerful politicians shoveling misinformation like it's snow in Chicago, while attacking the press for calling them out on their lies. They don't want to put controls on misinformation because they're benefiting from it. And part of that is keeping their voter base entertained, so it's a problem of positive feedback. More misinformation gets them more votes, and more votes keeps them in office, to block laws and spread more misinformation.

    It's gotten to the point where the politicians don't care if they're caught red-handed in lies. They just complain about the press and public being being 'woke' and how unfair it is that they're getting 'canceled'. Somehow ends up getting them more support instead of less.

    I'd love to be a part of fixing this, but I just don't see how to break the cycle. I've got a vote but it's not helping, and it's deeply frustrating!

    • Freedom of Speech, at least in the way it's enshrined in the US Constitution is merely stating that the government cannot punish you for your words. You won't be imprisoned for your speech.

      Corporations and individual people don't have to abide by that restriction placed on the government. We have the right to associate with who we want and if someone is busy spouting a bunch of stuff we disagree with, we don't have to associate with them. That's why your work can fire you if you are constantly expressing sp

  • If you define misinformation as not paying attention to the mainstream media, then yes, it increased a lot. However, what really happens is that people tune out of mainstream media and listen to all kinds of news sources, some of which are factually wrong, and then make up their own mind. This is demonstrably superior to single sourcing your views to a mainstream media outlet, but it is also more effort.
    • When people get their "News" from Rachael Maddow or Tucker Carlson, they aren't getting news, they are getting political opinions about current events. Huge difference, but so many people don't seem to grasp that anymore. Instead of news, we have entertainment that caters to different ideologies.

  • "They, especially politicians, must lead in curbing misinformation..."

    Good luck with that.

  • It's not surprising that when there is a lot of misinformed claims (or worse, self-serving lies) circulating around, some people will end up misinformed and because of that, make bad decisions that lead to bad consequences. It's worthwhile to try to correlate the misinformed claims and the bad consequences, so that when trying to figure out what to do about misinformed claims (or self-serving lies), the "stakes" will be able to be kept in mind: i.e. these things lead to serious real-world consequences. This
  • Gee, haven't you fascists swalled enough horse dewormer?

    And then there's censorship... such as Sinclair REFUSING TO SHOW Jimmy Kimmel. You don't want to call that censorship?

  • People are stupid.

    More at 11.

  • The proper counter to misinformation is true information, along with the tools to tell them apart objectively (experiments I can perform to tell which is true and which is false, without depending on any authority figures).

    Censorship is not the solution, censorship is the problem. I have to trust not only the censor's motive, but also the censor's competence. I can't objectively verify either of them.

Nothing in progression can rest on its original plan. We may as well think of rocking a grown man in the cradle of an infant. -- Edmund Burke

Working...