Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Facebook

Facebook Exec Blames Society for COVID Misinformation (axios.com) 149

Longtime Facebook veteran Andrew Bosworth insists that political and COVID-19 misinformation are societal problems rather than issues that have been magnified by social networks. From a report: Critics say Facebook and other social networks have played a significant role in vaccine hesitancy and the spread of political misinformation. "Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing," Bosworth said in an interview with "Axios on HBO." "I don't feel comfortable at all saying they don't have a voice because I don't like what they said." Bosworth has been leading Facebook's hardware efforts, including those in virtual and augmented reality. Next year he will become CTO for Meta, Facebook's parent company. Asked whether vaccine hesitancy would be the same with or without social media, Bosworth defended Facebook's role in combatting COVID, noting that the company ran one of the largest information campaigns in the world to spread authoritative information.
This discussion has been archived. No new comments can be posted.

Facebook Exec Blames Society for COVID Misinformation

Comments Filter:
  • by ITRambo ( 1467509 ) on Monday December 13, 2021 @10:41AM (#62075381)
    Whatever post trend brings in the most ad money gets Facebook's support. They even recognized that some posts were foreign government sockpuppet missives and did nothing, since their objective is to gather personal data and "give" it to advertisers (partners) that spend enough money to obtain the data. They're slimy and lie continuously about their part in dividing the worlds peoples.
    • Exactly . . . just because a door isn't locked, doesn't mean you shouldn't go in, take everything and sell it off.

      Billion dollar companies get to do just about anything they want . . . they CAN'T however shift blame for the way they made their billions and expect us to swallow it.
    • Exactly and this is why Social Media is a cancer that needs to be heavily regulated, especially micro-targeting and keywording.

      I generally dislike regulation, but Social Media companies have shown a wanton disregard for both facts AND human decency/rights... as long as they earn a profit f**king over society.
  • In other news,

    Society Execs Blame Facebook for COVID Misinformation

  • Same old same old (Score:4, Insightful)

    by Zontar The Mindless ( 9002 ) <plasticfish@info.gmail@com> on Monday December 13, 2021 @10:43AM (#62075387) Homepage

    $executive at $company blames $problem on everyone other than $company.

    Pictures at 11.

  • Yes, but also yes (Score:5, Interesting)

    by aj50 ( 789101 ) on Monday December 13, 2021 @10:45AM (#62075399)

    Sure, I can buy that misinformation is a societal problem. People like to repeat what they want to believe.

    However, social networks would appear to have massively magnified the problem. Firstly by optimising for engagement which leads to promoting controversial content and secondly by normalising the "sharing" of a post to broadcast it to all of your contacts.

    • by evanh ( 627108 )

      The dude is a top Facebook exec - Of course he's lying. That's the job description - To deflect from the truth, even for regular operations. You lie so much you believe your own bullshit. Which is also why regulations are a never-ending evolution.

    • We had wacko's who believes in all sorts of crazy crap sense probably before we became Homosapiens.

      Progress in human kind has been about moderating and measuring all the information that we get exposed to. Where good ideas and truthful information gets promoted, and spread, while bad information and lies get relegated to a small group.

      Social Media in essence gave your Crazy Uncle who no one really believes his own prime time show. In which all the other like minded crazy uncles can gravitate and grow the n

    • Re: (Score:2, Insightful)

      by Luckyo ( 1726890 )

      CDC also amplified misinformation. Examples: vaccines offer sterilizing immunity so you should get vaccinated so you don't infect those with compromised immune systems. Sanitizing hands with hand sanitizer helps against COVID.

      Because CDC just like Facebook is part of the society. Did you have a point?

      • You're spreading misinformation right now. The CDC never said any of the current vaccines offered sterilizing immunity. They said they reduced, but not eliminated, the chance of getting infected at all. Which is completely true, fuckheads like you lying about it and misleading with phrases like "doesn't prevent" which makes people think "doesn't do anything", notwithstanding. Sanitizing hands *does* help with covid, just not as much since fomite based transmission is *less common* but not zero. It's not mis
        • by Luckyo ( 1726890 )

          The claim, you lying scumbag, was that once vaccines are in, infections will be curbed and people will be able to continue normal lives.

          I.e. sterilizing immunity was promised. Now, the propaganda spreaders such as yourself will try to twist words until their meaning is whatever they want them to be, and pretend that either this claim wasn't made, or that this claim just didn't mean sterilizing immunity, it meant something that is equivalent to it of course, but it wasn't sterilizing immunity. So I'm totally

      • You either don't understand what "misinformation" means, or you are trolling. Given your history here, I'm going with the latter.

        KGFY, HAND.

        • by Luckyo ( 1726890 )

          I know that you think that "misinformation" is the information you don't like, regardless of merit given your history here.

          Did you have a point other than to whine about it?

  • PEOPLE kill people.

    By throwing bullets at each other.

  • by TheNameOfNick ( 7286618 ) on Monday December 13, 2021 @10:50AM (#62075423)

    Like the information coming from the inside of Facebook doesn't prove that Facebook is about one thing only: "engagement". Facebook shows people what's most likely to keep them on the site and what keeps them coming back. You need an audience to show ads to. If you keep showing impressionable people the most egregious lies to sell more ads, YOU are the fucking problem, asshole.

  • Willful ignorance (Score:4, Insightful)

    by Xylantiel ( 177496 ) on Monday December 13, 2021 @10:51AM (#62075433)
    This guy is either so insulated from how Facebook's ranking algorithms work that he is clueless or he wants to pretend that. Facebook knows perfectly well that its algorithms promote bad information heavily, just like everyone else does. I only have "test" facebook accounts that don't follow anything and still gets groups promoted via email that are obviously just venues for promoting bad information. The fact is that their platform is getting abused and they have a vested interest in looking the other way because they make money off of it. If they didn't make money, okay maybe it is someone else's fault, but they are spending that money that they make off bad information to push their platform to more people - which is why the people distributing bad information use it. Then, while knowing that, they want to pretend to be the victim and everyone should ignore their conflict of interest.
  • by fermion ( 181285 ) on Monday December 13, 2021 @10:51AM (#62075437) Homepage Journal
    After Clinton was elected, Fox News was created to promote conservative rhetoric, not all accurate, in response to Clintonâ(TM)s success at fiscal management. The network gave credence to such conspiracies as the Clinton Body Count, which among other things tormented the family of Vince Foster until SCOTUS effectively put an end to it around 2000. The network also gave credence to the 10 Benghazi investigations which found no misconduct of the Obama administration and were coincidently ended immediately after the 2016 US election.

    None of this, or the pizzagate conspiracy, ate the responsibility of Fox News. They merely report what is on the minds of the people. We choose to accept it as fact or not. Same thing with social media. Everyone has an opinion. We choose to accept it as fact or not. We choose to accept repetition or reputation as a proxy for truth. No one makes us. We can critically interpret all input and decide for ourselves. We choose to be like a brittle AI and create false positives by limited the scope of our data.

    • There is a difference between giving an opposing view point and misinformation.

      I am going on the assertion that Fox News is operating like a News Media company and not an Entertainment Company [foxnews.com] and what they are reporting on is true based on what facts they know at the time. So with targeted moderating you can show stories that put your side in the positive spotlight, while painting your opponent as a bad guy. While this is considered propaganda, at least it is based on truth, where a savvy news consumer

      • by fermion ( 181285 )
        People believe and they believe they have proof that COVID is harmless and the vaccine will kill you as much as they believe that they have proof that Hillary personally killed Vince Foster and collaborated with the enemy to lead an attack on a U.S. embassy. There is no information if Facebook, or at least no more than Fox News. The people on Facebook are desperately trying to protect us from the lies of Biden administration and the CDC. Otherwise they will succeed in the Great Replacement.
  • Facebook's algorithms are specifically designed for manipulation and to maximize social conflict. Of course there would be side-effects of this. https://www.forbes.com/sites/k... [forbes.com]

  • FB: "It's not our fault we studied people and made effective algorithms that connect people with this they feel strongly about."

    People: "Uhm really? Isn't that just finding emotional triggers for lab rats?"

    FB: "We didn't make people believe misinformation. It's not our fault we indirectly profit from views and ad traffic to the extent we prioritise it."

    People: "Err'you sure? If a large enough group believes something some people follow blindly."

    FB: "Our platform is being exploited by nefarious indiv
  • Facebook, nor any other big tech company should not be arbiters of truth. Yes, sure flat earthers are annoying, and covid misinformation is deadly dangerous. However it is ultimately a human sharing that information is responsible, that should be held accountable.

    Blaming our societal issues on Facebook is a cop out.

    • by Bert64 ( 520050 )

      Blame not just the human who spreads the (mis)information, but also the one that chooses to believe it.
      Doesn't matter how loud someone shouts if noone believes them.

      Education is the real solution, but that would result in a population that is much harder for anyone (especially the mass media and government) to manipulate, so they ignore the obvious solution.

  • Three things can be true at the same time:

    Facebook has deep pockets and can be shaken down quite profitably by grandstanding politicians eager to show the proles a head on a pike.

    Facebook lets the crazies find eachother and amplies the sort of disfunction that would fizzle if confined to a single workplace or classroom or group of irl friends.

    The freedom of association is sacrosanct in any society that has even a hope of being healthy or getting healthy and the answer to bad speech must be good speech.

    There

  • "A person is smart. People are dumb, panicky dangerous animals and you know it." -Agent K

    Facebook has removed the reliance we have had on critical thinking. With People now just assuming that since x number of people on their timeline believe something so it must be true.

  • The claim that "political and COVID-19 misinformation are societal problems rather than issues that have been magnified by social networks" avoids the reality that it can be both. These societal 'problems' obviously have been magnified by the social networks, and, IMHO, deliberately so.

    And to claim that Andrew Bosworth, for one, might be too 'insulated' from the workings of Facebook, or anyone else deeply involved in these social networks is somehow unaware or 'insulated' from the real workings of these net

    • Someone who actually understands the difference between the verbs lose and loose, and can use the latter correctly in a sentence!

      I tip my hat to you, sir.

  • the points about individual responsibility, differing viewpoints and democracy are spot on

    although it does seem a bit of a stretch to say fb is motivated by these items more than they are being used as a shield for their opaque for-profit practices

    and the interviewer -- which I'm assuming is calling itself a journalist -- is so agenda-laden and biased that any belief of non-editorializing is beyond belief

    I do wish -- as an individual trying to make an informed decision -- that more transparency about how fb

  • "I don't feel comfortable at all saying they don't have a voice because I don't like what they said."

    Yep. That's it in a nutshell.

    Sorry, but freedom is uncomfortable. You have to let other people have it too.

    It won't always be you running things. Censorship that can be used for your views can also be used against your views. You have to take the bad with the good.

  • Drug & alcohol addiction is a societal problem too, and that shit's highly regulated or illegal.
  • "Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing,"

    Yes. But I'm afraid "Men in Black" was right when they talked about the difference between an sensible individual person and of what becomes of that person when it is part of a larger group.

  • It's a requirement to be a Facebook executive. Suck up to Zuck's twisted version of reality or quit the company.

  • It's not their fault for accepting crap and spreading it everywhere, it's people's fault for making crap.

    Yes, blame mankind for not being 100% perfect, rather than yourself for not acknowledging the imperfection and designing the technology around the known imperfections so as to minimize the problem.

    Facebook is the roomba blaming the incontinent dog rather than taking responsibility for what the roomba did

    • No, but it is their fault for promoting crap, which they don't have to do. But they make money the more angry people get because angry people click more and they can serve up more ads, so they incentivize that behavior to make money and that is a problem

    • by GuB-42 ( 2483988 )

      And how to you "design the technology around the known imperfections"? Letting people communicate freely and fighting misinformation are opposite goals.
      You described the extreme "free" situation, and I like your analogy with the Roomba. But on the other side you have nothing, because the best way to fight misinformation is to not have information in the first place, like a Roomba that only goes where it is clean.

      For a good example of the other extreme, just look at a corporate environment. Nothing is authen

      • There are lots of things they could have done, but it would have cost them money. They care about engagement and nothing else. It's no different than being willing to sell anything, even drugs, kidneys, or slaves.

        One thing that Facebook could do is to group by issue rather than belief. That is, if someone cares about abortion, show them exactly as much pro-choice as pro-life. IT DOES NOT MATTER that they will hate half of what you show them.

        Another thing they could do is to avoid all politics and news

  • by thegarbz ( 1787294 ) on Monday December 13, 2021 @11:37AM (#62075619)

    Society has always had misinformation and stupidity. What is new however is an algorithmic billboard that puts this misinformation front and center because it is the best way to get "engagement".

    You can't blame society for a problem you specifically designed your product to amplify. You're in control of your algorithm. You have the power to shadow ban / burry misinformation on your platform. Except you do the opposite.

    For the clicks.

  • we just provide a gun/drug/cigarette/platform - it is always the end user!
  • "Society" or "what is posted on your network"

    Which of these two items do you have total control of?
  • I'd agree that it's FIRST a societal problem, however that doesn't exonerate social networks from their mechanisms that DO exacerbate social, political, cultural, etc divisions.

    To be clear: this is ULTIMATELY a social problem.
    Let's assume, for example, that American society in the first half of the 20th century was largely homogeneous (it wasn't, not even a bit, but the NARRATIVE wasn't the divisions). This meme of society very clearly left out minorities of all sorts - racial, cultural, sexual, etc - in p

    • It's a social problem in that one corporation seems to be responsible for fomenting a lot of conflict, violence, misinformation & social division. Regardless of HOW they're doing it, they need to STOP. If they can't, then it's up to law enforcement & the judiciary to make them stop. You know, public safety & all that.
      • Yeah but we haven't been talking at all about Twitter. ...Oh wait, no, I get it. You're all-aboard with the meme that somehow Facebook is the sole (or even primary) guilty party for everything that makes you sad.

        IMO this is all just an ongoing weaksauce followthrough continuing to hound FB for 'failing to understand those that were good and right discarded so-called objectivity to fight the orange tyrant emperor to the bitter end'. Even ITALY can see it: (https://www.theamericanconservative.com/dreher/ita

        • Ay ay ay, you had to go & make it into an American polarised partisan issue. The worst Facebook offences have been in other countries like Myanmar, Bangladesh, & India. We're talking mass killings & genocide, not some stupid twats in Native American costumes having a riot in a govt building. There's another 6.5 billion people in the world besides 'Muricans.
          • Honestly, that's a fair criticism. You're right, I disregarded the shit that FB has pulled in the 3rd world and I shouldn't have.

  • Facebook cannot comprehend moderation and will try and derail any attempt at pointing the finger at them, because it will hit them in the pocketbook if they do have to step up moderation or modify their money making algorithms.

    I say regulate facebook until it hurts.

  • I missed the bit where they told us about Andrew Bosworth's expertise in the social & political sciences. What were they again? Why should we listen to his opinion?
  • "You're holding it wrong"

  • /sarcasm Yep, it is all the fault of society. Damn humans spoiling our beautiful platform.

    Facebook deserves a break. Clearly this is a failure of the education system.
    Facebook should sue the government for compensation for the reputation damage.

    It is hard to blame society when you are a non negligible part of it. When you bring together a lot of people, you need to organize it well. That includes making sure people behave, which goes surprisingly well for 90% of the people.

    Noobs.
  • "Individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing,"

    Whose algorithm chose what 'thing' to show them, again?

"I'm a mean green mother from outer space" -- Audrey II, The Little Shop of Horrors

Working...