Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Facebook Social Networks The Media Politics

WSJ: Facebook's 2018 Algorithm Change 'Rewarded Outrage'. Zuck Resisted Fixes (livemint.com) 54

This week the Wall Street Journal reported that a 2018 algorithm change at Facebook "rewarded outrage," according to Facebook's own internal memos. But the Journal says the memos showed "that CEO Mark Zuckerberg resisted proposed fixes," and that the memos "offer an unparalleled look at how much Facebook knows about the flaws in its platform and how it often lacks the will or the ability to address them." In the fall of 2018, Jonah Peretti, chief executive of online publisher BuzzFeed, emailed a top official at Facebook Inc. The most divisive content that publishers produced was going viral on the platform, he said, creating an incentive to produce more of it... Mr. Peretti blamed a major overhaul Facebook had given to its News Feed algorithm earlier that year to boost "meaningful social interactions," or MSI, between friends and family, according to internal Facebook documents reviewed by The Wall Street Journal that quote the email...

Facebook's chief executive, Mark Zuckerberg, said the aim of the algorithm change was to strengthen bonds between users and to improve their well-being. Facebook would encourage people to interact more with friends and family and spend less time passively consuming professionally produced content, which research suggested was harmful to their mental health. Within the company, though, staffers warned the change was having the opposite effect, the documents show. It was making Facebook's platform an angrier place. Company researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook. "Our approach has had unhealthy side effects on important slices of public content, such as politics and news," wrote a team of data scientists, flagging Mr. Peretti's complaints, in a memo reviewed by the Journal... They concluded that the new algorithm's heavy weighting of reshared material in its News Feed made the angry voices louder. "Misinformation, toxicity, and violent content are inordinately prevalent among reshares," researchers noted in internal memos.

Some political parties in Europe told Facebook the algorithm had made them shift their policy positions so they resonated more on the platform, according to the documents. "Many parties, including those that have shifted to the negative, worry about the long term effects on democracy," read one internal Facebook report, which didn't name specific parties...

Mr. Zuckerberg resisted some of the proposed fixes, the documents show, because he was worried they might hurt the company's other objective — making users engage more with Facebook.

This discussion has been archived. No new comments can be posted.

WSJ: Facebook's 2018 Algorithm Change 'Rewarded Outrage'. Zuck Resisted Fixes

Comments Filter:
  • Sometimes the best way to maintain a relationship is a healthy dose of willful blindness

    • by Anonymous Coward on Saturday September 18, 2021 @11:12AM (#61808071)

      Zuckerberg is one of the few people in the world who, if he died right now, would result in the world being a better place. He shares this accolade with a select handful of other historical figures like Hitler, and Stalin.

      I'm sure his parents must be very proud.

      • I'm not so sure. There are likely a line of many people just as bad who would take his place, especially since Facebook already exists and the profitability of operating it in an evil manner has already been demonstrated.
    • What was that (the parent) supposed to be about? Care to clarify? (If so, I suggest starting by restoring and explaining your original Subject, but right now I'm dismissing it as another case of FP disease. ("The first rule of FP club is that complete thoughts are not allowed in FP club." (But I (obviously) prefer complete thoughts.)))

      My Subjective question is derived from the book The Big Nine by Amy Webb. Facebook figures as one of the six American AI companies, but I can't even read the paywalled story

      • by shanen ( 462549 )

        With regard to the gutless wonder (who might wonder how the AC comment was called to my attention), the book says quite a bit of substance about the Chinese social credit system. Quite different from how MEPR (Multidimensional Earned Personal Reputation) "should" work. But the funny part is that personal profiling and evaluation systems as bad as or worse than the Chinese social credit system (or my strangest fantasies of MEPR) already exist among the six American AI leaders. So far the American abuses seem

        • by BranMan ( 29917 )

          Hi Shanen,
          Just a small point to add - I have not read Deep Thinking, but every time I've heard "agnostic" used in relation to AI technologies it's talking about the implementation: it's "agnostic" if you can implement the AI on a number of different computer architectures - that is, that it isn't depending on some specific high level instructions or specialized hardware.
          Might not be at all what you are talking about, but I just thought I'd throw it out there.

          • by shanen ( 462549 )

            In the context it was pretty clear that he meant morally neutral in the sense that the same (AI) technologies can be used for good or bad purposes. Your usage would make more sense in terms of implementing similar chess-playing or chess-analysis programs on different hardware, but I'm pretty sure he didn't use "agnostic" in that sense. He did talk about how the algorithmic progress has allowed for much more powerful chess software to be implemented without the special purpose hardware.

            On the WeChat topic, I

  • by iamnotx0r ( 7683968 ) on Saturday September 18, 2021 @10:50AM (#61807997)
    Outrage is how they make money.
    • Good thing Slashdot has none of that "outrage" [slashdot.org].

      • One way to increase ad views is for forum to plant outrageous responses to get people to click and respond, each yielding an ad placement download to get paid for.

        • That, combined with thread creation, "makes a market" for blabbering hotheads to respond to and generate ad revenue.

          I wouldn't be surprised if I've earned $50,000 dollars for this web site over the years, given my ability to generate outrage downmods.

    • Yes, the operative term is 'reward' as that defines Business. The way to stop Facebook's propagating outrage is to make it unrewarded. True in life, too.
  • Hmm (Score:5, Insightful)

    by Chameleon Man ( 1304729 ) on Saturday September 18, 2021 @10:55AM (#61808009)
    Public discourse on how we should regulate social media is so misguided because not enough people understand the inner workings of these algorithms. The government needs to regulate, or straight up ban, personalization algorithms in social media. These algorithms are constantly A/B testing and going with the results that drive more engagement from the user. Monetizing outrage is exploiting a vulnerability of human nature for profit. No different than the opioid crisis.
    • The problem with A/B testing is that is there is a presumption that a "best" interface exists and can be found by sufficient poking at each of the dimensions and parameters of the interface. The contradictory reality is that "best" is a case-by-case thing and your best interface would differ from mine, just as your thinking differs from mine.

      But a dominant and profit-driven corporate cancer like Facebook (or the google (or Apple?)) sees the UI as a cost to be minimized, which means that the "best" solution

    • Re:Hmm (Score:5, Funny)

      by null etc. ( 524767 ) on Saturday September 18, 2021 @02:53PM (#61808601)

      Public discourse on how we should regulate social media is so misguided because not enough people understand the inner workings of these algorithms.

      Public discourse on pretty much anything these days is misguided, because not enough people understand anything about anything.

    • by AmiMoJo ( 196126 )

      Interestingly that's exactly what the Chinese government is proposing to do. For example, users will be able to see exactly why the algorithm showed them something, and correct it if they think it was wrong. They will also be able to turn it off and get the generic view.

    • by LKM ( 227954 )
      Yeah, at this point, a ban on personal recommendation algorithms seems like the most feasible and reasonable choice. Any time you let an algorithm recommend content, and you reward that algorithm for engagement, you'll end up with an algorithm that optimized for outrage, because outrage is the easiest way to get people to engage.

      If we just go back to the point where people see the things they actually subscribed to (i.e. stuff their actual friends posted on Facebook), 99% of this problem will just go away
  • by sinij ( 911942 ) on Saturday September 18, 2021 @11:01AM (#61808031)
    What social media is doing to society is similar to what manufacturing dumping toxic waste into rivers was before EPA stepped in.
    • by ytene ( 4376651 ) on Saturday September 18, 2021 @02:11PM (#61808523)
      To borrow and extend your analogy...

      When someone dumps toxic waste in to a river, that toxicity becomes visible to everyone downstream of the dump point. It's either directly visible: you can take a water sample and test it; or indirectly visible: you can observe the change to the plants and animals in the river and see how they are negatively impacted. What social media in general and Facebook in particular have done is add toxicity to the tap water in your home - individual people are being poisoned by a bespoke mix of poison that is unique to them. Because their poison is unique, it's harder to spot and harder to see the impact.

      When someone dumps toxic waste in to a river, their primary objective in doing this is to get rid of the toxic waste, which they are dumping because it is cheaper to do so. Their motive is not to addict the river to their toxicity. When social media in general and Facebook track people around the web, bombard them with psychologically profiled advertisements, send them images they know will have a high chance of body-shaming them, will generate "flash-floods" of "like" or "outrage", their motive is to permanently alter the recipients of their toxicity. They know that exposure to their toxicity will bring about permanent change in people - change for the worse. They know that they can sprinkle some dopamine-inducing feedback loops in to the process and then get other users to push those buttons [think "likes" or re-tweets or replies to slashdot posts] in their attempt to "addict" their users.

      Zuck is the Sackler Family of social media. He knows that what he is personally doing is f##king up literally millions of lives.

      And he won't change.

      And he doesn't care.


      Ask yourself how many teenagers have committed suicide thanks to body-shame issues and insecurity brought on by Instagram and Facebook. Whatever you *think* the number is, it's likely to be higher. We now know that Facebook have a clear picture of this - and sit back and do nothing.

      You know what the worst part of that is? They don't act not because they think that to do so would be to admit liability, but because their chosen course of action is the one that makes them the most money.

      The only reason they "get away with it" seems to be that they are the only party close enough to the evidence of the consequences of their actions - and they hide any proof they may identify.
      • by sinij ( 911942 ) on Saturday September 18, 2021 @04:45PM (#61808845)
        Body shaming is a tiny fragment of this much bigger problem. The very fabric of our social contract is fraying. Have you noticed how divisive most issues become - from politics, to gender identity, to even what and how is being taught in schools. This is because algorithm-driven outrage radicalizes and polarizes people. The world is on fire because quasi-AI at Facebook figured out this is the best way to maximize profits.
        • by ytene ( 4376651 )
          Exactly.

          We’re told that 80% of communication is non-verbal which is to say that it is composed of gesture, expression, tone, posture and so on. What seems to be less well understood is that when we lose that 80% of context and texture by absorbing content digitally, the one human emotion that suffers the most, is degraded the most, is empathy.

          If I wanted to spread toxicity and hatred on line, it isn’t hard. I could have replied to your post with swear words, insults, extremist propaganda o
          • by sinij ( 911942 )
            It didn't just happen to become this way. Social media engagement algorithms actively polarize and radicalize people. All of it - Facebook, Twitter, Youtube, Instagram, etc. - actively encourages bad behavior in a medium that readily enables bad behavior. More so, people don't keep it contained to social media and take the crazy offline.

            This is fundamentally dangerous. Society requires a high degree of cooperation, once something, anything sufficiently degrades that cooperation and it is no longer coheren
            • by ytene ( 4376651 )
              What would interest me would be for some form of investigation - which I suspect would have to come from Congress - in the style of "what did they know and when did they know it?"

              In your comment you write that "Social media algorithms actively polarize and radicalize people". Based on the evidence we see, I would be willing to accept that assertion.

              But you quite rightly take us to the heart of a more important and much darker question:-

              Is it the case that the Social Media Companies set out to maxim
              • by sinij ( 911942 )
                I think it is very dangerous to focus on users, even if you can clearly show that such users knowingly posting inflammatory content. This creates a situation where social media is a gatekeeper of what is considered acceptable discourse. I wouldn't trust such powers to Mother Teresa, less profit-driven power-hungry technocrats.

                The only path to solution must involve systemic changes - we need to a) accept that inflammatory content exists b) alter the system to make such content less impactful. This would inv
                • by ytene ( 4376651 )
                  I agree - but that wasn’t my observation. In fact, if you look at sites like Twitter, they do have a user-focussed approach to tolerating inflammatory content in that, if you are a high-profile user such as a media star or politician, you can go a lot further before you get sanctioned, temporarily blocked, or banned.

                  Maybe there is a case for democratization of the process - publish a very clear set of rules covering what is/is not allowed - which you revise as and when needed - and then apply those
                  • by sinij ( 911942 )
                    I personally see Twitter as book example of wrong behavior and actively causing harm. If anything, recent actions by Twitter made me support ineffective but punitive measures against social media. In my view, Twitter is openly meddling in elections in US. Even if you agree with their rationale, you still have to acknowledge that nobody voted for Jack Dorsey and he and his fellow technocrats should not have such degree of influence on politics.
        • by Corbets ( 169101 )

          Only in the USA, really.

          I mean, don’t get me wrong, we in Europe have our divisive issues. But in my 15 years here, I’ve never seen anything like the polarization I saw in the states 30 years ago (and it’s gotten worse since).

          • by LKM ( 227954 )
            I'm pretty sure antivaxxers have now recreated a large portion of America's division in Europe. Traditionally, many European countries have had much higher self-reported trust towards their fellow citizens than the US. This is important for societies that work well, because trust that others aren't going to abuse public systems or your personal property is what allows things like healthcare to work, and what provides safety without extreme Police interference.

            I have a hunch that Covid-19 might have fundam
  • by Anonymous Coward

    But the Journal says the memos showed "that CEO Mark Zuckerberg resisted proposed fixes," and that the memos "offer an unparalleled look at how much Facebook knows about the flaws in its platform and how it often lacks the will or the ability to address them."

    "flaws"? That was the feature.

  • by Anonymous Coward

    Mr. Zuckerberg resisted some of the proposed fixes, the documents show, because he was worried they might hurt the company's other objective — making users engage more with Facebook.

    Engagement is not the true objective, making money is. Mini-Mark is all about the money.

  • News Flash (Score:4, Funny)

    by sunderland56 ( 621843 ) on Saturday September 18, 2021 @11:11AM (#61808067)

    Wait, you're saying Zuckerberg is a jerk? Wow. Never heard that before.

  • I wasn't aware Facebook ever had meaningful social interaction beyond liking cat photos or other clickbait. Seriously, not even being sarcastic here...

    • by dcw3 ( 649211 )

      "Seriously, not even being sarcastic here..."

      Seriously, you can't figure out that millions of people have reconnected with old friends through FB? FB marketplace is also a decent venue to sell stuff...I did so when I moved two years ago, posting items on that, Craig's List and Next Door...FB sales worked much better than the others.

      Don't get me wrong...I hate Zuck and his ilk.

  • it doesn't make money for facebook.
  • But outrage is worth billions to Facebook.
  • and immediately thought it seemed like the battery cages they shove chickens into.

    "Platform" sounds so much more flattering than farm.
  • by aerogems ( 339274 ) on Saturday September 18, 2021 @02:32PM (#61808565)

    I'm sure it will be called socialism or some other boogeyman term by some, but it's well past time that some regulations be imposed on the likes of Facebook, Twitter, and their snowflake-haven counterparts on the right, along with the 24-hour cable news networks. The companies have gotten to the point where their actions, or inactions, or inability to take all the necessary actions, can result in changing the results of political races, allowing foreign actors like Russia to sow discord, and other things that I think everyone can agree are bad for society. Clearly we can't rely on the companies themselves to take the proper actions, so it's up to governments to force the issue.

    We need to impose a strict obligation on everyone at the top of these companies that they must make decisions that are in the best interest of society. Not the shareholders, not the executives, society. Give it some teeth too, like strict personal liability for everyone in the C-Suite at the company. If a "unite the right" type rally is organized on Facebook, for example, and people end up being injured or killed, the entire Facebook C-Suite can be charged as accessories unless they can show Facebook not only has policies in place requiring that immediate action is taken to shut down any sort of advocating for a violent protest, but that those policies are consistently followed by the rank and file. If they're not, it's not the low level employee who gets fired, it's the C-Suite executives who get charged as accessories and potentially serve time in jail.

    Cable news networks, be it CNN, MSNBC, Fox News, Newsmax, or any of the others... Not only will the C-Suite be held accountable, but so will on-air talent. They need to make sure that their opinion shows are very clearly labeled as such. Maybe starting with giant disclaimers at the start, end, and after every commercial break and segment. Some of which the host of the show must say aloud.

    It's downright pathetic that we even need to be having this conversation in the first place. Especially with things like covid, where it's an equal opportunity killer, companies like Twitter and Facebook should want to shut down the spread of misinformation. Having your customer base die off, literally, is not good for business. Neither is being associated with the spread of misinformation that resulted in a number of unnecessary deaths. Clearly the executives at these companies are only concerned about the next quarter's projections and don't give two fucks about what might happen 6-12 months from now, so we need to force the issue.

    • People like to bring up the 1A free speech issues in response to arguments like this, but when you look at the damage caused by speech prohibited by existing exceptions, I haven't seen anyone able to argue the damage from these networks isn't greater than some if not all of the existing exceptions. It is indeed time tor some very narrowly crafted limits on large-scale commercial/financially incentivized speech. Surely if the damage of puritans being offended by consenting adults puking on eachother during r
    • Give it some teeth too, like strict personal liability for everyone in the C-Suite at the company.

      That sort of liability would mean the swift end of social media. Not saying that's a bad thing...

      Given the money at play, I think a Surgeon General's warning in the fine print is a more likely outcome.

    • by sinij ( 911942 )

      companies like Twitter and Facebook should want to shut down the spread of misinformation.

      This is absolutely not the right approach (and what they actually trying to do). Identifying misinformation is extremely labor-intensive, it is prone to biases, agendas and influence peddling, and it actually and demonstrably harmful to progress. Using computing analogy - this is search on unsorted data type of a problem. It will also not solve the problem, as radicalization into "approved" areas (e.g., defund the police) will keep happening.

      The issue was, is, and will remain engagement algorithms resulti

You know you've landed gear-up when it takes full power to taxi.

Working...