WSJ: Facebook's 2018 Algorithm Change 'Rewarded Outrage'. Zuck Resisted Fixes (livemint.com) 54
This week the Wall Street Journal reported that a 2018 algorithm change at Facebook "rewarded outrage," according to Facebook's own internal memos. But the Journal says the memos showed "that CEO Mark Zuckerberg resisted proposed fixes," and that the memos "offer an unparalleled look at how much Facebook knows about the flaws in its platform and how it often lacks the will or the ability to address them."
In the fall of 2018, Jonah Peretti, chief executive of online publisher BuzzFeed, emailed a top official at Facebook Inc. The most divisive content that publishers produced was going viral on the platform, he said, creating an incentive to produce more of it... Mr. Peretti blamed a major overhaul Facebook had given to its News Feed algorithm earlier that year to boost "meaningful social interactions," or MSI, between friends and family, according to internal Facebook documents reviewed by The Wall Street Journal that quote the email...
Facebook's chief executive, Mark Zuckerberg, said the aim of the algorithm change was to strengthen bonds between users and to improve their well-being. Facebook would encourage people to interact more with friends and family and spend less time passively consuming professionally produced content, which research suggested was harmful to their mental health. Within the company, though, staffers warned the change was having the opposite effect, the documents show. It was making Facebook's platform an angrier place. Company researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook. "Our approach has had unhealthy side effects on important slices of public content, such as politics and news," wrote a team of data scientists, flagging Mr. Peretti's complaints, in a memo reviewed by the Journal... They concluded that the new algorithm's heavy weighting of reshared material in its News Feed made the angry voices louder. "Misinformation, toxicity, and violent content are inordinately prevalent among reshares," researchers noted in internal memos.
Some political parties in Europe told Facebook the algorithm had made them shift their policy positions so they resonated more on the platform, according to the documents. "Many parties, including those that have shifted to the negative, worry about the long term effects on democracy," read one internal Facebook report, which didn't name specific parties...
Mr. Zuckerberg resisted some of the proposed fixes, the documents show, because he was worried they might hurt the company's other objective — making users engage more with Facebook.
Facebook's chief executive, Mark Zuckerberg, said the aim of the algorithm change was to strengthen bonds between users and to improve their well-being. Facebook would encourage people to interact more with friends and family and spend less time passively consuming professionally produced content, which research suggested was harmful to their mental health. Within the company, though, staffers warned the change was having the opposite effect, the documents show. It was making Facebook's platform an angrier place. Company researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook. "Our approach has had unhealthy side effects on important slices of public content, such as politics and news," wrote a team of data scientists, flagging Mr. Peretti's complaints, in a memo reviewed by the Journal... They concluded that the new algorithm's heavy weighting of reshared material in its News Feed made the angry voices louder. "Misinformation, toxicity, and violent content are inordinately prevalent among reshares," researchers noted in internal memos.
Some political parties in Europe told Facebook the algorithm had made them shift their policy positions so they resonated more on the platform, according to the documents. "Many parties, including those that have shifted to the negative, worry about the long term effects on democracy," read one internal Facebook report, which didn't name specific parties...
Mr. Zuckerberg resisted some of the proposed fixes, the documents show, because he was worried they might hurt the company's other objective — making users engage more with Facebook.
Familiarity breeds contempt (Score:2)
Sometimes the best way to maintain a relationship is a healthy dose of willful blindness
Re:Familiarity breeds contempt (Score:5, Insightful)
Zuckerberg is one of the few people in the world who, if he died right now, would result in the world being a better place. He shares this accolade with a select handful of other historical figures like Hitler, and Stalin.
I'm sure his parents must be very proud.
Re: (Score:2)
But is WeChat worse than Facebook? (Score:1)
What was that (the parent) supposed to be about? Care to clarify? (If so, I suggest starting by restoring and explaining your original Subject, but right now I'm dismissing it as another case of FP disease. ("The first rule of FP club is that complete thoughts are not allowed in FP club." (But I (obviously) prefer complete thoughts.)))
My Subjective question is derived from the book The Big Nine by Amy Webb. Facebook figures as one of the six American AI companies, but I can't even read the paywalled story
Re: (Score:2)
With regard to the gutless wonder (who might wonder how the AC comment was called to my attention), the book says quite a bit of substance about the Chinese social credit system. Quite different from how MEPR (Multidimensional Earned Personal Reputation) "should" work. But the funny part is that personal profiling and evaluation systems as bad as or worse than the Chinese social credit system (or my strangest fantasies of MEPR) already exist among the six American AI leaders. So far the American abuses seem
Re: (Score:2)
Hi Shanen,
Just a small point to add - I have not read Deep Thinking, but every time I've heard "agnostic" used in relation to AI technologies it's talking about the implementation: it's "agnostic" if you can implement the AI on a number of different computer architectures - that is, that it isn't depending on some specific high level instructions or specialized hardware.
Might not be at all what you are talking about, but I just thought I'd throw it out there.
Re: (Score:2)
In the context it was pretty clear that he meant morally neutral in the sense that the same (AI) technologies can be used for good or bad purposes. Your usage would make more sense in terms of implementing similar chess-playing or chess-analysis programs on different hardware, but I'm pretty sure he didn't use "agnostic" in that sense. He did talk about how the algorithmic progress has allowed for much more powerful chess software to be implemented without the special purpose hardware.
On the WeChat topic, I
Re: (Score:2)
Re: (Score:2)
Re: (Score:1)
Lacks will? Lolz, they wanted this. (Score:4, Insightful)
Re: (Score:1)
Good thing Slashdot has none of that "outrage" [slashdot.org].
Re: (Score:2)
One way to increase ad views is for forum to plant outrageous responses to get people to click and respond, each yielding an ad placement download to get paid for.
Re: (Score:2)
That, combined with thread creation, "makes a market" for blabbering hotheads to respond to and generate ad revenue.
I wouldn't be surprised if I've earned $50,000 dollars for this web site over the years, given my ability to generate outrage downmods.
Re: (Score:2)
Hmm (Score:5, Insightful)
There is no perfect interface (Score:2)
The problem with A/B testing is that is there is a presumption that a "best" interface exists and can be found by sufficient poking at each of the dimensions and parameters of the interface. The contradictory reality is that "best" is a case-by-case thing and your best interface would differ from mine, just as your thinking differs from mine.
But a dominant and profit-driven corporate cancer like Facebook (or the google (or Apple?)) sees the UI as a cost to be minimized, which means that the "best" solution
Re:Hmm (Score:5, Funny)
Public discourse on pretty much anything these days is misguided, because not enough people understand anything about anything.
Re: (Score:2)
Interestingly that's exactly what the Chinese government is proposing to do. For example, users will be able to see exactly why the algorithm showed them something, and correct it if they think it was wrong. They will also be able to turn it off and get the generic view.
Re: (Score:1)
If we just go back to the point where people see the things they actually subscribed to (i.e. stuff their actual friends posted on Facebook), 99% of this problem will just go away
Social river is on fire (Score:5, Insightful)
No, it's *way* worse than that (Score:5, Insightful)
When someone dumps toxic waste in to a river, that toxicity becomes visible to everyone downstream of the dump point. It's either directly visible: you can take a water sample and test it; or indirectly visible: you can observe the change to the plants and animals in the river and see how they are negatively impacted. What social media in general and Facebook in particular have done is add toxicity to the tap water in your home - individual people are being poisoned by a bespoke mix of poison that is unique to them. Because their poison is unique, it's harder to spot and harder to see the impact.
When someone dumps toxic waste in to a river, their primary objective in doing this is to get rid of the toxic waste, which they are dumping because it is cheaper to do so. Their motive is not to addict the river to their toxicity. When social media in general and Facebook track people around the web, bombard them with psychologically profiled advertisements, send them images they know will have a high chance of body-shaming them, will generate "flash-floods" of "like" or "outrage", their motive is to permanently alter the recipients of their toxicity. They know that exposure to their toxicity will bring about permanent change in people - change for the worse. They know that they can sprinkle some dopamine-inducing feedback loops in to the process and then get other users to push those buttons [think "likes" or re-tweets or replies to slashdot posts] in their attempt to "addict" their users.
Zuck is the Sackler Family of social media. He knows that what he is personally doing is f##king up literally millions of lives.
And he won't change.
And he doesn't care.
Ask yourself how many teenagers have committed suicide thanks to body-shame issues and insecurity brought on by Instagram and Facebook. Whatever you *think* the number is, it's likely to be higher. We now know that Facebook have a clear picture of this - and sit back and do nothing.
You know what the worst part of that is? They don't act not because they think that to do so would be to admit liability, but because their chosen course of action is the one that makes them the most money.
The only reason they "get away with it" seems to be that they are the only party close enough to the evidence of the consequences of their actions - and they hide any proof they may identify.
Re:No, it's *way* worse than that (Score:4, Insightful)
Re: (Score:2)
We’re told that 80% of communication is non-verbal which is to say that it is composed of gesture, expression, tone, posture and so on. What seems to be less well understood is that when we lose that 80% of context and texture by absorbing content digitally, the one human emotion that suffers the most, is degraded the most, is empathy.
If I wanted to spread toxicity and hatred on line, it isn’t hard. I could have replied to your post with swear words, insults, extremist propaganda o
Re: (Score:2)
This is fundamentally dangerous. Society requires a high degree of cooperation, once something, anything sufficiently degrades that cooperation and it is no longer coheren
Re: (Score:2)
In your comment you write that "Social media algorithms actively polarize and radicalize people". Based on the evidence we see, I would be willing to accept that assertion.
But you quite rightly take us to the heart of a more important and much darker question:-
Is it the case that the Social Media Companies set out to maxim
Re: (Score:2)
The only path to solution must involve systemic changes - we need to a) accept that inflammatory content exists b) alter the system to make such content less impactful. This would inv
Re: (Score:2)
Maybe there is a case for democratization of the process - publish a very clear set of rules covering what is/is not allowed - which you revise as and when needed - and then apply those
Re: (Score:2)
Re: (Score:2)
Only in the USA, really.
I mean, don’t get me wrong, we in Europe have our divisive issues. But in my 15 years here, I’ve never seen anything like the polarization I saw in the states 30 years ago (and it’s gotten worse since).
Re: (Score:1)
I have a hunch that Covid-19 might have fundam
"flaws"? (Score:1)
But the Journal says the memos showed "that CEO Mark Zuckerberg resisted proposed fixes," and that the memos "offer an unparalleled look at how much Facebook knows about the flaws in its platform and how it often lacks the will or the ability to address them."
"flaws"? That was the feature.
Follow the money.... (Score:2, Insightful)
Mr. Zuckerberg resisted some of the proposed fixes, the documents show, because he was worried they might hurt the company's other objective — making users engage more with Facebook.
Engagement is not the true objective, making money is. Mini-Mark is all about the money.
Re:Follow the money.... (Score:2)
Yes. What Zuck and Sandberg did is no different than what Purdue pharma knowingly did in promoting the opioid crisis.
Both parties should be hung from trees.
Re: (Score:2)
Hang people who post outrageous things so you angrily post in response, generating ad revenue?
Keep reading that until you understand.
Re: (Score:2)
Re: (Score:2)
News Flash (Score:4, Funny)
Wait, you're saying Zuckerberg is a jerk? Wow. Never heard that before.
MSI? Really? (Score:2)
I wasn't aware Facebook ever had meaningful social interaction beyond liking cat photos or other clickbait. Seriously, not even being sarcastic here...
Re: (Score:2)
"Seriously, not even being sarcastic here..."
Seriously, you can't figure out that millions of people have reconnected with old friends through FB? FB marketplace is also a decent venue to sell stuff...I did so when I moved two years ago, posting items on that, Craig's List and Next Door...FB sales worked much better than the others.
Don't get me wrong...I hate Zuck and his ilk.
It's only a `flaw if (Score:2)
My Pillow Guy (Score:2)
Took one look at Facebook over half a decade ago (Score:1)
"Platform" sounds so much more flattering than farm.
Something needs to change (Score:3, Interesting)
I'm sure it will be called socialism or some other boogeyman term by some, but it's well past time that some regulations be imposed on the likes of Facebook, Twitter, and their snowflake-haven counterparts on the right, along with the 24-hour cable news networks. The companies have gotten to the point where their actions, or inactions, or inability to take all the necessary actions, can result in changing the results of political races, allowing foreign actors like Russia to sow discord, and other things that I think everyone can agree are bad for society. Clearly we can't rely on the companies themselves to take the proper actions, so it's up to governments to force the issue.
We need to impose a strict obligation on everyone at the top of these companies that they must make decisions that are in the best interest of society. Not the shareholders, not the executives, society. Give it some teeth too, like strict personal liability for everyone in the C-Suite at the company. If a "unite the right" type rally is organized on Facebook, for example, and people end up being injured or killed, the entire Facebook C-Suite can be charged as accessories unless they can show Facebook not only has policies in place requiring that immediate action is taken to shut down any sort of advocating for a violent protest, but that those policies are consistently followed by the rank and file. If they're not, it's not the low level employee who gets fired, it's the C-Suite executives who get charged as accessories and potentially serve time in jail.
Cable news networks, be it CNN, MSNBC, Fox News, Newsmax, or any of the others... Not only will the C-Suite be held accountable, but so will on-air talent. They need to make sure that their opinion shows are very clearly labeled as such. Maybe starting with giant disclaimers at the start, end, and after every commercial break and segment. Some of which the host of the show must say aloud.
It's downright pathetic that we even need to be having this conversation in the first place. Especially with things like covid, where it's an equal opportunity killer, companies like Twitter and Facebook should want to shut down the spread of misinformation. Having your customer base die off, literally, is not good for business. Neither is being associated with the spread of misinformation that resulted in a number of unnecessary deaths. Clearly the executives at these companies are only concerned about the next quarter's projections and don't give two fucks about what might happen 6-12 months from now, so we need to force the issue.
Re: (Score:2)
Re: (Score:1)
Give it some teeth too, like strict personal liability for everyone in the C-Suite at the company.
That sort of liability would mean the swift end of social media. Not saying that's a bad thing...
Given the money at play, I think a Surgeon General's warning in the fine print is a more likely outcome.
Re: (Score:2)
companies like Twitter and Facebook should want to shut down the spread of misinformation.
This is absolutely not the right approach (and what they actually trying to do). Identifying misinformation is extremely labor-intensive, it is prone to biases, agendas and influence peddling, and it actually and demonstrably harmful to progress. Using computing analogy - this is search on unsorted data type of a problem. It will also not solve the problem, as radicalization into "approved" areas (e.g., defund the police) will keep happening.
The issue was, is, and will remain engagement algorithms resulti