What Else Do the Leaked 'Facebook Papers' Show? (msn.com) 62
The documents leaked to U.S. regulators by a Facebook whistleblower "reveal that the social media giant has privately and meticulously tracked real-world harms exacerbated by its platforms," reports the Washington Post.
Yet it also reports that at the same time Facebook "ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content."
And in addition, the whistleblower also argued that due to Mark Zuckberg's "unique degree of control" over Facebook, he's ultimately personally response for what the Post describes as "a litany of societal harms caused by the company's relentless pursuit of growth." Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds before a human reports it. But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook...
For all Facebook's troubles in North America, its problems with hate speech and misinformation are dramatically worse in the developing world. Documents show that Facebook has meticulously studied its approach abroad, and is well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes. According to one 2020 summary, the vast majority of its efforts against misinformation — 84 percent — went toward the United States, the documents show, with just 16 percent going to the "Rest of World," including India, France and Italy...
Facebook chooses maximum engagement over user safety. Zuckerberg has said the company does not design its products to persuade people to spend more time on them. But dozens of documents suggest the opposite. The company exhaustively studies potential policy changes for their effects on user engagement and other factors key to corporate profits.
Amid this push for user attention, Facebook abandoned or delayed initiatives to reduce misinformation and radicalization... Starting in 2017, Facebook's algorithm gave emoji reactions like "angry" five times the weight as "likes," boosting these posts in its users' feeds. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook's business. The company's data scientists eventually confirmed that "angry" reaction, along with "wow" and "haha," occurred more frequently on "toxic" content and misinformation. Last year, when Facebook finally set the weight on the angry reaction to zero, users began to get less misinformation, less "disturbing" content and less "graphic violence," company data scientists found.
The Post also contacted a Facebook spokeswoman for their response. The spokewoman denied that Zuckerberg "makes decisions that cause harm" and then also dismissed the findings as being "based on selected documents that are mischaracterized and devoid of any context..."
Responding to the spread of specific pieces of misinformation on Facebook, the spokeswoman went as far to acknowledge that at Facebook, "We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible."
She added that the company is "constantly making difficult decisions."
Yet it also reports that at the same time Facebook "ignored warnings from its employees about the risks of their design decisions and exposed vulnerable communities around the world to a cocktail of dangerous content."
And in addition, the whistleblower also argued that due to Mark Zuckberg's "unique degree of control" over Facebook, he's ultimately personally response for what the Post describes as "a litany of societal harms caused by the company's relentless pursuit of growth." Zuckerberg testified last year before Congress that the company removes 94 percent of the hate speech it finds before a human reports it. But in internal documents, researchers estimated that the company was removing less than 5 percent of all hate speech on Facebook...
For all Facebook's troubles in North America, its problems with hate speech and misinformation are dramatically worse in the developing world. Documents show that Facebook has meticulously studied its approach abroad, and is well aware that weaker moderation in non-English-speaking countries leaves the platform vulnerable to abuse by bad actors and authoritarian regimes. According to one 2020 summary, the vast majority of its efforts against misinformation — 84 percent — went toward the United States, the documents show, with just 16 percent going to the "Rest of World," including India, France and Italy...
Facebook chooses maximum engagement over user safety. Zuckerberg has said the company does not design its products to persuade people to spend more time on them. But dozens of documents suggest the opposite. The company exhaustively studies potential policy changes for their effects on user engagement and other factors key to corporate profits.
Amid this push for user attention, Facebook abandoned or delayed initiatives to reduce misinformation and radicalization... Starting in 2017, Facebook's algorithm gave emoji reactions like "angry" five times the weight as "likes," boosting these posts in its users' feeds. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook's business. The company's data scientists eventually confirmed that "angry" reaction, along with "wow" and "haha," occurred more frequently on "toxic" content and misinformation. Last year, when Facebook finally set the weight on the angry reaction to zero, users began to get less misinformation, less "disturbing" content and less "graphic violence," company data scientists found.
The Post also contacted a Facebook spokeswoman for their response. The spokewoman denied that Zuckerberg "makes decisions that cause harm" and then also dismissed the findings as being "based on selected documents that are mischaracterized and devoid of any context..."
Responding to the spread of specific pieces of misinformation on Facebook, the spokeswoman went as far to acknowledge that at Facebook, "We have no commercial or moral incentive to do anything other than give the maximum number of people as much of a positive experience as possible."
She added that the company is "constantly making difficult decisions."
Facebook wants regulation (Score:5, Insightful)
Facebook and Google want regulation because they are the only companies with the resources to handle it. It will stifle any potential competition at a resonable cost to themselves. The administration gets censor powers via forcing these companies to censor for them, which is illegal, but no one with enough money has brought a case yet
Re: (Score:2)
There's a very simple solution to that problem which is also a good test for whether this is being done in good faith or with collaboration from Facebook and Twitter. The regulations should only apply to social networks with a minimum annual revenue. Say $100M. This would allow small competitors to grow, but when they get rich enough to afford it would make them have to be regulated in the same way. I'd suspect this is exactly what will happen anyway.
If you really believe what you say then calling and w
Re: (Score:2)
If they wanted to be regulated, all they would have had to do is go to Congress and ask. Liberals would have jumped at the chance to be able to claim they regulated the evil social media companies, and conservatives would have made sure the entire thing was toothless by letting them just adapting their current business practices to be the new regulations. Thus ensuring the entire exercise was basically a meaningless waste of time, except for the social media companies which get to stifle competition. As lon
Re: Facebook wants regulation (Score:1)
Re: (Score:1)
Re: (Score:3)
Facebook and Google want regulation because they are the only companies with the resources to handle it.
Citation needed showing competitors like Twitter and Yahoo can't handle a specific regulation that Google and Facebook can.
Re: (Score:2)
Twitter and Yahoo, sure. What about Mastodon, Matrix and Signal? The UK government has been talking about banning anonymous accounts, using the recent murder of an MP, completely unrelated to anonymous accounts, as an excuse.
Re: (Score:2)
Social media thought it had clarified its position back in the late 1990s, but that only held fo
Re: (Score:2)
This whole "affair" is being carefully stage managed to deliver the regulation Facebook and the administration want.
You do understand the extent to which the sort of regulation we are talking about here would kill off The Company Formerly Known as Facebook's ability to squeeze content for maximum eyeballs, right? I mean, you *do* understand how 'making money' works, correct?
Facebook and Google want regulation because they are the only companies with the resources to handle it. It will stifle any potential competition at a resonable cost to themselves. The administration gets censor powers via forcing these companies to censor for them
You don't get some automatic prize for killing off competition. You still have to sell something to someone for, you know, money. TCFKaF sells access to users, access they achieve by exploiting the content we're talking about censoring. The govern
Again, who cares? (Score:5, Insightful)
In all seriousness, who cares? Social media is proving to be o be a failed experiment. If we all left tomorrow this problem would be solved; we are mad at Zuck, but forget we are complicit. As they say in the movie War Games- The only winning move is not to play.
Re: (Score:2)
There's nothing fundamentally bad about social media per se; it's great that you can see what's up with your old college roommate or share pictures of your kids piano recital with your family. This is why people join social media sites. Social media also is a way for businesses to connect with their customers and artists to connect with their audiences; these are also positive things.
The problem is that the way you make the most profit with a social media platform is monetizing anger and resentment.
Re: (Score:2)
Oddly Slashdot has proven that social media isn't a "failed experiment". Some of the posters might be, but not the idea as a whole.
Re: (Score:2)
Have you actually participated in Slashdot comments???
People get methodically downvoted out or sight for "wrongthink". That is not adult discussion, that is goose stepping.
How on earth anyone could think Slashdot is how it should be is just beyond me...
Re: (Score:1)
Re: (Score:3)
Re: (Score:2)
Apparently, you can't handle the truth when slapped in the face with it, and instead have to resort to childish insults.
Re: Again, who cares? (Score:2)
Re: (Score:2)
While I think the moderation system on /. Is good, Kokuyo is correct. /. Used to be good at listening to diverse opinions, but in the last 5 years has become an amplifier of group think. Just look at any topic on COVID. No room for nuanced debate.
Instead of rational debated counter points it has devolved into name calling and woke thought, even by low user IDs. I think it’s a sign of the echo chamber effect infecting previously reasonable people.
Re:Again, who cares? (Score:5, Insightful)
Well, I'd agree with you if the toxic effects of social media were confined to people who seek them out. But they're not. The problem is especially dire in the non-English speaking world, where there have been multiple mob murders instigated by misinformation spread through social media.
Re: (Score:2)
Re: (Score:2)
You mean like bars and hookah lounges?
Re: Again, who cares? (Score:2)
Re: (Score:2)
I, for one, care about the revelation that Zuckertwit is the love child of Bill Gates and a belligerent carnivorous space alien!
You asked for it (Score:3)
Re: (Score:2)
Sounds like perjury (Score:3)
If Zuckerberg testified under oath to one thing, and internal documents are saying something completely different, that certainly sounds like a slam dunk perjury charge. It was a false statement, the person making the false statement knew it was false at the time, and there was a clear intent to deceive or mislead. Charge him and throw his ass in prison like they should anyone who knowingly lies under oath. Even if Zuck was just briefed by some underlings and didn't have personal knowledge at the time, he's the CEO, so the buck stops with him. Every action taken by every employee is ultimately his responsibility.
Re: (Score:3)
Re: (Score:2)
Here's ten of them that beg to differ.
https://www.forbes.com/sites/n... [forbes.com]
Re: (Score:2)
Re: (Score:2)
Okay, okay. These three are in though.
Bernie Madoff
Allen Stanford
Raj Rajaratnam
" . . . . ." is the sound . . . (Score:1)
. . . Of all the people who care.
FB, or whatever it's called now, is very rapidly becoming irrelevant.
Most of my friends - almost all of whom joined FB in the early days - have either closed their account or basically have stopped posting.
My children (28+) are reporting the same.
FB is dead. Sell, sell, sell !
is the sound (Score:3)
You have over 28 children? So many you lost count?
Re:"" is the sound (Score:3)
dang (Score:2)
dang, bought too soon /.
FUD still getting submitted to
price must not be low enough yet
Maybe I'm Too Cynical (Score:4, Interesting)
Please use the correct name (Score:3)
Facebook, the cancer, was yesterday. Today we have arrived at MetaStasis.
People are a problem (Score:2)
At the end of the day, the *main* thing Facebook does is simply this: It allows people to talk to each other. We've been talking to each other since the human race began, but we've never had the ability to monitor these communications on a vast scale or to perform large-scale numerical analyses of what is being said (e.g., "Hate speech was up 23% in February, and by the way, the definition of hate speech is currently [X]!")
It turns out that when you gain the ability to eavesdrop on half of the human race,
Re: (Score:1)
At the end of the day, the *main* thing Facebook does is simply this: It allows people to talk to each other. We've been talking to each other since the human race began, but we've never had the ability to monitor these communications on a vast scale or to perform large-scale numerical analyses of what is being said (e.g., "Hate speech was up 23% in February, and by the way, the definition of hate speech is currently [X]!")
Nope, that's not what facebook does at all. Their most important role is to put together people who would never normally interact into conversations they would never normally have. Having done that, they emphasise the "highest engagement" items, which actually means "most inflammatory". There is a fundamental difference between your telephone which just lets you talk to other people and Facebook which exercises editorial control over what parts of your speech get to other people and visa-versa.
Re: (Score:2)
Wha? Nah, it's how you use the damn site. My wife is part of lots of music and craft groups, as well as friends and family groups. She sometimes watches live streaming music performances done impromtu by some of the artists she follows.
Sure, it advertises at her and I'm sure if she cared about politics should would find divisive stuff, but more or less the platform works for her very well. She gets to stay in touch with all her friends and it keeps her up to date on all her musicians and when they will be p
Re: (Score:2)
Their most important role is to put together people who would never normally interact into conversations they would never normally have.
That's what the Internet in general does. (Really, it's what *any* new communication technology does). Specifically, it allows people with niche interests or niche opinions to find each other. This was a core function of the Internet long before Facebook came along.
As sarren1901 points out, what you choose to with this ability is entirely up to you. You can use it to find people who are interested in Galician folk music, or medieval calligraphy... or you can use it to find white supremacists and member
Re: (Score:2)
Their most important role is to put together people who would never normally interact into conversations they would never normally have.
Have you ever even been on FB? I'm no fan of Zuck, in fact I'd love to see him put away. However, your comment is flat out false. In the many years I've been on FB (since joining to monitor my teen's actions (she's 30 now)), I've accumulated many old friends and distant family. And yet, I've rarely, if ever, had discussions with people I wouldn't have talked to otherwise. Some of my FB "friends", post inflammatory things, and I simply move them into a status where I'm not seeing their posts, but remain
Liberal bias (Score:2)
Re: (Score:2)
Better hope he doesn't get it. Image this, a network he controls that he can directly talk to his followers on. Likely develop a basic app for the site and he could direct them to do all sorts of questionable and likely downright illegal stuff.
He may just settle for making lots of money off of his followers, but that's if we are lucky. Much worse could happen.
Re: (Score:2)
Fixed that for you... (Score:3)
"Responding to the spread of specific pieces of misinformation on Facebook, the spokeswoman went as far to acknowledge that at Facebook, "We have no commercial or moral incentive to do anything other than give our top execs the maximum profit and as much money as possible."
Seems someone was messing with the typewriter again.
If anyone possibly thinks that the money isn't the only concern MetaFace has their either in on the con or willfully ignorant. Mind you, that revelation hasn't cost any corporation any money ever so...
Re: Fixed that for you... (Score:2)
*They're. Fuck you autocorrect.
Re: (Score:2)
*They're. Fuck you autocorrect.
ROFL...seems that autocorrect FTFY!
Delete... (Score:1)
Zuckerberg and Murdoch (Score:2)
The two most powerful, and some of the most dangerous, people on the planet.
Zuckerberg makes money (and gains power) by connecting people, and he'll keep doing that by whatever means available, consequences be damned. Mass graves have been filled with the victims of genocides spurred on using, and enhanced by, Facebook.
Rupert Murdoch has found that it's easy to turn people into loyal customers of your media ecosystem by radicalizing them into fascists. His son Lachlan who is taking control of the empire see
94 vs 5 (Score:2)
Zuckerberg testified last year before Congress
that the company removes 94 percent of the hate speech
it finds before a human reports it. But in internal
documents, researchers estimated that the company was
removing less than 5 percent of all hate speech on
Facebook
Fuck Zuck.