Facebook Relents, Releases Report that Makes Them Look Bad (msn.com) 54
"Transparency is an important part of everything we do at Facebook," reads the first line of a first-quarter Content Transparency Report which Facebook later decided not to share with the public.
They've now changed their mind, and released that report. The Hill summarizes its findings: Facebook said that an article about a doctor who passed away two weeks after getting a coronavirus vaccine was the [#1] top-performing link on the social media platform in the U.S. from January to March, according to a report released Saturday... [The Washington Post adds that this article "was promoted on Facebook by several anti-vaccine groups".] According to Facebook's report, the article was viewed over 53 million times...
In addition, a website pushing coronavirus misinformation was one of the top 20 most visited sites on the platform, according to The Washington Post.
Specifically, the Post calls that top-20 site "a right-wing anti-China publication that has promoted the violent QAnon conspiracy theories and misleading claims of voter fraud related to the 2020 election."
Facebook had considered sharing the 100 most popular items in their newsfeed, the Post adds, but "The problem was that they feared what they might find..." The disclosure reflects the challenge of being open with the public at a time when the social network is being attacked by the White House as well as experts for fomenting the spread of health misinformation. Previously, the company had only shared how much covid-related misinformation it has removed, and has been careful not to acknowledge up to this point what role they've played in disseminating material that mislead the public about the virus and the vaccine. For months, executives have debated releasing both this report and other information, according to a person familiar with the company's thinking. In those debates, the conversations revolved around whether releasing certain data points were likely to help or hurt the company's already-battered public image. In numerous instances, the company held back on investigating information that appeared negative, the person said...
Facebook's leadership has long felt that skepticism about any subject, including vaccines, should not be censored in a society that allows robust public debate... The challenge is that certain factual stories that might cast doubt on vaccines are often promoted and skewed by people and groups that are opposed to them. The result is that factual information can become part of an ideological campaign. Facebook has been slow to remove or block some of the leading anti-vaccine figures that spread such ideas.
Some observations about Facebook's report:
They've now changed their mind, and released that report. The Hill summarizes its findings: Facebook said that an article about a doctor who passed away two weeks after getting a coronavirus vaccine was the [#1] top-performing link on the social media platform in the U.S. from January to March, according to a report released Saturday... [The Washington Post adds that this article "was promoted on Facebook by several anti-vaccine groups".] According to Facebook's report, the article was viewed over 53 million times...
In addition, a website pushing coronavirus misinformation was one of the top 20 most visited sites on the platform, according to The Washington Post.
Specifically, the Post calls that top-20 site "a right-wing anti-China publication that has promoted the violent QAnon conspiracy theories and misleading claims of voter fraud related to the 2020 election."
Facebook had considered sharing the 100 most popular items in their newsfeed, the Post adds, but "The problem was that they feared what they might find..." The disclosure reflects the challenge of being open with the public at a time when the social network is being attacked by the White House as well as experts for fomenting the spread of health misinformation. Previously, the company had only shared how much covid-related misinformation it has removed, and has been careful not to acknowledge up to this point what role they've played in disseminating material that mislead the public about the virus and the vaccine. For months, executives have debated releasing both this report and other information, according to a person familiar with the company's thinking. In those debates, the conversations revolved around whether releasing certain data points were likely to help or hurt the company's already-battered public image. In numerous instances, the company held back on investigating information that appeared negative, the person said...
Facebook's leadership has long felt that skepticism about any subject, including vaccines, should not be censored in a society that allows robust public debate... The challenge is that certain factual stories that might cast doubt on vaccines are often promoted and skewed by people and groups that are opposed to them. The result is that factual information can become part of an ideological campaign. Facebook has been slow to remove or block some of the leading anti-vaccine figures that spread such ideas.
Some observations about Facebook's report:
- It only covers public content in a News Feed — so presumably it's failing to account for any misinformation that's shared only with a group's members.
- The report acknowledges that nearly 20% of posts in a News Feed come from a Group the user has joined.
- More than 1 in every 17 content views in the News Feed are recommended by Facebook's algorithms.
Sadly ... (Score:5, Insightful)
Re: (Score:3, Informative)
Have you been living under a rock for the last 5 years? This sort of misinformation almost destroyed American democracy, and might still do it. It's not healthy for it.
I don't have the time or desire to go point by point but let's have a look here, because this shit gets repeated over and over and it's a perfect example of it:
My favorite stupid line of the year is "follow the science", usually from people who swore they wouldn't take "Trump's vaccine" if he won re-election, on nationally televised debates
This never happened. You're taliking about Harris & Biden, here's one of the quotes from Harris' VP debate:
I will say that I would not trust Donald Trump. And it would have to be a credible source of information that talks about the efficacy and the reliability of whatever he's talking about. I will not take his word for it. He wants us to inject bleach. I -- no, I will not take his word.
There are other statements but they're all pretty similar. There are conc
Re: (Score:1)
Re: (Score:2)
Humans are obviously unable to make good decisions for themselves, so we need to identify some of these humans to make the decisions for all the other ones.
Because putting people in charge who can't make good decisions is a good idea.
Profit.
Re: (Score:3)
Any other opinion is a step towards fascism by definition.
It's stupid to deny the very real and acute problem of the open public debate being hijacked by fascists that make their arguments with lies.
Re: (Score:2, Flamebait)
Then you either let them speak their bullshit and expose their lies for what they are. Or you openly draw a clear line on the sand, personally decide who is a fascist and who is not, and only allow in the open public space those you think are trustworthy... and that is fascism.
The problem is that everyone, from your dumbest grass eating idiot to the brightest genius, have a natural disposition towards trusting what they FEEL is right. Facts are quite often secondary to personal biases, even among the most r
Re: (Score:3)
Then you either let them speak their bullshit and expose their lies for what they are.
Unfortunately that doesn't work a lot of the times. People are often irrational, doubly so when they convince themselves of a lie.
and that is fascism
If you let the lies take over the public debate and politics it also leads to fascism.
The problem is that everyone, from your dumbest grass eating idiot to the brightest genius, have a natural disposition towards trusting what they FEEL is right.
That's right. And this behavior leads to people convincing themselves of lies, as i mention above.
Now, combine that with the real attacks of pushing convenient lies into this open discussion and you have a real shitstorm brewing.
The only way towards the truth is through open debate and disagreements.
You're confusing humans with creatures that are overall susceptible
Re: Sadly ... (Score:1)
Re: (Score:2)
You are not an American in spirit.
If you mean that i don't believe in absolutism then you're right.
There is a reason the first amendment came first.
Yeah, it's to protect you from corrupt governments, you dumbass.
That is BY DEFINITION authoritarian and fascist.
So you think that for instance outlawing racism in public discourse is fascist?
You know that most dictators came to power because they hijacked public debate and convinced enough people of their lies to be able to take control, right?
Re: (Score:2)
Yes, outlawing racism in public discourse is fascist. If we outlawed racism, we'd have to outlaw the entire Democrat party, whose history of slavery, Jim Crow, KKK, redlining, Tuskegee syphillis experiments, japanese internment, and other ridiculously racist policies, personalities, and philosophies, would surely find them deserving of being outlawed.
What's the difference between convincing people to gain control, and outlawing people to gain control?
Re: (Score:2)
What's the difference between convincing people to gain control, and outlawing people to gain control?
The word 'convincing' has taken on a different meaning since the development of psychological and commercial warfare on the internet.
You can illiterately buy masses of voices that appear to agree with you. The fair and open debate you want is severely disturbed at the base.
It's shocking to me that you post on slashdot but have apparently no concept of all the social hacking tricks that are being used to skew the public debate.
Re: (Score:2)
Your insight is compelling. Because no social hacking trick would include outlawing speech unwanted by powerful people.
Re: (Score:2)
The problem is that Facebook makes the devil, as depicted in said movies, look like a good guy. That's amazing.
Care to show us where a for profit organisation decided to do an internal review in attempt to argue that they were being a good corporate citizen, discovered something alarmingly negative, and decided, fuck it, we're going to show this to the public?
Facebook isn't extraordinarily evil here. Hell they aren't even more evil than most people. Do you go into a job interview and tell your perspective employer that you have a drinking problem and a pornography addiction?
Generally neither people nor corporations
Re: (Score:2)
Facebook isn't extraordinarily evil here. Hell they aren't even more evil than most people. Do you go into a job interview and tell your perspective employer that you have a drinking problem and a pornography addiction?
Generally neither people nor corporations don't put any effort into pointing out their negative traits. The expectation that Facebook would is just unrealistic.
For sensitive jobs I'm required to disclose criminal history or go through medical evaluation and notify if any new issues come up. Because people's lives can be in danger.
But it's not even close the same thing. Facebook can literally get people genocided, governments overturned or let deadly diseases spread through their choices of promoting some posts.
Re: (Score:2)
Facebook can literally get people genocided
LOL. I've heard of hyperbole, but I think we'll need to invent new words for your post. Megabole? Ludicrousbole? Plaidbole?
But non of your "feelings" addressed my point: Again, can you point to where you voluntarily and unforced decide to publicly criticise yourself for no gain?
Re: (Score:2)
I agree with your point.
Guess it is time to force them.
Lie and hide unpopular truths as a company with that much power. The C levels get blue collar prison time.
Re: (Score:2)
OK now look up facebook's role in causing the Rohingya crisis in Burma, the destruction of their democratic organizations, and their current situation.
They're much worse than a movie devil, they're like a biblical devil in the breadth and scale of their destruction.
Facebook is in a no-win situation (Score:3, Interesting)
It's kind of heroic that they released this information. Yes, it looks bad. Could they have corrected it? Maybe. There's plenty of room for criticism from outside the decision making core. But there seems to be a mountain of data to sort thru and whether they are capable of doing that is debatable.
I think they are motivated to keep a clean, responsible arena for people to gather in. I think that their enemies suspect they are unduly influenced by politics or social pressures or profit motives that have convinced them to promote certain viewpoints. I'm certain that profit motives encourage them to be as open and neutral as possible.
Take that from a non-user antagonist. I do all I can to discourage the use of Facebook. I am concerned that humans are no longer capable of standing up as individuals and see themselves only in the dim light of others' opinions.
Re: (Score:2)
What might help would be to ban posting of URLs. Yes, all URLs. That would remove linking to misinformation sites. Yes, it would also remove linking to legitimate or innocuous sites, but I can live with that. People can still share personal updates and photos (which was supposed to be the point of Facebook, right?).
Re: (Score:3)
It's not their fault... (Score:2)
Oh wait, that part is their fault.
Re: (Score:2)
Well, is it?
Is it a computers fault for being hacked through a vulnerability?
People keep going on about free will and all, but in the end a lot of folks can be manipulated by pressing the right buttons.
Re: (Score:2)
Re: (Score:2)
I think i misunderstood your post. I thought you were arguing it's the users fault they get addicted.
But i suppose you mean that it's facebook's fault, which i completely agree with.
Re: (Score:2)
Re: (Score:1)
As usual, (Score:4, Funny)
Incorrect title.
"Privacy Rapist Relents, Releases Report that Makes Them Look Bad"
There, FTFY.
Transaprency (Score:5, Funny)
The funniest thing I've read today so far.
Re: Transaprency (Score:1)
Re: (Score:3)
Transparency. I do not think it means what you think it means.
Transparent: allowing light to pass through so that objects behind can be distinctly seen.
Glass is transparent but you can still see it. Lots of things are transparent but you can still see them. The word that means you can't see it is... invisible. Don't people understand words any more?
Re: (Score:2)
Re: (Score:3)
In my opinion, the report isn't actually all that bad. From reading it, you really wouldn't know there was much "shady content" there. Part of that is also due to how badly formatted it is ... some very obvious, very hasty screengrabs from an Excel spreadsheet. If "Transparency is an important part of everything we do at Facebook" they should spend another hour or
Re: (Score:2)
Why? They're so transprarent you can see they are such diks/cunts.
Re: (Score:2)
It's true, though. Making your data transparent to them is important all day, every day.
drink from the firehose (Score:3)
This (fucking) story has been up for days. Why is it here now? Lazy eds? On /.!?
Are there not enough things being posted? Are not enough users checking the firehose?
I <3 how this site dies in extreme slow motion. I'll be here till peak entropy.
Re: (Score:2)
This (fucking) story has been up for days. Why is it here now?
Extraordinary fast reaction by editors from the looks of things.
Facebook should be banned (Score:1)
Re: Facebook should be banned (Score:2)
By who? And the more pertinent follow-up question, under what legal authority?
Which part makes them look bad? (Score:1)
Letting people speak?
Or letting people come to the wrong (or in some cases, "wrong") conclusions?
They just realized it will make them look good (Score:2)
At least in the eyes of their key demographics: The gullible stupids, i.e. the idiots that actually believe that bullshit and will now think Facebook is the last bastion of truth.
Everyone else has left their fields long ago.
Re: (Score:2)
Re: (Score:2)
More like coughs and sprayers, I'd say.
"misinformation" (Score:3)
Last year the Wuhan Lab Leak theory was considered "misinformation" and Facebook blocked people from even discussing it. Now, it's considered to be a very possible scenario. Right before the election, the Hunter Biden Laptop story was considered "Russian disinformation" and Facebook blocked people from even discussing it. Now we know it's real and that Hunter has actually lost another laptop.
Which begs the question, who at Facebook is deciding what is "misinformation" and "disinformation"? Do they have a large group of independent researchers who are traveling around the world to investigate these claims? Or are they taking their marching orders from a specific political party?
Re: (Score:2)
Anything is possibly true. So its not a great argument for allowing people to spread hysterical guesses to half the planet.
What a bad article/summary (Score:1)