Key People Are Leaving Facebook and Torching the Company In Departure Notes (buzzfeednews.com) 104
An anonymous reader quotes a report from BuzzFeed News: On Wednesday, a Facebook data scientist departed the social networking company after a two-year stint, leaving a farewell note for their colleagues to ponder. As part of a team focused on "Violence and Incitement," they had dealt with some of the worst content on Facebook, and they were proud of their work at the company. Despite this, they said Facebook was simply not doing enough. "With so many internal forces propping up the production of hateful and violent content, the task of stopping hate and violence on Facebook starts to feel even more sisyphean than it already is," the employee wrote in their "badge post," a traditional farewell note for any departing Facebook employee. "It also makes it embarrassing to work here."
Using internal Facebook data and projections to support their points, the data scientist said in their post that roughly 1 of every 1,000 pieces of content -- or 5 million of the 5 billion pieces of content posted to the social network daily -- violates the company's rules on hate speech. More stunning, they estimated using the company's own figures that, even with artificial intelligence and third-party moderators, the company was "deleting less than 5% of all of the hate speech posted to Facebook." (After this article was published, Facebook VP of integrity Guy Rosen disputed the calculation, saying it "incorrectly compares views and content." The employee addressed this in their post and said it did not change the conclusion.)
The sentiments expressed in the badge post are hardly new. Since May, a number of Facebook employees have quit, saying they were ashamed of the impact the company was having on the world or worried that the company's inaction in moderating hate and misinformation had led to political interference, division, and bloodshed. Another employee was fired for documenting instances of preferential treatment of influential conservative pages that repeatedly spread false information. But in just the past few weeks, at least four people involved in critical integrity work related to reducing violence and incitement, crafting policy to reduce hate speech, and tracking content that breaks Facebook's rules have left the company. In farewell posts obtained by BuzzFeed News, each person expressed concerns about the company's approach to handling US political content and hate speech, and called out Facebook leadership for its unwillingness to be more proactive about reducing hate, incitement, and false content. In the wake of the 2020 US Election, Facebook's "election integrity" team, which was charged with "helping to protect the democratic process" and reducing "the spread of viral information and fake accounts," was recently disbanded as a stand-alone unit. Company leadership also reportedly shot down a proposal from the company's integrity teams to throttle the distribution of false and misleading election content from prominent political accounts, like President Donald Trump's.
Using internal Facebook data and projections to support their points, the data scientist said in their post that roughly 1 of every 1,000 pieces of content -- or 5 million of the 5 billion pieces of content posted to the social network daily -- violates the company's rules on hate speech. More stunning, they estimated using the company's own figures that, even with artificial intelligence and third-party moderators, the company was "deleting less than 5% of all of the hate speech posted to Facebook." (After this article was published, Facebook VP of integrity Guy Rosen disputed the calculation, saying it "incorrectly compares views and content." The employee addressed this in their post and said it did not change the conclusion.)
The sentiments expressed in the badge post are hardly new. Since May, a number of Facebook employees have quit, saying they were ashamed of the impact the company was having on the world or worried that the company's inaction in moderating hate and misinformation had led to political interference, division, and bloodshed. Another employee was fired for documenting instances of preferential treatment of influential conservative pages that repeatedly spread false information. But in just the past few weeks, at least four people involved in critical integrity work related to reducing violence and incitement, crafting policy to reduce hate speech, and tracking content that breaks Facebook's rules have left the company. In farewell posts obtained by BuzzFeed News, each person expressed concerns about the company's approach to handling US political content and hate speech, and called out Facebook leadership for its unwillingness to be more proactive about reducing hate, incitement, and false content. In the wake of the 2020 US Election, Facebook's "election integrity" team, which was charged with "helping to protect the democratic process" and reducing "the spread of viral information and fake accounts," was recently disbanded as a stand-alone unit. Company leadership also reportedly shot down a proposal from the company's integrity teams to throttle the distribution of false and misleading election content from prominent political accounts, like President Donald Trump's.
"The rats leave the sinking ship." (Score:3)
Yeah, because it's sinking!
Smart animals, those rats!
Re: (Score:1)
Their post also argued that Facebook’s “very apparent interest in propping up actors who are fanning the flames of the very fire we are trying to put out” makes it impossible for people to do their jobs. “How is an outsider supposed to believe that we care whatsoever about getting rid of hate speech when it is clear to anyone that we are propping it up?”
Content that is controversial and inflammtory attracts more viewers and that means more profits. Facebook will NEVER seriously do anything about the problem. This is a company running by the greediest of the greedy, the most corrupt of all corruptness.
If you wrote a novel and had a character follow the exact behavior of Mark Zuckerberg, you would be accused of anti-semtism.
Re: (Score:1)
Indeed! Hip-hip-hooray for these censors burning out and leaving the hot mess that is Facebook.
Re:deleting less than 5% of all of the "hate" spee (Score:5, Insightful)
I believe in an offensive approach to Democracy and Freedom.
A democracy should not be tolerant towards parties that intend to abolish the democracy, and should make those parties illegal.
A county with freedom of religion should not be tolerant of religions who intend to abolish the freedom of religion, and should make those religions illegal.
A country with freedom of speech should not be tolerant of those speaking in favor of abolishing that freedom and forbid that kind of speech.
A free and enlightened society should protect facts and truths and make it illegal to spread demonstrably wrong claims and disinformation.
Taking a passive approach to defending democracy and freedoms means giving power and options to those elements who wish to suppress it, and over the long term, they might succeed.
Re: deleting less than 5% of all of the "hate" spe (Score:3)
The rub is how does one practice intolerance while preaching tolerance? Where does the line get drawn and then later moved? The government should do nothing. Its citizens should be the first line of defense. We are, afterall, the militia. A militia does not always require use of lethal force. Be creative, without violating certain laws. Leave the government out of it. Once you let that genie out of the bottle it becomes comfortable with its new living space.
Re: deleting less than 5% of all of the "hate" spe (Score:5, Insightful)
The rub is how does one practice intolerance while preaching tolerance? Where does the line get drawn and then later moved? The government should do nothing. Its citizens should be the first line of defense. We are, afterall, the militia. A militia does not always require use of lethal force. Be creative, without violating certain laws. Leave the government out of it. Once you let that genie out of the bottle it becomes comfortable with its new living space.
You are broaching the concept that democracies should always have the right to democratically vote to end the democracy. Because at some level , this whole thing transcends the theoretical. Ultimate tolerance means utter submission to those who demand ultimate intolerance.
We have seen exactly that, when a group has attempted to end the Republic by demanding that millions of votes be discarded because discarding them would mean that the people intolerant of democracy would then "win".
Re: (Score:3)
>A country with freedom of speech should not be tolerant of those speaking in favor of abolishing that freedom and forbid that kind of speech.
>A free and enlightened society should protect facts and truths and make it illegal to spread demonstrably wrong claims and disinformation.
These two points are mutually exclusive, because something that is "demonstrably wrong" can become "demonstrably correct" across just a single event. See: scientific method.
Pardox of tolerance (Score:2)
https://en.wikipedia.org/wiki/... [wikipedia.org]
From the other side, the enemies of America's principles are applying jujitsu against us. It's a lose-lose game where Putin thinks his minor losses (paid by the suffering of the Russian people) are fine as long as America and Europe are losing more bigly. "He whose name need not be mentioned" was just a great puppet for the purpose. Actually not as smart as a puppet. More like a passive fulcrum? Most recent book I've read on the topic is The Road to Unfreedom by Timothy Sn
Re: deleting less than 5% of all of the "hate" spe (Score:1)
Re:deleting less than 5% of all of the "hate" spee (Score:5, Insightful)
The failure of censorship is always a good thing, something to celebrate. A toast!
Listening to people pretend, in the name of anti-censorship, that Facebook isn't an unmitigated dumpster fire full of verifiable lies and misinformation is starting to get really tiring. Any corporation profiting on manufacturing divisiveness in our discourse is indefensible. It's one thing to be a platform where everyone has a voice. It's quite another to mold that platform into something that amplifies lies and rhetoric. I guess the real problem isn't the people posting all the shit, it's the dumbasses who read it, consume it, then echo it again.
Here's a hint: You can support Facebook being rm -rf / and still support free speech and anti-censorship positions. Facebook is cancer. Fuck cancer. Informed discourse, mutual respect, manners, and decorum are still things. For now.
Re: (Score:2)
I guess the real problem isn't the people posting all the shit, it's the dumbasses who read it, consume it, then echo it again.
In other words, "America", or a mid-sized double-digit percentage of it. I hope I'm wrong and it's mostly trolls feeding bots, but ....
Re: deleting less than 5% of all of the "hate" spe (Score:2)
Physicists and behavioral scientists have argued free will doesn't exist for a long time now. Our choices are heavily biased based on our environment.
Re: (Score:2)
Physicists and behavioral scientists have argued free will doesn't exist for a long time now. Our choices are heavily biased based on our environment.
I'm all about science. But this is an area that reminds me of politicians educating people on physics. If free will did not exist, we could write a history of the future that would be 100 percent accurate.
It's like the psyhic hotline lady who knew I was going to call her. It's all figured out.
Re: (Score:2)
I would first like the preface what I am going to say by saying I think free will does exist.
Now, what makes you so certain that the future cannot be predicted with relatively 100% accuracy? Even if free will exists, this doesn't not mean that relative to the lifetime of humanity, we cannot predict the future with rather certain accuracy. We just have to assume free will manifests seldomly and that when it manifests it's impact is relatively negated by other forces. At this point I would like to submit that
Re: (Score:2)
I would first like the preface what I am going to say by saying I think free will does exist.
Now, what makes you so certain that the future cannot be predicted with relatively 100% accuracy? Even if free will exists, this doesn't not mean that relative to the lifetime of humanity, we cannot predict the future with rather certain accuracy.
I believe that when certain actions have already been taken, that outcomes can be fairly well predicted. But that isn't a lack of free will, where no one will deviate from a pre set response to anything. That's just human nature interactions that tend to take a certain course.
The nonexistence of free will demands that no deviation ever take place, that everything is fixed since conception. All of your interactions and decisions are already in the books so to speak
We just have to assume free will manifests seldomly and that when it manifests it's impact is relatively negated by other forces. At this point I would like to submit that the term "prophecy" is often misinterpreted as a magical ability when the reality is that a "prophet" is a person who understands human nature to a high degree, such that the outcomes for societies can be determined for decades, centuries, or longer.
A person can have a pretty predictable r
Re: (Score:2)
In any case, the people in charge of censorship are not the most valuable players on the team.
Re: (Score:2)
Re: (Score:3)
I think there's an unfortunate reality that religions and other civic membership organizations that shape people's belief systems and set the parameters for acceptable behavior don't just exist because of historical accidents, but because most people are low effort thinkers and mostly just parrot whatever they've been exposed to that seems believable.
Over the course of history, these organizations have persisted because they help hold the fabric of society together and largely keep out the worst impulses.
Re: deleting less than 5% of all of the "hate" spe (Score:1)
Whenever a FB Recruiter Contacts Me... (Score:1)
I send them a link of the most recent outrage about Facebook's practices. Usually I can find one from that very day.
Re:Whenever a FB Recruiter Contacts Me... (Score:5, Insightful)
2003 - 2018: 15 years of the Mark Zuckerberg Apology Tour
November 2003
After creating Facemash, a Harvard 'hot-or-not' site.
“This is not how I meant for things to go and I apologize for any harm done as a result of my neglect.”
September 2006 ... We did a bad job of explaining what the new features were and an even worse job of giving you control of them.”
After introducing News Feed, which exposed updates to friends in one central place.
“We really messed this one up.
December 2007
After launching Beacon, which opted-in everyone to sharing with advertisers what they were doing in outside websites and apps.
“We simply did a bad job with this release, and I apologize for it. People need to be able to explicitly choose what they share.”
February 2009
After unveiling new terms of service that angered users.
“Over the past couple of days, we received a lot of questions and comments. Based on this feedback, we have decided to return to our previous terms of use while we resolve the issues.”
May 2010
After reporters found a privacy loophole allowing advertisers to access user identification.
“Sometimes we move too fast. We will add privacy controls that are much simpler to use. We will also give you an easy way to turn off all third-party services.”
November 2011
After Facebook reached a consent decree with the Federal Trade Commission for deceiving consumers about privacy.
“I’m the first to admit that we’ve made a bunch of mistakes. Facebook has always been committed to being transparent about the information you have stored with us — and we have led the internet in building tools to give people the ability to see and control what they share.”
July 2014
After an academic paper exposed that Facebook conducted psychological tests on nearly 700,000 users without their knowledge. (Apology by Facebook COO Sheryl Sandberg)
“It was poorly communicated. And for that communication we apologize. We never meant to upset you.”
December 2016
After criticism of the role of Facebook in spreading fake news about political candidates.
“I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through. Today we’re making it easier to report hoaxes.”
April 2017
After a Cleveland man posted a video of himself killing 74-year-old Robert Godwin.
“Our hearts go out to the family and friends of Robert Godwin Sr., and we have a lot of work — and we will keep doing all we can to prevent tragedies like this from happening.”
September 2017
While revealing a nine-step plan to stop nations from using Facebook to interfere in one another’s elections, noting that the amount of “problematic content” found so far is “relatively small.”
“I care deeply about the democratic process and protecting its integrity. It is a new challenge for internet communities to deal with nation states attempting to subvert elections. But if that’s what we must do, we are committed to rising to the occasion.”
September 2017
After continued criticism about the role of Facebook in Russian manipulation of the 2016 election.
“For the ways my work has been used to divide rather than to bring us together, I ask for forgiveness and I will work to do better. ”
January 2018
Announcing his personal challenge for the year is to fix Facebook.
“ We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools. This will be a serious year of self-improvement and I’m looking forward to learning from working to fix our issues together. ”
March 2018
After details emerged about Cambridge Analytica taking user data.
“We have a responsi
Re: (Score:3)
Nice. no mod points left else you would get one. I feel better knowing it was all just a mistake. You never see a 'won't happen again' though.
Data mining never looked so good. And apologetic.
"Violence", "Incitement" and "False Content" (Score:2, Insightful)
Re: (Score:1)
Once upon a time... (Score:3, Insightful)
Not very long ago, it was considered newsworthy to report on a presidential candidate's son meeting with foreign representatives with even a spurious appearance of impropriety.
Not very long ago, it was considered newsworthy to report on leaked emails documenting such meetings.
Not very long ago, it would have been major news for the CEO of a company to testify before the Senate, and for a Senator to reveal the CEO's testimony as a lie while the CEO was still on the floor of the Senate.
Companies like Facebook and Twitter helped change those norms about what can be reported, and now they are paying the price for it. I have no sympathy for them or the workers who carried out those policies. They are digital equivalents of jack-booted, masked enforcers.
Re: (Score:2, Insightful)
Can you imagine how fast DC insiders would have leaked the news if one of Trump's kids got paid millions upon millions of dollars by firms linked to Russia and China for such vague, uncheckable things as "consulting" and "legal services" -- or $10M/year for "introductions alone" -- and then not paid taxes on substantial chunks of that income?
Re: (Score:2)
Can you imagine how fast DC insiders would have leaked the news if one of Trump's kids got paid millions upon millions of dollars by firms linked to Russia and China for such vague, uncheckable things as "consulting" and "legal services" -- or $10M/year for "introductions alone" -- and then not paid taxes on substantial chunks of that income?
Apparently it takes at least 4 years.
Re: Once upon a time... (Score:1)
Re: (Score:1)
Not very long ago, it would have been major news for the CEO of a company to testify before the Senate, and for a Senator to reveal the CEO's testimony as a lie while the CEO was still on the floor of the Senate.
Wait, when did that one happen?
Re:Once upon a time... (Score:4, Informative)
Jack Dorsey wrongly testified that "anyone can tweet" the first NY Post story about Hunter Biden's laptop. People noticed he was lying, and Ted Cruz called him out on it: https://nypost.com/2020/10/28/... [nypost.com]
Re: Once upon a time... (Score:2)
Not very long ago, it was considered newsworthy to report on a presidential candidate's son meeting with foreign representatives with even a spurious appearance of impropriety.
It used to be weird when a president had porn stars under NDAs, but times change apparently.
Re: Once upon a time... (Score:2)
It's only the NDA part which is new.
Previously, they'd die very young under unusual circumstances.
Re: (Score:2)
It was the other way around.
Yes, we forgot about Bill Clinton, Kennedy, FDR, and of course "Johnson" administrations. And the many more before them...
https://www.history.com/news/p... [history.com]
The Bushes, Reagan, and Obama were exceptions of the rule.
Re: (Score:1)
Facebook blocked links to the October Surprise as spam. I couldn't share links to it, but my liberal-in-good-algorithmic-standing friends could. My guess is that suspect-conservatives get a higher spam-baseline score, or something.
I'm sure snarky slashdotters can invent some hypothetical way to blame the victim and claim everything might be ok therefore surely it is. Obviously their "algorithms" are secret, but why is that "obvious"? It is not okay at all.
Although it was Twitter who banned the NY Post f
Re: (Score:1)
Did they offer you money to turn your account off until after the election? They offered me up to $250. I told them - no thanks.
Best of luck (Score:2, Insightful)
Re: (Score:2)
Humans can do both be violent and also be generous and kind, I would say most interactions today are civil but the bad ones are the ones that stand out and we remember. Even on Facebook only 0.1% of posts violate the standards. Violence is sometimes necessary sometimes you do need use physical means to get what you want. In general I see humans becoming less violent, (with ups and downs). At least where I live you are much more likely to kill yourself than someone else.
Can we do better? Of course we can, bu
Human nature (Score:5, Interesting)
Attempting to filter speech based on content is only putting a bandage on the problem. It does not change the way the people who posted the hate speech (however that is defined at the moment) think or how they feel. Whether or not Social Media Platform X - be it Facebook or Twitter or whatever - can mute the speech of that person is pretty much irrelevant. They will continue to hold those viewpoints, and they can easily migrate to some other platform or communication channels that doesn't censor them.
Back in the 80s I ran a dial-up BBS with a couple friends. We had someone dial in and post some nasty things in one of the discussion forums. After one of those calls (you could see the person "live" on your local computer running the BBS) we did a good old *69 after the modem session terminated. A quick look through the phone book (yes, in those days, in our small town, it was totally feasible to "reverse lookup" a phone number by scanning through the listings!), and we found who it was. It was a fellow high school kid that was extremely shy and backwards. We confronted him and his parents, and the problem stopped. It was at that time I learned how the (apparent) anonymity, or at least the impersonal nature, of online social networking (although no-one called it that at the time) emboldened people to speak their minds more freely, and otherwise behave in ways they would never do in person or when they felt accountable.
Now we have billions of people online and communicating through words with people they have never met, and will never meet. In many ways, those interactions would be construed as totally inconsequential by some people. I think that fundamentally people are not well equipped to handle the social aspects of networking in that way. Neither the scale, nor the impersonal nature of interfacing with other people in that way. Many of our social traits are wired into our human nature. Our ability to form complex social interactions is what allows humanity to accomplish the things it has. However, never in our history was that social networking done at the volume and scale and indirect manner that it is now, and I think that some people simply cannot handle it.
I won't even try to suggest a solution, because I'm not sure if there is one. Maybe, given enough generations of people communicating through social media, humanity will improve at socializing in that way. Or maybe the youngest generation, who are growing up with this technology from day one, will handle it better than those of us it was thrown upon later in life. People at Facebook are frustrated because... they see people acting awful, which unfortunately is a part of human nature, except these employees are expected to deal with this funneled and filtered mass of nastiness from the baser element being thrown at them. Facebook didn't create this problem, they only amplified it.
Re: (Score:1, Flamebait)
It does not change the way the people who posted the hate speech (however that is defined at the moment) think or how they feel.
It isn't about changing the person posting the hate speech. It's about trying to prevent that hate speech being seen as accepted by impressionable people. People have a propensity to believe what they're told. Couple that with the fact that people have the propensity to defend what they believe, even/especially against fact, and social media becomes an easy spawning ground.
Re: (Score:1)
Replace "hate" in this post, with "anti-establishment", and you'll understand that this Facebook problem is even bigger than many people imagine. Half of the people end up being convinced to abuse the other half, in either case.
Re: Human nature (Score:1)
Re: (Score:2)
Hate is in the eye of the beholder, isn't it. Therein lies the problem.
That's both true and false. It's based upon consensus, and yes it can vary. There is a lot of grey area, but there are things that exist at the extremes as well.
Re: (Score:2)
Hate is in the eye of the beholder, isn't it. Therein lies the problem.
That's both true and false. It's based upon consensus, and yes it can vary. There is a lot of grey area, but there are things that exist at the extremes as well.
Criminal threatening that try to terrorize innocent people may be good to be reported and caught. "Hate" towards evil-doers shall not be censored.
Nowadays evil-doers are sitting on top and doing the exact opposite. They are trying every ways to establish their Ministry of Truth into everywhere in the world.
Re: (Score:3)
Your post sounds a lot like this [xkcd.com]. A rough rephrasing of your post would be, "Everyone is stupid and can't be fixed, except me and thee, and I'm not so sure about thee."
A better idea is for youtube to start posting commercials explaining logical fallacies in a humorous, understandable way. The more people understand them, the better, and logical fallacies are something most people can understand.
Re: (Score:2)
+1, humor works. But only for people who have a sense of humor. That may cut out 20-40% of the population, though.
Re: (Score:2)
Re: (Score:2)
But you can improve.
Re: (Score:2)
Re: (Score:2)
I won't even try to suggest a solution, because I'm not sure if there is one.
I like the idea of (instead of censoring) Youtube posting a commercial explaining a logical fallacy in a humorous, understandable way. News media these days is like a wall of textual logical fallacies.
Re: (Score:2)
Re: (Score:3)
There is a trap in the "do not learn facts!" approach to education. Failure to learn math or physics or chemistry leaves the world of engineering and science unavailable, and they rely very heavily on a basis in facts. Basic facts like "frozen water is 0 degrees centigrade, boiling water is 100 degrees, and humans are typically 37 degrees" gives a framework to understand cooking and ordinary home medicine. It's vital to have those facts available to function in a modern civilization.
May I suggest that toda
Re: (Score:2)
Re: (Score:2)
It's not just a few trolls these days, it's industrialized manipulation of the population.
The reason there is so much of this stuff on Facebook is because interesting parties flood the site with it, using networks of accounts to disseminate it and ensure it gets as much exposure as possible. There is a lot of time and money invested in it.
Facebook has basically become a propaganda platform for extremist groups who previously didn't have the resources to get this much attention.
Re: (Score:1)
we did a good old *69 after the modem session terminated. [...] We confronted him and his parents, and the problem stopped.
No speech allowed without the possibility for time-unlimited cross-forum retaliation is not the solution to the civility problem. It's a form of social proof.
It's also a way of erasing "shy" people so you don't have to deal with them.
Why is your, in my opinion, abuse, of *69 a solution to a "problem," but "doxing" is so wrong that it's against the AUP of all the tech oligarch sites? What's the difference in the two scenarios? Is it just the social standing of the person being attacked? Is it your confid
Re: (Score:2)
Yet nobody complained about too MUCH censorship? (Score:2)
Key People Are Leaving Facebook and Torching the Company In Departure Notes
[Examples of those leaving complaining about allegedly inadequate censorship of "hate speech" and "misinformation" that "led to political interference, division, and bloodshed, along with alleged "preferential treatment of influential conservative pages that repeatedly spread false information."]
Yet (apparently) nobody leaving is complaining about too MUCH censorship.
Why am I not surprised?
Re: (Score:2)
If the people who are leaving are the ones in charge of censorship at Facebook, it is no great loss to the company.
5% of 1 in 1,000 (Score:3)
1 in 1,000 = 0.1%
5% of 0.1% = 0.00005%
I like those odds.
Anecdotally, it's not hard to tell the cries of "Censorship!" are just whining. Simply look at any of these social media sites and observe the mountainous piles of shit that get shoveled into your lap. But it's good to have their internal numbers to cite as well.
"hate speech"... (Score:2)
I'm willing to bet a vast majority of that 5% of "hate speech" is really just speech that they disapprove of. Opinions that aren't "approved of" and truth that they don't like. Truth like The Laptop From Hell and truth about the "peaceful protests" and the like.
I would like to know what was their actual role (Score:2)
Facebook leadership knows they can't fix this (Score:3)
We only have their proclamation of doing the Lord's Work here, but there are two examples from the TX lawsuit that show the "disinformation" narrative about the election is gaslighting:
1. TX showed that even PA admits that they officially claim they printed 2.7M mail-in ballots and their actual count was 3.1M. Their own state legislature is not satisfied with the fig leaf of an explanation from the executive on where the extra 400k originated.
2. MI had to admit they have no idea where 174K votes that cannot be tied back to registered voters came from.
Don't believe me? Go dig up the actual lawsuit and don't waste time asking for a link or media citation because TX officially charged these things and got back responses that should be damning to people who believe the election was fair and air tight.
Yet social media companies call any assertion that the election was rigged or riddled with fraud disinformation.
Facebook's leadership knows the truth, which is that they are being expected to gaslight or censor users across many topics. It's bad for business. It cannot be automated, and Facebook isn't particularly fit to judge the truth of any claim. Their fact-checkers suck. They have know conflicts of interest [thefederalist.com] or just regurgitate Official Stories from the very politicians and institutions being criticized. Facebook's leadership knows this won't end well and they have a mob asking them to them use their platform to attack a huge part of their user base for political advantage.
And when you get to hate like the rioting it caused in Myanmar, what do you expect? Many diverse countries like Myanmar are constant powder kegs with one ethnic group always looking to stir up stuff against another. We've even had this problem here in the past ("oh lawdy, I heard an uppity negro whistled at a white girl, get the boys together it's time to burn some crosses" or today "I heard a cop shot a black guy, let's riot and attack the police instead of finding out if it was a good shooting")
Re: (Score:3)
don't waste time asking for a link or media citation
Oh please, linking to https://www.supremecourt.gov/s... [supremecourt.gov] isn't hard.
Re: (Score:2)
My experience with Facebook is also horrible when it comes to recruiting, with the recruiter promising to call for preparation for two weeks, and ending up calling the night before the onsite interview. Then last minute changes in interviewers without any notice, interviewers being super young and inexperienced, interviewers not showing (!), and finally recruiter promising to call after to go through the results because "we like to do that", and then basically disappearing without ever calling. Worst experience in maybe 30+ interviews i had
Had the same experience. I told the 4th interviewer that I was done and ready to leave (as in I'm no longer interested). They asked if I had any questions to which I replied. "Do you seriously expect someone to want to work here after this interview experience?" Only then did they get the hint. Seriously, it was like being interviewed by a series of brick walls. I've never seen people who were worse at reading social cues and body language.
Stopping hate and violence (Score:1)
"Key People" leaving? (Score:1)
These are "Key People"? They're probably the most disposable and readily available jobs being offered by the company.
Then again this is from Buzzfeed. They should never be quoted as a source of "news".
Kudos on a conscience (Score:1)
PRECISELY what constitutes this "hate speech"? (Score:3)
The Faceboolk censorship criteria should be easily available for public review to inform the discussion.
promoting lies as turth is not free speech (Score:2)
The press/media (including Facebook) are huge contributors to the problem. Under some mis-guided and mis-directed desire to support "free speech" and a need to be seen as fair to both sides, they too often present everything as having equal weight and value. Censorship sucks, and, much as we need to honour, cherish and protect free speech, at some point there needs to be consequences when what is said/written/broadcasted has no factual basis.
Media should brand third-party information with labels: support
Re: (Score:1)
Surely the phone company needs to listen to my phone calls and let me know if someone is pushing "misinformation."
Good! (Score:1)
So, they weren't happy with just draconian censorship, they want complete control. Bye-bye.
I gotta believe (Score:2)
that any time now FB is going to be the target of massive DDOS attacks, serious hacking to reveal lots of internal secrets, and doxing of the C-levels.
Normally I don't support that kind of thing, but Facebook needs to either get a conscience or get gone, and it seems they need some very pointed prodding in one or both of those directions.
FB is yet another prime example of how allowing executives to use the fiction of corporate personhood as a shield against personal responsibility is quite simply evil. Corp
5% Hate Speech? Or, Just Opposing Viewpoints (Score:1)
To those people... (Score:1)
A better approach (Score:1)
Sticks and stones will break my bones, but words will never hurt me.