Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Facebook Businesses Social Networks United States

Key People Are Leaving Facebook and Torching the Company In Departure Notes (buzzfeednews.com) 104

An anonymous reader quotes a report from BuzzFeed News: On Wednesday, a Facebook data scientist departed the social networking company after a two-year stint, leaving a farewell note for their colleagues to ponder. As part of a team focused on "Violence and Incitement," they had dealt with some of the worst content on Facebook, and they were proud of their work at the company. Despite this, they said Facebook was simply not doing enough. "With so many internal forces propping up the production of hateful and violent content, the task of stopping hate and violence on Facebook starts to feel even more sisyphean than it already is," the employee wrote in their "badge post," a traditional farewell note for any departing Facebook employee. "It also makes it embarrassing to work here."

Using internal Facebook data and projections to support their points, the data scientist said in their post that roughly 1 of every 1,000 pieces of content -- or 5 million of the 5 billion pieces of content posted to the social network daily -- violates the company's rules on hate speech. More stunning, they estimated using the company's own figures that, even with artificial intelligence and third-party moderators, the company was "deleting less than 5% of all of the hate speech posted to Facebook." (After this article was published, Facebook VP of integrity Guy Rosen disputed the calculation, saying it "incorrectly compares views and content." The employee addressed this in their post and said it did not change the conclusion.)

The sentiments expressed in the badge post are hardly new. Since May, a number of Facebook employees have quit, saying they were ashamed of the impact the company was having on the world or worried that the company's inaction in moderating hate and misinformation had led to political interference, division, and bloodshed. Another employee was fired for documenting instances of preferential treatment of influential conservative pages that repeatedly spread false information. But in just the past few weeks, at least four people involved in critical integrity work related to reducing violence and incitement, crafting policy to reduce hate speech, and tracking content that breaks Facebook's rules have left the company. In farewell posts obtained by BuzzFeed News, each person expressed concerns about the company's approach to handling US political content and hate speech, and called out Facebook leadership for its unwillingness to be more proactive about reducing hate, incitement, and false content.
In the wake of the 2020 US Election, Facebook's "election integrity" team, which was charged with "helping to protect the democratic process" and reducing "the spread of viral information and fake accounts," was recently disbanded as a stand-alone unit. Company leadership also reportedly shot down a proposal from the company's integrity teams to throttle the distribution of false and misleading election content from prominent political accounts, like President Donald Trump's.
This discussion has been archived. No new comments can be posted.

Key People Are Leaving Facebook and Torching the Company In Departure Notes

Comments Filter:
  • by BAReFO0t ( 6240524 ) on Friday December 11, 2020 @08:04PM (#60821206)

    Yeah, because it's sinking!

    Smart animals, those rats!

    • by Anonymous Coward

      Their post also argued that Facebook’s “very apparent interest in propping up actors who are fanning the flames of the very fire we are trying to put out” makes it impossible for people to do their jobs. “How is an outsider supposed to believe that we care whatsoever about getting rid of hate speech when it is clear to anyone that we are propping it up?”

      Content that is controversial and inflammtory attracts more viewers and that means more profits. Facebook will NEVER seriously do anything about the problem. This is a company running by the greediest of the greedy, the most corrupt of all corruptness.

      If you wrote a novel and had a character follow the exact behavior of Mark Zuckerberg, you would be accused of anti-semtism.

  • I send them a link of the most recent outrage about Facebook's practices. Usually I can find one from that very day.

    • by Anonymous Coward on Friday December 11, 2020 @09:34PM (#60821416)

      2003 - 2018: 15 years of the Mark Zuckerberg Apology Tour

      November 2003
      After creating Facemash, a Harvard 'hot-or-not' site.
      “This is not how I meant for things to go and I apologize for any harm done as a result of my neglect.”

      September 2006
      After introducing News Feed, which exposed updates to friends in one central place.
      “We really messed this one up. ... We did a bad job of explaining what the new features were and an even worse job of giving you control of them.”

      December 2007
      After launching Beacon, which opted-in everyone to sharing with advertisers what they were doing in outside websites and apps.
      “We simply did a bad job with this release, and I apologize for it. People need to be able to explicitly choose what they share.”

      February 2009
      After unveiling new terms of service that angered users.
      “Over the past couple of days, we received a lot of questions and comments. Based on this feedback, we have decided to return to our previous terms of use while we resolve the issues.”

      May 2010
      After reporters found a privacy loophole allowing advertisers to access user identification.
      “Sometimes we move too fast. We will add privacy controls that are much simpler to use. We will also give you an easy way to turn off all third-party services.”

      November 2011
      After Facebook reached a consent decree with the Federal Trade Commission for deceiving consumers about privacy.
      “I’m the first to admit that we’ve made a bunch of mistakes. Facebook has always been committed to being transparent about the information you have stored with us — and we have led the internet in building tools to give people the ability to see and control what they share.”

      July 2014
      After an academic paper exposed that Facebook conducted psychological tests on nearly 700,000 users without their knowledge. (Apology by Facebook COO Sheryl Sandberg)
      “It was poorly communicated. And for that communication we apologize. We never meant to upset you.”

      December 2016
      After criticism of the role of Facebook in spreading fake news about political candidates.
      “I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through. Today we’re making it easier to report hoaxes.”

      April 2017
      After a Cleveland man posted a video of himself killing 74-year-old Robert Godwin.
      “Our hearts go out to the family and friends of Robert Godwin Sr., and we have a lot of work — and we will keep doing all we can to prevent tragedies like this from happening.”

      September 2017
      While revealing a nine-step plan to stop nations from using Facebook to interfere in one another’s elections, noting that the amount of “problematic content” found so far is “relatively small.”
      “I care deeply about the democratic process and protecting its integrity. It is a new challenge for internet communities to deal with nation states attempting to subvert elections. But if that’s what we must do, we are committed to rising to the occasion.”

      September 2017
      After continued criticism about the role of Facebook in Russian manipulation of the 2016 election.
      “For the ways my work has been used to divide rather than to bring us together, I ask for forgiveness and I will work to do better. ”

      January 2018
      Announcing his personal challenge for the year is to fix Facebook.
      “ We won’t prevent all mistakes or abuse, but we currently make too many errors enforcing our policies and preventing misuse of our tools. This will be a serious year of self-improvement and I’m looking forward to learning from working to fix our issues together. ”

      March 2018
      After details emerged about Cambridge Analytica taking user data.
      “We have a responsi

      • Nice. no mod points left else you would get one. I feel better knowing it was all just a mistake. You never see a 'won't happen again' though.

        Data mining never looked so good. And apologetic.

  • by Anonymous Coward
    Given their definitions of "Violence", "Incitement" and "False Content" more should quit or be fired. Not because they did not do enough, but because what they did was unfair and often utterly biased. And if one is such a political zealot they cannot see the latter they have no business evaluating content for abuse. Unless we want to admit Facebook is not an open content system and that it really is only for content of an approved political orientation and advocacy.
    • Maybe that the whole point of the "article." It's Facebook's opening response to any accusation of censorship from the right.
  • by Entrope ( 68843 ) on Friday December 11, 2020 @08:24PM (#60821256) Homepage

    Not very long ago, it was considered newsworthy to report on a presidential candidate's son meeting with foreign representatives with even a spurious appearance of impropriety.

    Not very long ago, it was considered newsworthy to report on leaked emails documenting such meetings.

    Not very long ago, it would have been major news for the CEO of a company to testify before the Senate, and for a Senator to reveal the CEO's testimony as a lie while the CEO was still on the floor of the Senate.

    Companies like Facebook and Twitter helped change those norms about what can be reported, and now they are paying the price for it. I have no sympathy for them or the workers who carried out those policies. They are digital equivalents of jack-booted, masked enforcers.

    • Not very long ago, it would have been major news for the CEO of a company to testify before the Senate, and for a Senator to reveal the CEO's testimony as a lie while the CEO was still on the floor of the Senate.

      Wait, when did that one happen?

    • Not very long ago, it was considered newsworthy to report on a presidential candidate's son meeting with foreign representatives with even a spurious appearance of impropriety.

      It used to be weird when a president had porn stars under NDAs, but times change apparently.

    • by carton ( 105671 )

      Facebook blocked links to the October Surprise as spam. I couldn't share links to it, but my liberal-in-good-algorithmic-standing friends could. My guess is that suspect-conservatives get a higher spam-baseline score, or something.

      I'm sure snarky slashdotters can invent some hypothetical way to blame the victim and claim everything might be ok therefore surely it is. Obviously their "algorithms" are secret, but why is that "obvious"? It is not okay at all.

      Although it was Twitter who banned the NY Post f

      • by ebvwfbw ( 864834 )

        Did they offer you money to turn your account off until after the election? They offered me up to $250. I told them - no thanks.

  • Best of luck (Score:2, Insightful)

    by alvinrod ( 889928 )
    I'm sure at their next job they'll be able to overcome all of human history and the evolutionary pressure that lead to humans being violent and inciting it. Sorry, but the cards were rigged from the start and this was a game they had no hope of winning. If they're going to tackle something like this they should start smaller. See if they can get the next door neighbor's annoying dog to stop barking first and maybe then they can start getting humans to act civil.
    • Humans can do both be violent and also be generous and kind, I would say most interactions today are civil but the bad ones are the ones that stand out and we remember. Even on Facebook only 0.1% of posts violate the standards. Violence is sometimes necessary sometimes you do need use physical means to get what you want. In general I see humans becoming less violent, (with ups and downs). At least where I live you are much more likely to kill yourself than someone else.

      Can we do better? Of course we can, bu

  • Human nature (Score:5, Interesting)

    by Dan East ( 318230 ) on Friday December 11, 2020 @08:42PM (#60821286) Journal

    Attempting to filter speech based on content is only putting a bandage on the problem. It does not change the way the people who posted the hate speech (however that is defined at the moment) think or how they feel. Whether or not Social Media Platform X - be it Facebook or Twitter or whatever - can mute the speech of that person is pretty much irrelevant. They will continue to hold those viewpoints, and they can easily migrate to some other platform or communication channels that doesn't censor them.

    Back in the 80s I ran a dial-up BBS with a couple friends. We had someone dial in and post some nasty things in one of the discussion forums. After one of those calls (you could see the person "live" on your local computer running the BBS) we did a good old *69 after the modem session terminated. A quick look through the phone book (yes, in those days, in our small town, it was totally feasible to "reverse lookup" a phone number by scanning through the listings!), and we found who it was. It was a fellow high school kid that was extremely shy and backwards. We confronted him and his parents, and the problem stopped. It was at that time I learned how the (apparent) anonymity, or at least the impersonal nature, of online social networking (although no-one called it that at the time) emboldened people to speak their minds more freely, and otherwise behave in ways they would never do in person or when they felt accountable.

    Now we have billions of people online and communicating through words with people they have never met, and will never meet. In many ways, those interactions would be construed as totally inconsequential by some people. I think that fundamentally people are not well equipped to handle the social aspects of networking in that way. Neither the scale, nor the impersonal nature of interfacing with other people in that way. Many of our social traits are wired into our human nature. Our ability to form complex social interactions is what allows humanity to accomplish the things it has. However, never in our history was that social networking done at the volume and scale and indirect manner that it is now, and I think that some people simply cannot handle it.

    I won't even try to suggest a solution, because I'm not sure if there is one. Maybe, given enough generations of people communicating through social media, humanity will improve at socializing in that way. Or maybe the youngest generation, who are growing up with this technology from day one, will handle it better than those of us it was thrown upon later in life. People at Facebook are frustrated because... they see people acting awful, which unfortunately is a part of human nature, except these employees are expected to deal with this funneled and filtered mass of nastiness from the baser element being thrown at them. Facebook didn't create this problem, they only amplified it.

    • Re: (Score:1, Flamebait)

      by Xenx ( 2211586 )

      It does not change the way the people who posted the hate speech (however that is defined at the moment) think or how they feel.

      It isn't about changing the person posting the hate speech. It's about trying to prevent that hate speech being seen as accepted by impressionable people. People have a propensity to believe what they're told. Couple that with the fact that people have the propensity to defend what they believe, even/especially against fact, and social media becomes an easy spawning ground.

      • by Anonymous Coward

        Replace "hate" in this post, with "anti-establishment", and you'll understand that this Facebook problem is even bigger than many people imagine. Half of the people end up being convinced to abuse the other half, in either case.

      • Hate is in the eye of the beholder, isn't it. Therein lies the problem. In any case, these social media companies have taken on an impossible task. The harder they try to moderate the harder they're going to fail or feel like they've failed. Case in point these employees. They should just go wild west, hang section 230 on their walls as a daily affirmation, and call it a day.
        • by Xenx ( 2211586 )

          Hate is in the eye of the beholder, isn't it. Therein lies the problem.

          That's both true and false. It's based upon consensus, and yes it can vary. There is a lot of grey area, but there are things that exist at the extremes as well.

          • Hate is in the eye of the beholder, isn't it. Therein lies the problem.

            That's both true and false. It's based upon consensus, and yes it can vary. There is a lot of grey area, but there are things that exist at the extremes as well.

            Criminal threatening that try to terrorize innocent people may be good to be reported and caught. "Hate" towards evil-doers shall not be censored.

            Nowadays evil-doers are sitting on top and doing the exact opposite. They are trying every ways to establish their Ministry of Truth into everywhere in the world.

      • Your post sounds a lot like this [xkcd.com]. A rough rephrasing of your post would be, "Everyone is stupid and can't be fixed, except me and thee, and I'm not so sure about thee."

        A better idea is for youtube to start posting commercials explaining logical fallacies in a humorous, understandable way. The more people understand them, the better, and logical fallacies are something most people can understand.

    • I won't even try to suggest a solution, because I'm not sure if there is one.

      I like the idea of (instead of censoring) Youtube posting a commercial explaining a logical fallacy in a humorous, understandable way. News media these days is like a wall of textual logical fallacies.

    • Better education is a key to a lot of this. Stop filling kids' heads with easy to test rote facts and start teaching them to think. This whole anti-education, anti-elitist thing we've got going has made America ripe for what's happening. Social media changed the game and now any random flake with a catchy conspiracy can get thousands of people to hear and believe. Until we once again arm people with the mental skills to filter out the crap, we're stuck in this mess.
      • There is a trap in the "do not learn facts!" approach to education. Failure to learn math or physics or chemistry leaves the world of engineering and science unavailable, and they rely very heavily on a basis in facts. Basic facts like "frozen water is 0 degrees centigrade, boiling water is 100 degrees, and humans are typically 37 degrees" gives a framework to understand cooking and ordinary home medicine. It's vital to have those facts available to function in a modern civilization.

        May I suggest that toda

        • I didn't mean to imply I think there are no useful facts. And providing a robust math and science education would do wonders in itself to help people understand how the world works (and doesn't work). But teaching only what is required to pass multiple choice tests is not sufficient to arm a person for life in this modern world.
    • by AmiMoJo ( 196126 )

      It's not just a few trolls these days, it's industrialized manipulation of the population.

      The reason there is so much of this stuff on Facebook is because interesting parties flood the site with it, using networks of accounts to disseminate it and ensure it gets as much exposure as possible. There is a lot of time and money invested in it.

      Facebook has basically become a propaganda platform for extremist groups who previously didn't have the resources to get this much attention.

    • by carton ( 105671 )

      we did a good old *69 after the modem session terminated. [...] We confronted him and his parents, and the problem stopped.

      No speech allowed without the possibility for time-unlimited cross-forum retaliation is not the solution to the civility problem. It's a form of social proof.

      It's also a way of erasing "shy" people so you don't have to deal with them.

      Why is your, in my opinion, abuse, of *69 a solution to a "problem," but "doxing" is so wrong that it's against the AUP of all the tech oligarch sites? What's the difference in the two scenarios? Is it just the social standing of the person being attacked? Is it your confid

  • Key People Are Leaving Facebook and Torching the Company In Departure Notes

    [Examples of those leaving complaining about allegedly inadequate censorship of "hate speech" and "misinformation" that "led to political interference, division, and bloodshed, along with alleged "preferential treatment of influential conservative pages that repeatedly spread false information."]

    Yet (apparently) nobody leaving is complaining about too MUCH censorship.

    Why am I not surprised?

    • If the people who are leaving are the ones in charge of censorship at Facebook, it is no great loss to the company.

  • by sound+vision ( 884283 ) on Friday December 11, 2020 @10:49PM (#60821604) Journal

    1 in 1,000 = 0.1%
    5% of 0.1% = 0.00005%

    I like those odds.

    Anecdotally, it's not hard to tell the cries of "Censorship!" are just whining. Simply look at any of these social media sites and observe the mountainous piles of shit that get shoveled into your lap. But it's good to have their internal numbers to cite as well.

  • ... is free speech. Those employees are mad that they can't control what other people think.

    I'm willing to bet a vast majority of that 5% of "hate speech" is really just speech that they disapprove of. Opinions that aren't "approved of" and truth that they don't like. Truth like The Laptop From Hell and truth about the "peaceful protests" and the like.
  • I am quite curious to know if these people are actual developers, or if they just belong to the non technical workers that Facebook accrued - after the actual job of creating Facebook was mostly done - and that serve only for social and PR purposes.
  • Another employee was fired for documenting instances of preferential treatment of influential conservative pages that repeatedly spread false information

    We only have their proclamation of doing the Lord's Work here, but there are two examples from the TX lawsuit that show the "disinformation" narrative about the election is gaslighting:

    1. TX showed that even PA admits that they officially claim they printed 2.7M mail-in ballots and their actual count was 3.1M. Their own state legislature is not satisfied with the fig leaf of an explanation from the executive on where the extra 400k originated.
    2. MI had to admit they have no idea where 174K votes that cannot be tied back to registered voters came from.

    Don't believe me? Go dig up the actual lawsuit and don't waste time asking for a link or media citation because TX officially charged these things and got back responses that should be damning to people who believe the election was fair and air tight.

    Yet social media companies call any assertion that the election was rigged or riddled with fraud disinformation.

    Facebook's leadership knows the truth, which is that they are being expected to gaslight or censor users across many topics. It's bad for business. It cannot be automated, and Facebook isn't particularly fit to judge the truth of any claim. Their fact-checkers suck. They have know conflicts of interest [thefederalist.com] or just regurgitate Official Stories from the very politicians and institutions being criticized. Facebook's leadership knows this won't end well and they have a mob asking them to them use their platform to attack a huge part of their user base for political advantage.

    And when you get to hate like the rioting it caused in Myanmar, what do you expect? Many diverse countries like Myanmar are constant powder kegs with one ethnic group always looking to stir up stuff against another. We've even had this problem here in the past ("oh lawdy, I heard an uppity negro whistled at a white girl, get the boys together it's time to burn some crosses" or today "I heard a cop shot a black guy, let's riot and attack the police instead of finding out if it was a good shooting")

  • Stopping Hate is just like The War on Terror or the War on Drugs - impossible to do, and basically just a scam.
  • These are "Key People"? They're probably the most disposable and readily available jobs being offered by the company.

    Then again this is from Buzzfeed. They should never be quoted as a source of "news".

  • Genuinely impressed (and pleased) that people are jumping ship because of the impact Facebook has on society. I genuinely don't know how people work there or at Twitter. Both platforms are cancer.
  • by couchslug ( 175151 ) on Saturday December 12, 2020 @09:20AM (#60822528)

    The Faceboolk censorship criteria should be easily available for public review to inform the discussion.

  • The press/media (including Facebook) are huge contributors to the problem. Under some mis-guided and mis-directed desire to support "free speech" and a need to be seen as fair to both sides, they too often present everything as having equal weight and value. Censorship sucks, and, much as we need to honour, cherish and protect free speech, at some point there needs to be consequences when what is said/written/broadcasted has no factual basis.

    Media should brand third-party information with labels: support

    • Surely the phone company needs to listen to my phone calls and let me know if someone is pushing "misinformation."

  • So, they weren't happy with just draconian censorship, they want complete control. Bye-bye.

  • that any time now FB is going to be the target of massive DDOS attacks, serious hacking to reveal lots of internal secrets, and doxing of the C-levels.

    Normally I don't support that kind of thing, but Facebook needs to either get a conscience or get gone, and it seems they need some very pointed prodding in one or both of those directions.

    FB is yet another prime example of how allowing executives to use the fiction of corporate personhood as a shield against personal responsibility is quite simply evil. Corp

  • I respect their convictions, but I would prefer to err by allowing free speech. It is easier to feel and say that someone else should not have any means of expressing their opinions, but it is quite another matter when one's own speech is at issue. Is there actually even 5% hate speech on Facebook? Or, is it more likely that there are simply a lot of opposing viewpoints?
  • To those people The Ten Commandments is hate speech.
  • Wouldn't it be much easier, and better to just teach people to fully grock the following ancient wisdom?

    Sticks and stones will break my bones, but words will never hurt me.

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...