Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Facebook

'Blistering' Note Reveals Secret Travails of Facebook's Content Moderators (sfgate.com) 65

A Facebook content moderator (contracted through Accenture) quit their position in Austin, Texas — but also left a critical internal note which was later leaked by a senior tech reporter at BuzzFeed who described it as "blistering."

SFGate also calls it "a harrowing account of what it's like to work as a Facebook content moderator." The message describes content moderation as a job that takes a significant toll mentally and physically and has led some coworkers to go on psychiatric medication for the first time or self-medicate with alcohol and drugs... "Content analysts are paid to look at the worst of humanity for eight hours a day..." The employee in question allegedly acknowledges that Facebook has made improvements to their wellness program, but still claims it to be inadequate, stating that managers view their employees' brains "as machines," rather than taking into account the consequences of workplace stress.
But the note also points out that "Those who spend the most time in the queues have the least input as to policy... It can take months for issues to be addressed, if they are addressed at all..." Content analysts should be able to communicate directly with those responsible for designing policy... The fact that content analysts are hired by outside agencies makes these things impossible. There are no established avenues for communication with Facebook full-time employees, and we can face penalties if we attempt to contact them.
The last line of the note offers this benediction for Facebook. "I hope you figure out a way to stop constantly starting PR fires and traumatize people en masse."
This discussion has been archived. No new comments can be posted.

'Blistering' Note Reveals Secret Travails of Facebook's Content Moderators

Comments Filter:
  • This just in (Score:4, Interesting)

    by mcnster ( 2043720 ) on Sunday April 18, 2021 @11:48AM (#61286736)

    Ok, we've got guns in the post office, guns in schools, guns on military bases, guns in church, guns in the magazine offices, guns at outdoor festivals...

    How long until we have guns in datacenters?

    "Oh, he went 'Facebook', poor boy."

    :-)

    • I actually like your angle on FP. Kind of hyperbolic, but an interesting spin. Hate the Subject, however. What is it with all the vacuous Subjects on FPs these days?

      My initial reaction to the story was along similar lines, but wondering what it would be like to have such a job. And why? In my salad days I worked a lot of strange gigs, but this sounds worse than any I ever had.

      But as usual, I want to think in terms of solution approaches. First approximation to a solution would be to look at the so-called mo

      • You don't have to wonder, you could probably talk to the people who gave up a similar contract to provide the same services [theverge.com] to Facebook.
        • by rtb61 ( 674572 )

          Can't stand the heat, get out of the kitchen. Stop using the crap employment services to cheat employees with designed to go bankrupt employers.

          Content moderation, hook them up to a machine, test their reactions as they moderate content and do no employ those who can not handle it. Also do not moderate the entire content, once it fails it fails, stop it fast, hit a big old delete and pass to the authorities button and replace with a calming view for some minutes (yes a big red button they can mash, it's ps

    • I see not problem with people having guns in data centers.
    • No, you don't have guns on military bases, they're pretty strict about that.

  • The abyss stares back.

    Unsurprising. That's why peer moderation of one form or another is generally better for all concerned. Short of anything blatently illegal (a small category that includes direct calls to violence, posting copyrighted information sans permission, and similarly proscribed content under existing American laws), all should be permitted.

    • Re: (Score:2, Insightful)

      Oh for fuck's sake.. that is just about the most brain-dead, naïve thing anyone could possibly say about this subject!
      You want a snapshot of what Facebook and Twitter would be like under your 'moderation policy'? Go visit 4chan for a few weeks. Their 'moderation staff' is all unpaid volunteers, there's orders of magnitude (phrase used properly in this case, really it is) too FEW of them -- and they really don't seem to give a damn that much so long as, as you say, it's not something 'blatantly illegal
      • by RightwingNutjob ( 1302813 ) on Sunday April 18, 2021 @12:23PM (#61286840)

        Posted on wrong thread below for some reason:

        You realize that slashdot has peer moderation, and aside from some ascii not-sees, and Tesla fanbois, the quality of comments (if not stories) has been mostly constant for over the last 15 years that I've been reading, at least.

        8chan is 8chan because that's how it markets itself. Slashdot doesn't sell itself that way, doesn't attract that crowd, and you get what you get.

        I don't subscribe to this notion of wanting everything sanitized. If someone's a whackaloon, it's best to know it. If I don't like strip shows and cigar smoke and gambling, I don't go to the part of town where the strip joints are and I don't frequent casinos; I don't demand they all be banned and closed and never mentioned in the newspaper.

        • Relatively speaking this place is obscure at best compared to mainstream 'social media' like Facebook or Twitter, your 'peer moderation' shit won't scale up, it would become an echo chamber where the supporters of bad actors and violent assholes would use their moderation ability to drown out anyone who doesn't agree with them -- much like happens in this place, with 'users' (a fraction of which are probably paid shills) have multiple accounts for purposes of farming moderation points they then use to furth
          • Neither you nor I have any solid evidence of this playing out at scale because the big socials never tried it. They went from wild west to armies of censors and moderators with nothing in between.

            What we do have is slashdot and slashdot style systems that work well enough and principles about letting people speak their minds that align more with peer moderation (as both the Platonic Ideal and the actual implementations) than with paid censors and moderators.

            In New Zealand, there's a government official know

        • by mobby_6kl ( 668092 ) on Sunday April 18, 2021 @01:45PM (#61287070)

          Posted on wrong thread below for some reason:

          You realize that slashdot has peer moderation, and aside from some ascii not-sees, and Tesla fanbois, the quality of comments (if not stories) has been mostly constant for over the last 15 years that I've been reading, at least.

          8chan is 8chan because that's how it markets itself. Slashdot doesn't sell itself that way, doesn't attract that crowd, and you get what you get.

          Why not just look at Reddit which has a much more general normie audience rather than a subsection of old nerds. The peer moderation there is a disaster. Like the most obvious GNAA-type trolls would get modded down but beyond that it's a mess.

          You get modded down until the comment is invisible for a post that doesn't agree with the current state of the hive mind. Saying the most obvious shit possible ("hey guys aren't child rapists bad!?") gets you +thousands of karma points and gold rewards and what not. Yet you still have these localized communities full of complete nutjobs like the former thedonald where they can run wild with no adult supervision.

          Theoretically of course you'd want the communities to moderate themselves. Your suburban moms can ban swearing in their groups or whatever they want. But this runs into a problem when you get a community of dangerous weirdos who truly believe pizza restaurants are fronts for pedo operations and what not.

          • That happens here too. I find there's a cycle of about a month where people pile on and then it takes a couple weeks of karma-bait posts (pithy and snarky, with snark directed at some generic bogeyman) to get back. And then you start telling it like you see it and the mod trolls pile on again.

            As for the pizza loons....well they'll always be there and they'll always find a way to connect. And pedo island was always a wold conspiracy theory before it was suddenly a documented fact, the *denial* of which, or *

            • You can't tech your way out of a human problem. And bad behavior irl is a human problem, not a tech problem.

              But bad behavior online is a tech problem. If your friend is being an idiot at a party by talking about nazi conspiracies or something, you can tell them to shut the fuck up and stop being embarrassing and public shaming might work. Online they'll just go to the next group over which is full of the same morons who'll just self-reinforce their idiocy.

              • by MobyDisk ( 75490 )

                Bad news: your friend just goes to the next group over too. He just doesn't invite you to that party so you don't know...

          • by PPH ( 736903 )

            Why not just look at Reddit which has a much more general normie audience rather than a subsection of old nerds.

            Not so much 'old nerds' but nerds in general. Nerds tend to be more educated (in hard sciences) than those of the general public (normies). So much less of the hive mind group. Not that we don't get some of them. But the push-back here tends to come from the less gullible.

        • by nagora ( 177841 )

          Posted on wrong thread below for some reason:

          You realize that slashdot has peer moderation,

          You realise that you can't post photos of "amusing" car accidents on Slashdot so no one has to look at them to moderate?

    • That's why peer moderation of one form or another is generally better for all concerned.

      Tell you what, go spend some time in the comments section of Zerohedge and come back and let us know if you still feel this way.

      Short of anything blatently illegal (a small category that includes direct calls to violence, posting copyrighted information sans permission, and similarly proscribed content under existing American laws), all should be permitted.

      Since it's a private company, I'm pretty sure that content modera

      • Re: (Score:2, Interesting)

        by shanen ( 462549 )

        Wow! You surprised me. Or not so much. I recognize the handle as an often thoughtful and sometimes humorous one. (Related to an old series of one-panel cartoons? (My memory must be playing tricks on me? I remember it as before the human pope of a similar name, but websearch comes up completely dry? Not a comic in the lot. (My apologies. Can you believe that I was confusing your handle with Pope Alien? https://web.archive.org/web/20... [archive.org] from the naughts.)))

        But along the lines of your solution approach, I thin

    • Re: (Score:2, Insightful)

      by geekmux ( 1040042 )

      The abyss stares back.

      Unsurprising. That's why peer moderation of one form or another is generally better for all concerned.

      I'm sorry, but did you just call for moderation to be handed over...to the Karen's of the world?

      For FUCKS sake.

      • by Anonymous Coward

        What is so difficult about the apostrophe? You don't pluralize with an apostrophe, so "Karens".

        The possessive of a noun is indicated by an apostrophe followed by the letter s. Fuck's sake. For the sake of fucks.

        Clear?

        • Technically, “fuck’s sake” is the sake of a single fuck, “fucks’ sake” is the sake of multiple fucks, and “fucks sake” is a rice wine fetish.

          My previous dissertation on whether or not it’s a plural and the theological implications of this can be found here [slashdot.org].

      • I'm pretty sure that "peer" and "Karen" are only synonymous to the Karens.

  • by AppXprt ( 6146386 ) on Sunday April 18, 2021 @11:59AM (#61286764)
    Remember what happens when you train an AI Neural Network with negatively biased models? They develop a predominantly negative interruption of everything. This is essentially the same thing. What is the brain if not the ultimate neural network? Training the brain on constant negative material, will obviously negatively affect the individual. Now, the same thing applies to multiple occupations; take the police for instance. They increasingly met with hostility, so they are increasingly hostile and now killing people left and right. The brain gets trained on what it experiences in the real world, it's called learning. Probably the root cause of, "Everything is a nail to a hammer." I often wonder about the "nails" perspective, especially when it pertains to those being killed by police. We as a society are at fault for developing an increasingly hostile environment.
  • Wait, which is it? Facebook should review everything we post to keep us all safe, or ... they should stop making their poor employees look at all that?
  • Would consider it a good use of his time to understand all aspects and users of his system(s). In 15 years or more of Facebook, she should have spent at least a few days not and then sitting with the moderators. Considering how much press this aspect gets. This just shows he's more of a sociopath than a leader.
    • by fermion ( 181285 )
      The complication is that in these low skilled jobs there is no need to manage the well being of the employee, just pay enough and have an efficient hiring process. Combine this with the fact that there is no real quality control, just legal compliance with a process that can be used to prevent lawsuits, and you have an absolute dreadful situation for workers.

      This is just as it was in the Jungle. Bad jobs that injure the workers that the workers tolerate because they believe that working is a sacrament.

  • by sound+vision ( 884283 ) on Sunday April 18, 2021 @01:12PM (#61286978) Journal

    Jobs like this should really have 50% of their time on the clock be breaks. Have a masseuse on the premises like they do at the main office.

    To make up for the drop in productivity, have every programmer, marketer, and executive working for Facebook spend 10 or 15 minutes a day moderating. It'd be a reality check for a lot of them, to see what they're selling doesn't really look like this [youtube.com].

    • Have a masseuse on the premises like they do at the main office.

      Get a happy ending to make you even more relaxed?

    • Jobs like this should really have 50% of their time on the clock be breaks.

      I doubt that would happen due to the drop in productivity, and I'm not sure it would really help things much since your productive time is still spent with toxic content.

      Rather, a version of this shows up in one of the links:
      5) This one is in my mind the most important: Reduce the amount of time that people spend in safety queues. All workers should have the option to move to another queue every six months. Further, they shouldn't spend all of their work time in these queues. They should be trained on somet

    • Jobs like this should really have 50% of their time on the clock be breaks. Have a masseuse on the premises like they do at the main office.

      Bollocks. "Moderation" is a cost impacting the bottom line, upsetting the advertisers and getting in the way of profit margins. The absolute minimum possible will be done about it, at the absolutely lowest possible cost. Which precludes using actual Facebook staff (other dumb-fuck social media are available), but lowest-bid contractors companies, in the lowest-possible

  • Three more dead in a mass shooting in Austin [cnn.com]. Perhaps this person was the shooter who finally snapped after looking at all that crap, or was gunned down with the other two as a way to cover up for them being killed because of their posting.

  • The real problem is people aren't prepared before going into a career. It's pretty obviously by design. Lower class people working as their to work, people with degrees go on fancy lunches and draw theories on whiteboards and discuss ROI. It's right in your face, you're an insider or your not. It doesn't matter your actual output or technical skill, because the people in the "upper class" are writing their own metrics to put them on top of the totem pile and the log parsers and API duct tape technicians

  • Really? (Score:1, Funny)

    by dmitch33 ( 6254132 )
    "Content analysts are paid to look at the worst of humanity for eight hours a day." So, they have to read their friends and family's comments?
  • There've been a lot of articles of this type, the 'oh those poor moderators having to work on these awful things' type.
    I agree their jobs are awful. What these ignore, or even obfuscate, is that
    - part of these awful posts are of the 'subculture' type: groups of people exchanging stuff amongst themselves. If you don't like it don't go there. There will be the occasional accidental encounter but not that many. You don't need to clean it all up, mostly the ones that want to spread out.
    - a lot of the posts curr

  • A series of tweets with screenshots attached? =/

  • It's called on the other sites. And they do it for (almost) free the joke goes.

    Yes, cleaning up megacorp poops is a mess. And, no, you can't speak to the manager. The job can't get better, because the job is inherently shitty. Dare I say even lower skill than an actual janitor.

  • They should sub out that sort of thing to 4chan. End of problem.

A CONS is an object which cares. -- Bernie Greenberg.

Working...