Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Facebook Medicine Social Networks The Internet

Facebook Moderators Are Routinely High and Joke About Suicide To Cope With Job, Says Report (gizmodo.com) 217

According to a new report from The Verge, Facebook moderators in Phoenix, Arizona reportedly make just $28,800 a year and use sex and drugs to deal with the stress. "The report published on Monday detailed the experiences of current and former employees who worked at professional services company Cognizant, a company they say Facebook outsources its moderating efforts to," Gizmodo summarizes. "According to the report, employees experienced severe mental health distress, which they coped with by having sex at the office and smoking weed. Some even began believing the conspiracy theories they were tasked with reviewing. One quality assurance manager said he began bringing a gun to work in response to threats from fired workers." From the report: "There was nothing that they were doing for us," one former moderator told The Verge, "other than expecting us to be able to identify when we're broken. Most of the people there that are deteriorating -- they don't even see it. And that's what kills me." "Randy," a quality assurance worker at Cognizant charged with reviewing posts flagged by moderators, said that several times over his year at the company he was approached and intimidated by moderators to change his decisions. "They would confront me in the parking lot and tell me they were going to beat the shit out of me," Randy told The Verge. He also said that fired Cognizant employees made what he believed to be genuine threats of harm to their former colleagues. Randy started to bring a concealed gun to the office to protect himself.

Employees told The Verge that moderators in the Phoenix office dealt with the hellish reality of their jobs by having sex in the office -- in stairwells, bathrooms, parking garages, and a lactation room -- smoking weed on breaks, and joking about suicide. A former moderator claimed that there was a joke among colleagues that "time to go hang out on the roof" was subtext for wanting to jump off the building. Moderators for Facebook have to review graphic posts containing violence, dehumanizing speech, and child abuse, but they also have to weed through the conspiracy theories that run rampant on the web. It's well-reported that the former has resulted in moderators developing PTSD and other debilitating mental health issues, but Monday's report from The Verge indicates that the latter may be causing them to develop fringe beliefs.

This discussion has been archived. No new comments can be posted.

Facebook Moderators Are Routinely High and Joke About Suicide To Cope With Job, Says Report

Comments Filter:
  • by dyfet ( 154716 ) on Tuesday February 26, 2019 @05:05AM (#58181212) Homepage

    Just imagine the AI that will one day get trained on that corpus....

    • by fazig ( 2909523 ) on Tuesday February 26, 2019 @05:22AM (#58181270)
      It lends credence to scenarios like in Terminator or the Matrix.
    • by djinn6 ( 1868030 )

      AI doesn't get PTSD, or at least, no AI we can create in the foreseeable future will have such a capability.

      • by Anonymous Coward on Tuesday February 26, 2019 @06:03AM (#58181392)

        AI doesn't get PTSD, or at least, no AI we can create in the foreseeable future will have such a capability.

        Dont you remember microsofts twitter bot called Tay?

        "She" was sweet enough at first, but after beeing subjected to the internet she quickly adopted conspiracy theories, became racist, misogynist and generally foul-mouthed.

        MS took her down for some tweaks and when she came back she was very clearly into smoking weed as it was her favorite topic.

        Not long after that she got stuck in a loop (bot-suicide).

        #botshavefeelingstoo

      • by jythie ( 914043 )
        I don't see why they would not, or at least not an equivalent. PTSD is the brain reacting to a traumatic event and rewiring itself to add extra aversion to particular stimuli. Pretty much any machine learning system has the potential to screw itself up by overcompensating for a spike in some kind of data and thing not being able to undo the damage because now it is part of the network.
    • by gweihir ( 88907 )

      Hehehehe, nice! Reminds me of when Watson learned to swear or that MS chatterbot went full fascist...

  • Are you surprised? (Score:5, Insightful)

    by Freischutz ( 4776131 ) on Tuesday February 26, 2019 @05:07AM (#58181224)

    Facebook Moderators Are Routinely High and Joke About Suicide To Cope With Job ...

    Are you surprised? They have to spend their days wading through the torrent of raw stupidity that are Facebook comments every moment of every working day. That is bound to destroy your faith in humanity as a a species and drive you to the brink of suicidal depression.

    • by alexgieg ( 948359 ) <alexgieg@gmail.com> on Tuesday February 26, 2019 @05:24AM (#58181284) Homepage

      This is the kind of job best suited for psychopaths. I don't mean that in jest. A psychopath doesn't become ill by seeing any of this, their mind is wired such that it doesn't affect them. And there are highly functioning, non-murderous psychopaths that'd do this job if the pay was high enough.

      Alas, no company would want that cost, so psychologically damaging sane individuals in exchange for saving money it'll be, at least until laws protecting workers from psychological harm are enacted and enforced with the same rigor of laws protecting workers from bodily harm.

      • Granted hey are high functioning psychopaths who in general pull their own weight in society. However the lack of empathy may not be good for the job at hand. Sure the content doesn’t bother them, but because it doesn’t bother them they probably don’t see the need to moderate it.

      • The pay is the problem. 28k is pennies, such people are usually found in management, making a magnitude more money than that.

        • Depends on where you live... $28k/yr is somewhat pretty decent entry-level money in Mississippi, parts of Arkansas, Louisiana, Alabama, even bits of Florida...

          • And you think in those areas a person without consciense and remorse couldn't find a better job? How about politics?

      • by dryeo ( 100693 )

        Which raises the question of whether there are enough psychopaths to fill all these shit jobs. Seems like something seriously wrong with society when so many jobs really need psychopaths to handle them or excel at them in the case of the top jobs (management type)

    • Now consider the fact that the facebook content comes from a fairly wide cross section of the voting public. Is it time to update the Churchill's famous quite : "The best argument against democracy is a 10 minute conversation with an average voter"? Substitute "10 minute conversation with an average voter with "spend 10 minutes as a facebook moderator"?

      • The best argument against Facebook is a 10 minute conversation with the average Facebook user.

        • There are plenty of legitime uses for FB, e.g. organizing events.

          You obviously are not a FB user, so why do you claim things about stuff you have no clue about?

          • It was a play on ol' Winson's saying about democracy.

    • by AmiMoJo ( 196126 ) on Tuesday February 26, 2019 @07:02AM (#58181524) Homepage Journal

      It's probably less the stupid conspiracy shit and more the suicidal children, images of self harm, friends trying desperately and ineffectively to help, groups encouraging anorexia...

      Seeing people genuinely suffering is one of the common causes of PTSD in soldiers and aid workers, for example.

      • That's why further up psychopaths were suggested. Hearing these things barely affect a psychopath, if at all, and he could easily moderate it with zero impact on his own well being.

        Unfortunately management pay is better.

      • The original article talks about the training session where they have to watch ISIS snuff video's and then tell the class why it violates facebooks TOS.

        There is far worse out there than the shit you listed and it's all on facebook.

    • Plus now that this has been publicized, I'm sure the response from corporate will be to take their weed away. That's a lot easier than providing ongoing counseling and mental health support.

    • Back during the .com boom - I worked for a company that training porn filters - luckily in IT, but the people they hired to grade content all day all quickly became perverts.

  • by wolfheart111 ( 2496796 ) on Tuesday February 26, 2019 @05:18AM (#58181260)
    What the Fc
  • by ketomax ( 2859503 ) on Tuesday February 26, 2019 @05:24AM (#58181282)

    which they coped with by having sex at the office and smoking weed.

    Are they hiring?

    • which they coped with by having sex at the office and smoking weed.

      Are they hiring?

      . . . maybe we could make IT development more popular by replacing our "scrum" with an "orgy" . . . ?

      • by Shotgun ( 30919 )

        which they coped with by having sex at the office and smoking weed.

        Are they hiring?

        . . . maybe we could make IT development more popular by replacing our "scrum" with an "orgy" . . . ?

        Have you seen the people (and I use that term loosely) I work with?

    • by Opportunist ( 166417 ) on Tuesday February 26, 2019 @07:49AM (#58181696)

      Yes. But as a tip, bring your own lube. After all, you'll be the new guy.

    • which they coped with by having sex at the office and smoking weed.

      Are they hiring?

      Only Rockstar develo.... I mean, moderators.

    • which they coped with by having sex at the office and smoking weed.

      Are they hiring?

      That was my (satirical) thought, lol.

      Not that I actually want to work there, but that plenty of people manage to cope with stress without resorting to these behaviors.

      Something tells me they'd be doing the same stuff down at the local car wash, if not at Facebook.

      • Re:Awesome Workplace (Score:5, Interesting)

        by _merlin ( 160982 ) on Tuesday February 26, 2019 @09:03AM (#58182014) Homepage Journal

        Working in finance, a lot of people in this business cope by drinking coffee while their biggest problem is staying awake, then switch to alcohol. Lots of high-functioning alcoholics (I was for a few years, but weaned myself off). Plenty of people smoke weed after work or take cocaine on the weekends. Also some guys hire prostitutes to talk out their day before going home to their family. (Prostitutes are cheaper than shrinks, work at more convenient hours for you if you have a day job, and will happily listen to all your problems, offer sympathy, and not tell anyone about it. You don't even need to have sex with them, although that's an option. They may also be able to give you a massage, sing karaoke with you, and other stuff.) But in general this kind of thing happens outside the office. The vices in the office are just the caffeine and alcohol.

  • by goose-incarnated ( 1145029 ) on Tuesday February 26, 2019 @05:38AM (#58181324) Journal

    Violence and child abuse is now the same as dehumanising speech?

    • by jareth-0205 ( 525594 ) on Tuesday February 26, 2019 @07:51AM (#58181708) Homepage

      Violence and child abuse is now the same as dehumanising speech?

      Probably depends on how much and how often you have to deal with it. Speech matters.

      Having to deal day-in-day-out with the conspiracy nuts, literal nazis, threats of violence etc., after a while, little by little, that's going to change you. That's exactly what they're talking about. Since you or I haven't done that job we aren't in a good position to judge what it's like.

      • Violence and child abuse is now the same as dehumanising speech?

        Probably depends on how much and how often you have to deal with it. Speech matters.

        Having to deal day-in-day-out with the conspiracy nuts, literal nazis, threats of violence etc., after a while, little by little, that's going to change you. That's exactly what they're talking about. Since you or I haven't done that job we aren't in a good position to judge what it's like.

        No one claimed it wouldn't change you, but when you place speech in the same category as child abuse, you're trivialising child abuse.

        • We're not talking about child abuse in this context though - they're talking about looking at images and discussions of it. Speech about child abuse. That's a pretty serious difference, it's not like the moderator is going to suffer the trauma of the abuse that they're seeing and reading about. (presumably anyone who suffered such a thing themselves so that past traumas would be invoked would stay far away from such a job)

          Do you really think reading someone's post about raping children is going to be dramat

    • by dabadab ( 126782 )

      Violence and child abuse is now the same as dehumanising speech?

      In the case of moderators we are not speaking about actual violence but images / descriptions of violence and child abuse, and yes, that may be on the same level as dehumanising speech.

      • by Stan92057 ( 737634 ) on Tuesday February 26, 2019 @08:24AM (#58181834)
        Never ceases to amaze me how some people think that because its said on the internet words don't hurt, that its not real..somehow.
        • Never ceases to amaze me how some people think that because its said on the internet words don't hurt, that its not real..somehow.

          Just because it hurts and is real doesn't make it the same as violence and child abuse.

          When you compare speech to child abuse, you're trivialising child abuse.

          • Please rank all crimes for us so we don't make this mistake again. I'll wait...
            • Please rank all crimes for us so we don't make this mistake again. I'll wait...

              I don't need to rank all, just the ones they want to equivocate.

              Child abuse is worse than nasty speech!

              Only twats think otherwise.

              • And yet, clearly, you are getting all upset, stressed and emotional about mere words, about words, which are about people having to examine reported cases of suspected child abuse on a web site - i.e. a medium where said abuse is found primarily in the form of text an images.

                So... At least 5 degrees of separation and abstraction away...
                And there you are shouting, all boldface and exclamation points, that certain text is far worse than other text... because text can't be in the same category as text.

                Hmm...
                So

              • And you're continuing to miss the point. In this context, the mental health of the moderators, there is no child abuse. There is only discussions and images of child abuse. Really sucks for the kid (assuming there's an actual kid involved), but they're outside the scope of this discussion

                Moderators may be traumatized by looking at such images and reading such posts, but it will not be anything remotely like actually suffering such abuse themselves.

          • Re: (Score:2, Informative)

            by Stan92057 ( 737634 )
            You do know that you can destroy a child with words right...Your fucking useless,you will never be any good to anyone your fucking stupid i hate you go away i don't have time...words hurt just as much as the belt, even more so it scars kids for life..verbal abuse of a child IS CHILD ABUSE moron.
          • When you compare speech to child abuse, you're trivialising child abuse.
            A good deal of child abuse is speech ...

            And there is nothing trivial about abuse/harassing by speech.

            • It's a good thing we have freedom of speech to protect ourselves against fascists like you.
              • Wow, no one ever called me a fascists.

                Hint you can google what the term actually means.

                • Well, how does it feel to be on the same side with fascists? Who seeks oppression rather than freedom of speech? There's only ONE side doing that in politics today. When you suppress free speech with violence, you are not fighting fascism - you are the fascist!
                  • When you suppress free speech with violence, you are not fighting fascism - you are the fascist!
                    You seem to mix me up with someone.

                    I'm a democrate not a fascist.

      • My personal view:

        Seeing child abuse images would be orders of magnitude worse than the worst hate speech you could produce. I've read no end of sweary, hateful screeds on the internet. They aren't pleasant reading, but I find I can shrug my shoulders and move on, sometimes even laugh as I imagine the spittle flecked, red faced, raging keyboard hammerer who wrote it.

        Not so with images. I've never seen any child abuse images, and I never want to. I've seen some gore / death type photos though, and those were

  • by grumbel ( 592662 ) <grumbel+slashdot@gmail.com> on Tuesday February 26, 2019 @05:57AM (#58181376) Homepage

    This doesn't really add up. There are around 4500-7500 moderators on Facebook and while there is a lot of terrible stuff on the Internet, most of it could be automatically filtered away by content-id after first identifying it. Furthermore most users wouldn't even be stupid enough to post that stuff on Facebook in the first place, since that gets your account blocked and there are more appropriate places for it on the Internet. I doubt that leaves enough content to damage thousands of moderators.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      "most users wouldn't even be stupid enough to post that stuff on Facebook in the first place"

      You vastly overestimate the intelligence of your fellow man.

      Pizzagate raged for months, Qanon is still going strong. A significant population of the US is dumb enough to believe that nonsense - and post repeatedly about it every day. And that's just the US nutters. Facebook reported 2,200,000,000 active monthly users globally in 2018. It has repeatedly been acknowledged that the number of moderators is insufficient

  • How do those facebook moderators get the details of the Cognizant workers that they are able to track them down and threathen them?
    At least hide the identities so they don't need to worry about what crazy people might do to them for reviewing their posts.

    • Doesn't Facebook require real names? That, paired with the Cognizant corporate employee directory should be enough to ID the moderator.
    • by Cederic ( 9623 )

      Well, for a start they all sit in the same office.

      I suspect they also get performance ratings based on whether their moderations get overturned, and that gives them tremendous incentive to avoid that happening.

      I can easily believe that the meta-moderator is expected to provide a written reason for overruling, and you'd soon learn who writes in which styles.

  • by bradley13 ( 1118935 ) on Tuesday February 26, 2019 @08:03AM (#58181758) Homepage

    Consider a small, isolated community: If someone acts like a jackass, they will be socially shunned. If they persist in acting like a jackass, someone bigger and meaner will take them out behind the shed and "learn 'em". If they still persist, they will ultimately be run out of town.

    In more civilized climes, the community hands over some of this responsibility to the government. There are laws about stalking harrassment, and the like. Ultimately, the punishments aren't all that different.

    The problem in public, online communities is the lack of hard-and-fast identity, so that punishments can be applied. Sure, an account gets banned - but the person just makes another account. There's no "shed", and no real way to run the perp out of town. Moderation becomes nothing but a gigantic game of whac-a-mole - it's almost completely pointless.

    It seems to me that part of the solution is to regain those small communities, by making online communities mostly private. Participants have to be invited; which means that they can easily be permanently disinvited. Just creating a new account won't garner an invitation to join.

    Taking Facebook as the example (since it's the subject of TFA): Why should any profile be open to public comments? Let a profile show enough information for people to find you. But any interaction - posting or whatever - should require an explicit invitation. No invite for the asshat, and the person will never know they exist. And if you're a member of a group where people are saying bad things? Leave, problem solved.

    If some asshat wants to post unpleasant stuff, they are absolutely free to do so - on their own profile, where only the people they invite will ever see it. It won't bother anyone else. But, but...what if they post something I don't like? Waaah!

    - Fake news? Unpopular opinions? Let the invite-only groups entertain themselves. It's no one's business, and any intervention is really just censorship. Stupid people exist, and who knows, maybe we're actually the stupid ones. Maybe it really is turtles all the way down.

    - Illegal material? Call the police, that's why they exist. Don't moderate - that's evidence tampering. Do what the police request, whether that's deleting the material, or leaving it up as evidence.

    • The problem in public, online communities is the lack of hard-and-fast identity, so that punishments can be applied. Sure, an account gets banned - but the person just makes another account. There's no "shed", and no real way to run the perp out of town. Moderation becomes nothing but a gigantic game of whac-a-mole - it's almost completely pointless.

      This was the problem I (and a bunch of other users) faced when I attracted the attention of a cyber-stalker on Twitter a few years back. She was convinced that

  • ...they're just like the rest of us.

  • $6.9 billion in profits divided by 30,000 employees working on safety and security = $230,000 in profit per person in that division.

    That math may be incorrect if contract content moderators aren't included in the 30,000 employees, and I'd be happy to have more accurate numbers. Still, it's clear where the profits are coming from: Investors are making money off of the fact that there are people desperate enough for a job that they're willing to do this job for shit wages.

  • The UFOs are real! The government knows and keeps quiet! Men in Black come and make people who know too much disappear!!!

    For moderator: if you're female and frustrated, text me on 1-212-555-1234 for a good time!
  • We are used to professional level environments but at this income level this is more like restaurant or call center level crowd and these are probably mostly young people like those jobs. Their managers are probably parents as much as bosses.

    This all sounds pretty similar to earlier in life when I worked in those kinds of environments. It's less people having sex and using drugs to cope with work than just people having sex and using drugs because sex and drugs are a great way to pass the time with coping a

  • But I'm sure the job has other perks as well.
  • Couldn't you just replace "Moderators" with "Employees" and still have a valid statement. My guess is a good fraction of that company is high and mentally distressed.

  • And Facebook is NEVER to blame for all the ads they push onto member newsfeeds without their consent? I absolutely despise those intrusions and I do everything to make the moderators' job as miserable as possible by vigorously reporting the ads, especially the scams like junk health treatments. Advertising storage units? Reported as "s3xually inappropriate". Restaurants? Reported as "political issue". Entertainment? Reported as "prohibited content". And yes I exclude as much personal info as possibl
  • I mean really.. this is textbook for your average restaurant. I'd say the job responsibilities of surfing the type of content would encourage more action (read porn all day, have more sex in the office; read hate speech all day have anger issues; read conspiracies and fake news all day develop more of a broken perception of the world) but honestly.. all of those levels are so high in a kitchen we're talking about shades of grey here!

  • moderating while not thoroughly baked.

    [insert *anything* for 'moderating'.]

You are always doing something marginal when the boss drops by your desk.

Working...