Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Facebook Communications Social Networks The Courts The Internet

Facebook Is Not Protecting Content Moderators From Mental Trauma, Lawsuit Claims (reuters.com) 210

A former Facebook contract employee has filed a lawsuit, alleging that content moderators who face mental trauma after reviewing distressing images on the platform are not being properly protected by the social networking company. Reuters reports: Facebook moderators under contract are "bombarded" with "thousands of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder," the lawsuit said. "Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job," Korey Nelson, a lawyer for former Facebook contract employee Selena Scola, said in a statement on Monday. Facebook in the past has said all of its content reviewers have access to mental health resources, including trained professionals onsite for both individual and group counseling, and they receive full health care benefits. More than 7,500 content reviewers work for Facebook, including full-time employees and contractors. Facebook's director of corporate communications, Bertie Thomson, said in response to the allegations: "We take the support of our content moderators incredibly seriously, [...] ensuring that every person reviewing Facebook content is offered psychological support and wellness resources."
This discussion has been archived. No new comments can be posted.

Facebook Is Not Protecting Content Moderators From Mental Trauma, Lawsuit Claims

Comments Filter:
  • The new America. (Score:1, Interesting)

    by Anonymous Coward

    A country of weaklings. If you don't think you can handle that shit (and I'm sure it is horrible) don't take the fucking job. Butch the fuck up.

    • You can't unsee. (Score:5, Insightful)

      by Anonymous Coward on Tuesday September 25, 2018 @02:37AM (#57371940)

      I've read that for the police officers who work combating child porn, there's stuff you can't unsee.

      People who have to deal with that to keep the rest of us from seeing it should have reasonable resources and therapy available to deal with it. I'm not talking carte blanche, but something serious if they need it. Just because you can find someone to work a job without that support doesn't mean it's okay to mess people up for doing their job. You can find people to work a sawmill even if you don't give them health insurance if they cut off your hand, but it's still not okay.

      • by mysidia ( 191772 )

        I've read that for the police officers who work combating child porn, there's stuff you can't unsee.

        I see this as Facebook should give moderators a "This content is illegal/shocking/excessively grotesque checkbox" and partner with law enforcement
        to try and track down the source of the image to hold them responsible.

        Also, if someone as an end user uploads a violating picture or knowingly sends a direct link to a violating picture (For example: if the picture appears in the thumbnail): then that person

    • by Calydor ( 739835 )

      So if a construction job says they have a team of medics on the scene in case you get injured, and then you fall from a three story scaffolding and break your arm only to be told to suck it up or GTFO, that's okay in your world because you should have been able to handle breaking your arm?

      • Re: (Score:1, Insightful)

        by cvdwl ( 642180 )

        This is more like working construction and suing because your muscles are sore at the end of the day. If you aren't physically, mentally, or emotionally capable of handling a job, don't take it! I'm sensitive and north of 50, neither content moderation nor construction are in my future career path. I can live with that.

        If the scaffolding was defective, you have a case; if you fall off well-constructed scaffolding with all safety protocols in place, you HAVE screwed up. Yeah, workman's comp should cover

      • No - this would be more like the medic refusing to help you because they have already seen to many traumatic pictures browsing facebook while on the way to the accident.

        • I'm traumatized because I keep seeing posts from a moron who doesn't know the difference between "to" and "too", and he keeps trying to sound intelligent but keeps blasting his stupidity to the world instead!
      • Comment removed based on user account deletion
      • So if a construction job says they have a team of medics on the scene in case you get injured, and then you fall from a three story scaffolding and break your arm only to be told to suck it up or GTFO, that's okay in your world because you should have been able to handle breaking your arm?

        I think there is a BIG difference between physically breaking bones or worse on your body....vs being offended by 'naughty pictures'.

    • by Gravis Zero ( 934156 ) on Tuesday September 25, 2018 @03:01AM (#57371982)

      A country of weaklings. If you don't think you can handle that shit (and I'm sure it is horrible) don't take the fucking job. Butch the fuck up.

      And what if you do think you can handle it but end up with PTSD? Yes, it can cause PTSD. Also, how does your perspective align with soldiers? Are you going to tell the one's that saw their friends blown to pieces that they should "Butch the fuck up" when they are having a flashback?

      I for one would love to draft all the ACs like you into the being content moderators until you squeal at the very sight of a webpage loading.

      • You're seriously comparing someone sitting in their comfy office moderating videos and pictures to someone who is 5 thousand miles from home and family, half blinded and deafened by war while witnessing first hand their own friends literally getting blown apart while their own life is in peril? Because that what it essentially reads like.

      • I am not really trying to argue with you here. I just want to clarify a point:

        Also, how does your perspective align with soldiers? Are you going to tell the one's that saw their friends blown to pieces that they should "Butch the fuck up" when they are having a flashback?

        Long story short, I walked into the movie Saving Private Ryan without any clue whatsoever it about its content. The theater was full and there was only one seat available, front row center.

        I sat down, saw some old guy at Arlington, and thought to myself, "Fuck. Why did I come to see this movie? This is going to be another "feel good" boring story."

        So the old guy fades out and the next 20 minutes or so were just fucking intense. To

    • This is why you can't get a date.

    • A country of weaklings. If you don't think you can handle that shit (and I'm sure it is horrible) don't take the fucking job. Butch the fuck up.

      This is nothing but attention-whoring and looking for a fat lawsuit payday from deep-pocketed FB.

      I'm laughing at the fact that FB is reaping some of what it's sown in promoting Leftist-snowflake thinking and attitudes. "You're either an Oppressor or one of the Oppressed!"

      I knew this shit was coming. Leftists, given sufficient time, always end up eating their own. Just look at how the MSM has changed how they treat the Clintons now, the kid-gloves have (at least somewhat) come off.

      "Hey FB workers! FB should

      • Again I have to ask: what has being left or right to do with that?
        Why is "lefty" an insult in the USA?
        Or more precisely: why do you insult people who have a "left attitude" in politics?
        Or even more precisely, why is everything you don't agree with: "left" ??? Is that the new newspeak, as in "sinister" meaning "left" and you are to dumb to use the proper term and replace "sinister" with "lefty" to make a point?

        No offense, just wondering why people always use the term lefty ... you are against nuclear power,

        • Re: (Score:2, Insightful)

          by BlueStrat ( 756137 )

          Again I have to ask: what has being left or right to do with that?
          Why is "lefty" an insult in the USA?
          Or more precisely: why do you insult people who have a "left attitude" in politics?

          Because Leftists are basically Marxists with various flavors of authoritarianism and tyranny (e.g. socialism, communism). Marxism in it's various forms has killed more of it's own citizens than disease, starvation, or wars between foreign nations.

          All forms of Marxism are authoritarian by their very nature, as it forces individuals to act in the best interests of the collective and the State, not necessarily in their own best interests. That is evil.

          Marxism in it's various forms is actually and literally wor

          • and people on the right are not authoritarian at all (or should I ignore the Religious fundamentalist moral police)
            • and people on the right are not authoritarian at all (or should I ignore the Religious fundamentalist moral police)

              Yep, you killed the hell out of that strawman.

              I have just as much of a problem with the religious-fundamentalist moral police as I do with Leftist cultural/PC moral police.

              I'm a small-"L" libertarian. I'm forcing my ideology on you *right NOW* because I'm leaving you the hell alone. Horrors!

              Strat

      • As usual you are spewing some phenomenally stupid shit. I'm pretty sure most people are not of the opinion "just watch the child pedo and tough the fuck up." I do however understand why you don't see why most people find the task objectionable.
        • As usual you are spewing some phenomenally stupid shit.

          Which you seem unable to refute with facts.

          I'm pretty sure most people are not of the opinion "just watch the child pedo and tough the fuck up."

          No, they're of the opinion; "Why did you take the job then? Find another if you don't like it. It's not like you're unskilled or you wouldn't have been hired there in the first place. Stop being a snowflake."

          I do however understand

          This statement is not born out by your comments.

          Strat

  • by Mal-2 ( 675116 ) on Tuesday September 25, 2018 @02:29AM (#57371924) Homepage Journal

    It looks like it's time to call in a relief crew and let them get a rest. Call in the B Team -- or the /b/ team, rather.

    You get all the best talent when they do it for free [knowyourmeme.com].

  • So they now need content monitors for the content monitors? Until the content monitors' content monitors sue for their own content monitors. I foresee a problem here. At least until we train our nascent AI overlords by feeding them an endless stream of bestiality and beheading videos, then all our problems will be solved!
  • I guess those photos of a 30 year old professional model's nipples must be really traumatizing. That's why the model is suspended for 30 days because of the horror inflicted on the poor moderators.

  • by wierd_w ( 1375923 ) on Tuesday September 25, 2018 @03:15AM (#57372012)

    Given that it is impossible to do what this disgruntled worker demands (which is to pre-filter the offensive content, that THEY are hired to filter!!), what I see happening instead is the addition of new job requirements...

    Applicant for this position must have demonstrated a complete lack of empathy or emotional reaction to offensive media, and must be able to endure hours of review in the detection, flagging, and removal process of such media. Items that the applicant must have dulled reactions to include but are not limited to, deep fake pornography, child abuse, including sexual exploitation of children, animal abuse, including sexual exploitation of animals, and offensive political rhetoric.

    Applicants are expected to work overtime as instructed by supervisors to meet platform quality standards at corporate mandated deadlines, including weekdays and holidays, as required.

    So, Mr Smith. I assume that you have reviewed and agreed to our initial screening waiver while we administer the 4CHAN-Reddit test battery to determine your candidacy for this position-- are you ready to proceed?

    Excellent! This equipment will measure your emotional responses to the images and other content in this test battery, which have been selected at random from some of the most infamous place on the internet, and which represent a sampling of the worst kind of content humans are able to produce. The test will last 30 minutes, after which, we will review your data and inform you if you have made our candidate list.

    (Begin horror scene from A Clockwork Orange)

    -----

    You know, that kind of thing.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Given that it is impossible to do what this disgruntled worker demands (which is to pre-filter the offensive content, that THEY are hired to filter!!), ...

      Where are you getting this from? It sounds like the fundamental complaint is that Facebook is not providing adequate resources to reasonable deal with the psychological harm from the job of filtering offensive content and specifically using contractors to avoid any long-term work compensation costs for counseling. Or is your point that it's not viable f

    • by AmiMoJo ( 196126 ) on Tuesday September 25, 2018 @06:43AM (#57372366) Homepage Journal

      There are ways to limit the harm that this sort of work can do. Military orgs around the world have been studying it for decades, to try to prevent their soldiers getting PTSD and becoming ineffective. They have also been studying how to make it worse, as a tactic to use against the enemy.

      One example would be limiting exposure. Rather than doing this for 8 hours a day, 5 days a week they might get only be assigned half an hour a day, with the option to continue for up to say two hours if they feel they are okay to do that. The limited exposure and granting of some control over the process really helps psychologically.

      Of course the problem for Facebook is that they don't have enough staff already, and reducing them all from 37.5 hours/week to 2.5-10 will mean they have to hire a huge number more and either make the part time or find them other work to do in the mean time.

      • Oh,
        I had no problem working an 40h shift to review content.
        I would simply approve every picture ... who am I to censor what a brave citizen wants to post on his own timeline? /sarcasm

      • by mysidia ( 191772 )

        One example would be limiting exposure. Rather than doing this for 8 hours a day, 5 days a week they might get only be assigned half an hour a day

        What they should do is give their reviewers a numerical scale for "Grotesqueness of the Image/Content":

        0=Benign/Keep, 1=Questionable, 2=Patently Offensive, 3=Obscene or Policy Violation (DELETE), 4=Extremely Obscene, 5=Grotesque, Repulsive, Shocking

        And carefully monitor their employee's reactions to the images --- both by counting the number of 4s or 5s

      • Maybe have everyone in the company moderate for 5 or 10 minutes a day. Delegate the burnout evenly between the available employees if you don't want to hire more. Otherwise - problems. Problems that many companies have. Facebook isn't unique there.
  • Should somebody else watch all videos and moderate them for the moderators? Apparently, logical thinking is not available to the ones complaining here...

    Also, I very much doubt that much illegal content gets uploaded to Facebook, were it should be pretty easy to identify who did it. People getting traumatized by legal content, on the other hand, should not agree to do this job in the first place.

    • Also, I very much doubt that much illegal content gets uploaded to Facebook, were it should be pretty easy to identify who did it.

      You'd be surprised...

      • I'm really wondering who sees the content. Don't you usually just see stuff from your friends? Who are befriending the people that are posting the illegal content? I've never seen anything like the content in question on facebook. But I don't usually befriend random people I don't know. If somebody started sharing that kind of content, I would promptly block them.

        • by swb ( 14022 ) on Tuesday September 25, 2018 @09:50AM (#57373090)

          I think there's a whole weird world of Facebook that ordinary people who have friend lists that mostly mirror their real lives never see.

          My guess is its comprised of people making low-end money pushing scams and social-media-as-a-career, various swaths of low-income populations, bored and lonely shut-ins who will friend/like anything and have zero privacy settings, and then the truly weird and crazy bottom end of the population.

          Plus, it's an international system. You can participate in high weirdness outside your geography.

          I've been in lots of bars, but I've never seen a bar fight, gang rape or other type of horrible thing in a bar. I think it mostly just means I don't associate with those kinds of people or go to those kinds of bars, not that they don't exist.

      • Yeah,
        he would be surprised.

        In Germany we had a case where 4 _boys_ about 15 or 16, raped a 14 year old girl.

        Surprisingly the girl went straight to the police!
        More surprisingly, the boys thought "we just had sex with a slut" and uploaded it on facebook.

        They got them just a few hours after the incident.

        Luckily, the police got them first ...

        • by gweihir ( 88907 )

          And how often does that happen? Enough to give thousands of moderators a lot of incidents of that nature? No. It does not. This is so rare that it makes the national news.

          • Of course it is rare.
            I posted that only to support the parents point: there is no limit to human stupidity.

          • Ok, but while this occurrence happens rarely, it can be downloaded and reposted thousand times by various people, for various reasons. Then it's like a cancer which moderators fight to eradicate.
            Then you have car accidents occurring while being on Facebook Live, security cam footage of horrible scenes being uploaded and so on. Imagine liveleak posted on Facebook a thousand times over and having to be removed.

            I worked as a field cameraman between 1999 and 2000, for a local TV in a town 50,000 people strong.

    • I was going to comment the same thing. Apparently they want content moderators to filter the uploaded content before the content moderators filter the uploaded content.
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      Should somebody else watch all videos and moderate them for the moderators? Apparently, logical thinking is not available to the ones complaining here...

      That's called the users, numbnuts. Moderates nominally only get involved after users report "illegal content". Ergo, these moderators are specifically watching content that has a substantial probability of being illegal.

      Also, I very much doubt that much illegal content gets uploaded to Facebook, were it should be pretty easy to identify who did it. People

  • by Opportunist ( 166417 ) on Tuesday September 25, 2018 @04:03AM (#57372086)

    I'm asking for a friend...

  • by Anonymous Coward

    When was the last time we learned something good about the people that run facebook, twitter, reddit or amazon ?

  • by ruddk ( 5153113 ) on Tuesday September 25, 2018 @04:10AM (#57372108)

    Facebook, who is it good for, except for the owner.
    What could go wrong when someone who seems to be anti-social, created a "social" website. :D

  • by Rande ( 255599 ) on Tuesday September 25, 2018 @04:12AM (#57372114) Homepage

    There's some people who must enjoy looking at this stuff (otherwise it wouldn't get posted), so why not just hire them?

    Psychopaths need jobs too you know.

  • I thought they had those content moderators mostly in Philippines.
    source: Vice: The Companies Cleaning the Deepest, Darkest Parts of Social Media [vice.com]
  • I can believe it (Score:5, Insightful)

    by Millennium ( 2451 ) on Tuesday September 25, 2018 @05:54AM (#57372274)

    Law enforcement officers who work on child-pornography cases have their own specialized therapists. There's even a name for the stuff they face: Secondary Traumatic Stress Disorder (STSD), brought on by repeatedly witnessing events that traumatize people.

    It wouldn't surprise me at all if many Facebook mods needed this same kind of treatment, given the stuff they have to deal with. Hell; some Slashdot mods could probably use it.

  • You've got to be kidding... I mean, you only use moderators that are cleared (psychologically) for being able to wacht those streams. You don't just put anybody on moderating content.
    But if it pays better than my current (development) job, where can I sign up.. Still looking for content that really traumatises me, still haven't found it (PS. that doesn't mean I enjoy the content, it just means I'm not traumatized by it).

  • Maybe just inform the people before they upload videos that all videos are subject to review and illegal content will be submitted to law enforcement after which Facebook will do everything in their power to help prosecute you. Sure that will reduce the number of people caught doing bad things, but that's not Facebook's problem. They're under no obligation to police people.
  • This would be an ideal job for an expert system. Not the silly Google Assistant (or whatever it's called today), Alexa or Siri, which are good for grins and giggles, and little more. An expert system able to go over all those images and automatically discard the immense majority of the filth would be invaluable.
    • by gweihir ( 88907 )

      Also quite infeasible, since expert systems cannot do this. You would need strong AI, but that does not exist and may well never exist.

  • Buy job ads on rotten.com.

We are Microsoft. Unix is irrelevant. Openness is futile. Prepare to be assimilated.

Working...