Forgot your password?
typodupeerror
Facebook The Courts Slashdot.org

More Than 140 Kenya Facebook Moderators Diagnosed With Severe PTSD (theguardian.com) 56

An anonymous reader quotes a report from The Guardian: More than 140 Facebook content moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism. The moderators worked eight- to 10-hour days at a facility in Kenya for a company contracted by the social media firm and were found to have PTSD, generalized anxiety disorder (GAD) and major depressive disorder (MDD), by Dr Ian Kanyanya, the head of mental health services at Kenyatta National hospital in Nairobi. The mass diagnoses have been made as part of lawsuit being brought against Facebook's parent company, Meta, and Samasource Kenya, an outsourcing company that carried out content moderation for Meta using workers from across Africa.

The images and videos including necrophilia, bestiality and self-harm caused some moderators to faint, vomit, scream and run away from their desks, the filings allege. The case is shedding light on the human cost of the boom in social media use in recent years that has required more and more moderation, often in some of the poorest parts of the world, to protect users from the worst material that some people post.
The lawsuit claims that at least 40 moderators experienced substance misuse, marital breakdowns, and disconnection from their families, while some feared being hunted by terrorist groups they monitored. Despite being paid eight times less than their U.S. counterparts, moderators worked under intense surveillance in harsh, warehouse-like conditions.
This discussion has been archived. No new comments can be posted.

More Than 140 Kenya Facebook Moderators Diagnosed With Severe PTSD

Comments Filter:
  • get an union!

  • by quonset ( 4839537 ) on Tuesday December 24, 2024 @08:43PM (#65037551)

    All social media companies have the same issue. The people trying to keep this stuff off are burnt out and can't cope with the horrid things humans do.

    Until humans are gone, this issue will remain.

    • by ClickOnThis ( 137803 ) on Tuesday December 24, 2024 @10:33PM (#65037705) Journal

      Until humans are gone, this issue will remain.

      Gone from where, though? From the face of the Earth, or just from the moderation facility?

      I hope you meant the latter. Maybe use AI instead?

    • Hell is other people.

      I looked up the quote and it's from a French philosopher, Jean-Paul Sartre.

    • by AmiMoJo ( 196126 )

      It very much depends on the moderation policy. If people get banned long before getting to the point of posting this extreme stuff, there are very few who register and account just to post it knowing that a ban isn't far behind.

      Facebook tries to allow as much as possible, so has huge amounts of this stuff pushing at the fringes, existing in an environment where very slightly milder stuff is tolerated and the boundary is unclear.

  • ... there is a greater-than-zero number of moderators who do NOT have PTSD after more than 5 minutes of this toxic waste.

    • But shouldn't they sue the subcontracting outfit? Seems like an easy fb win. But...politics.
      • by suman28 ( 558822 )
        What would that do? The job needs to be done. Eventually, it may be done by "AI" but for now it is a job. The people should be the one held accountable. Suing the contractor, sub-contractor, facebook does not eliminate beastiality, rape, murder, suicide and all the other crap.
        • My point was more that Meta has no apparent liability here. How could they?
        • by alexgieg ( 948359 ) <alexgieg@gmail.com> on Wednesday December 25, 2024 @09:51AM (#65038291)

          There's one subgroup of humans extremely well-suited to do this kind of moderation: psychopaths. They understand what causes normal people to break, and can pinpoint it with accuracy, while not being affected by it. Moderating social media posts, if well paid, would be a very easy entry-level job for them, if not a full on career. And they'd become a net positive to the world, their abilities turned for good and recognized as such, rather than the much more usual opposite of that.

          Sociopaths might fill that role too, if more moderators were needed than there are psychopaths, but in their case it'd definitely be just entry level, as sociopaths tend to want to move up the ladder quickly and wouldn't stick around for long in what would be, for them, an easy but boring job.

    • ... there is a greater-than-zero number of moderators who do NOT have PTSD after more than 5 minutes of this toxic waste.

      And how are these same moderators after 5 hours/days/weeks/months/years?

      It seems to be the prolonged exposure that is causing the problems the moderator farms are seeing.

      Mind you, I'm not sure I would want to endure even 5 minutes of the kind of sick shit these moderators must see.

  • by Anonymous Coward

    "Eight times less than their US counterparts" is still more than I get for moderating /. when those points hit the account. And have you ever scrolled this site with the filter turned down?! /s

  • by votsalo ( 5723036 ) on Tuesday December 24, 2024 @10:11PM (#65037673)
    What about the billions of regular Facebook users? Don't they suffer from Social Media Disorder (SMD)? Or is it Social Media Addiction SMA)?
  • Your life is already pretty fucked up.
  • This job seems perfect for one of the touted Facebook AI's

  • Having sex with a dead goat - better live stream it to Facebook. I mean - wtf?!

I came, I saw, I deleted all your files.

Working...