Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Facebook Social Networks The Internet

Facebook Moderators Say Company Is Asking Them To 'Risk Our Lives' (engadget.com) 67

In an open letter published Wednesday, a group of Facebook moderators say the company is putting them and their families at risk by asking them to go back to work in the midst of the pandemic. Engadget reports: The content reviewers say that while workers with a doctor's note can be excused from going to the office, those with high risk family members don't get the same opportunity. "In several offices, multiple COVID cases have occurred on the floor," the letter, states. "Workers have asked Facebook leadership, and the leadership of your outsourcing firms like Accenture and CPL, to take urgent steps to protect us and value our work. You refused. We are publishing this letter because we are left with no choice."

According to the letter-writers, the reason Facebook is pushing moderators to go back to the office is because the company's AI-based moderation is "years away" from being truly effective: "Without informing the public, Facebook undertook a massive live experiment in heavily automated content moderation. Management told moderators that we should no longer see certain varieties of toxic content coming up in the review tool from which we work -- such as graphic violence or child abuse, for example. The AI wasn't up to the job. Important speech got swept into the maw of the Facebook filter -- and risky content, like self-harm, stayed up. The lesson is clear. Facebook's algorithms are years away from achieving the necessary level of sophistication to moderate content automatically."

The letter also brings up several issues that predate the coronavirus pandemic, like the lack of mental healthcare for moderators as well as their status as contractors rather than full-time employees. Among the moderators demand from Facebook and the contracted companies that employ them: hazard pay, more flexibility to work from home and access to better mental healthcare.
"We appreciate the valuable work content reviewers do and we prioritize their health and safety," a Facebook spokesperson said. "While we believe in having an open internal dialogue, these discussions need to be honest. The majority of these 15,000 global content reviewers have been working from home and will continue to do so for the duration of the pandemic. All of them have access to health care and confidential wellbeing resources from their first day of employment, and Facebook has exceeded health guidance on keeping facilities safe for any in-office work."
This discussion has been archived. No new comments can be posted.

Facebook Moderators Say Company Is Asking Them To 'Risk Our Lives'

Comments Filter:
  • I am grabbing a popcorn
  • Will be some good breeding stock from this.

  • "and Facebook has exceeded health guidance on keeping facilities safe for any in-office work" There is no need for any in-office work. Every other tech company can handle 100% remote work, but not facebook? Give me a break. "You have previously said content moderation cannot be performed remotely for security reasons." Other companies are working around these issues. Facebook should too.
    • Re:Nonsense (Score:5, Insightful)

      by malkavian ( 9512 ) on Wednesday November 18, 2020 @10:08PM (#60741348)

      I would guess that the working from office is based around the access to illegal materials and having a safe harbour clause with law enforcement.
      For example, if you're at work, and you're provably at a work computer that only accesses Facebook, and links from there, then any an all content on the computer becomes evidence for pointing a legal finger at people who post illegal content (from criminal behaviour through to snuff and kiddie porn).
      If said computer becomes riddled with illegal imagery, then there is a very good trail that guarantees nothing came in from outside that room, and nothing leaves that device.

      Once you start taking that home, then you lose the ability to secure a terminal in that fashion. If the telecommute home computer becomes infested, or a work laptop using the home wireless becomes compromised, then there is very little recourse should that device become host to material that couldn't be proven to have come from a FB connection, opening the employee up to all kinds of charges.

      And there's almost certainly no way to police the distribution of any content that is uncovered in the course of the job. Working in a secured environment is just that.. And it's there to protect the worker as much as anything.. Other companies working round security issues, I suspect don't have the legal requirements for accessing illegal content that FB does with its mods.

      Those are just the hurdles I can immediately think of.. Without knowing the exact legal agreements with law enforcement, none of us really know the true picture.

      As for going in? I work direct support to front line in a hospital, so I've been on site nearly every day since the start of the pandemic, and seen colleagues get sick from it.. I appreciate how hard it is, but like everyone else I work with and around, we know we've got a job to do, and to make sure there are jobs afterwards, we have to keep the wheels on (actually, in my case, I have a job to do, so patients have a good shot at having life afterwards, but hey.. That's the reason I joined healthcare).

      • by Anonymous Coward

        I suspect the problem isn't so much illegal content (it's relatively easy to provide a locked down appliance incapable of locally storing anything), but preventing exfiltration of sensitive/protected information.

      • I spent years in call centers. Most likely they are accessing their tools over an RDP connection, even when they're in the office. It's done this way so you can turn the computers they use into dumb terminals making it easy and cheap for local IT to replace them. It also makes it much easier to lock down their environment.

        Finally companies do like to have the flexibility to do WFH if they find the hit in productivity from not having a manager standing over them is less than the cost of having a physical
    • by mark-t ( 151149 )

      I work for a tech company right now, and for the moment, we have embraced working from home. Overall, I like it quite a bit (the commute is awesome), but there are definitely advantages to us all being in one place at once, and I have regrettably noticed that working from home is having a less than ideal affect on my overall productivity. Speaking for myself, there are more distractions at home than at work, so I am having to work longer hours to compensate. Some are simply unable to find enough isola

  • by roc97007 ( 608802 ) on Wednesday November 18, 2020 @08:36PM (#60741102) Journal

    Am I missing something? Why can't the content reviewers VPN from home?

    Like the rest of us are doing?

    • Their management wants to be able to oversee them in person, likely so they can squeeze a little more productivity out of them.
      • That way they can breathe (SARS-COV-2) down their necks.

      • by gweihir ( 88907 )

        Ah, yes, good old dysfunctional "managers" that cannot actually supervise anybody but needs people present to give themselves the illusion of being able to.

      • Their management wants to be able to oversee them in person, likely so they can squeeze a little more productivity out of them.

        Hm. Good point. One might say, if a manager must have their direct reports physically present, what exactly is the manager's job again?

        • these aren't Facebook employees, they're contractors working for a 3rd party. Basically, this is a call center environment. I've worked this kind of place before. FB will pay the vendors like shit and they in turn the employees like shit. This means the employees don't make enough money to live on, so their lives are a constant mess. This in turn means they've got a wide variety of personal problems which reduce their productivity.

          Mix in some very, very high turnover (you're reading posts about guys wan
          • I wasn't aware that this was a call center environment. You raise a very good point. Call centers are a nasty environment.

      • by AmiMoJo ( 196126 )

        Also so the management can say they are taking steps to prevent PTSD. It's a big problem for social media moderators, particularly on Facebook which seems to get the worst of it.

        If they are in the office the managers can say they put up motivational posters and there is a therapist they can go talk to on their lunch break. That way if they get PTSD it's their fault!

      • How hard is it to put metrics in saying they have to moderate x items/day? This seems like a super easy job to manage remotely.

  • Facebook doesn't like people working from home.

    I've been recruited to them multiple times, but they insist on my relocation to areas where their offices are located. They recently baited me with "work from home" job opportunities, but that's only during COVID-19 and I would have been required to relocate in April, 2021, anyway.

    Facebook is crazy stupid to work for.

  • Facebook and Twitter might initiate our next war.
    • Facebook and Twitter are all but guaranteed to initiate our next war.

      Fixed that for you. Seriously, they are doing their dead-level best to both appear to take content management seriously while doing as little of it as possible, what with it mostly just interfering with their whole 'passive yellow journalism' business model. "Remember the Maine!" takes on a whole new gravity in a world with nuclear weapons and drones equipped with precision missiles.

  • The argument of using one false negative and one false positive is as proof that the AI doesn't work or is not up to the task is meaningless. The real question is what percentage of moderated content is incorrectly classified - that would tell you whether or not AI is up to the job. Unfortunately this information is missing from this letter. If the bar is 100% correct 100% of the time, AI will never be up to the job, and neither will any of the human moderators. If AI is close to human error rate, it provid

    • by malkavian ( 9512 )

      Given that a high degree of things that get moderated on are subjective, then I suspect that there's not ever going to be a suitably objective metric to measure by.
      This would also probably mean that any AI that deviated from the bias of the person/group implementing it would be marked unfit, even if it had correctly learned a completely objective and measurable metric from the data.

  • The Internet is a dirty street corner with shiny buildings inviting you in for *free* beer and pretzels.
  • self absorbed morons they sure are pretending to be all touchy, feely and such. Must be hard when you are so narcissistic.
    Facebook Moderators Say Company Is Asking Them To 'Risk Our Lives'
  • Hey, have you heard the good news about masks? They work! That's what the press says, at least.

    • What they say is that wearing masks [correctly] works for reducing viral load, and thus risk. They don't say that masks are 100% effective at stopping Covid-19. Only trolls and dumbasses claim that they're saying that, and only the latter believe it.

  • by lusid1 ( 759898 ) on Thursday November 19, 2020 @02:08AM (#60741736)

    Moderate content remotely? That's hilarious. And sad, but mostly hilarious.

  • Have these people not heard of remote desktop? You can do this with Windows, granted it's probably a bit expensive but I'm sure you can do BYOL model and leverage AWS Workspaces with DUO 2-factor sign-in. Issue out dumb chrome books and you're good to go. Pretty much can work from anywhere...

  • Why can't the work be done from home?

    • Probably because Facebook is trying to hide both the kind of content being posted to their platform, and the kind of censorship* of content they are engaging in.

      * Of course it's censorship, it's just not illegal censorship. It isn't only censorship when the government does it, it's censorship when someone in charge does it to someone else.

  • Life isn't fair, the world owes you nothing. If they don't want to do it, they can quit.
  • Comment removed based on user account deletion
  • I for one am very glad to see facebook standing up to the greater danger that is uncomfortable words and opinions. Without their leagues of moderators and compelling work schedules, we might actually have to make moral judgements on our own.

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...