Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Social Networks

Behind TikTok's Boom: A Legion of Traumatized, $10-A-Day Content Moderators (time.com) 90

Time magazine teamed up with a London based non-profit newsroom called the Bureau of Investigative Journalism, in an investigation that reveals that "horrific" videos "are part and parcel of everyday work for TikTok moderators in Colombia." They told the Bureau of Investigative Journalism about widespread occupational trauma and inadequate psychological support, demanding or impossible performance targets, punitive salary deductions and extensive surveillance. Their attempts to unionize to secure better conditions have been opposed repeatedly. TikTok's rapid growth in Latin America — it has an estimated 100 million users in the region — has led to the hiring of hundreds of moderators in Colombia to fight a never-ending battle against disturbing content. They work six days a week on day and night shifts, with some paid as little as 1.2 million pesos ($254) a month, compared to around $2,900 for content moderators based in the U.S....

The nine moderators could only speak anonymously for fear they might lose their jobs, or undermine their future employment prospects.... The TikTok moderation system described by these moderators is built on exacting performance targets. If workers do not get through a huge number of videos, or return late from a break, they can lose out on a monthly bonus worth up to a quarter of their salary. It is easy to lose out on the much-needed extra cash. Ãlvaro, a current TikTok moderator, has a target of 900 videos per day, with about 15 seconds to view each video. He works from 6am to 3pm, with two hours of break time, and his base salary is 1.2m pesos ($254) a month, only slightly higher than Colombia's minimum salary.... He once received a disciplinary notice known internally as an "action form" for only managing to watch 700 videos in a shift, which was considered "work avoidance". Once a worker has an action form, he says, they cannot receive a bonus that month....

Outsourcing moderation to countries in the global south like Colombia works for businesses because it is cheap, and workers are poorly protected.... For now... TikTok's low-paid moderators will keep working to their grueling targets, sifting through some of the internet's most nightmarish content.

The moderators interviewed all had "contractor" status with Paris-based Teleperformance, which last year reported €557 million ($620m) in profit on €7.1 billion ($8.1 billion) in revenue. In fact, Teleperformance has more than 7,000 content moderators globally, according to stats from Market Research Future, and the moderators interviewed said that besides TikTok, Teleperformance also provided content moderators to Meta, Discord, and Microsoft.
This discussion has been archived. No new comments can be posted.

Behind TikTok's Boom: A Legion of Traumatized, $10-A-Day Content Moderators

Comments Filter:
  • It was called "Clockwork Orange".

  • by Anonymous Coward

    This sounds like torture. Are we sure they aren't being forced to do this work on threat of violence?

    Could you imagine watching hundreds of video clips for 7+ hours a day, every day? But even better, TIK-TOK clips. That's a punishment worse than death.

    I went to tik-tok once and after watching half a dozen videos I wanted to stick hot pokers in my eyes. Stupidest content on the planet. The human race is doomed.

    • Could you imagine watching hundreds of video clips for 7+ hours a day, every day?

      Plenty of first world social-media-addicted dumbasses happily do that for free.

    • Tiktok has excellent algorithms based on how long you view videos. I downloaded it and it begins with the Gerry Springer, 'reality teevee', shock jock crap that so many love. But by randomly showing you vids, it quickly finds your areas of interest and refines what it shows you until you get hooked. In my case it was bass guitar and blues videos by really good musicians as well as hacks just slightly better than me. It is quite addicting because tiktok floods one vid after another and it's easy to get hooke
    • I guess if I had to do the job, the quality of my moderating would suck. Unless something really jumps at me as bad, it would be "it's OK", click, "it's OK", click, "it's OK", click...

  • but luckily the TikTok moderators only have to watch 15 second videos.
    Imagine being a Facbook moderator having to watch hour long videos on all kinds of nonsense like bird watching.

    • That's why they should have Slashdot-style moderation, where the site users themselves upmod or downmod content, so that bad content quickly gets buried and good content gets promoted. This has been working great for Slashdot for decades: I literally can't remember the last time I saw a GNAA membership offer, a penisbird or a svastika on here.

      • "Any sufficiently optimistic statement is indistinguishable from sarcasm."

        Slashdot's system leaves the bad content available. Illegal stuff needs to be purged, not buried.

        Also, Slashdot's system results in good, well-written and relevant content modded troll for political reasons.

        • Re: (Score:3, Insightful)

          by Powercntrl ( 458442 )

          Also, Slashdot's system results in good, well-written and relevant content modded troll for political reasons.

          To be fair, Slashdot at least doesn't let a post fall below -1. On some other forums (Red*cough*dit), a sufficiently unpopular post can burn an incredible amount of the poster's karma, usually resulting in the poster self-censoring by deleting the post to prevent further karma loss.

        • by AmiMoJo ( 196126 )

          When you consider how relatively small Slashdot is compared to sites like Facebook and TikTok, and yet how much effort trolls and shills put into farming mod points and pushing their political/commercial agenda... The scale of the problem for bigger sites would be insurmountable.

          Correction, *is* insurmountable. None of them seem to be able to stop it.

      • by Powercntrl ( 458442 ) on Sunday October 23, 2022 @10:37PM (#62992487) Homepage

        I literally can't remember the last time I saw a GNAA membership offer, a penisbird or a svastika on here.

        Then you're probably also missing out on posts that ran afoul of the groupthink, and anons who actually made a good point but didn't get modded up because the crop of moderators du jour were too busy blowing their moderator points censoring opposing political views.

        Originally, Slashdot's moderation system did take into account that some people would moderate in a biased fashion, and had meta-moderation so moderators with an axe to grind would eventually be removed from the moderation pool. This site has changed hands so many times though, that part of Slashcode is likely broken.

        Ultimately, you're still left with the problem that only some people are really qualified to be good moderators, and you're not paying them for their efforts. I suspect as time went on, it may have also been possible that there aren't as many people who are willing to do a good job cleaning up Slashdot's trash, for free.

        • It still "works" but the same shit goes on in metamoderation. Just try moderating political topics, you'll find you start getting mod points *a lot* less often, no matter how reasonable your mods were.

          Part of it is because they changed the metamod system to be garbage... it used to ask if you agreed with the specific mod, now it just asks you 'good post or bad post'... sometimes it matters what the actual mod was. Like I wouldn't hit someone with a bad metamod for thinking some things were funny that I'd
        • by sinij ( 911942 )
          Agreed, Slashdot moderation system is not holding up here but at least it is one of the rare few places where politics could be discussed and echo chamber is not enforced. You can't say the same about Reddit, where political subredits are universally dogmatic echo chambers. The difference? On /. moderation is decentralized (i.e., no one admin running things) and on Reddit there is. As such, despite shortcomings, I think this moderation system works better on a community level.

          What lacks is anti-brigade me
        • by AmiMoJo ( 196126 )

          CmdrTaco has spoken about this on Twitter. Still hoping he does a memoir. Anyway, apparently there is some mechanism by which the site can perma-ban accounts from getting mod points. He mentioned auto-generated fake posts that were either obvious trolls or not trolls, and if the user moderated them the wrong way their account never got more mod points.

          I'm not sure how it related to meta-moderation though. There is an obvious flaw in the scheme as well - users can just create new accounts.

        • I browser at -1 specifically because a decent amount of on-topic stuff gets modded down because of groupthink. Other times, the person really is an idiot but the comments before and after the idiot make more sense when you also read what the idiot wrote.

          With that said, I'm sort of surprised I haven't seen some of the prior spam. Figured people got bored and quit after so long. Like GNAA I assumed just got bored or died or whatever.

      • That's why they should have Slashdot-style moderation, where the site users themselves upmod or downmod content, so that bad content quickly gets buried and good content gets promoted. This has been working great for Slashdot for decades: I literally can't remember the last time I saw a GNAA membership offer, a penisbird or a svastika on here.

        Spammers are still around, here's one I flagged and downmodded most recently https://slashdot.org/~Antonsax... [slashdot.org] and what happened here I don't know but also downmodded https://hardware.slashdot.org/... [slashdot.org] But with that said I pretty much concur.

    • That's the huge difference, on Tiktok all you've got is a neverending stream of videos of doughy white chicks twerking and lipsyncing while on Fecebook you've got to sit through (shudder) Fecebook content.
  • Because the media want you to believe American-based content-creation sites are safe havens
  • Brutal (Score:3, Interesting)

    by SirSlud ( 67381 ) on Sunday October 23, 2022 @10:44PM (#62992497) Homepage

    Jesus Christ that would absolutely wreck your brain, can't even imagine the brutal shit they have to watch, and in tiktok lengths no less. That absolutely has to fuck you up.

    • Jesus Christ that would absolutely wreck your brain, can't even imagine the brutal shit they have to watch, and in tiktok lengths no less. That absolutely has to fuck you up.

      I'd figure the robotic voice and terrible background music would be what destroys your sanity first. Especially that "oh no" song.

    • by Tablizer ( 95088 )

      I can't believe there are that many sinister videos created. An occasional cat in the dryer maybe, but most are just dancing or taking on mostly harmless acrobatic challenges. Are there really that many morons? What is the vid rejection rate?

      • Are there really that many morons?

        You've never dealt with humans before, have you?

      • Are there really that many morons? What is the vid rejection rate?

        We don't know, but TikTok has over one-billion users worldwide.

        Even if 0.001% are uploading CSAM, torture and violence videos daily then that's 10,000 videos per day.

    • The worst thing I could imagine must be that on TikTok, a fad takes off and everyone and their dog (quite literally in this case) do the same kind of video. Imagine watching the same bullshit video from a million people, all doing the exact same silly movies. It's not even interesting the first dozen times, now repeat that another thousand times.

    • Re:Brutal (Score:4, Interesting)

      by AmiMoJo ( 196126 ) on Monday October 24, 2022 @04:13AM (#62992801) Homepage Journal

      Facebook has the same problem, and I expect most social networks do.

      In Facebook's case they not only had problems with moderators getting traumatised, they had issues with not having moderators familiar with languages and events in certain parts of the world that allowed Facebook to be used to organize genocide.

      It's an unsolved, maybe unsolvable problem.

    • Yep, and imagine we only see what they approve. Who wants to see a best of banned tik tok?

  • Just let people post the videos they want and let other people watch the videos they want.

  • by geekmux ( 1040042 ) on Monday October 24, 2022 @02:35AM (#62992697)

    ...besides TikTok, Teleperformance also provided content moderators to Meta, Discord, and Microsoft.

    Well, if that is what they want to call "moderation", at least they've explained the consistency between major platforms. Same company behind all of them.

    How utterly convenient for those who wish to control or manipulate a narrative across the masses of most major countries. When you own social media moderators, you control the winds of content.

  • Wait, you mean "AI" and "algorithms" don't do all this shitty work for us, so humans don't have to? But i was told that AI and algorithms were solving all the world's problems right now!

  • Still don't get Twitter or Facebook, and now I don't get this either. Guess I'm just lucky like that.

    Fair play to their devotees: What do they do? Like really, what do they do that makes them worthwhile?
    • Facebook actually makes sense to me. A number of people used to send out 'newsletters' once a year or so about events in their lives to their extended family & friends who they weren't in regular contact with. Facebook made this kind of thing a whole lot easier and added useful features (along with, inevitably, some terrible ones). The big bummer is that this didn't turn into an open standard with multiple competitors you could cross-post between, but instead became a single monopoly increasingly tai
  • Yeah $10 a day seems low for us, but it's more than a lot of people in Columbia will earn with a legal job. So there is no problem with the wage that they get so stop acting like it's awful how low they get paid to do their job.
    And I'm still wondering if they are actually training AI at the same time with all the moderators so in the near future they can put those moderators out of a job. I wonder how those moderators like it when they have to find another job in a country where jobs are already hard to fin

  • by weeboo0104 ( 644849 ) on Monday October 24, 2022 @07:55AM (#62993097) Journal

    You mean all the horrible TikTok videos I see online are the ones that made it PAST the moderators?

    • Nope. It means if enough people complain about a video, it will get reviewed by people in the lowest-wage nation that speaks the dominant language in the region the video is posted from.

      Also... if they're horrible, maybe stop apping like an apping app apper?

  • Let's suppose that, to post a video, a poster had to be verified with an ID such a driver's license. The ID would be kept private between the company and the individual, unless law enforcement had a subpoena.

    This would eliminate bots, and help law enforcement track down the criminals behind the posts.

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...