Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Google

Google Assistant's AI Is Actually Humans In 'A White-Collar Sweatshop', Complain Workers (theguardian.com) 125

This week the Guardian ran an expose on Google Assistant (Google's version of Alexa or Siri)

"Interpreting a spoken request isn't magic, rather it has taken a team of underpaid, subcontracted linguists to make the technology possible." "It's smoke and mirrors if anything," said a current Google employee who, as with the others quoted in this story, spoke on condition of anonymity because they were not authorized to speak to the press. "Artificial intelligence is not that artificial; it's human beings that are doing the work." The Google employee works on Pygmalion, the team responsible for producing linguistic data sets that make the Assistant work. And although he is employed directly by Google, most of his Pygmalion co-workers are subcontracted temps who have for years been routinely pressured to work unpaid overtime, according to seven current and former members of the team.

These employees, some of whom spoke to the Guardian because they said efforts to raise concerns internally were ignored, alleged that the unpaid work was a symptom of the workplace culture put in place by the executive who founded Pygmalion. That executive was fired by Google in March following an internal investigation. But current and former employees also identified Google's broad reliance on approximately 100,000 temps, vendors and contractors (known at Google as TVCs) for large amounts of the company's work as a culprit.

Google does not directly employ the workers who collect or create the data required for much of its technology, be they the drivers who capture photos for Google Maps' Street View, the content moderators training YouTube's filters to catch prohibited material, or the scanners flipping pages to upload the contents of libraries into Google Books. Having these two tiers of workers -- highly paid full-time Googlers and often low-wage and precarious workers contracted through staffing firms -- is "corrosive", "highly problematic", and "permissive of exploitation", the employees said.

"It's like a white-collar sweatshop," said one current Google employee. "If it's not illegal, it's definitely exploitative. It's to the point where I don't use the Google Assistant, because I know how it's made, and I can't support it."

This discussion has been archived. No new comments can be posted.

Google Assistant's AI Is Actually Humans In 'A White-Collar Sweatshop', Complain Workers

Comments Filter:
  • by 93 Escort Wagon ( 326346 ) on Saturday June 01, 2019 @03:40PM (#58692146)

    ”Google does not directly employ the workers who collect or create the data required for much of its technology, be they the drivers who capture photos for Google Maps' Street View, the content moderators training YouTube's filters to catch prohibited material, or the scanners flipping pages to upload the contents of libraries into Google Books. Having these two tiers of workers -- highly paid full-time Googlers and often low-wage and precarious workers contracted through staffing firms -- is "corrosive", "highly problematic", and "permissive of exploitation", the employees said.”

    Google appears to be taking yet another page from the bad-old-days Microsoft playbook...

    • by alvinrod ( 889928 ) on Saturday June 01, 2019 @04:10PM (#58692292)
      I don't quite get how the workers think things would be better if they were directly employed by Google. They still wouldn't get paid any better because the work is not high-skilled labor and they'd still be laid off once they were finished with whatever short-term task they were assigned with doing is finished.
      • by Anonymous Coward

        Yes, but their contracting agency wouldn't take a cut. So if Google pays the same, they'd have more money.

        • Yes, but their contracting agency wouldn't take a cut.

          Most contractors don't work for an agency. My company uses contractors, and all of them have a direct contract. No agency, and no one taking a cut.

          They interview similar to a candidate employee. If they demonstrate they can do the job, we sign them up. They worked for a (renewable) fixed term, get no benefits and no severance, but are paid a higher rate.

          • I'm not sure you can extrapolate that to "most". My personal experience leads me to speculate that the larger the company, the more likely they are to require subcontractors to use a specific staffing firm. This was true for Microsoft and Amazon, at least. I suspect it's easier for them to simply deal with a single company as a middleman rather than managing each contract specifically.

          • by ceoyoyo ( 59147 )

            Not sure what it's like in the US, but here as a contractor I can write off all kinds of stuff. For the same expenditure by the contracting organization, I make a lot more.

        • by mindwhip ( 894744 ) on Saturday June 01, 2019 @05:25PM (#58692602)

          No because if Google employs them directly there are other overheads that come into play including the additional staff admin that Goolge has to do directly (and at greater cost than the agency can do it for as the agency can do so 'in bulk' for a larger pool of employees covering multiple companies) along with the costs for things like healthcare etc. Not to mention Google losing flexibility of being able to easily adjust the staffing levels as workload varies.

          Its actually likely that if Google was to employ directly there would be less jobs and wagers would be lower.

          • No, 'in bulk' is not the real reason. Google is already 'in bulk'.

            Google is unwilling to treat its own resources as poorly as the temp agency, but wants to benefit monetarily from the arrangement - that is the real reason.

            The staffing/temp agency model is ripe for disruption - by independent AI agents negotiating gigs in the best interests of their 'masters'.

  • Their cameras have monkeys inside chiseling out the picture

  • Will actually be piloted by an unemployed taxi driver watching multiple screens at once?

    • Close to that. The current approach to 'AI' is highly flawed and deeply inadequate; 'deep learning algorithms' aren't sufficient, and the so-called 'AI' has no capacity for actual cognition (your dog or cat or hamster has more ability to 'think'), so assuming it doesn't screw up so bad that it causes an accident, it'll pull over to the side of the road and 'phone home' so a remote human operator can take over. So much for 'self driving', right? Yet Google and Waymo and whoever else will claim it's 'self dri
      • Weird.

        "The current approach to 'AI' is highly flawed and deeply inadequate; 'deep learning algorithms' aren't sufficient..." Inadequate and (in)sufficient for what? I certainly hope they're insufficient for creating a true conscious AI, else we're in trouble. But they are probably sufficient for a lot of useful tasks, ranging from analyzing astronomical data to flipping burgers, and yes--maybe--driving cars.

        "they'll keep exaggerating" You are perhaps a prophet? Self driving cars will *never* work? Fra

    • Will actually be piloted by an unemployed taxi driver watching multiple screens at once?

      Wait, wait, I read that book!

      The Diamond Age by Neal Stephenson

      You can also hire a live virtual companion for the trip.

  • by 110010001000 ( 697113 ) on Saturday June 01, 2019 @03:59PM (#58692230) Homepage Journal

    The 2010s will be known as the the decade where the AI myth was hyped and exposed for what it is: a venture capital ploy. The only "AI" that works is data analysis, and that has been around since the beginning of computing. Oh, and autonomous driving? Not going to happen either but companies are happy to take VC money for it.

    • by Tablizer ( 95088 )

      The ai bubble poppage could be ugly. The upside is that Bay Area real-estate will be affordable again. Actually, this is the 2nd AI bubble; there was a smaller one in the late 1980's.

    • The 2010s will be known as the the decade where the AI myth was hyped and exposed for what it is

      There are only 7 months left. So if you are going to expose AI as a sham before the end of the decade, you need to hurry up.

      • It has already been exposed. IBM is learning this the hard way when they tried to sell their "Watson" AI which never worked. Once the next recession hits all that VC money will dry up.

      • There are only 7 months left. So if you are going to expose AI as a sham before the end of the decade, you need to hurry up.

        There are 19 months left.

        3rd millenium = 2001 to 3000
        21st century = 2001 to 2100
        202nd decade = 2011 to 2020.

        • by Kjella ( 173770 )

          There are 19 months left.
          21st century = 2001 to 2100
          202nd decade = 2011 to 2020

          Not when he said the 2010s. Everybody accepts that the 20s, 30s, 40s, 50s, 60s, 70s, 80s and 90s are 1920-1929, 1930-1939, 1940-1949, 1950-1959, 1960-1969, 1970-1979, 1980-1989 and 1990-1999. In UK English 2000-2009 is the noughties. I don't think 2010-2019 ever got a decent nickname but I'm pretty sure we'll return to the 20s meaning 2020-2029 next year. Same way 1900s != the 20th century. You can say this is wrong and whatever.... but all the big parties were on y2k, not 2001.

    • You and I don't get along, but I do agree with everything you just said, and I've been saying all the above for a long time now. When the truth comes out they'll be bankrupt. I just hope a bunch of people don't have to die just to expose the truth.
      • Impossible. I get along with everyone and everyone likes me.

      • When the truth comes out they'll be bankrupt.

        You're deluded. Waymo is part of Alphabet/Google. Those guys have enough money to eat 10 waymos without taking a dent on their balance sheet.

        I just hope a bunch of people don't have to die just to expose the truth.

        Yeah because no one dies on the roads right now...

      • Comment removed based on user account deletion
    • autonomous driving? Not going to happen either but companies are happy to take VC money for it.

      Why not? It seems clear to me that it could work, and already sort of works, but the algorithms are not good enough yet. Are you saying that algorithms can't be improved?

      • It's not that the algorithms are not good enough, it's that the algorithms create an emulation of the output of a real driver, but doesn't emulate the cognitive capabilities and thought process of a real driver.

        It's bound to fail as there would be cases where the AI will cause accidents/deaths which would be extremely unlikely with a real person behind the wheel. And those cases will continue.

    • You and Rick Schuman (who posted above) apparently believe you have the gift of prophecy. Autonomous driving is *never* going to happen? Do you by chance also believe (along with Lord Kelvin) that heavier-than-air flying machines will never happen?

    • For the more immediate concerns, an AI capable of human-level cognition is not needed. Deep learning will suffice for robotic machine vision and self-driving cars.

  • If the AI actually worked then the job wouldn't exist at all and you wouldn't be able to complain about how bad it is because you'd be out of work entirely.
  • People are complaining that they don't like their job.

    If your time is all that valuable, find someone willing to pay your price for it.

    Or keep the job you have.

  • Google offers a service. How much they automate, or not, it is their business.
  • There's nothing wrong with hating your job, and not finding another, and instead starting your own. There are plenty of jobs that require zero skill up-front, that pay really really well in terms of calendar time.

    Alas, these people all want free money, with zero risk. Taking zero risk means you get treated like someone who takes zero risks.

  • Wasn't too long ago when we last heard this. Last November was the Google Walkout. While the press focused on the #MeToo angle and the toxic workplace issue, Google TVCs made up a substantial number of protesters, and tried to call media attention to their plight. TVCs even followed-up the walkout with this open letter on Medium [medium.com], that garnered some attention [cnbc.com]. The Guardian had a good focus [theguardian.com] on the situation; they also referred to it as a "Two-Tier Workforce."

    But, I understand why people would forget. the digital interwebs is a fickle place, and what falls in our Facebook feeds and Google searches is easily forgettable.

    • There have been signs calling for unionization in the Mountain View train station.
  • computers used as tool to aid when humans produce and use makes more sense than most the B.S. being passed as AI

  • that these allegedly "self-driving" cars are really beaming their cameras back to some sweatshop full of overstressed "assistants" ?
    • that these allegedly "self-driving" cars are really beaming their cameras back to some sweatshop full of overstressed "assistants" ?

      There are actually food delivery trolleys that deliver meals to elderly people that are remote controlled by someone in a country with lower wages. Much cheaper than car + deliver driver. It's like a video game, just that you get points for _not_ hitting anyone.

  • REQUIRED to work unpaid overtime?

    That's a complaint to the Labor Commissioner. If I get fired, I yelp like a scalded dog, kick up a stink on social media and file another complaint for retaliatory behavior.

    No job is worth so much that I won't report my employer for criminal behavior.

  • That wording kind of makes it sound like humans are listening to the "OK/Hey Google" and personally responding.
  • When I read this story, why do I get the image in my head of Charlton Heston shouting "Soylent Green is People!!!!"
  • I get that the folks leaking this information out to The Guardian are frustrated with what they see going on around them. But it sounds like the lead of the whole project was canned a few months ago, so Google actually did take action to correct the root of the problem. You can't expect an entire division of a company to do a complete change practically overnight, once you fire the guy who set up a whole structure for it.

    Additionally though? When all you need is a whole lot of data collection or input, that

  • The working conditions described (such as unpaid overtime) are terrible, if true, and should be dealt with. But that has nothing to do with AI.

    AI does not mean the technology is unaided by humans. Any worthwhile AI will require a LOT of training...by humans. Humans themselves undergo training for their entire lives. Why would AI be any different, other than not being nearly as adept as humans?

    Just because humans are involved, doesn't make it "not" AI...though that term is misused a lot by marketers these days.

I've noticed several design suggestions in your code.

Working...