Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Google Privacy

Google Contractor Pays Parents $50 To Scan Their Childrens' Faces (404media.co) 46

Google is collecting the eyelid shape and skin tone of children via parent submitted videos, according to a project description online reviewed by 404 Media. From the report: Canadian tech conglomerate TELUS, which says it is working on Google's behalf, is offering parents $50 to film their children wearing various props such as hats or sunglasses as part of the project, the description adds. The project shows the methods some companies are using to build machine learning, artificial intelligence, or facial recognition datasets and products. Rather than scraping already existing images or analyzing previously collected material, TELUS, and by extension Google, is asking the public to contribute directly and get paid in return. Google told 404 Media the collection was part of the company's efforts to verify users' age.
This discussion has been archived. No new comments can be posted.

Google Contractor Pays Parents $50 To Scan Their Childrens' Faces

Comments Filter:
  • Usually they just steal people's work by having them identify fire hydrants, bicycles or crosswalks to access the free internet...

    On the other hand, I know another class of creeps who collect photos of children and they tend to end up in isolation in jail. Shame on the parents who sell photos of their children to creepy Google if they don't have a dire need for the money.

  • sell me your children I will pay $50 CAD!

  • I hope, in the fullness of time, that some of these kids sue their parents and get a painful amount of money out of them for whoring them out when they're too young to protest.

    In coming years, when it's far too late, people will come to realize just what they're giving up when they surrender their privacy to people with exponentially more power than they have.

    • This is pretty tame compared to the child beauty pageants that parents groom their own children for in the USA.
    • by classiclantern ( 2737961 ) on Friday January 05, 2024 @12:30PM (#64134341)
      I don't share your optimism. We live in a dystopian World now because everyone lives in a fantasy bubble of justice, fairness, privacy, health, democracy, and super-heros. It's clear to me than absolutely no one knows what is really going on. Here on Slashdot just like everywhere else, the mere suggestion things are not the way they appear is roundly cancelled. When was the last time a victim received justice? Who is truly healthy when we are all dying? Was there ever a fully honest election? If there is no God then there is no plan, only chaos.
      • In addition, people embrace the dystopia. "Oh well, I can't stop using credit cards because they track you anyway." Er, not if you go into a store and use cash and nothing but cash. At least until facial recognition is that good. But every news story about this horror is accompanied by 'civil libertarians are concerned.' Well, golly, gee, I feel so much better.
      • Victims receive justice every day.
    • It's already too late and people have already been bitten by the data they supplied to Big Data years ago for a long time. And now that AI has swallowed it all and will readily regurgitate it with the correct prompt, people will exponentially get to pay dearly for years of ignorance or sloppiness very soon.

      Me, I got my first realization that the future would become very dystopian when Scott McNealy inadvertantly spilled the beams in 1999 [wired.com].

      Not many people believed him or even paid attention to what he said ba

      • With you, my friend. I've probably been less effective than you, but I've definitely tried to limit the extent to which I've hemorrhaged personal data over the years.

      • It's already too late and people have already been bitten by the data they supplied to Big Data years ago for a long time. And now that AI has swallowed it all and will readily regurgitate it with the correct prompt, people will exponentially get to pay dearly for years of ignorance or sloppiness very soon.

        Me, I got my first realization that the future would become very dystopian when Scott McNealy inadvertantly spilled the beams in 1999 [wired.com].

        Not many people believed him or even paid attention to what he said back then. But I did. The reality of what he said hit me like a freight train, and I've been extremely paranoid about my private data ever since.

        I've been called a crazy conspiracy theorist for decades. But here we are today: my attack surface is much narrower than most because I supplied as little data to Big Tech as I could for 25 years, and what data I did have to supply was usually poisoned. So who's crazy now eh?

        1) Talk about privacy and minimizing invasive data-attack surfaces.
        2) As proof, post link to article on website requiring credit-card subscription account.
        3) ???
        4) Prophet!

      • Privacy isn't dead, but it's on Life Support and fading fast.

    • by skam240 ( 789197 )

      By your standards we'd never have child actors or regular clothing models. Seems pretty over the top to me.

      At least Google is doing this with parental consent rather than just scraping shit off the internet.

    • I hope, in the fullness of time, that some of these kids sue their parents and get a painful amount of money out of them for whoring them out when they're too young to protest.

      In coming years, when it's far too late, people will come to realize just what they're giving up when they surrender their privacy to people with exponentially more power than they have.

      In the relatively near future, I have my suspicions that privacy will cease to be a concept that anyone cares about. Once you get past GenX and head into GenZ and beyond, you'll be dealing with people raised in an environment of fear, clamoring for the illusion of security theater to make them feel safer. They don't care about privacy today if someone can breathlessly whisper, "Think of the children," with tears in their eyes. Why would they care about privacy tomorrow?

      • In the relatively near future, I have my suspicions that privacy will cease to be a concept that anyone cares about. Once you get past GenX and head into GenZ and beyond, you'll be dealing with people raised in an environment of fear, clamoring for the illusion of security theater to make them feel safer. They don't care about privacy today if someone can breathlessly whisper, "Think of the children," with tears in their eyes. Why would they care about privacy tomorrow?

        What one generation tolerates....

        ...

  • TELUS writes that the purpose is to “capture a broad cross-section of participants targeting various combinations of demographics, with the goal of ensuring that our customer's services, and derived products, are equally representative of a diverse set of end-users.”

    Really? Cause it sounds more like you are trying to train AI to recognize and work around physical methods of thwarting facial recognition.

    Queue the think of the children plea.

    • TELUS writes that the purpose is to “capture a broad cross-section of participants targeting various combinations of demographics, with the goal of ensuring that our customer's services, and derived products, are equally representative of a diverse set of end-users.”

      Really? Cause it sounds more like you are trying to train AI to recognize and work around physical methods of thwarting facial recognition.

      Queue the think of the children plea.

      But they are thinking of the children ... just not in the same way that normies think of children.

  • Google is collecting the eyelid shape and skin tone

    Sounds more like racial recognition.

  • Fight fire with fire (Score:3, Interesting)

    by Rick Schumann ( 4662797 ) on Friday January 05, 2024 @01:40PM (#64134513) Journal
    Use a generative AI to produce pictures of children the way Google is asking for, get $50.
  • At least adults can consent.

Don't panic.

Working...