Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Google AI

Google Executive Warns of Face ID Bias (bbc.com) 71

Facial recognition technology does not yet have "the diversity it needs" and has "inherent biases," a top Google executive has warned. From a report: The remarks, from the firm's director of cloud computing, Diane Greene, came after rival Amazon's software wrongly identified 28 members of Congress, disproportionately people of colour, as police suspects. Google, which has not opened its facial recognition technology to public use, was working on gathering vast sums of data to improve reliability, Ms Greene said. However, she refused to discuss the company's controversial work with the military. "Bad things happen when I talk about Maven," Ms Greene said, referring to a soon-to-be abandoned project with the US military to develop artificial intelligence technology for drones. After considerable employee pressure, including resignations, Google said it would not renew its contract with the Pentagon after it lapses some time in 2019. The firm has not commented on the deal since, only to release a set of "AI principles" that stated it would not use artificial intelligence or machine learning to create weapons.
This discussion has been archived. No new comments can be posted.

Google Executive Warns of Face ID Bias

Comments Filter:
  • Wrong? (Score:2, Insightful)

    Amazon's software wrongly identified 28 members of Congress[,,,]as police suspects

    "There is no native criminal class except Congress." -- Mark Twain

  • How noble of them (Score:1, Insightful)

    by Anonymous Coward
    But they'll happily allow it to be used to subjugate and oppress civilians.
    If those employees were so concerned about rights and liberties they'd have blocked the tech altogether.
  • by SuperKendall ( 25149 ) on Friday July 27, 2018 @10:38AM (#57019092)

    The technology behind FaceID has no bias. It works really well - if given the right training data. Now it could easily be that the training data you are feeding it is biased in some way, but that is why extensive testing of the resulting recognition engine you have built is key, so you can go back and correct training data...

    Because training neural networks is kind of a blackbox, it's sometimes hard to say what kind of bias you may have built in. the Amazon system recognizing a set of politicians as criminal might be down to the lighting used in the picture being a lot like mug shot lighting!

    Or who knows, maybe it's latched onto specific micro-expressions of criminals and the politicians it identified really are criminals, we just don't know it yet... :-)

    • Bias? Probably only got those with leg monitors already attached.
    • by Anonymous Coward

      Using unbiased training data doesn't necessarily lead to an unbiased system. If your task is to classify thumbnail pictures of cats, dogs, coins, and fake coins, then your system will have problems distinguishing the last two even though you were given 1000 training pictures of each type.

      • Burning mod points here but... The NN might indeed be unbiased. So what? All practical systems do some preprocessing to cut the data rate to something reasonable, not just raw pixels at random orientations - you've all see the pics of faces with polygons drawn over them, right?
        Guess where that comes from, and how it was tuned? Decisions about how to data reduce the input - create bias.
        What helps tell say, white faces apart and white from black (to vastly oversimplify, not trying to exclude any race et
    • The technology behind FaceID has no bias. It works really well - if given the right training data.

      Given that clearly there isn't a set of "right training data" available that sounds like a hasty conclusion without evidence. Your argument is circular. You say the technology has no bias but proving that it has no bias requires feeding it an unbiased data set which hasn't happened. So neither of us knows if there is an inherent bias built into the system or not. Maybe there is and maybe there isn't but you don't have the data to say either way.

      Furthermore it's more complicated than just the training da

      • Given that clearly there isn't a set of "right training data" available>

        Apple got this right, but you are right that generally there is not a "right training set" for facial recognition.

        There cannot be though because it all depends on what you are trying to do. What is right for one purpose would be wrong for another.

        Your argument is circular. You say the technology has no bias but proving that it has no bias requires feeding it an unbiased data set

        That reflects a total lack of understanding of neura

    • by sjames ( 1099 )

      Are you sure it doesn't have problems caused by a different contrast between skin tone and background, for example? Perhaps the camera's automatic exposure and white balance adjustments are losing detail?

  • by Solandri ( 704621 ) on Friday July 27, 2018 @10:52AM (#57019188)
    The problem is the amount of light the camera sensors receive. Darker faces reflect less light, and thus the camera sensor gets less data to work with making algorithms based on that data less accurate at identifying darker faces.

    This presents an obvious solution. To further the goal of eliminating racial bias, we need to turn off all the lights. That means all light bulbs need to be banned, and existing ones destroyed. NASA should launch a huge unfurling disk to block out the sun and leave the planet in perpetual darkness. Newborns should have their eyes surgically removed upon birth (they won't suffer because they won't know what they're missing). Only then can we be free of the evil racial bias being promulgated by light.
    • by PPH ( 736903 )

      At least, get rid of outdoor lighting. So when you encounter someone at night, everyone is equal.

    • The solution is just to wait it out. Give our species another 10,000 years and we might just evolve past racial bias -- assuming we don't destroy the planet one way or another before then, or cause our own extinction-level event.
    • Oh sure, then we'll be doomed when the Triffids invade.

    • by AmiMoJo ( 196126 )

      Just use IR cameras, or properly light areas where you are using the tech.

      Or just don't use facial recognition, that's better for everyone.

      • by btroy ( 4122663 )
        Agreed - it would mean that cameras collect both visible and infrared pictures as signatures.
  • Our species has not evolved past bullshit like racial bias therefore the shitty excuse for AI we have, that has zero capability to think for itself, only reflects who we are like a perfect mirror: humans are racist, therefore the shitty AI is racist, too, and there's no fixing it, any attempt to make it unbiased will just be scoffed at and devalued by the assholes among our species that has embraced racism; they'll dismiss it as 'liberal bias'.
  • Another step toward a tyrannical, Big Brother-like society, and all preceded, as usual, with the excuse of fighting crime. Google, please stick your Face ID system you know where.
  • Physics is racist (Score:5, Insightful)

    by argStyopa ( 232550 ) on Friday July 27, 2018 @12:15PM (#57019702) Journal

    It's harder to see the contours of a dark-colored shape (ie a face) than a white one.

    Seriously, people, how are we going to get around that?

  • by SmaryJerry ( 2759091 ) on Friday July 27, 2018 @01:46PM (#57020288)
    They are reporting this as if the false positives are either matches or not match. In reality, there are associated levels of confidence, and a match is likely anything more than 95% confidence. Less light means there is more similar data in a photo. What they really need to do is run a different confidence level on faces that reflect less light. Maybe even run a completely separate facial recognition algorithm so the accuracy of the better data is not muddying the confidence levels of the worse data.

As you will see, I told them, in no uncertain terms, to see Figure one. -- Dave "First Strike" Pare

Working...