Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Technology

Facial Recognition Is Accurate, if You're a White Guy (nytimes.com) 284

Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph. When the person in the photo is a white man, the software is right 99 percent of the time. But the darker the skin, the more errors arise -- up to nearly 35 percent for images of darker skinned women, the New York Times reported, citing a new study. From the report: These disparate results, calculated by Joy Buolamwini, a researcher at the M.I.T. Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition. In modern artificial intelligence, data rules. A.I. software is only as smart as the data used to train it. If there are many more white men than black women in the system, it will be worse at identifying the black women. One widely used facial-recognition data set was estimated to be more than 75 percent male and more than 80 percent white, according to another research study.
This discussion has been archived. No new comments can be posted.

Facial Recognition Is Accurate, if You're a White Guy

Comments Filter:
  • by Anonymous Coward on Sunday February 11, 2018 @10:08AM (#56103923)

    White people's faces reflecting more light is problematic.

  • by mpercy ( 1085347 ) on Sunday February 11, 2018 @10:12AM (#56103937)

    Always knew that machines were bigots.

  • So the government will be less likely to know where I am.

  • Here we go again (Score:4, Interesting)

    by Jody Bruchon ( 3404363 ) on Sunday February 11, 2018 @10:39AM (#56104017)
    This reminds me of that ridiculous article (and accompanying video) [youtube.com] saying that color film was biased towards white people. [vox.com] Around 3:30 in the video they have white and black people stand in front of a face-following camera and it doesn't work for the black people. Everyone acts like this is some sort of Harry Potter wizardry against the black man keeping him down when it's vastly simpler than that.

    For progressively darker skin, progressively higher light on that skin is required to reveal its contours. The fundamental problem is that white and light-skinned brown people have their normal skin color shades in the midtones when a scene is properly exposed while darker-skinned brown and black people are closer to shadows. To expose properly for facial recognition of dark brown or black skin, you have to overexpose the midtones to bring up the shadows. Since people rarely take photos on purpose that are exposed for the shadows while blowing everything else out, it should be fairly obvious that facial recognition (and early ISO 32 color film and small-sensor cameras like webcams and phone cameras) will have a very hard time with dark skin. Sure, it could be a lack of data in some instances, but it's far more likely to be the fact that the skin absorbs more light and photographs are generally exposed too low to reveal enough detail for the machines to analyze.

    If you think this is "racist" you're saying that the nature of light itself is racist. I don't feel like I should have to explain why that position is really stupid.
    • by MrMr ( 219533 )
      Automatic exposure is normally calibrated for 18% gray (average outdoor scene). That means if you make a portrait of a skin tone darker or lighter than that you need to think what you measure and compensate. https://en.wikipedia.org/wiki/... [wikipedia.org]
    • Re:Here we go again (Score:4, Informative)

      by AmiMoJo ( 196126 ) on Sunday February 11, 2018 @11:17AM (#56104187) Homepage Journal

      The point you missed is that set lighting for white people has to be carefully designed and set up. One of the reasons colour film took so long to become practical was the difficulty of getting skin tones right.

      If you look at early colour film the skin tones of white people are pretty good, but other colours are way off. Over saturated in places, washed out in others. It was a design decision.

      Vox is correctly pointing out that film from the era was not designed for dark skin, and that made it hard for non-white actors. Similar to how when sound came in a lot of actors lost work because they had thick accents.

      Note that the Vox article does not contain the word "racist" or even "race". It's pertinent because we are now seeing more black actors on screen and Hollywood finally figured out how to light them properly.

      • I'm looking at early color film and the dark-skinned people shown look pretty darn good. [youtube.com] The more I look for images of dark skin on early color film the more I find evidence that goes against the assertion that early film was specifically made for "white" skin. Kodachrome seems to make everyone look pretty good. [youtube.com]
      • There are lots of different kinds of film, even from the same manufacturer. Different kinds of film are better at capturing different colors. There's long been film which was better for photographing whites, and film which was better for photographing people with dark skin.

      • set lighting for white people has to be carefully designed and set up. One of the reasons colour film took so long to become practical was the difficulty of getting [white] skin tones right.

        I assume you've got a reputable source for this assertion, right? I mean, surely you didn't just pull it out of your ass (or the ass of a gender-studies course*).

        Pertaining to older movies, which ones are designed to be slanted in their color representations? I watched Spartacus, for example, which came out in 1960, and while the cast is certainly overwhelmingly white, there's a notable scene [youtube.com] with Kirk Douglas and Woody Strode which hardly appears to be slanted against Woody in the color balance/saturatio

    • by zmooc ( 33175 )

      I suggest not to explain this in terms of exposure etc. because that will simply trigger a discussion on (early) photography technology development being racist.

      It's much simpler than that: the darker something is, the less light it reflects, the less information is present in it's appearance, the harder it is to recognize. This will always be the case and developments in photography technology will never solve it, they will alleviate the problem at best. If it is a problem, that is, because I think facial

    • I agree that a significant part of this is a physics problem. It would be possible to test whether or not this is algorithmic by training a recognizer on a high percentage of dark skinned people and seeing what its performance was like on light skinned people.

      A lot of modern cameras / cell phones have live face detection features. A photo setting that set exposure for faces would help this. People might not use it much though - if you have a dark skinned person in a scene, many people may still prefer th

    • by HiThere ( 15173 )

      Sorry, but that's not the problem.

      In the summary the problem was stated to be the frequency of images in the training set data.

      The technical details you specify may be correct, but they are irrelevant to this particular problem. And they wouldn't explain the problem with recognizing women in any case.

      • Well I guess you better get off your ass and implement a better algorithm, then. After all, light reflectivity isn't a problem for you.

        Also, what is the problem, in the summary, with recognizing women vs. men? This statement?

        more errors arise -- up to nearly 35 percent for images of darker skinned women

        Oh, wait... it's darker skinned women, not just women in general, that the summary references. Perhaps the technical detail referenced by the parent isn't irrelevant after all?

        No, I'm sure you're right, and will roll out that politically correct recognition algorithm of yours in no time

  • by epine ( 68316 ) on Sunday February 11, 2018 @01:01PM (#56104551)

    Someone needs to test whether humans, also, decline in speed or accuracy of facial recognition when dealing with darker shades of skin colour.

    I know for certain that I have more trouble reading facial emotion from black people than white people. The naive response is that I live in a city that's 95% white. But I've been able to convince myself that this is the correct explanation. I simply feel like I have less visual data than I would otherwise at the same point in the cognitive process.

    Suppose I lived in a troop deployment in Afghanistan, and 90% of the people around me wore camo all the time. Would I actually become better at recognizing camo than civilian gear? But this is, indeed, the converse implication of the naive hypothesis.

    There are populations in Brazil that experience the entire range of skin tones on a daily basis. These populations could be tested for recognition rate/accuracy for lighter and darker test cases.

    I highly suspect that darker skin tone has a detectable coefficient of identity camouflage, also in human cognition.

    • I know for certain that I have more trouble reading facial emotion from black people than white people. The naive response is that I live in a city that's 95% white.

      The more likely reason is that you grew up in an environment that was 95% white.

      It's well-established (many studies) that people are better at recognizing faces similar to those they grew up looking at. Just like with machine learning, human brains trained on white faces are better at distinguishing white faces, and human brains trained on black faces are better at distinguishing black faces.

      I highly suspect that darker skin tone has a detectable coefficient of identity camouflage, also in human cognition.

      That would not explain why Africans who grow up without seeing white faces think all white people look alike, but c

    • by AmiMoJo ( 196126 )

      Recognising emotion is different to recognising identity. It's heavily dependent on culture. It took me a while to learn to recognise Chinese and Japanese emotions from people's faces, because they are different to British ones. I guess different shape faces probably had an influence too.

      But I don't think skin colour alone was much of a factor, which is what screws up these facial recognition systems.

  • In our increasingly Orwellian society, I would be quite happy to have facial recognition technology be less effective on my skin tone (fair).
  • by quietwalker ( 969769 ) <pdughi@gmail.com> on Sunday February 11, 2018 @01:15PM (#56104623)

    Facial morphology refers to the various traits and features in a face. For example, the distance between the eyes, or the eye slant, or cheek gaunt or whatever.

    'White' people have the broadest range of diversity, in part because aside from the skin color, there's a lot of differences. Certain Asians, like the Han Chinese, have some of the least diversity (google for iphone face recognition matching two Chinese co-workers).

    If you pick 20 key features as your unique code, and each of those key features has 20-30 distinct possible values, you can rely on reasonable uniqueness, even when some of those values have inter-relationships. When the diversity goes down, and 10 out of the 20 are not unique, and when the range of values those have is between 3 and 5, well, you'll have a lot more trouble differentiating people.

    In fact, a studies shows that among a given ethnic group, actual real life people perform facial recognition on only a few features, but those features are always those traits that show the most variation. When you apply that same algorithm to another ethnicity, it doesn't work so well. You get racist-seeming phrases like, "They all look alike to me," when really the issue is that your specialized detection algorithm was never meant to deal with their differences. ... and every group has this blindness. The one thing that's amusing is that because whites tend to have a large variety, they're the easiest to uniquely identify regardless of your personal/cultural/ethic technique. So, you can say things like "I can tell all you white people apart, you're racist for not being able to identify ME!" and think you're on the moral and ethical high road, when in fact, the situation is different from the other side.

  • Rather than:

    some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition

    Maybe the issue is lighting? Why does a simple thing such as AI to identify gender from a facial camera have to be an example of latent racism? As if programmers subtly, unconsciously, monkeyed with the algorithm to only work for white faces.

  • by tezbobobo ( 879983 ) on Sunday February 11, 2018 @06:35PM (#56105683) Homepage Journal

    Why are we calling this bias? White males have the most range of unique identifying characteristics:

    * Beards

    * Moustaches

    * More tonal contrast

    * Difference in eye and hair colour

  • Apple's FaceID uses infrared depth perception, where light contrast isn't an issue.

    Apologies for the inflammatory title https://www.gizmodo.com.au/201... [gizmodo.com.au]
    They also went to the effort of testing it out on various ethnicities as well, so the AI didn't overly focus on areas that are different for one group but similar in another.

    I'm not saying that Face ID is "racist" or anything like that. I'm just happy there's a technical solution that solves this problem.

  • Even computers think that all black people look alike!
  • I'm having trouble with the fact that it's so accurate at identifying men but not women. Gender politics aside, how does this work?

On a clear disk you can seek forever. -- P. Denning

Working...