Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Transportation AI Software Technology

Self-Driving Cars May Hit People With Darker Skin More Often, Study Finds (futurism.com) 237

According to a new paper from the Georgia Institute of Technology, autonomous cars could disproportionately endanger pedestrians with darker skin, a troubling sign of how AI can inadvertently reproduce prejudices from the wider world. Futurism reports: [In the paper, the researchers] detail their investigation of eight AI models used in state-of-the-art object detection systems. These are the systems that allow autonomous vehicles to recognize road signs, pedestrians, and other objects. They tested these models using images of pedestrians divided into two categories based on their score on the Fitzpatrick scale, which is commonly used to classify human skin color. According to the researchers' paper, the models exhibited "uniformly poorer performance" when confronted with pedestrians with the three darkest shades on the scale. On average, the models' accuracy decreased by 5 percent when examining the group containing images of pedestrians with darker skin tones, even when the researchers accounted for variables such as whether the photo was taken during the day or at night. Thankfully, the researchers were able to figure out what was needed to avoid a future of biased self-driving cars: start including more images of dark-skinned pedestrians in the data sets the systems train on and place more weight on accurately detecting those images.
This discussion has been archived. No new comments can be posted.

Self-Driving Cars May Hit People With Darker Skin More Often, Study Finds

Comments Filter:
  • Wrong (Score:5, Interesting)

    by SuperKendall ( 25149 ) on Wednesday March 06, 2019 @07:42PM (#58228334)

    This is not true at all, it's based on false assumptions.

    First of all, most self driving cars will end up using LIDAR. Skin color, not an issue.

    Secondly. even cars with cameras do a lot of image transformations such that color is usually disposed of. You kin color is irrelevant to a recognizer looking for human forms.

    In fact you could argue that during the day, darker skin is an advantage because against a blue sky it's more noticeable than really pale skin which could look like clouds... #GingerLivesMatter.

    • it's based on false assumptions. ... Skin color, not an issue.

      So *that*'s what it is! I knew something was missing from this exercise [mit.edu].

    • by ffkom ( 3519199 ) on Wednesday March 06, 2019 @07:50PM (#58228382)
      So the anorexic may be hit by self-driving cars more often. On the other hand, fugitives from jails in their vertical striped uniforms may be in grave danger near crosswalk signs. While the hypertonic will survive more often, their red faces being interpreted as red traffic lights.

      So much prejudice to consider!
    • Does anyone know why infrared detectors are not used to locate people and animals by self driving cars?

      • Because oddly enough, there are other things which radiate heat at night. In places where it's warm to hot all day, the road itself would radiate large amounts of heat at night. The same with car engines and car exhausts, both of which move.

        Manhole covers also give off heat so the system would see this big spot and come to a stop in the middle of the road unless it had been specifically programmed to ignore such things, which then presents a whole new set of problems.

    • Re:Wrong (Score:5, Insightful)

      by dgatwood ( 11270 ) on Wednesday March 06, 2019 @08:21PM (#58228602) Homepage Journal

      This is not true at all, it's based on false assumptions.

      First of all, most self driving cars will end up using LIDAR. Skin color, not an issue.

      Secondly. even cars with cameras do a lot of image transformations such that color is usually disposed of. You kin color is irrelevant to a recognizer looking for human forms.

      In fact you could argue that during the day, darker skin is an advantage because against a blue sky it's more noticeable than really pale skin which could look like clouds... #GingerLivesMatter.

      Yes and no. Image recognition tends to be more sensitive to texture than to shape, and darker skin results in less contrast, which means less ability to see things like facial features that otherwise might identify the object as a human.

      You are correct that object detection should not be a meaningful part of your strategy for avoiding hitting things. Rather, object detection is for doing things like traffic light detection, road sign reading, and determining where nearby cars are located so that you can calculate when to change lanes, whether you need to accelerate while doing so, etc.

      Similarly, object detection should not be used for verifying that nothing is beside you, behind you, or in front of you. Those additional sanity checks are what RADAR, LIDAR, and SONAR are for.

      Moreover, even if we assume that image recognition is used for that purpose, parallax differences between cameras should tell you that there is something in front of you. No matter how dark your skin is, if the car thinks that you're part of the road, the software is doing something very wrong, and it's the procedural part of the code base that is failing, not the image recognition part. After all, if dark skin is indistinguishable from the road, so are grey or black automobiles.

      But — and this is a big but — detecting people near the road is often useful in terms of avoiding unexpected interactions later by slowing down, changing lanes, etc. And detecting gestures of police officers or other personnel directing traffic also needs to work regardless of their skin color. So it is important to ensure that training data doesn't show racial bias. The same is true for gender bias, attire bias, and any number of other things that could cause confusion for machine vision.

      What bugs me about this article is not that the premise is wrong, because it isn't necessarily, but rather that it appears to be entirely built upon a giant tower of hypotheticals, such as the training data being inadequate, the computer vision being used for critical behavior rather than LIDAR or other tech, etc., none of which are necessarily going to happen in the real world, and all of which are readily avoidable by just not cutting corners in development.

      Basically, it's like saying that a new nuclear reactor could seriously screw up the world if you forget to connect it to a water supply. My response is, "Yeah, no kidding."

      • Image recognition tends to be more sensitive to texture than to shape

        Not the kind used in autonomous driving which it lots, lot more concerned about the shape of people than textures, since they could be wearing anything.

        darker skin results in less contrast, which means less ability to see things like facial features

        Facial features are like 1/1000000 as important as just knowing "that is a human" which is looking at a whole body shape. Mostly a car camera would not have enough resolution to perceive faci

        • >>And detecting gestures of police officers or other personnel directing traffic also needs to work regardless of their skin color.

          >Exactly, so SKIN COLOR DOES NOT MATTER.

          My god you're desparate. Skin colour must not be a factor, therefore it isn't a factor. Checkmate liberals!

          • by AmiMoJo ( 196126 )

            Thanks, that post was a TL;DR rant but you managed to find the humour in it. Gave me a good chuckle.

        • skin color DOES NOT MATTER ONE BIT for that task, especially as the cameras are probably very IR sensitive.

          Yes, CCD and CMOS cameras are sensitive in the IR -- near IR, not far. In fact, most color cameras have an IR cut filter in front of the sensor just to prevent false color renditions, and many B/W cameras have them, too. Near IR is close to visible, and is not generated in any significant amount by something that isn't also radiating visible light, except for those deliberate near IR radiators. People aren't.

          Far IR is what comes from thermal emitters, like warm bodies, and unless you have a camera specifi

      • The anti-collision system pretty much has to use LIDAR. It's the only current technology with sufficient spacial accuracy and reliability sufficient for a self-driving application.

        The issue with using a camera system for anti-collision is that it doesn't work in many edge cases, as Tesla is experiencing.

        The combination of the two systems does work well, and can cover off many edge cases where the Lidar or the camera system by themselves is inadequate.

        While it is easily documented that a camera system re

      • Actually the Subaru Eyesight system relies entirely on cameras and does active cruise control today. I own one so I know. It's mostly good at classifying what things are obstacles. It still gets confused every once in a while by a tennis ball I hung in my garage to tell me how far forward to pull in. Mostly it ignores is but sometime it has a conniption fit. Also, since I have the 2016 version, its generally terrible at pedestrian detection because the cameras are focused far out in front of the camera. In
    • This is not true at all, it's based on false assumptions.

      First of all, most self driving cars will end up using LIDAR. Skin color, not an issue.

      Secondly. even cars with cameras do a lot of image transformations such that color is usually disposed of. You kin color is irrelevant to a recognizer looking for human forms.

      In fact you could argue that during the day, darker skin is an advantage because against a blue sky it's more noticeable than really pale skin which could look like clouds... #GingerLivesMatter.

      On the one hand there is SuperKendall with a totally unsupported but very authoritative set of assertions that he pulled out of his posterior. On the other hand there is a bunch of scientists at the Georgia Institute of Technology.... hmmmm .... whom to believe ???? .... I'm gonna go with Georgia Tech.

    • Comment removed based on user account deletion
    • You kin color is irrelevant to a recognizer looking for human forms.
      It is not.

      When it is dark white skin is easier to see than black skin. For your eyes, as well as for a camera.

      do a lot of image transformations such that color is usually disposed of.
      Making a colour picture into a black and white picture still leaves the black people black and the white people white.

      • by Bert64 ( 520050 )

        How much skin is actually being shown on average? Most people will be wearing clothes, and in colder areas will be covering most of their skin with them.

        That said, differentiating any object from the background depends on contrast, if the object you're trying to identify is dark and so is the background then recognising it becomes harder. The colour of clothes is important too.

        Based on practical experience, i've often encountered dark skinned people wearing dark clothes at night which can make them much har

    • All I can say is "Whoo hoo!" Mow them down, cut the welfare!

      / just kidding
      // really, just kidding. And why do I have to put explicit break HTML tags in this day and age.
      /// Can we train them to mow down lazy /. HTML coders instead?
      • And why do I have to put explicit break HTML tags in this day and age.

        You should be able to change that in your settings.

    • by Hylandr ( 813770 )

      It's not racism, it's contrast. Pure and simple. Camera's are at work as Lidar can be jammed.

    • by AmiMoJo ( 196126 )

      Lidar isn't a given. Tesla is trying to do self driving with only cameras and a front facing radar, for example.

  • Comment removed based on user account deletion
    • "may" and "finds" don't belong together. You can't promote a 'maybe' to a 'definitely' in the same sentence.

      That's standard practice in the humanities, especially the grievance majors. All things are explained by an "ism" in those circles.

    • by mentil ( 1748130 )

      Equivocation may be combinable in sentence with definitive statement, amateur grammatist finds.

  • by Anonymous Coward

    If the glaring sun is behind you at sunrise, you'll be hit too. The laws of physics are not selective.

  • by Anonymous Coward

    This is officially the most snowflake story I have ever seen on SlashDot. Are you serious? Good grief, you kids all need to be spanked. Also: 'self-driving' cars hit any people they hit because *the tech doesn't work*. And it never will.

    • by mark-t ( 151149 )

      To be fair, "the tech doesn't work" applies to the human driven cars that hit people too... and it never will either, because people will always make mistakes.

      The best we can hope for in self-driving cars is to reduce the number of people that get hit to be low enough that when someone gets hit by a car at all, it's so outside the norm that it becomes real news.

    • by Cederic ( 9623 )

      What the fuck is snowflake about this? Image recognition systems have a discernable measurable flaw that impinges on their ability to support required safety levels. That's not snowflake, that's technology and something to explore and address.

      But sure, I'll go for the spanking.

    • This is officially the most snowflake story I have ever seen on SlashDot. Are you serious? Good grief, you kids all need to be spanked.

      This is the most slashdot ever answer. World throws up results you don't like? Just beat people until they start denying reality. Problem solved!

  • by RyanFenton ( 230700 ) on Wednesday March 06, 2019 @07:53PM (#58228408)

    I looked at the actual article, and the article it references - and they're all short tabloid blabs without any link to the full article.

    Nothing obvious showing up on Georgia Institute of Technology's websites.

    Like with most reports on early reporting on scientific studies, it helps to see what the actual text says - reporters have a tendency to, well, sensationalize findings to meet their own needs.

    Ryan Fenton

  • by Anonymous Coward

    How about people wearing all black? Are ninja's safe? Will stage workers get run over on the way to their cars after the show?

    How much skin was showing in the images? Were these streakers or people wearing blaze orange hunting parkas? Can just a face cause this issue if their hands are in the pockets of their parka?

    Shouldn't these cars be avoiding things in the road in general? Say deer, pets, moose, etc?

    • Ninjas are never safe, and they never expected to be safe.

      I recommend leaping to safety, or at least throwing shuriken at the grill so that they can identify and return your body.

    • What about North Carolina politicians doing Michael Jackson impressions?
      We need more training data!
      • What about North Carolina politicians doing Michael Jackson impressions?

        What about Michael Jackson himself? I mean, before he died but after cosmetic procedures.

  • by msauve ( 701917 ) on Wednesday March 06, 2019 @08:04PM (#58228482)
    Unless the study is done in winter, in Scandinavia, that is.
  • Flipflipflipflipflipflipflipflipflipflipflipflipflipflipflip

    Automotive AI!

  • by epine ( 68316 ) on Wednesday March 06, 2019 @08:06PM (#58228496)

    There aren't many black people where I live, and when I do encounter a black person, especially a very dark person, it is definitely more difficult at first to accurately read facial expressions.

    This is probably a combination of my environment, my long relationship with my keyboard in a dark room, and a side order of actual physics (optics).

    • There aren't many black people where I live,

      So there's an argument you may want to prioritize light skinned pedestrians since there's more of them... but that's probably controversial.

      and when I do encounter a black person, especially a very dark person, it is definitely more difficult at first to accurately read facial expressions.

      This is probably a combination of my environment, my long relationship with my keyboard in a dark room, and a side order of actual physics (optics).

      It's also not really relevant.

      The person detection systems in use are relying more on general body form than facial features. More likely they just don't have as many training samples in their data sets.

      • by aevan ( 903814 )
        You just agreed with him though: he listed a lack of training samples as a cause for his inability. i.e. not enough encounters with darker skinned people
    • by AmiMoJo ( 196126 )

      Yep, it's a well understood issue. It's why it's important to have dark skinned people represented on TV and in movies - it helps everyone get used to it.

      Now watch the push-back against an easy, simple solution.

  • Before you die, you see the bling.

  • We all know why.... (Score:4, Interesting)

    by argStyopa ( 232550 ) on Wednesday March 06, 2019 @08:12PM (#58228536) Journal

    ....because physics is racis.

    • Re: (Score:2, Insightful)

      by AmiMoJo ( 196126 )

      Why is it always racism with you? You are obsessed.

      It's hard to have a conversation about improving tech when people go around screaming racism at everything. Please stop.

      • Why is it always racism with you?

        It's because he doesn't understand what the word "racist" actually means. This is quite common over here. As far as many people understand, "racist" is just an insult word thrown around by liberals, that just means "you're a bad person". There's some vague understanding that it often crops up around matters of race but that's about as far as it goes.

        • by AmiMoJo ( 196126 )

          I think they are just primed to launch into their anti-SJW diatribe any time anything to do with race or skin colour comes up. Maybe it's deliberate, maybe it's some kind of programmed Pavlovian response. Either way someone is pushing that narrative.

  • Nothing would be said if it were that people wearing dark colored clothes and a hoody are more difficult to detect. And why wouldn't they just train these systems with all dark skinned people.

  • Pixel counting (Score:4, Insightful)

    by Z80a ( 971949 ) on Wednesday March 06, 2019 @08:25PM (#58228622)

    I think the clothing counts a lot more than the skin, given they cover most of the body of people.
    Which means if you're a goth, self-driving cars are most likely to hit you.

    • Re:Pixel counting (Score:4, Insightful)

      by Tablizer ( 95088 ) on Wednesday March 06, 2019 @08:49PM (#58228750) Journal

      2 or 3 times I've come very close to accidentally flattening pedestrians at night wearing dark clothes and having a dark complexion/tan. They just blended into the background. Regardless of your skin color, please DON'T walk around at night wearing dark clothes. Leave ninja-ing to ninjas.

  • I Hate Black People (Score:2, Interesting)

    by Anonymous Coward

    Apparently I hate black people because I almost ran one down last night. He was wearing black, standing on the highway, and was waving his hands around. The only thing you could see of him before the headlights hit him was the tiny cell phone light in his hand.

    Guy ran out of gas and was too poor to pay for a tow truck. Yeah I drove him to a gas station and wasn't murdered, nor did I kill him. But the internet says I hate black people since I've never almost ran over a white guy. Anyone one to volunteer

    • by mentil ( 1748130 )

      You'll just have to intentionally nearly run over a white person, to make yourself an equal-opportunity near-vehicular-manslaughterer. Maybe an Asian, too, just to be safe.

    • by Cederic ( 9623 )

      If it helps, you also failed to run me over last night too, so I think you can claim to be an equal opportunity accident avoider.

  • by Oligonicella ( 659917 ) on Wednesday March 06, 2019 @08:45PM (#58228726)

    a troubling sign of how AI can inadvertently reproduce prejudices from the wider world

    • You ask the right question. Somebody fishing for clicks and/or cannot understand the idea of contrast.

      And, yes, everything where I am now is covered with snow and ice.

    • a troubling sign of how AI can inadvertently reproduce prejudices from the wider world

      Well the sun is prejudiced against light skinned people because it gives them more sun burns /s

    • by Trip6 ( 1184883 )
      The only logical response to this horrid article.
    • The demand for racism far exceeds the supply. It's made-up clickbait
  • In my part of the world most people wear clothing. It doesn't matter what your skin color is when only 4% of your surface area is skin.

    Unfortunately, most of those people wear dark clothing at night. Children and adults, male and female, pedestrians and bicyclists. Even fire engine red is almost indistinguishable from black at night. So, these people are at risk from motorists already. Self-driving cars are obviously not a concern of these people.

    • by Bert64 ( 520050 )

      If 4% is a light color and stands out from the background that's still a bit better than 0%, which gives a slight advantage to light skinned people.

      But a bigger advantage can be had by wearing light colored and/or reflective clothes. If you're walking around at night wearing dark clothes in a poorly lit area you're less likely to be seen which is generally not to your advantage unless you're planning to do something illegal.

  • How about making self driving cars that don't hit any people at all? If a person to jumps in front of the car and gets hit, then the car was going to fast for that environment. Isn't this how it works with non self driving cars? In most situations I am aware of, if I hit a pedestrian, it was my fault. In the few situations I am not at fault, like jaywalking from behind a completely obstructed vantage point or a small child running under the car from a completely obstructed vantage point, I cannot imagin
  • I'd hope so!
  • I only had a quick glance through the paper so not sure if it's addressed, but: what is the normal everyday rate of human drivers hitting people with darker skin? How does that compare to self-driving cars?

    I nearly hit a dark-skinned cyclist just a couple days ago, about 3 seconds after he was nearly hit by another car. Wearing almost all black and riding at night with no lights. He was nearly completely invisible and it was obvious the other car only saw him at the last second, just like I did - in fact on

  • Self-Driving Cars May Hit People With Darker Skin More Often, Study Recommends.

  • https://en.m.wikipedia.org/wik... [wikipedia.org] is very high

  • Comment removed based on user account deletion
  • I just wish the self-driving advocates would decide whether we want it to be better than a human or not. So many "but a human does it too" comments are way off base. In self driving you have the opportunity to be better than a human, why would you not take every opportunity to eliminate every flaw?
  • I'll bet they will hit people who wear all black/navy blue with their hoods up more often too!

    Heck, I'll bet humans hit them more too. You know, Scene Contrast. I can't tell you how much I hate the NY "we wear all dark clothes" thing on rainy nights. Add in jaywalking, and I can't tell you how close I've come at times. It is why I added "Black retro-reflective" stripes to one of my black jackets, and one of my new jackets is safety yellow with DOT level 3 striping. Sometimes I'm required to be roadside

  • Thinking that accidently being more likely to hit people who are harder to detect on cameras is a carry-over of real world prejudice tends to show how people have lost perspective on what that really is.

    Not everything that disproportionately impacts some racial marker is racist, only a deliberate effort to target by race does that.

    Automated processes and algorithms aren't racist.

  • "Thankfully, the researchers were able to figure out what was needed to avoid a future of biased self-driving cars: start including more images of dark-skinned pedestrians in the data sets the systems train on and place more weight on accurately detecting those images."

    Another possible solution would've been to randomly hit people with light skin color that the AI recognized with a small probability, so that it evens out.

  • Man, I wish that show hadn't been cancelled. The episode with the drinking fountains was just too predictive.

    https://vimeo.com/29017688 [vimeo.com]

"If it ain't broke, don't fix it." - Bert Lantz

Working...