Self-Driving Cars May Hit People With Darker Skin More Often, Study Finds (futurism.com) 237
According to a new paper from the Georgia Institute of Technology, autonomous cars could disproportionately endanger pedestrians with darker skin, a troubling sign of how AI can inadvertently reproduce prejudices from the wider world. Futurism reports: [In the paper, the researchers] detail their investigation of eight AI models used in state-of-the-art object detection systems. These are the systems that allow autonomous vehicles to recognize road signs, pedestrians, and other objects. They tested these models using images of pedestrians divided into two categories based on their score on the Fitzpatrick scale, which is commonly used to classify human skin color. According to the researchers' paper, the models exhibited "uniformly poorer performance" when confronted with pedestrians with the three darkest shades on the scale. On average, the models' accuracy decreased by 5 percent when examining the group containing images of pedestrians with darker skin tones, even when the researchers accounted for variables such as whether the photo was taken during the day or at night. Thankfully, the researchers were able to figure out what was needed to avoid a future of biased self-driving cars: start including more images of dark-skinned pedestrians in the data sets the systems train on and place more weight on accurately detecting those images.
Wrong (Score:5, Interesting)
This is not true at all, it's based on false assumptions.
First of all, most self driving cars will end up using LIDAR. Skin color, not an issue.
Secondly. even cars with cameras do a lot of image transformations such that color is usually disposed of. You kin color is irrelevant to a recognizer looking for human forms.
In fact you could argue that during the day, darker skin is an advantage because against a blue sky it's more noticeable than really pale skin which could look like clouds... #GingerLivesMatter.
Re: (Score:2)
it's based on false assumptions. ... Skin color, not an issue.
So *that*'s what it is! I knew something was missing from this exercise [mit.edu].
But LIDAR scans will miss skinny people more often (Score:5, Funny)
So much prejudice to consider!
Humans can make the same mistakes. (Score:2)
Re: (Score:2)
Does anyone know why infrared detectors are not used to locate people and animals by self driving cars?
Re: (Score:3)
Because oddly enough, there are other things which radiate heat at night. In places where it's warm to hot all day, the road itself would radiate large amounts of heat at night. The same with car engines and car exhausts, both of which move.
Manhole covers also give off heat so the system would see this big spot and come to a stop in the middle of the road unless it had been specifically programmed to ignore such things, which then presents a whole new set of problems.
Re: (Score:2)
So does the sun, cars, hot rooftops, ...
Re:Wrong (Score:5, Insightful)
This is not true at all, it's based on false assumptions.
First of all, most self driving cars will end up using LIDAR. Skin color, not an issue.
Secondly. even cars with cameras do a lot of image transformations such that color is usually disposed of. You kin color is irrelevant to a recognizer looking for human forms.
In fact you could argue that during the day, darker skin is an advantage because against a blue sky it's more noticeable than really pale skin which could look like clouds... #GingerLivesMatter.
Yes and no. Image recognition tends to be more sensitive to texture than to shape, and darker skin results in less contrast, which means less ability to see things like facial features that otherwise might identify the object as a human.
You are correct that object detection should not be a meaningful part of your strategy for avoiding hitting things. Rather, object detection is for doing things like traffic light detection, road sign reading, and determining where nearby cars are located so that you can calculate when to change lanes, whether you need to accelerate while doing so, etc.
Similarly, object detection should not be used for verifying that nothing is beside you, behind you, or in front of you. Those additional sanity checks are what RADAR, LIDAR, and SONAR are for.
Moreover, even if we assume that image recognition is used for that purpose, parallax differences between cameras should tell you that there is something in front of you. No matter how dark your skin is, if the car thinks that you're part of the road, the software is doing something very wrong, and it's the procedural part of the code base that is failing, not the image recognition part. After all, if dark skin is indistinguishable from the road, so are grey or black automobiles.
But — and this is a big but — detecting people near the road is often useful in terms of avoiding unexpected interactions later by slowing down, changing lanes, etc. And detecting gestures of police officers or other personnel directing traffic also needs to work regardless of their skin color. So it is important to ensure that training data doesn't show racial bias. The same is true for gender bias, attire bias, and any number of other things that could cause confusion for machine vision.
What bugs me about this article is not that the premise is wrong, because it isn't necessarily, but rather that it appears to be entirely built upon a giant tower of hypotheticals, such as the training data being inadequate, the computer vision being used for critical behavior rather than LIDAR or other tech, etc., none of which are necessarily going to happen in the real world, and all of which are readily avoidable by just not cutting corners in development.
Basically, it's like saying that a new nuclear reactor could seriously screw up the world if you forget to connect it to a water supply. My response is, "Yeah, no kidding."
Not quite right as well... (Score:3)
Image recognition tends to be more sensitive to texture than to shape
Not the kind used in autonomous driving which it lots, lot more concerned about the shape of people than textures, since they could be wearing anything.
darker skin results in less contrast, which means less ability to see things like facial features
Facial features are like 1/1000000 as important as just knowing "that is a human" which is looking at a whole body shape. Mostly a car camera would not have enough resolution to perceive faci
Re: (Score:2)
>>And detecting gestures of police officers or other personnel directing traffic also needs to work regardless of their skin color.
>Exactly, so SKIN COLOR DOES NOT MATTER.
My god you're desparate. Skin colour must not be a factor, therefore it isn't a factor. Checkmate liberals!
Re: (Score:2)
Thanks, that post was a TL;DR rant but you managed to find the humour in it. Gave me a good chuckle.
Re: (Score:2)
skin color DOES NOT MATTER ONE BIT for that task, especially as the cameras are probably very IR sensitive.
Yes, CCD and CMOS cameras are sensitive in the IR -- near IR, not far. In fact, most color cameras have an IR cut filter in front of the sensor just to prevent false color renditions, and many B/W cameras have them, too. Near IR is close to visible, and is not generated in any significant amount by something that isn't also radiating visible light, except for those deliberate near IR radiators. People aren't.
Far IR is what comes from thermal emitters, like warm bodies, and unless you have a camera specifi
Re: (Score:2)
The anti-collision system pretty much has to use LIDAR. It's the only current technology with sufficient spacial accuracy and reliability sufficient for a self-driving application.
The issue with using a camera system for anti-collision is that it doesn't work in many edge cases, as Tesla is experiencing.
The combination of the two systems does work well, and can cover off many edge cases where the Lidar or the camera system by themselves is inadequate.
While it is easily documented that a camera system re
Re: (Score:2)
How do you explain Subaru's EyeSight system which relies on stereo cameras? There's no LIDAR or sonar.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Dark people may have "less contrast", and that doesn't matter.
It is not necessary to identify anything as "humans". A self-driving car should not hit cattle, dogs, brick walls or garbage cans either. Any obstacle is bad. A rock the size of a cat could wreck the car.
Yeah, that was the whole point of this bit:
Technically speaking, there are circumstances where it is better to hit a sufficiently small object than to swerve to avoid it. For example, if a squirrel runs out across the road, you're probably better off hitting it than the car in the next lane. Same goes for a plastic bag blowing in the wind.
IIRC, there are existing techniques for crudely gauging the mass of an objec
Re: (Score:2)
This is not true at all, it's based on false assumptions.
First of all, most self driving cars will end up using LIDAR. Skin color, not an issue.
Secondly. even cars with cameras do a lot of image transformations such that color is usually disposed of. You kin color is irrelevant to a recognizer looking for human forms.
In fact you could argue that during the day, darker skin is an advantage because against a blue sky it's more noticeable than really pale skin which could look like clouds... #GingerLivesMatter.
On the one hand there is SuperKendall with a totally unsupported but very authoritative set of assertions that he pulled out of his posterior. On the other hand there is a bunch of scientists at the Georgia Institute of Technology.... hmmmm .... whom to believe ???? .... I'm gonna go with Georgia Tech.
Re: (Score:2)
Appeal to authority denied. Also, you are a n!gger for making such a logically fallacious argument.
Oh my, I appear to have hit a nerve,
Re: (Score:2)
Re: (Score:2)
You kin color is irrelevant to a recognizer looking for human forms.
It is not.
When it is dark white skin is easier to see than black skin. For your eyes, as well as for a camera.
do a lot of image transformations such that color is usually disposed of.
Making a colour picture into a black and white picture still leaves the black people black and the white people white.
Re: (Score:2)
How much skin is actually being shown on average? Most people will be wearing clothes, and in colder areas will be covering most of their skin with them.
That said, differentiating any object from the background depends on contrast, if the object you're trying to identify is dark and so is the background then recognising it becomes harder. The colour of clothes is important too.
Based on practical experience, i've often encountered dark skinned people wearing dark clothes at night which can make them much har
Being an oppressed white male (Score:2)
/ just kidding
Re: (Score:2)
And why do I have to put explicit break HTML tags in this day and age.
You should be able to change that in your settings.
Re: (Score:2)
It's not racism, it's contrast. Pure and simple. Camera's are at work as Lidar can be jammed.
Re: (Score:2)
Lidar isn't a given. Tesla is trying to do self driving with only cameras and a front facing radar, for example.
Re: (Score:2)
LIDAR is a patent protected technology
Correction. Someone's implementation of LIDAR may be patent protected. But the technology itself has been around since the 1960s and is well beyond patent coverage.
Re: (Score:2, Funny)
Oh no no no. The AI was trained correctly but then let its prejudice take over and decided for itself that it liked the idea of running over darker skinned people because it's racist.
Re: (Score:2)
You're also a fucking idiot. Skin colour has no fucking correlation to using a defined crossing.
Culture does; Chinese people seem incapable of avoiding them. British people treat them as convenient but entirely optional.
Re: (Score:2)
Unprofessional photographer here. Not just focus, but also correct automatic exposure too.
But I buy cameras designed and made in Japan, so I wont blame Western civilisation.
Re: (Score:2)
Another good reason to learn how to shoot manual mode.
It's NOT just for pro's.
After you've had a decent camera a month or so, take it off auto, spend a weekend shooting manual...or at least the shutter or aperture priority modes.
Re: (Score:2)
No shit sherlock. Although that doesn't help when the sun's behind them, the sky's already clipping and the vegetation is horribly bright, yet their face is still lacking definition.
But hey, the man lying on a beach in Papua New Guinea with a pig asleep across his back came out just fine, so maybe I'm just getting lucky.
Re: (Score:2)
Well, in some conditions, you need a bit more help equipment-wise.
Set your exposure for the background, and use speed lights or now portable strobes are available at reasonable costs, and you can then use that to light your subjects face, etc...
Re: (Score:2)
Yes, I always carry speed lights and portable strobes halfway around the world and across a Yam garden, over the pig-proof fence and into the rain forest.
I mean, who wouldn't.
Re: (Score:2)
Re: (Score:2)
"may" and "finds" don't belong together. You can't promote a 'maybe' to a 'definitely' in the same sentence.
That's standard practice in the humanities, especially the grievance majors. All things are explained by an "ism" in those circles.
Re: (Score:2)
Equivocation may be combinable in sentence with definitive statement, amateur grammatist finds.
Low contrast is not a vendetta (Score:1)
If the glaring sun is behind you at sunrise, you'll be hit too. The laws of physics are not selective.
Okay, we have a winner (Score:1)
This is officially the most snowflake story I have ever seen on SlashDot. Are you serious? Good grief, you kids all need to be spanked. Also: 'self-driving' cars hit any people they hit because *the tech doesn't work*. And it never will.
Re: (Score:2)
To be fair, "the tech doesn't work" applies to the human driven cars that hit people too... and it never will either, because people will always make mistakes.
The best we can hope for in self-driving cars is to reduce the number of people that get hit to be low enough that when someone gets hit by a car at all, it's so outside the norm that it becomes real news.
Re: (Score:2)
What the fuck is snowflake about this? Image recognition systems have a discernable measurable flaw that impinges on their ability to support required safety levels. That's not snowflake, that's technology and something to explore and address.
But sure, I'll go for the spanking.
Re: (Score:3)
This is officially the most snowflake story I have ever seen on SlashDot. Are you serious? Good grief, you kids all need to be spanked.
This is the most slashdot ever answer. World throws up results you don't like? Just beat people until they start denying reality. Problem solved!
Anyone have text of the actual study? (Score:3)
I looked at the actual article, and the article it references - and they're all short tabloid blabs without any link to the full article.
Nothing obvious showing up on Georgia Institute of Technology's websites.
Like with most reports on early reporting on scientific studies, it helps to see what the actual text says - reporters have a tendency to, well, sensationalize findings to meet their own needs.
Ryan Fenton
Re:Anyone have text of the actual study? (Score:4, Informative)
Nevermind:
https://arxiv.org/pdf/1902.110... [arxiv.org]
Ryan Fenton
Re: (Score:3)
Thanks! That quickly clarifies that this research is about the machine learning datasets used to train AI-based optical image classifiers.
Re: (Score:2)
I looked at the actual article, and the article it references - and they're all short tabloid blabs without any link to the full article.
Nothing obvious showing up on Georgia Institute of Technology's websites.
Like with most reports on early reporting on scientific studies, it helps to see what the actual text says - reporters have a tendency to, well, sensationalize findings to meet their own needs.
Ryan Fenton
I believe the reporters are trying to cite the study from Veridian Dynamics (2009):
https://www.youtube.com/watch?... [youtube.com]
Darker people or darker things in general? (Score:1)
How about people wearing all black? Are ninja's safe? Will stage workers get run over on the way to their cars after the show?
How much skin was showing in the images? Were these streakers or people wearing blaze orange hunting parkas? Can just a face cause this issue if their hands are in the pockets of their parka?
Shouldn't these cars be avoiding things in the road in general? Say deer, pets, moose, etc?
Re: (Score:2)
Ninjas are never safe, and they never expected to be safe.
I recommend leaping to safety, or at least throwing shuriken at the grill so that they can identify and return your body.
Re: (Score:2)
We need more training data!
Re: (Score:2)
What about North Carolina politicians doing Michael Jackson impressions?
What about Michael Jackson himself? I mean, before he died but after cosmetic procedures.
Well... (Score:3)
Let's see what's racist this week! (Score:1)
Flipflipflipflipflipflipflipflipflipflipflipflipflipflipflip
Automotive AI!
social relativism tofu burger, hold the physics (Score:4, Insightful)
There aren't many black people where I live, and when I do encounter a black person, especially a very dark person, it is definitely more difficult at first to accurately read facial expressions.
This is probably a combination of my environment, my long relationship with my keyboard in a dark room, and a side order of actual physics (optics).
Re: (Score:2)
There aren't many black people where I live,
So there's an argument you may want to prioritize light skinned pedestrians since there's more of them... but that's probably controversial.
and when I do encounter a black person, especially a very dark person, it is definitely more difficult at first to accurately read facial expressions.
This is probably a combination of my environment, my long relationship with my keyboard in a dark room, and a side order of actual physics (optics).
It's also not really relevant.
The person detection systems in use are relying more on general body form than facial features. More likely they just don't have as many training samples in their data sets.
Re: (Score:3)
Re: (Score:1)
Yep, it's a well understood issue. It's why it's important to have dark skinned people represented on TV and in movies - it helps everyone get used to it.
Now watch the push-back against an easy, simple solution.
Autonomous lowrider hops on its own owner (Score:1)
Before you die, you see the bling.
We all know why.... (Score:4, Interesting)
....because physics is racis.
Re: (Score:2, Insightful)
Why is it always racism with you? You are obsessed.
It's hard to have a conversation about improving tech when people go around screaming racism at everything. Please stop.
Re: (Score:2)
Why is it always racism with you?
It's because he doesn't understand what the word "racist" actually means. This is quite common over here. As far as many people understand, "racist" is just an insult word thrown around by liberals, that just means "you're a bad person". There's some vague understanding that it often crops up around matters of race but that's about as far as it goes.
Re: (Score:2)
I think they are just primed to launch into their anti-SJW diatribe any time anything to do with race or skin colour comes up. Maybe it's deliberate, maybe it's some kind of programmed Pavlovian response. Either way someone is pushing that narrative.
Re: (Score:2)
Actually that whooshing sound was my mocking of Styopa, who seems to assume that "SJWs" are always making it about racism, but in fact he is the only one making that claim. He's outraged at imaginary outrage, and the only one doing the thing he is complaining about other people doing.
Re: (Score:2)
You have a faulty memory.
Yet (Score:2)
Nothing would be said if it were that people wearing dark colored clothes and a hoody are more difficult to detect. And why wouldn't they just train these systems with all dark skinned people.
Pixel counting (Score:4, Insightful)
I think the clothing counts a lot more than the skin, given they cover most of the body of people.
Which means if you're a goth, self-driving cars are most likely to hit you.
Re:Pixel counting (Score:4, Insightful)
2 or 3 times I've come very close to accidentally flattening pedestrians at night wearing dark clothes and having a dark complexion/tan. They just blended into the background. Regardless of your skin color, please DON'T walk around at night wearing dark clothes. Leave ninja-ing to ninjas.
Re: (Score:2)
As someone who has walked around at night wearing usually jeans and a dark coat but without remotely a dark complexion, cars have nearly blinding light to the point that it's frequently painful to look at. If that's not enough to see a person in the dark, then I'm not sure what is.
I recommend one of these: https://www.amazon.com/kwmobil... [amazon.com]. Maybe a set of these, too: https://www.amazon.com/dp/B018... [amazon.com]. And some reflectors to make sure those bright headlights are bounced back, too.
Seriously, light yourself up for safety if you're out on the road at night.
Re: (Score:2)
Just don't set the vest to strobe or I'll aim for you.
Not even on purpose, it draws my eye and my attention and my steering instinctively follows. Although even if I miss you I'll want to turn and have a second go, I fucking hate strobe lights when I'm driving.
Re: (Score:2)
If that's not enough to see a person in the dark, then I'm not sure what is.
It's enough to luminate a person in the dark but there's still the challenge of differentiating them from the background while dealing with controlling a vehicle, looking for other obstructions, coping with weather and/or a dirty windscreen, checking satnav and bouncing up and down in time to whatever music is playing.
Which is why I drive slower when I have poor visibility and pedestrians are likely, but wearing dark clothing still isn't going to help you.
Re: (Score:2)
In a way.
At least they can celebrate "being like a vampire for not showing on the self-driving car's detection"
I Hate Black People (Score:2, Interesting)
Apparently I hate black people because I almost ran one down last night. He was wearing black, standing on the highway, and was waving his hands around. The only thing you could see of him before the headlights hit him was the tiny cell phone light in his hand.
Guy ran out of gas and was too poor to pay for a tow truck. Yeah I drove him to a gas station and wasn't murdered, nor did I kill him. But the internet says I hate black people since I've never almost ran over a white guy. Anyone one to volunteer
Re: (Score:3)
You'll just have to intentionally nearly run over a white person, to make yourself an equal-opportunity near-vehicular-manslaughterer. Maybe an Asian, too, just to be safe.
Re: (Score:2)
If it helps, you also failed to run me over last night too, so I think you can claim to be an equal opportunity accident avoider.
What fucking moron wrote this? (Score:5, Insightful)
Re: (Score:2)
You ask the right question. Somebody fishing for clicks and/or cannot understand the idea of contrast.
And, yes, everything where I am now is covered with snow and ice.
Re: (Score:2)
Well the sun is prejudiced against light skinned people because it gives them more sun burns /s
Re: (Score:2)
Re: (Score:2)
How do the auomated systems compare (Score:1)
to human drivers?
NAKED people with darker skin? (Score:2)
In my part of the world most people wear clothing. It doesn't matter what your skin color is when only 4% of your surface area is skin.
Unfortunately, most of those people wear dark clothing at night. Children and adults, male and female, pedestrians and bicyclists. Even fire engine red is almost indistinguishable from black at night. So, these people are at risk from motorists already. Self-driving cars are obviously not a concern of these people.
Re: (Score:2)
If 4% is a light color and stands out from the background that's still a bit better than 0%, which gives a slight advantage to light skinned people.
But a bigger advantage can be had by wearing light colored and/or reflective clothes. If you're walking around at night wearing dark clothes in a poorly lit area you're less likely to be seen which is generally not to your advantage unless you're planning to do something illegal.
Déjà vu (Score:2)
https://www.imdb.com/title/tt1... [imdb.com]
Self driving cars may hit people (Score:1)
lol (Score:1)
How about humans? (Score:2)
I only had a quick glance through the paper so not sure if it's addressed, but: what is the normal everyday rate of human drivers hitting people with darker skin? How does that compare to self-driving cars?
I nearly hit a dark-skinned cyclist just a couple days ago, about 3 seconds after he was nearly hit by another car. Wearing almost all black and riding at night with no lights. He was nearly completely invisible and it was obvious the other car only saw him at the last second, just like I did - in fact on
Could have been worse. (Score:2)
Self-Driving Cars May Hit People With Darker Skin More Often, Study Recommends.
I think (Score:1)
https://en.m.wikipedia.org/wik... [wikipedia.org] is very high
Re: (Score:2)
flawed (Score:2)
And in further news... (Score:2)
I'll bet they will hit people who wear all black/navy blue with their hoods up more often too!
Heck, I'll bet humans hit them more too. You know, Scene Contrast. I can't tell you how much I hate the NY "we wear all dark clothes" thing on rainy nights. Add in jaywalking, and I can't tell you how close I've come at times. It is why I added "Black retro-reflective" stripes to one of my black jackets, and one of my new jackets is safety yellow with DOT level 3 striping. Sometimes I'm required to be roadside
Prejudice? (Score:2)
Thinking that accidently being more likely to hit people who are harder to detect on cameras is a carry-over of real world prejudice tends to show how people have lost perspective on what that really is.
Not everything that disproportionately impacts some racial marker is racist, only a deliberate effort to target by race does that.
Automated processes and algorithms aren't racist.
That's some weapons grade clickbait there (Score:2)
Another possible solution to avoid bias ... (Score:2)
"Thankfully, the researchers were able to figure out what was needed to avoid a future of biased self-driving cars: start including more images of dark-skinned pedestrians in the data sets the systems train on and place more weight on accurately detecting those images."
Another possible solution would've been to randomly hit people with light skin color that the AI recognized with a small probability, so that it evens out.
Better Off Ted (Score:2)
Man, I wish that show hadn't been cancelled. The episode with the drinking fountains was just too predictive.
https://vimeo.com/29017688 [vimeo.com]
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Or if they're also wearing dark clothes... The face is not typically covered so light skinned people still have light faces at night even if the rest of their body is covered with dark clothing.
How do these systems perform when identifying people wearing dark coverings such as motorcycle helmets, veils or balaclavas etc?
Re: (Score:2)
Re: (Score:2)
Self-Driving Cars May Hit People With Darker Skin More Often
When I first read that, I wasn't sure if it was a statistic, or the latest executive order from the president.
When I first read it, I immediately thought of a Stupid Kid Joke(tm) learned way back in that weird era known as the 1970s.
Not giving the set up, just the punch line...
"No, but I got him with the gas can."