Facial Recognition Is Accurate, if You're a White Guy (nytimes.com) 284
Facial recognition technology is improving by leaps and bounds. Some commercial software can now tell the gender of a person in a photograph. When the person in the photo is a white man, the software is right 99 percent of the time. But the darker the skin, the more errors arise -- up to nearly 35 percent for images of darker skinned women, the New York Times reported, citing a new study. From the report: These disparate results, calculated by Joy Buolamwini, a researcher at the M.I.T. Media Lab, show how some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition. In modern artificial intelligence, data rules. A.I. software is only as smart as the data used to train it. If there are many more white men than black women in the system, it will be worse at identifying the black women. One widely used facial-recognition data set was estimated to be more than 75 percent male and more than 80 percent white, according to another research study.
The laws of physics are discriminatory (Score:5, Insightful)
White people's faces reflecting more light is problematic.
Stupid racist algorithms (Score:3)
Always knew that machines were bigots.
Need to get a deep tan (Score:2)
So the government will be less likely to know where I am.
Here we go again (Score:4, Interesting)
For progressively darker skin, progressively higher light on that skin is required to reveal its contours. The fundamental problem is that white and light-skinned brown people have their normal skin color shades in the midtones when a scene is properly exposed while darker-skinned brown and black people are closer to shadows. To expose properly for facial recognition of dark brown or black skin, you have to overexpose the midtones to bring up the shadows. Since people rarely take photos on purpose that are exposed for the shadows while blowing everything else out, it should be fairly obvious that facial recognition (and early ISO 32 color film and small-sensor cameras like webcams and phone cameras) will have a very hard time with dark skin. Sure, it could be a lack of data in some instances, but it's far more likely to be the fact that the skin absorbs more light and photographs are generally exposed too low to reveal enough detail for the machines to analyze.
If you think this is "racist" you're saying that the nature of light itself is racist. I don't feel like I should have to explain why that position is really stupid.
Re: (Score:3)
Re:Here we go again (Score:4, Informative)
The point you missed is that set lighting for white people has to be carefully designed and set up. One of the reasons colour film took so long to become practical was the difficulty of getting skin tones right.
If you look at early colour film the skin tones of white people are pretty good, but other colours are way off. Over saturated in places, washed out in others. It was a design decision.
Vox is correctly pointing out that film from the era was not designed for dark skin, and that made it hard for non-white actors. Similar to how when sound came in a lot of actors lost work because they had thick accents.
Note that the Vox article does not contain the word "racist" or even "race". It's pertinent because we are now seeing more black actors on screen and Hollywood finally figured out how to light them properly.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
There are lots of different kinds of film, even from the same manufacturer. Different kinds of film are better at capturing different colors. There's long been film which was better for photographing whites, and film which was better for photographing people with dark skin.
Re: (Score:2)
set lighting for white people has to be carefully designed and set up. One of the reasons colour film took so long to become practical was the difficulty of getting [white] skin tones right.
I assume you've got a reputable source for this assertion, right? I mean, surely you didn't just pull it out of your ass (or the ass of a gender-studies course*).
Pertaining to older movies, which ones are designed to be slanted in their color representations? I watched Spartacus, for example, which came out in 1960, and while the cast is certainly overwhelmingly white, there's a notable scene [youtube.com] with Kirk Douglas and Woody Strode which hardly appears to be slanted against Woody in the color balance/saturatio
Re: (Score:2)
Nice. Make a bullshit assertion. Someone rebuts your bullshit with factual counterpoints. Cry "Triggerreed! Guilltay! Snawflarke!"
By the way, where did you see a dig at women? The use of a gender studies (not even women's studies) example, illustrating how your ilk are eager to assign bigoted motivations -- without any evidence -- to genuine technical challenges, simply because such challenges may not impact everyone identically? Hmm, that seems very related to the topic being discussed... almost as if it's
Re: (Score:2)
I suggest not to explain this in terms of exposure etc. because that will simply trigger a discussion on (early) photography technology development being racist.
It's much simpler than that: the darker something is, the less light it reflects, the less information is present in it's appearance, the harder it is to recognize. This will always be the case and developments in photography technology will never solve it, they will alleviate the problem at best. If it is a problem, that is, because I think facial
Re: (Score:2)
I agree that a significant part of this is a physics problem. It would be possible to test whether or not this is algorithmic by training a recognizer on a high percentage of dark skinned people and seeing what its performance was like on light skinned people.
A lot of modern cameras / cell phones have live face detection features. A photo setting that set exposure for faces would help this. People might not use it much though - if you have a dark skinned person in a scene, many people may still prefer th
Re: (Score:3)
Sorry, but that's not the problem.
In the summary the problem was stated to be the frequency of images in the training set data.
The technical details you specify may be correct, but they are irrelevant to this particular problem. And they wouldn't explain the problem with recognizing women in any case.
Re: (Score:2)
Well I guess you better get off your ass and implement a better algorithm, then. After all, light reflectivity isn't a problem for you.
Also, what is the problem, in the summary, with recognizing women vs. men? This statement?
more errors arise -- up to nearly 35 percent for images of darker skinned women
Oh, wait... it's darker skinned women, not just women in general, that the summary references. Perhaps the technical detail referenced by the parent isn't irrelevant after all?
No, I'm sure you're right, and will roll out that politically correct recognition algorithm of yours in no time
Re: (Score:2)
Re: (Score:2)
The fact that we call a portion of the electro-magnetic spectrum "light" is obviously racist.
Re: (Score:2)
Re: (Score:2)
Ha! I remember a Dave Barry column where he referred to lightbulbs as "darksuckers".
Re: (Score:2)
The problem is not the nature of color and light. The problem is the sample data used to train this Ãoeartificial intelligenceÃ. If the trainer doesnÃ(TM)t think to include non-white people, the technology will suck at differentiating non-white people.
That you yourself ague that it is inherently more difficult to distinguish non-white people compounds both the error and bias. Not against non-white people per-she, but simply in reflection of the view that white peoples matter most.
Sigh. You start off so well in the first paragraph, and then you jump to unfounded conclusions. Feeding the facial recognition routines data that reflect the overall demographics is the unbiased approach. Feeding it 50% pictures of dark skinned people in a region with 20% dark skinned people would be biased, and precisely what you argue against.
Yes, it will likely have problems with albinos, people with argyria or rickets, those who use the same dermatologist as Trump, full beard aficionados, and many o
Re: (Score:2)
Better Off Ted: "Racial Sensitivity"
http://www.imdb.com/title/tt13... [imdb.com]
Re: (Score:2)
They (Vox) also spend a lot of time picking on Kodachrome film from Kodak which has a notoriously narrow dynamic range (about 6 stops compared to modern color film's 13 stops) and which was only available up to ISO 16 until 1961. [wikipedia.org] At ISO 16 you have to expose around 6-7 times longer for proper exposure compared to the "sunny outdoors film" of the 1990s which was IS
also in human cognition (Score:3)
Someone needs to test whether humans, also, decline in speed or accuracy of facial recognition when dealing with darker shades of skin colour.
I know for certain that I have more trouble reading facial emotion from black people than white people. The naive response is that I live in a city that's 95% white. But I've been able to convince myself that this is the correct explanation. I simply feel like I have less visual data than I would otherwise at the same point in the cognitive process.
Suppose I lived in a troop deployment in Afghanistan, and 90% of the people around me wore camo all the time. Would I actually become better at recognizing camo than civilian gear? But this is, indeed, the converse implication of the naive hypothesis.
There are populations in Brazil that experience the entire range of skin tones on a daily basis. These populations could be tested for recognition rate/accuracy for lighter and darker test cases.
I highly suspect that darker skin tone has a detectable coefficient of identity camouflage, also in human cognition.
Re: (Score:3)
I know for certain that I have more trouble reading facial emotion from black people than white people. The naive response is that I live in a city that's 95% white.
The more likely reason is that you grew up in an environment that was 95% white.
It's well-established (many studies) that people are better at recognizing faces similar to those they grew up looking at. Just like with machine learning, human brains trained on white faces are better at distinguishing white faces, and human brains trained on black faces are better at distinguishing black faces.
I highly suspect that darker skin tone has a detectable coefficient of identity camouflage, also in human cognition.
That would not explain why Africans who grow up without seeing white faces think all white people look alike, but c
Re: (Score:2)
Recognising emotion is different to recognising identity. It's heavily dependent on culture. It took me a while to learn to recognise Chinese and Japanese emotions from people's faces, because they are different to British ones. I guess different shape faces probably had an influence too.
But I don't think skin colour alone was much of a factor, which is what screws up these facial recognition systems.
Why is this a bad thing? (Score:2)
Facial Morphology is an issue too (Score:4, Interesting)
Facial morphology refers to the various traits and features in a face. For example, the distance between the eyes, or the eye slant, or cheek gaunt or whatever.
'White' people have the broadest range of diversity, in part because aside from the skin color, there's a lot of differences. Certain Asians, like the Han Chinese, have some of the least diversity (google for iphone face recognition matching two Chinese co-workers).
If you pick 20 key features as your unique code, and each of those key features has 20-30 distinct possible values, you can rely on reasonable uniqueness, even when some of those values have inter-relationships. When the diversity goes down, and 10 out of the 20 are not unique, and when the range of values those have is between 3 and 5, well, you'll have a lot more trouble differentiating people.
In fact, a studies shows that among a given ethnic group, actual real life people perform facial recognition on only a few features, but those features are always those traits that show the most variation. When you apply that same algorithm to another ethnicity, it doesn't work so well. You get racist-seeming phrases like, "They all look alike to me," when really the issue is that your specialized detection algorithm was never meant to deal with their differences. ... and every group has this blindness. The one thing that's amusing is that because whites tend to have a large variety, they're the easiest to uniquely identify regardless of your personal/cultural/ethic technique. So, you can say things like "I can tell all you white people apart, you're racist for not being able to identify ME!" and think you're on the moral and ethical high road, when in fact, the situation is different from the other side.
Perhaps... (Score:2)
Rather than:
some of the biases in the real world can seep into artificial intelligence, the computer systems that inform facial recognition
Maybe the issue is lighting? Why does a simple thing such as AI to identify gender from a facial camera have to be an example of latent racism? As if programmers subtly, unconsciously, monkeyed with the algorithm to only work for white faces.
Robot eye for the white guy (Score:2)
Bias (Score:3)
Why are we calling this bias? White males have the most range of unique identifying characteristics:
* Beards
* Moustaches
* More tonal contrast
* Difference in eye and hair colour
FaceID works here (Score:2)
Apple's FaceID uses infrared depth perception, where light contrast isn't an issue.
Apologies for the inflammatory title https://www.gizmodo.com.au/201... [gizmodo.com.au]
They also went to the effort of testing it out on various ethnicities as well, so the AI didn't overly focus on areas that are different for one group but similar in another.
I'm not saying that Face ID is "racist" or anything like that. I'm just happy there's a technical solution that solves this problem.
See, it's not just us... (Score:2, Funny)
Also it's sexist (Score:2)
I'm having trouble with the fact that it's so accurate at identifying men but not women. Gender politics aside, how does this work?
Re: Facial recognition (Score:5, Informative)
Darker colors provide less contrast. Less contrast means features are more difficult to make out.
Combine that with the typically horrendous lighting video cams face and you have a situation where recognition fails.
Re: Facial recognition (Score:5, Funny)
What's the difference between porn and erotica?
Lighting.
--
BMO
Re: Facial recognition (Score:5, Interesting)
Most of the time these cameras would work just fine if they adjusted the exposure properly. They have auto exposure but it's tuned either for general photography or white skin.
The fix isn't that complex, and some kind of calibration could be done when setting up face recognition.
Re: (Score:2)
They have auto exposure but it's tuned either for general photography or white skin
You mean that they overexpose black skin ? This means that the correct setting would produce darker results, with even less contrast.
Re: (Score:2)
Underexposed, not over developed. The sensors are not getting enough light. You would have to increase the amount the light via shutter speed or aperture. But if you tuned it for darker faces, white faces would be washed out and the problem would be reversed.
Re: (Score:2)
If the cameras have their exposure calibrated for a white face (middle gray), but they are shown a black face, they will automatically increase light to make the face look middle gray again, and the face will be overexposed.
Re: (Score:3)
Sorry, I misunderstood your previous statement. What you just said makes sense. If I understand what you are saying, attempting to compensate by overexposing will cause the face to look unnatural.
You're right, they shouldn't overexpose as a solution since the point was to identify people from a normal picture. As opposed to using over exposed pictures that are focusing on image recognition. Unless it was an HDR+ picture, it would look really bad. Even an HDR+ picture would have issues unless people wanted p
Re: Facial recognition (Score:4, Insightful)
Strange you would make this about social justice, and not just an interesting technology problem. TFA doesn't mention racism or anything... It's almost like you are some kind of social justice obsessed warrior who has to bring it into every conversation.
Have to agree with you about the sock puppets though.
Use infrared (Score:3)
Then there is no discrimination. Kinect infrared for example does a great job leveling the playground.
Re: (Score:3)
Are you sure? My guess is that melanin radiates in the infrared when exposed to light, so I would guess you've just altered the problem.
Better to improve the dynamic sensitivity of the CCDs.
OTOH, the real problem, as mentioned in the summary, was the set of training data. It learned to handle the most commonly appearing faces in the training data well, and didn't do as well on those things that were rare in the training data set, which happened to be both non-Caucasian faces and female faces.
Re: (Score:2)
Yes, I use Kinect in nightclubs (custom software) and it sees the faces of all races and all major genders, to quote Dave Barry, equally well.
Agree re the training set, I think if they retrained it on IR faces it would be much more robust since it would not depend on lighting.
Re: (Score:3, Interesting)
I don't really think that is true, but it is true you need different lighting to pick up the nuances in a black face. Movies and TV shows for years have lit black actors in ways that wash out their faces by using rules of thumb that work for white faces. It's only in the last few years that we're seeing black actors properly lit.
Re: (Score:3)
That's beside the point in this context, because with most facial recognition cameras, you don't control the light. Most of the time, it's going to be daylight, and sometimes street lights or store lights.
That darker objects reflect fewer photons is always going to be the case. That's not racial bias.
Fitness trackers with optical heart readers have a similar problem, where the light does not penetrate as well for darker skin. Some auto-adjust, with the unfortunate side effect of shorter battery life for
Re: (Score:3)
Actually it is on point with regard to testing.
Anyhow, what you're saying black faces have less luminance -- well sure. That says nothing about contrast. The idea that there's less detail there to be seen is a result of looking at poorly lit photographs.
The upshot is that if the problem is in the sensor's ability to handle a certain luminance range, by that theory the algorithms should perform better on black faces in very brightly lit conditions.
Re: (Score:2)
>Darker colors provide less contrast. Less contrast means features are more difficult to make out
That's all there is to this. Let's go home. Seriously.
Re: (Score:2, Informative)
There's more than enough contrast for a convolutional neural net to work with. You could probably turn it all into 4 bit grayscale before training and still get excellent results.
The explanation isn't what you propose, but unbalanced training sets. Failing any face is equally bad for the training algorithm. Whether its errors are equally divided among all subgroups, or concentrated in one of them, is equally good for the algorithm. Since it has more data on whites, it can profit more from focusing on featur
Re: (Score:2)
Darker colors provide less contrast. Less contrast means features are more difficult to make out.
Combine that with the typically horrendous lighting video cams face and you have a situation where recognition fails.
This is precisely why people say diversity in IT is a good idea. Developers primarily develop for themselves, so if you are missing a large selection of the population in your company then there is a blindness there to those people's needs.
Re: Facial recognition (Score:4, Interesting)
Actually, light skinned people have more variations in facial shape than do other groups. Second is darker skinned people. Orientals have the fewest.
OTOH, in any particular area, the people from outside the area are likely to have more variation, because they have a wider variety of ancestors.
That said, this *is* a bit strange, because the greatest genetic variation is among the population native to Africa. (Note I'm not even including the Australian aborigines. Which are a part of the facial variation of the darker skinned people.) So one would expect the largest variation among the darker skinned people.
Additionally the homogeneity of the orientals is probably due to their long period of civilization. This is a guess, but it's a reasonable one. So there was a longer period of undisturbed gene flow among groups. Even so there are distinct sub-groups, just not as many as among other categories.
A problem with this analysis is that I didn't include the population of the Indian sub-continent, as I couldn't figure out in which group to place them. They have darker skin colors, but have facial features that more closely align with the lighter skinned peoples. This is readily explained by historical analysis, but it does make categorization difficult.
Re: (Score:2)
Sorry, this is from multiple different sources, not one, and different pieces were in different places. The scientific stuff was from places like Science News, and I only read the printed edition, so I don't have links to ANY of it.
And no, orientals isn't a technical term, but I wanted to lump Melanesians, Polynesians, Chinese, Mongols, Korean, Japanese, and various south-east Asians together without including Australian aborigines or their close relatives. This isn't a closely related group, but it's a g
Re: (Score:2)
It's not the preferred nomenclature!
Re: Facial recognition (Score:4, Informative)
The Summary said it was because they were underrepresented in the training data set.
That's what you should first assume when an AI system fails at some particular kind of categorization, so it should hardly be surprising.
Re: Racist, or accurate? (Score:5, Insightful)
Have you ever watched some war move where all the actors are relatively unknown and are sporting a buzz cut?
Can't tell one white guy from another, especially if they have similar builds.
And when they later use camo face paint...forget it. All you can rely on is their voices
Re: (Score:2)
Face recognition doesn't care about hair style. It doesn't with like a human, it measures face geometry like the distance between the eyes.
Camouflage make up works only if you design it to make it hard for the face recognition to see features like the eyes and mouth. Basic war paint has mixed results, especially with cameras that are IR sensitive for night vision.
Re: (Score:2)
Face recognition doesn't care about hair style. It doesn't with like a human, it measures face geometry like the distance between the eyes.
Yes, and no. While eye distance is a factor, it's not the only factor.
Some of the facial geometry can and sometimes is obscured by hair style, whether it's hair obscuring the slope of the forehead, a lock obscuring the brow ridge, a moustache obscuring the mouth, or a full beard obscuring the entire jaw. For some, the eyes and nose is almost all data you have, while for others there are far more data points.
Re: (Score:2)
So, when the white guy struggles to differentiate black people with an "you all look the same to me" statement, it's automatically construed as racist and derogatory.
It is? News to me, mostly because you're inventing outrage.
Even the Grauniad doesn't say it's racist.
https://www.theguardian.com/sc... [theguardian.com]
Re: (Score:2)
I have seen several black men make "we all look alike" jokes this year. I guarantee you it's a thing.
er...?
Basically if you spend your time mostly hanging round with people from one race, people from other races will be kind of similar looking.
On the other hand, it's crass to say "all X look alike" in seriousness because it's not true. All X might look alike to a particular person, but not to many many other people. Do people still say such things in seriousness? I've not heard something like that personal
Re:Racist, or accurate? (Score:4, Informative)
Racist? Not really - just that one learns to identify people by looking at people. So Africans generally think Asians and "whites"* look all the same, Asians generally think "whites" and Africans look the same etc.
Nothing strange or racist about that.
Then add the fact that many people have problem identifying others from facial features alone we get a cultural aspect of this "look alike" thing.
Re: (Score:3)
Actually that is literally the definition of racism.
I just looked it up and it's literally not the definition of racism.
Re: Racist, or accurate? (Score:2)
racÂism
ËrÄËOEsizÉ(TM)m
noun
prejudice, discrimination, or antagonism directed against someone of a different race based on the belief that one's own race is superior.
"a program to combat racism"
synonyms: racial discrimination, racialism, racial prejudice, xenophobia, chauvinism, bigotry, casteism
"Aborigines are the main victims of racism in Australia"
the belief that all members of each race possess characteristics or abilities specific to that race, especially so as to distinguish
Re: (Score:3, Funny)
It's official, I can no longer differentiate between alt-righters and SJWs, both just say "we're the victim and everybody else are raping/genociding/whatever us"
Re: Bias (Score:2)
I'm glad it's not just me.
Re: (Score:2)
Really? I think the terms are best described as "someone I don't agree with" and "someone I don't agree with that have some conservative views" - at least as used by idiots on the Internet.
Re: (Score:2)
Wait, who is who?
Re: (Score:2)
Horseshoe theory. They're two sides of the same racist coin.
The only thing they diverge on is who is the ubermensch and who is the untermensch.
Re: (Score:2)
Re: (Score:2)
This will all even out over time. There are a finite number of facial types by race.
If that's the case, then facial recognition will soon be useless.
Re: I am half white, half hispanic.. so... (Score:2)
It means there's a 50 percent chance you'll kill the AI in a drive-by before it has a chance to ID you.
And wtf is "half white, half hispanic"?? You mean you're 100% white and you speak Spanish?
Re: (Score:2, Insightful)
Re: (Score:2)
Re:So it will be no good (Score:4, Funny)
Trump's a white guy isn't he?
Orange is the new white?
Re: (Score:2, Interesting)
Are there sources? We could provide sources for you all day.
https://www.alternet.org/civil... [alternet.org]
http://abcnews.go.com/Politics... [go.com]
https://www.ussc.gov/research/... [ussc.gov]
Re: (Score:2)
You've never been to Texas, have you?
Re: (Score:2)
Kentucky
Re: So it will be no good (Score:2)
In many parts of rural West Virginia people leave their house doors unlocked at night. Because there's no crime. Despite the crushing poverty.
Re: (Score:2)
Statistically, they have better lives, and lead a more valuable and productive existence. Millenia of archeological proof.
People in Africa had at least 50000 years of head start to make their lives more productive. They have plenty of natural resources too. It was their own choice to continue with tribal warfare until present day.
white countries have been rewarded with the best, most-coveted civilisations
They started with nothing, and made those themselves.
Re: (Score:2, Interesting)
Yes, a story listing examples of how white people can point guns at cops and live and black people cannot.
Yes, a piece from the "politics" news section that includes citations to peer-reviewed studies.
The Commission's study was most certainly peer-reviewed.
https://www.albany.edu/scj/doc... [albany.edu]
Re: So it will be no good (Score:3, Insightful)
Yes, a story listing examples of how white people can point guns at cops and live and black people cannot.
lol. If you're stupid enough to think that a cop with a gun pointed at him gives a shit about the colour of the hand holding it, you can't possibly expect to be taken seriously in these kinds of discussions.
Re: (Score:3, Insightful)
Of course he gives a shit. There are centuries of examples in the United States to police giving white people the benefit of the doubt when black people would have been put under the jailhouse.
I can't believe you're stupid enough to think that kind of long-standing prejudice just suddenly went away in the past 20 years.
Re: So it will be no good (Score:2)
Of course he gives a shit.
"Oh hello Mr. Fellow White Citizen. May I inquire as to why you are pointing that firearm at my head? I'm sure you mean no harm, but you understand that I must ask."
lmao. You have some incredibly entertaining delusions, buds.
Re: So it will be no good (Score:2)
I suggest you walk your honkey ass down the street of your posh little town and find the nearest heavily armed paramilitary thug ("cop"). Then point a firearm at him. Please be sure to have someone film your encounter.
Most likely the law enforcer and his gangmates will blow you away in a hail of lead. There's a smaller possibility he may beat the living shit out of you, then lock you in a cage at a torture camp for a few decades.
Re: (Score:2)
https://www.usnews.com/news/na... [usnews.com]
Re: So it will be no good (Score:2)
Wow, that's crazy. Who would have thought that a demographic which is insanely more likely to murder others is also way more likely to be killed by police.
It's almost like there's something connecting those two statistics ...
Re: (Score:2)
Yes, but not in the direction you think.
Re: (Score:2)
Hey man, don't look at me. I'm Sicilian, I get bumps when I shave and I carry the gene for the Mediterranean version of sickle cell. I clap on the 2 and 4 beats and have a terrific se
Re: So it will be no good (Score:2)
Yes, but not in the direction you think.
Ah, but of course. He dindu nuffin. He was a good boy. Da po po made him rob the store and shoot the clerk. If only they wasn't to ray-cist he might have been a rocket surgeon!
Re: (Score:2)
Gosh, you're an idiot.
Re: So it will be no good (Score:2)
Gosh, you're an idiot.
That's adorable coming from the dipshit who thinks that cops are quite happy to be shot by white criminals, and that police shootings somehow magically make blacks more likely to be killers.
Re: (Score:2)
Rich white psychopaths always have the option of becoming CEOs.
Re: (Score:2)
âoeAll the white men are wealthy and privilegedâ
LOL
You missed the "in societies where" that qualified the bit you quoted. I'm not claiming that any such exist (nor am I claiming that they don't), just that in such a society a face recognition algorithm that only works on white guys wouldn't be much use in catching criminals.
Re:Even 99 percent is nearly worthless (Score:4, Informative)
Also known as the base rate fallacy. If you're looking for a needle in a haystack, an algorithm which correctly distinguishes them 99% of the time is useless.
Re: (Score:3)
No, it will find "needles" all over the place, laced through the haystack. Just one of those tens of thousands of "needles" will actually be a needle.
Re: (Score:2)
If I am trying to catch 30 terrorists in a city of 1 million I am going to get ~100,000 false positives. My false positives will outnumber my valid hits over 3000:1.
No, you won't, unless you insist on submitting pictures of infants and senior citizens in your search for a dark-skinned middle eastern terrorist (you know, like in the movies).
In a city of 1 million, half of them are the wrong gender, and given an equal spread of the population aged between 0 and 75 years, when looking for a 35 year-old male, you can likely exclude 75 of all men as too old or too young. So now we're down to 1/8th of the city, only 125K possible matches BEFORE we factor in ethnicity - when
Re: (Score:2)
That means if you are looking for one terrorist in a train station that services 300,000 a day your false positives are going to outnumber your actual hit 3000:1.
What about false negatives, a failure to match correctly?
You know that 99% probability resets with each new face - you aren't guaranteed a positive for every 100 comparisons, right?
Re: (Score:3)
The tabby takes great pictures
What sort of camera have you trained that cat to use when he's taking those pictures?
Re: (Score:2)
https://usercontent.irccloud-c... [irccloud-cdn.com]
Re: Colorimetric algorithm rather than geometric? (Score:2)
Most algorithms I've seen convert to black and white before further classification. Color is mostly useless and quite expensive (4x as much data) to computers that use geometric features to classify pictures.
Geometry is detected by shadows (dark) (Score:2)
The geometry is found by distinguishing between recessed a car with as, which are dark, next to relatively high areas, which are brighter. When the entire face is dark, it's difficult to distinguished shadowed areas, and therefore geometry.