Face-Scanning Loses by a Nose in Palm Beach 232
Rio writes: "A story from myCFnow.com reports that Palm Beach International Airport officials said
face-scanning technology will not become part of their airport's security system."
Looks like
the ACLU was right.
Checking a database of 15 employees, the technology gave false-negatives -- failed to recognize the test subjects -- over 50% of the time. A spokesperson said, "There's room for improvement." The Pentagon said
the same thing
in February. The false-positive rate is more important -- it isn't mentioned, but even if it were just 0.1%, Bruce Schneier argues,
it'd be useless.
only 15 employees? (Score:2, Insightful)
good idea...now extend this (Score:3, Insightful)
It's 0.1%, not 0.01% (Score:2, Insightful)
If a person is mistaken once for a terrorist (or a "normal" criminal), don't you think other recognition points will do the same mistake? What do you do then? Plan a few extra hours each time you take the plane? Get an official letter stating "I'm not a terrorist"? If a simple letter can get you through, terrorists will get some.
Unpopular View (Score:5, Insightful)
Incidentally, by this reasoning, it is in fact the false negatives that are more important. False positives can presumably be discarded by humans providing closer scrutiny. False negatives in this scenario, however, present a major difficulty.
Face scanning technology isn't innately evil. Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt. No surprises there.
-db
False positives okay (Score:2, Insightful)
It is the false negatives that are truly scary. If a known terrorist sympathizer can board a plane without setting off any signals then it is clearly a useless product.
Luckily, humans have the ability to fuzzily predict terrorist-like behavior (now that everyone's on high alert, that is).
Re:False positives okay (Score:3, Insightful)
False positives are as bad if not worse then false negatives.
Re:Unpopular View (Score:3, Insightful)
If we want to make this technology work... (Score:2, Insightful)
I, for one, am pretty much 99.99% correct when it comes to making positive recognition of those people around me that I see often. People I haven't seen in a few years, I have more trouble identifying. Why? Because people's faces change. Facial hair, glasses (or removal of them), makeup, etc. can throw a lot of people off. Can this technology compensate for that?
I personally think that these cameras need to look at people the way we do, with two eyes. What do we get when we look at the world with two eyes? Depth perception. We can see objects in three dimensions, because we see it from two angles at once. If facial recognition computers were able to take in two separate data streams, like two cameras a foot apart, it would be possible to create at three-dimensional image of that person's face. And though it would require more computing power, it is much easier to make a positive match using three-dimensional data as opposed to two. Ever seen a perfect frontal view photograph of a person's face? Can you tell how long their nose is when you're looking at it? Isn't the length of a person's nose a significant facial feature? (Oh, and I know, if you see a person from the side, you see that, but these cameras are always only getting one angle, so they're always throwing out a lot of data. If you see a person's face from the side, you are not seeing how wide their face is, and so on.)
Re:False Positives are OK (Score:2, Insightful)
Given the choice of a false positive in a bookshop and one at the airport I know which I would want to avoid.
Re:Broken promise ring (Score:3, Insightful)
Atta (the scary looking ringleader) had previously been arrested in Israel for being a terrorist. He was relesed as part of Bill Clinton's mideast "peace" initiative, but was still on various US gov't list of terrorists.
If the INS wasn't totally useless, if the FBI, FTC etc. shared information, they would have been deported when they were caught being here illegally, driving with an expired licesne, failing to show up for court, or buying airline tickets.
Tom Daschle and the democrats want to blame George Bush because the FBI and CIA, in hinsight, had the information to see this coming.
The real tragedy is that they, and thousands of others, were here illegally, and we did nothing.
Re:Unpopular View (Score:5, Insightful)
I don't necessarily understand the objections to face scanning technology. [...] Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt.
You just hit the nail on the head there; most people who don't like this technology don't like it because (they believe) it will be used irresponsibly, eventually if not immediately. Power corrupts, as the old saying goes, and people are unfortunately easily corruptible. Ordinarily I wouldn't be quite so pessimistic, but given all the hoopla over the "War on Terrorism", I'm inclined to side with the Slashdot popular view.
(Note to moderators: Yes, I do realize that there are many points of view represented on Slashdot, thankyouverymuch.)
False positives, fales negatives, and wasting time (Score:5, Insightful)
As noted, there can be no "get past ID check free" letter or ID card, since those would immediately be forged. And with a 50% false negative rate (missing a suspect 50% of the time), the system seems hardly worth using.
I have not traveled by air since returning from Europe on September 19 (delayed from Sept. 12).
In the past, I would have flown between the San Francisco Bay Area and the Los Angeles area (a 1-hour flight, using any of the airports on either end), but now it's actually likely to be faster to drive (around six hours each way), after including all the "waiting in line time," the increased flight delays, and of course the time to get into and out of the airports (park here, rent a car there).
To be fair, of course, a system with a 50% false negative rate is presumably able to detect "known suspects" 50% of the time, which is almost certainly much better than human beings will ever do. Of course, the tests are probably being conducted under very favorable conditions, with an extremely small sample of "suspects." And of course, if the false-positives were equally distributed, we'd all be willing to suffer a one-in-a-thousand delay, if it actually had any meaningful benefit. (But we know that the false-positives won't be equally distributed, they will mostly affect persons in certain ethnic groups or with beards, etc., and while that means I'm less likely to be inconvenienced, I can't tolerate a system that punishes people for their skin color or ethnic background.)
What's scary, to me, is that we are giving up so much (in many little bits and pieces) for so little benefit. On Saturday, I discovered that I couldn't use the restrooms in the BART (train) stations again, because they were closed to prevent someone from planting a bomb in them. Okay, so I had to hold it for an hour until I got home, big deal. And armed troops in the airports, and on bridges, okay, I can live with that one thing. And I can't drop off my express mail without handing it to a postal clerk now.
But ding, ding, ding, we add up all the little "show-off" gimmicks and what we face is a huge impact that provides pretty much zero actual benefit. All the gimmicks combined might provide about 1% or 10% improved safety, at a much greater cost.
While I was stuck in London during the week after September 11, I worried that things would change for the worse, not because of things that terrorists did, but because of the things we would do out of panic and fear and prejudice and idiocy. Things are nowhere near my worst fears, but I think things are very bad, and ultimately I believe that the terrorists have already "won" by causing most Americans to change multiple aspects of our "way of life."
The Ultimate System (Score:2, Insightful)
A security guard is sitting in front of a computer next to the x-ray machine ready for a positive match.
If you look nothing like the person. (different race or something like that) You would be let through to the gate and not even know you were positively identified.
If it may be a good match- you get stopped. The operator already has some information about the criminal in front of him. The operator will do an on the spot quick check. One thing that crimanals are notorious for is tattoos. If the passenger doesn't have them (or signs of removal surgery) let them go. If the passenger is a very close match do a more thorough examination.
Every night there can be an audit of the matches to make sure the security personel are doing their job. The system seems very effective to me.
The system by Visionics looks at 80 different facial characteristics. The configuration used by the airport only needed 14 matches to produce a positive. It seems this is a setting in software and could probably be lowered to produce more positives. Even if they are false positives the sytem I menetioned above would do the job.
Re:Unpopular View (Score:4, Insightful)
Look-alikes? (Score:5, Insightful)
Let's say, some time in the future, they get the face-scanning technology to work right. 0.000001% false-positive rate. And it's implemented all over the US.
Let's also say that, among the 250 million people in the United States, one or more people had facial structures similar enough to terrorists' that they would trigger those scanners. In fact, they'd trigger every scanner that person was surveiled by. And let's say that person were you.
What would you do?
You couldn't go to an airport. You couldn't go to a major public attraction. You probably couldn't go to a public place without fear of some alarm going off and people waving automatic weapons in your face. Would you cower at home? Would you wear a bag over your head? Would you sue the US government? How would you cope?
No such thing as a cure-all (Score:2, Insightful)
Employing facial recognition is just one thing we can do - granted, we need to get the technology to work better, but we need to realize that it's multiple systems working together that is going to stop terrorists, not one or two "miracle systems."
99.99% accurate?? (Score:3, Insightful)
First, as you state, that 99.99% accuracy rate only applies to a group of people you meet regularly; this probably includes perhaps a few hundred people, and a significant part of your total memory and processing capability is devoted to recognizing and updating your memory of those faces (check out a brian map for how much of our cortex is dedicated to face recognition.) Even duplicating that feat (i.e. identifying a small group of faces) would be a major undertaking for a computer system.
Second, that 99.99% isn't nearly as impressive as it sounds, because it represents the positive rate, i.e. the chance that you will correctly identify an individual in the target population. That corresponds to a false negative rate of 0.01% -- you're saying that once in ten thousand times, you'll actually fail to recognize somebody you see on a regular basis. Not too encouraging, that.
Third, that figure says absolutely nothing about the false positive rate, which I suspect is much higher. In other words, how often do you see somebody that you think you recognize, but can't quite remember exactly? From my own experience, I would say that number is as high as one in a hundred. Our own built-in face recognition system is simply designed that way -- to generate a large number of "near misses".
So, the bottom line is: even the supposedly high accuracy of human facial recognition isn't accurate enough, and undoubtedly doesn't scale very well.
False positives (Score:5, Insightful)
As for false negatives, even 50% is better than nothing as long as the false positive is much MUCH lower. Imagine catching 50% of the hijackers on September 11 before they boarded the planes. A lot of red flags could have gone up, and flights could have been delayed, the rest of the passengers could be more carefully scrutinized. No, this is not the solution to any problem. And no, it should not be used legally any more than a lie detector can be. Its a guide. It tells us where we might need to concentrate more of our efforts on.
As far as threats to privacy go, this makes sense in an airport, but it does not make sense out on the street. People go into an airport expecting to be searched, questioned, carded, etc. They do not have the same expectation while walking down the street. So unless the cops are currently chasing someone, they lose him, and you have a striking resemblance, they shouldn't bother you at all.
-Restil
Re: Unpopular View (Score:3, Insightful)
> To be sure, I don't want computers flagging people to be arrested. But computers sift through enormous amounts of information, making them ideal for a first pass. If they are used to flag people to be scrutinized by humans, I don't have any objections.
Are those humans going to be highly-trained well-paid experts like those who work airport security?
The basic expectation is that the human 'supervisors' will adopt a strategy of either (a) waiving everyone that the computer identifies because they're tired of taking the heat for false positives, or else (b) calling for the cops every time the computer identifies someone, so they won't have to take the heat if a terrorist does get through. (Interesting problem, that. I would guess that we would see a lot of variety of individual behavior early on, after which it would settle into a state where all 'supervisors' behave the same. Presumably that would be state (a) except for during 'alerts' and for relatively short periods after real incidents.)
The only optimal position between those extremes is get it right almost every time, i.e. to have a real expert (or team of experts) looking over the computer's shoulder. And I seriously doubt that society is going to be willing to pay for that.
What Bothers Me... (Score:3, Insightful)
I'm in that database because when I was an 18 year old high school senior I committed the high crime of having had consensual sex with my girlfriend, who was a year and a half younger than I was. It's bad enough to get charged with a felony for consensual sex with a partner who's within 2 years of your own age, but now maybe I'll get harassed when I go to national monuments or big events because of hits in facial recognition software. In theory the facial recognition technology will only be hooked into a partial database of certain types of people. In practice, I doubt they'll be very selective.
What if you got arrested as a teenager for having a small amount of marijuana? What if you were accused of assault for a minor altercation? What about any number of minor infractions which still would have landed your face in the FBI database? My guess is, as technology gets better and more discriminating in the field, the parts of the FBI databasde used will get wider until the full database gets scanned.
So, it's not just false-positives that are a worry, but positives against people with very minor infractions that have still landed them in the FBI database. Should you get shaken down by some overzealous dweeb who thinks you're dealing drugs because 10 years ago you got caught with your personal stash of green? And what of the potential for abuse of sensitive personal data?
Now that this particular can of worms has been opened under the excuse of 9/11, it's only going to get bigger and more invasive. First they'll assure us the database they're using only has "violent" criminals in it. Then it'll only be felons. Next it's the whole FBI database, including all the pictures of people whose parents were stupid enough to fingerprint and photograph their children and submit a packet voluntarily "to protect your chuildren in case of abduction", and DMV databases as well.
Is it just me, or is it getting kind of Orwellian in here?
Turn up false positive, false negative declines (Score:3, Insightful)
If you took this technology, made it match on too many faces and then had someone manually double-check the potential match, you would have a kick-ass system.
Like all powerful technology, its use must be ethical.
Avoiding Responsibility (Score:3, Insightful)
Without human supervision, there will be too many false positives for the average person to stand for. Without *diligent* human supervision, the false negatives will slip through too easily.
Not that I'm necessarily being critical of the security employees. It is only human nature. How many security checks and stops did you happily (or at least understandingly) endure in the months after September 11th that you grouse about now? Keeping security personnel at top alert all the time is the problem they should be working on. That and getting the INS to do their job.
Re:False positives (Score:2, Insightful)
I sold you and you sold me
There lie they and here lie we
Under the spreading chestnut tree
Osama is winning, and we are letting him (Score:3, Insightful)
By reacting the way we are in the U.S. Osama Bin Laden is getting exactly what he was aiming for. He wanted to destroy the American way of life, and by removing the freedom and civil rights the way we are, he is achieving his goal. There is no longer any need for him to act. We have met the enemy, and it is U.S.