Face-Scanning Loses by a Nose in Palm Beach 232
Rio writes: "A story from myCFnow.com reports that Palm Beach International Airport officials said
face-scanning technology will not become part of their airport's security system."
Looks like
the ACLU was right.
Checking a database of 15 employees, the technology gave false-negatives -- failed to recognize the test subjects -- over 50% of the time. A spokesperson said, "There's room for improvement." The Pentagon said
the same thing
in February. The false-positive rate is more important -- it isn't mentioned, but even if it were just 0.1%, Bruce Schneier argues,
it'd be useless.
Try telling the Aussies that. (Score:4, Informative)
Airport face identification isn't practical? Try telling the Australian Government that. [news.com.au] They are trialling a hybrid face-recognition/biometric passport system that sends shivers up my spine.
Re:only 15 employees? (Score:4, Informative)
If it can't identify 1 of 15, then what chance has it got of finding 1 person out of millions?
Human oversight (Score:2, Informative)
Its exactly the systems Casino's have sucessfully deployed to keep known "cheaters" out of their casino's. The face scanning technology merely provides POSSIBLE matches, the actual decision on further investigation rests with a human operator...
This seems perfectly reasonable to me from a technology standpoint, I'll argue the ethics of this technology some other time:)
Re:slashdotters dont need to worry (Score:2, Informative)
Right now, in your own eyes, you are not a criminal. But what keeps you that way? What if the government decides something you do is a criminal offense? Perhaps they'll decide that Slashdot, as a part of Hax0r culture (I wouldn't call it that, but the people in power in this country are stupid enough to do so), must be outlawed, and its users are all 'terrorists.' Of course, fifty years ago we'd all be 'communists,' but times change and the way you make the idea of a subversive sound like the enemy change.
You see, anything is potentially a crime. Leaving my house, attending class, writing papers, playing water polo, jacking off ten hours a day- these are things that take up most of my time. The fact is, no one is to say that these are not crimes. If using drugs is a crime, if someone who feeds a non-violent subversive activist is a 'terrorist' now, any of these activities could become criminal.
In the majority of the United States, it is still legal to fire someone for quite simply being gay. There is no amendment to protect from this, there is no federal law. And it will be this way for a long time, most likely. In fact, some of the anti-discrimination laws that keep this from being true everywhere are being repealed. What's to say that you aren't a criminal in such an unjust nation?
We are not the land of the free, don't buy that. You aren't safe. Unless you work for the government in a high ranking office (as in you were either elected or appointed), or have a LOT of money, you can be screwed at any time.
Slashdotters need to worry. Fight surveillance! Fight for your freedom, no matter the cost.
No, it's 99.99% Read Cryptogram (Score:4, Informative)
Bruce talks about 99.9%, so there's 0.1% left, not 0.01% as the story says right now.
No, sorry, just read Bruce's Cryptogram [counterpane.com]
Suppose this magically effective face-recognition software is 99.99 percent accurate. That is, if someone is a terrorist, there is a 99.99 percent chance that the software indicates "terrorist," and if someone is not a terrorist, there is a 99.99 percent chance that the software indicates "non-terrorist." Assume that one in ten million flyers, on average, is a terrorist. Is the software any good?
No. The software will generate 1000 false alarms for every one real terrorist. And every false alarm still means that all the security people go through all of their security procedures. Because the population of non-terrorists is so much larger than the number of terrorists, the test is useless. This result is counterintuitive and surprising, but it is correct. The false alarms in this kind of system render it mostly useless. It's "The Boy Who Cried Wolf" increased 1000-fold.
Not just FP,FN, but Base Rates! (Score:2, Informative)
As you may know, Bayes Theorem (actually a statement of fact in probability theory) says:
Post-test odds = Likelihood Ratio * Pre-test odds
(Where the likelihood ratio for a positive test is the sensitivity/(1-specificity), or TP rate / FP rate)
If your pre-test odds of being a terrorist are very low (and when you consider how many terrorists fly compared to how many non-terrorists fly, they must be exceedingly low), you're going to need a very, very powerful ("highly specific" in medical terms) test if you want to reliably determine that a given person ought to be treated with greater care.
On the other hand, if they were planning to spend a lot of time and money screening people anyway, and they could improve their sensitivity (TP rate), facial recognition might be a (statistically) sound approach to screening *out* suspects. That is, one you pass a face-detection screen that has a high TP rate, you don't need to be subjected to as much extra screening; but if you fail the face-detection screen, it's not really diagnostic.
Normally, you could use my diagnostic test calculator [uic.edu] to fool around with numbers yourself and see what the impact would be, but it appears to be down until I can get to the server (dratted dist upgrade!)
Re:What Bothers Me... (Score:3, Informative)
So, with that in mind - is keeping blacklists (or greylists, really) of people a good idea at all? We like to pretend that they keep us 'safer' - but I bet the sixty-year-old gay man (prosecuted under one of those 'unenforced' state sodomy laws) who's driven out of his neighborhood with cries of 'think of the children!' isn't feeling any safer as a result of the existence of these lists.