Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

Face-Scanning Loses by a Nose in Palm Beach 232

Rio writes: "A story from myCFnow.com reports that Palm Beach International Airport officials said face-scanning technology will not become part of their airport's security system." Looks like the ACLU was right. Checking a database of 15 employees, the technology gave false-negatives -- failed to recognize the test subjects -- over 50% of the time. A spokesperson said, "There's room for improvement." The Pentagon said the same thing in February. The false-positive rate is more important -- it isn't mentioned, but even if it were just 0.1%, Bruce Schneier argues, it'd be useless.
This discussion has been archived. No new comments can be posted.

Face-Scanning Loses by a Nose in Palm Beach

Comments Filter:
  • by serps ( 517783 ) on Monday May 27, 2002 @12:08AM (#3589482) Homepage

    Airport face identification isn't practical? Try telling the Australian Government that. [news.com.au] They are trialling a hybrid face-recognition/biometric passport system that sends shivers up my spine.

  • by Triskaidekaphobia ( 580254 ) on Monday May 27, 2002 @12:10AM (#3589487)
    If it is a small sample then a high false negative rate is even worse.

    If it can't identify 1 of 15, then what chance has it got of finding 1 person out of millions?
  • Human oversight (Score:2, Informative)

    by enjo13 ( 444114 ) on Monday May 27, 2002 @12:11AM (#3589491) Homepage
    I think a 0.01% false positive rate would be perfectly acceptable. I have not seen one proposal for a face scanning system that has not also included human oversight.

    Its exactly the systems Casino's have sucessfully deployed to keep known "cheaters" out of their casino's. The face scanning technology merely provides POSSIBLE matches, the actual decision on further investigation rests with a human operator...

    This seems perfectly reasonable to me from a technology standpoint, I'll argue the ethics of this technology some other time:)
  • by God Takeru ( 409424 ) on Monday May 27, 2002 @12:36AM (#3589577) Homepage
    I don't know if you're being sarcastic or not, but if you're serious, you're wrong.

    Right now, in your own eyes, you are not a criminal. But what keeps you that way? What if the government decides something you do is a criminal offense? Perhaps they'll decide that Slashdot, as a part of Hax0r culture (I wouldn't call it that, but the people in power in this country are stupid enough to do so), must be outlawed, and its users are all 'terrorists.' Of course, fifty years ago we'd all be 'communists,' but times change and the way you make the idea of a subversive sound like the enemy change.

    You see, anything is potentially a crime. Leaving my house, attending class, writing papers, playing water polo, jacking off ten hours a day- these are things that take up most of my time. The fact is, no one is to say that these are not crimes. If using drugs is a crime, if someone who feeds a non-violent subversive activist is a 'terrorist' now, any of these activities could become criminal.

    In the majority of the United States, it is still legal to fire someone for quite simply being gay. There is no amendment to protect from this, there is no federal law. And it will be this way for a long time, most likely. In fact, some of the anti-discrimination laws that keep this from being true everywhere are being repealed. What's to say that you aren't a criminal in such an unjust nation?

    We are not the land of the free, don't buy that. You aren't safe. Unless you work for the government in a high ranking office (as in you were either elected or appointed), or have a LOT of money, you can be screwed at any time.

    Slashdotters need to worry. Fight surveillance! Fight for your freedom, no matter the cost.
  • by AftanGustur ( 7715 ) on Monday May 27, 2002 @03:30AM (#3589918) Homepage


    Bruce talks about 99.9%, so there's 0.1% left, not 0.01% as the story says right now.

    No, sorry, just read Bruce's Cryptogram [counterpane.com]


    Suppose this magically effective face-recognition software is 99.99 percent accurate. That is, if someone is a terrorist, there is a 99.99 percent chance that the software indicates "terrorist," and if someone is not a terrorist, there is a 99.99 percent chance that the software indicates "non-terrorist." Assume that one in ten million flyers, on average, is a terrorist. Is the software any good?

    No. The software will generate 1000 false alarms for every one real terrorist. And every false alarm still means that all the security people go through all of their security procedures. Because the population of non-terrorists is so much larger than the number of terrorists, the test is useless. This result is counterintuitive and surprising, but it is correct. The false alarms in this kind of system render it mostly useless. It's "The Boy Who Cried Wolf" increased 1000-fold.

  • by alansz ( 142137 ) on Monday May 27, 2002 @08:17AM (#3590345) Homepage
    This problem is exactly analogous to the proposal to test all married couples for HIV that went around Chicago some years back. Surprise, surprise, the base rate of HIV among to-be-married couples was quite low. More false positives than true positives. Lots of wasted time, money, and stress on re-screening.

    As you may know, Bayes Theorem (actually a statement of fact in probability theory) says:

    Post-test odds = Likelihood Ratio * Pre-test odds

    (Where the likelihood ratio for a positive test is the sensitivity/(1-specificity), or TP rate / FP rate)

    If your pre-test odds of being a terrorist are very low (and when you consider how many terrorists fly compared to how many non-terrorists fly, they must be exceedingly low), you're going to need a very, very powerful ("highly specific" in medical terms) test if you want to reliably determine that a given person ought to be treated with greater care.

    On the other hand, if they were planning to spend a lot of time and money screening people anyway, and they could improve their sensitivity (TP rate), facial recognition might be a (statistically) sound approach to screening *out* suspects. That is, one you pass a face-detection screen that has a high TP rate, you don't need to be subjected to as much extra screening; but if you fail the face-detection screen, it's not really diagnostic.

    Normally, you could use my diagnostic test calculator [uic.edu] to fool around with numbers yourself and see what the impact would be, but it appears to be down until I can get to the server (dratted dist upgrade!)

  • by david.johns ( 466417 ) <kallisti@morpho . d ar.net> on Monday May 27, 2002 @02:29PM (#3591453) Homepage
    Actually, over on plastic [plastic.com], we had a recent discussion about being on sex offender lists. The point was made that sex offender lists often include people who flashed someone 40 years ago and got caught. However, they're branded with the big 'A' (or is it 'P' these days?) wherever they go, in spite of the fact that these lists didn't exist back then. People think they're pedophiles when they might just have been sleeping with their underage girl/boy friend. Great.

    So, with that in mind - is keeping blacklists (or greylists, really) of people a good idea at all? We like to pretend that they keep us 'safer' - but I bet the sixty-year-old gay man (prosecuted under one of those 'unenforced' state sodomy laws) who's driven out of his neighborhood with cries of 'think of the children!' isn't feeling any safer as a result of the existence of these lists.

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...