Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Technology

Face-Scanning Loses by a Nose in Palm Beach 232

Rio writes: "A story from myCFnow.com reports that Palm Beach International Airport officials said face-scanning technology will not become part of their airport's security system." Looks like the ACLU was right. Checking a database of 15 employees, the technology gave false-negatives -- failed to recognize the test subjects -- over 50% of the time. A spokesperson said, "There's room for improvement." The Pentagon said the same thing in February. The false-positive rate is more important -- it isn't mentioned, but even if it were just 0.1%, Bruce Schneier argues, it'd be useless.
This discussion has been archived. No new comments can be posted.

Face-Scanning Loses by a Nose in Palm Beach

Comments Filter:
  • only 15 employees? (Score:2, Insightful)

    by Atrax ( 249401 ) on Monday May 27, 2002 @12:07AM (#3589479) Homepage Journal
    a bit of a small sample, don't you agree? and how was it composed...?
  • by I Want GNU! ( 556631 ) on Monday May 27, 2002 @12:14AM (#3589500) Homepage
    Not using faulty technology is a great idea! Now all they need to do is repeal the law taking away school or library aid if they don't use filter technology, since the filters don't have open lists of sites and often block sites they shouldn't and don't block sites they should!
  • by Papineau ( 527159 ) on Monday May 27, 2002 @12:16AM (#3589504) Homepage
    Bruce talks about 99.9%, so there's 0.1% left, not 0.01% as the story says right now.

    If a person is mistaken once for a terrorist (or a "normal" criminal), don't you think other recognition points will do the same mistake? What do you do then? Plan a few extra hours each time you take the plane? Get an official letter stating "I'm not a terrorist"? If a simple letter can get you through, terrorists will get some.
  • Unpopular View (Score:5, Insightful)

    by deebaine ( 218719 ) on Monday May 27, 2002 @12:19AM (#3589517) Journal
    I don't necessarily understand the objections to face scanning technology. To be sure, I don't want computers flagging people to be arrested. But computers sift through enormous amounts of information, making them ideal for a first pass. If they are used to flag people to be scrutinized by humans, I don't have any objections. In fact, if a computer can flag 20 of the hundreds of thousands of faces so that human experts can give a closer look, so much the better.

    Incidentally, by this reasoning, it is in fact the false negatives that are more important. False positives can presumably be discarded by humans providing closer scrutiny. False negatives in this scenario, however, present a major difficulty.

    Face scanning technology isn't innately evil. Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt. No surprises there.

    -db
  • by ObviousGuy ( 578567 ) <ObviousGuy@hotmail.com> on Monday May 27, 2002 @12:22AM (#3589525) Homepage Journal
    False positives are fine, though a failure rate of 50% is clearly way too high. False positives mean that in those cases a suspect was actually identified correctly.

    It is the false negatives that are truly scary. If a known terrorist sympathizer can board a plane without setting off any signals then it is clearly a useless product.

    Luckily, humans have the ability to fuzzily predict terrorist-like behavior (now that everyone's on high alert, that is).
  • by ivan256 ( 17499 ) on Monday May 27, 2002 @12:26AM (#3589543)
    Let's see if you think a false positive is ok when the guy with the rubber gloves is up your ass to the elbow looking for explosives.

    False positives are as bad if not worse then false negatives.
  • Re:Unpopular View (Score:3, Insightful)

    by h0rus ( 451357 ) on Monday May 27, 2002 @12:27AM (#3589549)
    Yes, and the reason for this tension is we expect these methods to be misused.
  • by Indras ( 515472 ) on Monday May 27, 2002 @12:32AM (#3589565)
    we need to take a minute to figure out why it doesn't work. Or maybe, instead of that, look at recognition that does work.

    I, for one, am pretty much 99.99% correct when it comes to making positive recognition of those people around me that I see often. People I haven't seen in a few years, I have more trouble identifying. Why? Because people's faces change. Facial hair, glasses (or removal of them), makeup, etc. can throw a lot of people off. Can this technology compensate for that?

    I personally think that these cameras need to look at people the way we do, with two eyes. What do we get when we look at the world with two eyes? Depth perception. We can see objects in three dimensions, because we see it from two angles at once. If facial recognition computers were able to take in two separate data streams, like two cameras a foot apart, it would be possible to create at three-dimensional image of that person's face. And though it would require more computing power, it is much easier to make a positive match using three-dimensional data as opposed to two. Ever seen a perfect frontal view photograph of a person's face? Can you tell how long their nose is when you're looking at it? Isn't the length of a person's nose a significant facial feature? (Oh, and I know, if you see a person from the side, you see that, but these cameras are always only getting one angle, so they're always throwing out a lot of data. If you see a person's face from the side, you are not seeing how wide their face is, and so on.)
  • by Triskaidekaphobia ( 580254 ) on Monday May 27, 2002 @12:34AM (#3589572)
    Clerks in bookshops don't have machine guns and don't have the authority to arrest and strip search you.

    Given the choice of a false positive in a bookshop and one at the airport I know which I would want to avoid.
  • by larry bagina ( 561269 ) on Monday May 27, 2002 @12:49AM (#3589607) Journal
    The september 11 terrorists weren't "sleepers"

    Atta (the scary looking ringleader) had previously been arrested in Israel for being a terrorist. He was relesed as part of Bill Clinton's mideast "peace" initiative, but was still on various US gov't list of terrorists.

    If the INS wasn't totally useless, if the FBI, FTC etc. shared information, they would have been deported when they were caught being here illegally, driving with an expired licesne, failing to show up for court, or buying airline tickets.

    Tom Daschle and the democrats want to blame George Bush because the FBI and CIA, in hinsight, had the information to see this coming.

    The real tragedy is that they, and thousands of others, were here illegally, and we did nothing.

  • Re:Unpopular View (Score:5, Insightful)

    by achurch ( 201270 ) on Monday May 27, 2002 @12:53AM (#3589618) Homepage

    I don't necessarily understand the objections to face scanning technology. [...] Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt.

    You just hit the nail on the head there; most people who don't like this technology don't like it because (they believe) it will be used irresponsibly, eventually if not immediately. Power corrupts, as the old saying goes, and people are unfortunately easily corruptible. Ordinarily I wouldn't be quite so pessimistic, but given all the hoopla over the "War on Terrorism", I'm inclined to side with the Slashdot popular view.

    (Note to moderators: Yes, I do realize that there are many points of view represented on Slashdot, thankyouverymuch.)

  • The notion that someone will repeatedly be "identified" as matching a particular face, is a very real concern for travelers. Already, we find that Americans with brown skin and beards, and especially persons who "look" Muslim, are hassled every time they enter an airport and often miss their flights. Non-citizens who "appear Muslim" should probably just give up on any idea of flying in the next few years.

    As noted, there can be no "get past ID check free" letter or ID card, since those would immediately be forged. And with a 50% false negative rate (missing a suspect 50% of the time), the system seems hardly worth using.

    I have not traveled by air since returning from Europe on September 19 (delayed from Sept. 12).

    In the past, I would have flown between the San Francisco Bay Area and the Los Angeles area (a 1-hour flight, using any of the airports on either end), but now it's actually likely to be faster to drive (around six hours each way), after including all the "waiting in line time," the increased flight delays, and of course the time to get into and out of the airports (park here, rent a car there).

    To be fair, of course, a system with a 50% false negative rate is presumably able to detect "known suspects" 50% of the time, which is almost certainly much better than human beings will ever do. Of course, the tests are probably being conducted under very favorable conditions, with an extremely small sample of "suspects." And of course, if the false-positives were equally distributed, we'd all be willing to suffer a one-in-a-thousand delay, if it actually had any meaningful benefit. (But we know that the false-positives won't be equally distributed, they will mostly affect persons in certain ethnic groups or with beards, etc., and while that means I'm less likely to be inconvenienced, I can't tolerate a system that punishes people for their skin color or ethnic background.)

    What's scary, to me, is that we are giving up so much (in many little bits and pieces) for so little benefit. On Saturday, I discovered that I couldn't use the restrooms in the BART (train) stations again, because they were closed to prevent someone from planting a bomb in them. Okay, so I had to hold it for an hour until I got home, big deal. And armed troops in the airports, and on bridges, okay, I can live with that one thing. And I can't drop off my express mail without handing it to a postal clerk now.

    But ding, ding, ding, we add up all the little "show-off" gimmicks and what we face is a huge impact that provides pretty much zero actual benefit. All the gimmicks combined might provide about 1% or 10% improved safety, at a much greater cost.

    While I was stuck in London during the week after September 11, I worried that things would change for the worse, not because of things that terrorists did, but because of the things we would do out of panic and fear and prejudice and idiocy. Things are nowhere near my worst fears, but I think things are very bad, and ultimately I believe that the terrorists have already "won" by causing most Americans to change multiple aspects of our "way of life."

  • by MikeD83 ( 529104 ) on Monday May 27, 2002 @01:01AM (#3589641)
    At the metal detector a passenger's picture is taken. It is then compared to the database of known criminals.
    A security guard is sitting in front of a computer next to the x-ray machine ready for a positive match.
    If you look nothing like the person. (different race or something like that) You would be let through to the gate and not even know you were positively identified.
    If it may be a good match- you get stopped. The operator already has some information about the criminal in front of him. The operator will do an on the spot quick check. One thing that crimanals are notorious for is tattoos. If the passenger doesn't have them (or signs of removal surgery) let them go. If the passenger is a very close match do a more thorough examination.
    Every night there can be an audit of the matches to make sure the security personel are doing their job. The system seems very effective to me.

    The system by Visionics looks at 80 different facial characteristics. The configuration used by the airport only needed 14 matches to produce a positive. It seems this is a setting in software and could probably be lowered to produce more positives. Even if they are false positives the sytem I menetioned above would do the job.
  • Re:Unpopular View (Score:4, Insightful)

    by TheCage ( 309525 ) on Monday May 27, 2002 @01:08AM (#3589662)
    But what about the fact that it doesn't solve the problem? This is presumably to stop terrorism, yet a significant number of terrorists are not part of any database, which is required for something like this to work. Seems just like a waste of money to me.
  • Look-alikes? (Score:5, Insightful)

    by TheSHAD0W ( 258774 ) on Monday May 27, 2002 @01:40AM (#3589738) Homepage
    Here's one for you: What would you do if you looked like a terrorist?

    Let's say, some time in the future, they get the face-scanning technology to work right. 0.000001% false-positive rate. And it's implemented all over the US.

    Let's also say that, among the 250 million people in the United States, one or more people had facial structures similar enough to terrorists' that they would trigger those scanners. In fact, they'd trigger every scanner that person was surveiled by. And let's say that person were you.

    What would you do?

    You couldn't go to an airport. You couldn't go to a major public attraction. You probably couldn't go to a public place without fear of some alarm going off and people waving automatic weapons in your face. Would you cower at home? Would you wear a bag over your head? Would you sue the US government? How would you cope?
  • by jonman_d ( 465049 ) <nemilar.optonline@net> on Monday May 27, 2002 @01:41AM (#3589742) Homepage Journal
    I think the problem with security these days is that many people are looking for a one-solution-fixes-all type of thing. People need to realize that (and geeks know this, we do it on our computers ;) there are, and should be, multiple layers of security.

    Employing facial recognition is just one thing we can do - granted, we need to get the technology to work better, but we need to realize that it's multiple systems working together that is going to stop terrorists, not one or two "miracle systems."
  • 99.99% accurate?? (Score:3, Insightful)

    by FaithAndReason ( 112179 ) on Monday May 27, 2002 @01:48AM (#3589758)
    I suspect that your own accuracy rate is not nearly as high as you believe it is.

    First, as you state, that 99.99% accuracy rate only applies to a group of people you meet regularly; this probably includes perhaps a few hundred people, and a significant part of your total memory and processing capability is devoted to recognizing and updating your memory of those faces (check out a brian map for how much of our cortex is dedicated to face recognition.) Even duplicating that feat (i.e. identifying a small group of faces) would be a major undertaking for a computer system.

    Second, that 99.99% isn't nearly as impressive as it sounds, because it represents the positive rate, i.e. the chance that you will correctly identify an individual in the target population. That corresponds to a false negative rate of 0.01% -- you're saying that once in ten thousand times, you'll actually fail to recognize somebody you see on a regular basis. Not too encouraging, that.

    Third, that figure says absolutely nothing about the false positive rate, which I suspect is much higher. In other words, how often do you see somebody that you think you recognize, but can't quite remember exactly? From my own experience, I would say that number is as high as one in a hundred. Our own built-in face recognition system is simply designed that way -- to generate a large number of "near misses".

    So, the bottom line is: even the supposedly high accuracy of human facial recognition isn't accurate enough, and undoubtedly doesn't scale very well.
  • False positives (Score:5, Insightful)

    by Restil ( 31903 ) on Monday May 27, 2002 @01:48AM (#3589759) Homepage
    We're not talking about using this technology to make courtroom identifications. We're using it to notify security that you MIGHT have someone in front of you that is of less than reputable character. This doesn't mean you immediately cuff him and throw him in jail, but if he tries to walk through a screener checkpoint it MIGHT be a good idea to do a little better check than a simple wand wave. In the meantime, someone can be checking the pictures to see if that person's face actually matches the match the computer made. With a .1% false positive rate, you could have a couple paid employees just looking at matching pictures to see if there's really cause for concern or not. At the rate people go through screening checkpoints now, they'll get a "match" about once every 10 minutes or so, your mileage may vary with larger airports, its all a matter of scale.

    As for false negatives, even 50% is better than nothing as long as the false positive is much MUCH lower. Imagine catching 50% of the hijackers on September 11 before they boarded the planes. A lot of red flags could have gone up, and flights could have been delayed, the rest of the passengers could be more carefully scrutinized. No, this is not the solution to any problem. And no, it should not be used legally any more than a lie detector can be. Its a guide. It tells us where we might need to concentrate more of our efforts on.

    As far as threats to privacy go, this makes sense in an airport, but it does not make sense out on the street. People go into an airport expecting to be searched, questioned, carded, etc. They do not have the same expectation while walking down the street. So unless the cops are currently chasing someone, they lose him, and you have a striking resemblance, they shouldn't bother you at all.

    -Restil
  • Re: Unpopular View (Score:3, Insightful)

    by Black Parrot ( 19622 ) on Monday May 27, 2002 @02:05AM (#3589795)


    > To be sure, I don't want computers flagging people to be arrested. But computers sift through enormous amounts of information, making them ideal for a first pass. If they are used to flag people to be scrutinized by humans, I don't have any objections.

    Are those humans going to be highly-trained well-paid experts like those who work airport security?

    The basic expectation is that the human 'supervisors' will adopt a strategy of either (a) waiving everyone that the computer identifies because they're tired of taking the heat for false positives, or else (b) calling for the cops every time the computer identifies someone, so they won't have to take the heat if a terrorist does get through. (Interesting problem, that. I would guess that we would see a lot of variety of individual behavior early on, after which it would settle into a state where all 'supervisors' behave the same. Presumably that would be state (a) except for during 'alerts' and for relatively short periods after real incidents.)

    The only optimal position between those extremes is get it right almost every time, i.e. to have a real expert (or team of experts) looking over the computer's shoulder. And I seriously doubt that society is going to be willing to pay for that.

  • What Bothers Me... (Score:3, Insightful)

    by Chasing Amy ( 450778 ) <asdfijoaisdf@askdfjpasodf.com> on Monday May 27, 2002 @02:08AM (#3589799) Homepage
    What bothers me isn't just the "false positives," but the plain positives as well. Most of these things, like the ones getting deployed in NYC at "possible terrorit targets," are using parts of the FBI database for their facial recognition capabilities. Well, any American who's ever even been accused of a felony in recent years, even if he was never convicted, is in the FBI database.

    I'm in that database because when I was an 18 year old high school senior I committed the high crime of having had consensual sex with my girlfriend, who was a year and a half younger than I was. It's bad enough to get charged with a felony for consensual sex with a partner who's within 2 years of your own age, but now maybe I'll get harassed when I go to national monuments or big events because of hits in facial recognition software. In theory the facial recognition technology will only be hooked into a partial database of certain types of people. In practice, I doubt they'll be very selective.

    What if you got arrested as a teenager for having a small amount of marijuana? What if you were accused of assault for a minor altercation? What about any number of minor infractions which still would have landed your face in the FBI database? My guess is, as technology gets better and more discriminating in the field, the parts of the FBI databasde used will get wider until the full database gets scanned.

    So, it's not just false-positives that are a worry, but positives against people with very minor infractions that have still landed them in the FBI database. Should you get shaken down by some overzealous dweeb who thinks you're dealing drugs because 10 years ago you got caught with your personal stash of green? And what of the potential for abuse of sensitive personal data?

    Now that this particular can of worms has been opened under the excuse of 9/11, it's only going to get bigger and more invasive. First they'll assure us the database they're using only has "violent" criminals in it. Then it'll only be felons. Next it's the whole FBI database, including all the pictures of people whose parents were stupid enough to fingerprint and photograph their children and submit a packet voluntarily "to protect your chuildren in case of abduction", and DMV databases as well.

    Is it just me, or is it getting kind of Orwellian in here?
  • by Jeppe Salvesen ( 101622 ) on Monday May 27, 2002 @03:37AM (#3589935)
    Where were you guys in stats class?

    If you took this technology, made it match on too many faces and then had someone manually double-check the potential match, you would have a kick-ass system.

    Like all powerful technology, its use must be ethical.
  • by UberOogie ( 464002 ) on Monday May 27, 2002 @04:12AM (#3589986)
    This is an excellent point on why there will never be a technical cure-all for this problem, especially now that airport security is federalized. The backbone of government employment is avoiding responsibility for bad things. The level of "urgency" (for lack of a better word) cannot and will not be kept up by the people in charge of security, which will render any technical solution useless.

    Without human supervision, there will be too many false positives for the average person to stand for. Without *diligent* human supervision, the false negatives will slip through too easily.

    Not that I'm necessarily being critical of the security employees. It is only human nature. How many security checks and stops did you happily (or at least understandingly) endure in the months after September 11th that you grouse about now? Keeping security personnel at top alert all the time is the problem they should be working on. That and getting the INS to do their job.

  • Re:False positives (Score:2, Insightful)

    by Llywelyn ( 531070 ) on Monday May 27, 2002 @05:08AM (#3590080) Homepage
    Under the spreading chestnut tree
    I sold you and you sold me
    There lie they and here lie we
    Under the spreading chestnut tree
  • by Zero__Kelvin ( 151819 ) on Monday May 27, 2002 @09:28AM (#3590475) Homepage


    By reacting the way we are in the U.S. Osama Bin Laden is getting exactly what he was aiming for. He wanted to destroy the American way of life, and by removing the freedom and civil rights the way we are, he is achieving his goal. There is no longer any need for him to act. We have met the enemy, and it is U.S.

This file will self-destruct in five minutes.

Working...