Face-Scanning Loses by a Nose in Palm Beach 232
Rio writes: "A story from myCFnow.com reports that Palm Beach International Airport officials said
face-scanning technology will not become part of their airport's security system."
Looks like
the ACLU was right.
Checking a database of 15 employees, the technology gave false-negatives -- failed to recognize the test subjects -- over 50% of the time. A spokesperson said, "There's room for improvement." The Pentagon said
the same thing
in February. The false-positive rate is more important -- it isn't mentioned, but even if it were just 0.1%, Bruce Schneier argues,
it'd be useless.
That explains alot... (Score:4, Funny)
only 15 employees? (Score:2, Insightful)
Re:only 15 employees? (Score:4, Informative)
If it can't identify 1 of 15, then what chance has it got of finding 1 person out of millions?
Re:only 15 employees? (Score:1)
Try telling the Aussies that. (Score:4, Informative)
Airport face identification isn't practical? Try telling the Australian Government that. [news.com.au] They are trialling a hybrid face-recognition/biometric passport system that sends shivers up my spine.
what does this do (Score:4, Interesting)
If anything, it should be a call for all Americans to protest this kind of thing (should you disagree with it).
oh, but i want funding! (Score:1)
Re:oh, but i want funding! (Score:1)
Re:oh, but i want funding! (Score:1)
Human oversight (Score:2, Informative)
Its exactly the systems Casino's have sucessfully deployed to keep known "cheaters" out of their casino's. The face scanning technology merely provides POSSIBLE matches, the actual decision on further investigation rests with a human operator...
This seems perfectly reasonable to me from a technology standpoint, I'll argue the ethics of this technology some other time:)
Re:Human oversight (Score:2)
Re:Human oversight (Score:2)
Yep, we need a national ID card. We really do...
sPh
good idea...now extend this (Score:3, Insightful)
Re:good idea...now extend this (Score:1)
(I don't live in Multnomah County, but do pass through it every day on my way from home in Clackamas County to work in Washington County. Yes, my commute sucks.)
More info here. [epic.org]
Re:good idea...now extend this (Score:2)
False positive rate (Score:3, Interesting)
(Two of the false positives even got the sex of the suspect wrong)
Since they state that it was the first days, perhaps it just needed tuning?
It's 0.1%, not 0.01% (Score:2, Insightful)
If a person is mistaken once for a terrorist (or a "normal" criminal), don't you think other recognition points will do the same mistake? What do you do then? Plan a few extra hours each time you take the plane? Get an official letter stating "I'm not a terrorist"? If a simple letter can get you through, terrorists will get some.
Re:It's 0.1%, not 0.01% (Score:2)
What can you do? Curl up and die. (Score:2)
What can you do? How about:
In other words, just give up any chance of ever living without fear again.
I sincerely hope you're just being a troll, because if facial recognition were ever to be widely implemented, the above would be a way of life for tens of thousands of perfectly law-abiding citizens in this country, or whever else it was implemented.
If you really don't think it matters, I'll tell you what: send me a couple of photographs of yourself, in the classic mug shot poses, and within a week I'll have you in that wonderful little FBI database with nice little TERRORIST notes all over your file (all it takes is unsubstantiated rumors these days.) Then we'll see how much you enjoy traveling...
Re:What can you do? Curl up and die. (Score:2)
No, it's 99.99% Read Cryptogram (Score:4, Informative)
Bruce talks about 99.9%, so there's 0.1% left, not 0.01% as the story says right now.
No, sorry, just read Bruce's Cryptogram [counterpane.com]
Suppose this magically effective face-recognition software is 99.99 percent accurate. That is, if someone is a terrorist, there is a 99.99 percent chance that the software indicates "terrorist," and if someone is not a terrorist, there is a 99.99 percent chance that the software indicates "non-terrorist." Assume that one in ten million flyers, on average, is a terrorist. Is the software any good?
No. The software will generate 1000 false alarms for every one real terrorist. And every false alarm still means that all the security people go through all of their security procedures. Because the population of non-terrorists is so much larger than the number of terrorists, the test is useless. This result is counterintuitive and surprising, but it is correct. The false alarms in this kind of system render it mostly useless. It's "The Boy Who Cried Wolf" increased 1000-fold.
Re:No, it's 99.99% Read Cryptogram (Score:2)
What a sensible system would do would be notify another system which is better at identifying faces - we call it a human being. Then they can check, and where appropriate take further steps. Which may or may not involve having you arrested and executed without trial.
What it means is that instead of having to check ten million flyers, the security people have to check 1000, which is far more feasible. I'd argue that false positives are a lot less harmful than false negatives in such systems, provided positive is treated as "take a closer look" rather than "terminate with extreme prejudice"...
Broken promise ring (Score:2)
Re:Broken promise ring (Score:3, Insightful)
Atta (the scary looking ringleader) had previously been arrested in Israel for being a terrorist. He was relesed as part of Bill Clinton's mideast "peace" initiative, but was still on various US gov't list of terrorists.
If the INS wasn't totally useless, if the FBI, FTC etc. shared information, they would have been deported when they were caught being here illegally, driving with an expired licesne, failing to show up for court, or buying airline tickets.
Tom Daschle and the democrats want to blame George Bush because the FBI and CIA, in hinsight, had the information to see this coming.
The real tragedy is that they, and thousands of others, were here illegally, and we did nothing.
Re:Broken promise ring (Score:2)
But the true sleepers involved in September's attacks as well as people we have no idea about are the sort that will pass through the cracks of any sort of database system. Joe Terrorist moves to the US or just gets a visa to live here and goes about his business and never gets so much as a speeding ticket. Then one day an e-mail turns up talking about enlarging his penis and has a particular picture on it and the appropriate code words. He then builds a bomb and blows somebody up. Other than a group of telepaths or time traveling cops how are you going to screen people coming into the US to see if they are a terrorist deep down? Facial recognition is just going to show that Joe Terrorist has no criminal record to speak of and pays his taxes. It isn't going to tell you his backpack has twenty pounds of home made explosives.
Re:Broken promise ring (Score:2)
1) If you travel with a false identity you HAVE A PROBLEM
2) If you are not registered, you HAVE A PROBLEM
3) Whoever you are and whatever identity you are using, they can trace all the locations you've traveled. You are the same guy everywhere. You passport is your body.
I guess i'l be investing in those biomasks companies soon
Re:Broken promise ring (Score:2)
I coul only see this system as an aid in specific cases.
Re:Broken promise ring (Score:2)
The more methods you have to detect a person, the better. Imagine that a terrorist may have to use a mask, alter it's fingertips, forge fake ID, use a special contact lens, change his height, etc. It would turn a nightmare for this guy to travel undetected. Ok, if you don't yet know he is a terrorist, it doesn't help you detect him. But if you suspect he might me into something, you can trace it's movements in the past (better than the method of reliying on a faked ID alone) and in the present/future.
It can't be bad, as long as it's not the only means of security. So I think we may agree...
Re:Broken promise ring (Score:2)
That doesn't stop face recognition from having potential in areas where some people *are* in the database, including not only terrorism but also missing persons, wanted (non-terrorism) criminals, football hooliganism etc.
Unpopular View (Score:5, Insightful)
Incidentally, by this reasoning, it is in fact the false negatives that are more important. False positives can presumably be discarded by humans providing closer scrutiny. False negatives in this scenario, however, present a major difficulty.
Face scanning technology isn't innately evil. Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt. No surprises there.
-db
Re:Unpopular View (Score:3, Insightful)
Re:Unpopular View (Score:5, Insightful)
I don't necessarily understand the objections to face scanning technology. [...] Like everything else, if we use it wisely, it can help. If we use it irresponsibly, it can hurt.
You just hit the nail on the head there; most people who don't like this technology don't like it because (they believe) it will be used irresponsibly, eventually if not immediately. Power corrupts, as the old saying goes, and people are unfortunately easily corruptible. Ordinarily I wouldn't be quite so pessimistic, but given all the hoopla over the "War on Terrorism", I'm inclined to side with the Slashdot popular view.
(Note to moderators: Yes, I do realize that there are many points of view represented on Slashdot, thankyouverymuch.)
Re:Unpopular View (Score:1)
The main reason I don't like facial scanning is quite simple. I view it as a slippery slope -- we start scanning for a few "bad guys" now, and what happens a few years down the road when it becomes feasible to scan everyone to make sure they're not doing something "wrong"? If we give our government the power to watch us all the time, we've given up the ability that was guaranteed to us in the Constitution to think, and speak freely. If you've never read 1984 you really need to. The descriptions of the lengths that the man in the book went to avoid being observed will drive you nuts -- and make you really think about where this is going. Orwell was off by a few years -- but it wouldn't surprise me if it turns out he was only wrong by about 20-25 years.
Re:Unpopular View (Score:4, Insightful)
Re: Unpopular View (Score:3, Insightful)
> To be sure, I don't want computers flagging people to be arrested. But computers sift through enormous amounts of information, making them ideal for a first pass. If they are used to flag people to be scrutinized by humans, I don't have any objections.
Are those humans going to be highly-trained well-paid experts like those who work airport security?
The basic expectation is that the human 'supervisors' will adopt a strategy of either (a) waiving everyone that the computer identifies because they're tired of taking the heat for false positives, or else (b) calling for the cops every time the computer identifies someone, so they won't have to take the heat if a terrorist does get through. (Interesting problem, that. I would guess that we would see a lot of variety of individual behavior early on, after which it would settle into a state where all 'supervisors' behave the same. Presumably that would be state (a) except for during 'alerts' and for relatively short periods after real incidents.)
The only optimal position between those extremes is get it right almost every time, i.e. to have a real expert (or team of experts) looking over the computer's shoulder. And I seriously doubt that society is going to be willing to pay for that.
Avoiding Responsibility (Score:3, Insightful)
Without human supervision, there will be too many false positives for the average person to stand for. Without *diligent* human supervision, the false negatives will slip through too easily.
Not that I'm necessarily being critical of the security employees. It is only human nature. How many security checks and stops did you happily (or at least understandingly) endure in the months after September 11th that you grouse about now? Keeping security personnel at top alert all the time is the problem they should be working on. That and getting the INS to do their job.
Society will pay, Bushment won't (Score:3, Interesting)
If you start to think about it, wouldn't you say that the Bush administration should be thankful for the 911 attack? Now, Bush can do what he does best, show strong leadership. We all remember his campaign speeches, right?
However, what kinds of strong leadership has he given? He has reconfirmed his alliance with Pakistan, the country run by a general that got his power in a military coup, under the banner of "protecting freedom". He needed to do this in order to punish the Taliban.
Now, his poor judgement may very well be biting him in his ass. Pakistan has long offered support for the resistance movement in India-controlled Kashmir. How this support has manifested itself in real life is a matter of debate. However, India does not think Pakistan has done enough to crack down on the separatists in Kashmir after the attack on the Indian parliament in December. Consider it comparable to a band of terrorists attempting to storm capital hill, and then have the nation the terrorists came from refusing to stop supporting the same forces.
What else goes on in Pakistan? Ever once in a while, you'll see small or large reports about how parts of the Pakistani intelligence service is sympathetic to Al Qaeda and the Taliban. Wonder how Mullah Omar got away? He travelled with a pile of money, paying off warlords that the USA trusted for free passage.
Rather than effectively fighting terrorism abroad, your government seems to favor disclosing every non-specific, non-corroborated terrorist threat, complete with security checkpoints that close down this or that because of a suspicious package.
It's looking bleak, folks. Any good conspiracy theorist (or reader of 1984 by G. Orwell) will tell you that keeping people afraid is a good way of controlling their ability to think rationally.
Oh, and would you like to know what I believe to be the ultimate terrorist strike? Trigger a landslide off the continental shelf along the Californian coast. According to Discovery Channel, the ground shows signs of previous landslides. One or more large-scale landslides could trigger a huge tsunami that could wipe out portions of the coastal areas along the Californian coast. What materials are required? Honestly, I don't know, but I'm guessing a few recreational boats with primitive depth charges or timed mines would have a pretty good chance of triggering something if they had a good geological report.
I hope I didn't make any Californians piss their pants. I'm just speculating. And I hope I won't have any government agency knocking on my door tonite.
Then again, the most effective portion of the WTC attack might be the fallout. America is marginalizing itself, giving the rest of us ever fewer reasons to really like the American government. (I like Americans, btw).
Re:Unpopular View (Score:2)
I do. If some dick attacks me on Saturday night and is clocked by a security camera, then he's spotted in the Mall the next day I want the police to know about it.
On the other hand - when the UK becomes a complete police state (in about 6 months at the current rate) I DON'T want to have to cover my face going into a 'subversive' bookshop for fear of being arrested and questioned about my support for 'the way of Tony'.
Ah dillemmas - where would we in the rich west be without them.
Face Recognition or Feed People
Daddy or Chips
Daddy or Chips
Daddy or Chips
Chips!
False sense of security! (Score:2)
Ooops, what you (and many others apparently) seriously fail to see is that all these face scanners can produce is false sense of security. Knowing that every airport used such a device it would be pretty damn easy for any terrorists or other criminals to modify their face enough so that the face scanner would fail for them (false negative).
Re:Unpopular View (Score:2)
False positives okay (Score:2, Insightful)
It is the false negatives that are truly scary. If a known terrorist sympathizer can board a plane without setting off any signals then it is clearly a useless product.
Luckily, humans have the ability to fuzzily predict terrorist-like behavior (now that everyone's on high alert, that is).
Re:False positives okay (Score:3, Insightful)
False positives are as bad if not worse then false negatives.
Re:False positives okay (Score:1)
Re:False positives okay (Score:1)
Re:False positives okay (Score:1)
Re:False positives okay (Score:2)
More perspective: That's ~610,000 people held for questioning due to false positives each year. Almost 2,000 a day. If you had to question 2,000 people a day, and you knew that out of those 2,000 people probably none of them were terrorists, how long would it take before you started doing a sloppy job? Talk about thankless work, and enormous expense.
False Positives are OK (Score:1, Interesting)
Re:False Positives are OK (Score:2, Insightful)
Given the choice of a false positive in a bookshop and one at the airport I know which I would want to avoid.
Re:False Positives are OK (Score:4, Interesting)
So I cram the pants and half my groceries into my backpack, the other half in plastic bags. I leave. The alarm goes off. It occurs to me that the pants must have a security tag that I didn't remove. I glance around, and nobody even looks my direction. I proceed to leave the building.
Then I remember that I've forgotten to buy a bus pass. I go back in. The alarm goes off. I head over to the customer service counter, and shell out $56 for a little card that will enable me to get to/from work for the next month. I leave again, and the alarm goes off. I wait a few minutes for the bus, and go home.
I completely forget about the security tag until I'm wearing the pants and am on my way to catch the bus to work. I've gotten about a block when I hear a noise as I'm walking. Sure enough, there it is. I run home, try unsuccessfully to get it off, give up, change pants, and run to catch the bus. I arrive at work 15 minutes late. When I get home I finish mutilating the tag. Tough little buggers.
So anyway, the moral of the story is that those little tags are absolutely worthless if store security is asleep at the wheel.
If we want to make this technology work... (Score:2, Insightful)
I, for one, am pretty much 99.99% correct when it comes to making positive recognition of those people around me that I see often. People I haven't seen in a few years, I have more trouble identifying. Why? Because people's faces change. Facial hair, glasses (or removal of them), makeup, etc. can throw a lot of people off. Can this technology compensate for that?
I personally think that these cameras need to look at people the way we do, with two eyes. What do we get when we look at the world with two eyes? Depth perception. We can see objects in three dimensions, because we see it from two angles at once. If facial recognition computers were able to take in two separate data streams, like two cameras a foot apart, it would be possible to create at three-dimensional image of that person's face. And though it would require more computing power, it is much easier to make a positive match using three-dimensional data as opposed to two. Ever seen a perfect frontal view photograph of a person's face? Can you tell how long their nose is when you're looking at it? Isn't the length of a person's nose a significant facial feature? (Oh, and I know, if you see a person from the side, you see that, but these cameras are always only getting one angle, so they're always throwing out a lot of data. If you see a person's face from the side, you are not seeing how wide their face is, and so on.)
99.99% accurate?? (Score:3, Insightful)
First, as you state, that 99.99% accuracy rate only applies to a group of people you meet regularly; this probably includes perhaps a few hundred people, and a significant part of your total memory and processing capability is devoted to recognizing and updating your memory of those faces (check out a brian map for how much of our cortex is dedicated to face recognition.) Even duplicating that feat (i.e. identifying a small group of faces) would be a major undertaking for a computer system.
Second, that 99.99% isn't nearly as impressive as it sounds, because it represents the positive rate, i.e. the chance that you will correctly identify an individual in the target population. That corresponds to a false negative rate of 0.01% -- you're saying that once in ten thousand times, you'll actually fail to recognize somebody you see on a regular basis. Not too encouraging, that.
Third, that figure says absolutely nothing about the false positive rate, which I suspect is much higher. In other words, how often do you see somebody that you think you recognize, but can't quite remember exactly? From my own experience, I would say that number is as high as one in a hundred. Our own built-in face recognition system is simply designed that way -- to generate a large number of "near misses".
So, the bottom line is: even the supposedly high accuracy of human facial recognition isn't accurate enough, and undoubtedly doesn't scale very well.
Re:99.99% accurate?? (Score:2, Funny)
check out a brian map for how much of our cortex is dedicated to face recognition
How much is used for transposing of letters? :)
Re: brain = abstraction (Score:2)
If you don't believe me, try to draw a portrait of a close friend with pencil and paper. You'll find out you can't or that it doesn't correspond to the real look. It's NOT that you can't draw (You can perfectly copy it if you have a B&W photograph). The thing is that you really abstract the look and only store tiny bits of angles, distances, colors, patterns, movements and facial expresions.
You don't even know WHAT you are storing in the first place. Perception and pattern-matching are a very complex thing, and a thing far different than what one might guess.
Re:If we want to make this technology work... (Score:2)
Unfortunately, this apparently simple statement is not as true as it would seem to you, a human being equipped with staggeringly immense computational power and a brain specially equipped for this very task.
In vision, there are two problems (at least). One is the usual problem of creating algorithms that can recognize things. The other is the staggering amount of data these algorithms must cope with.
Many common vision applications (by which I mean not necessarily face recognition) involve taking the picture, which may start life at any resolution you please, sampling it down to 32x32 or 64x64 (if you're willing to stretch), dropping down to 4 or 6 bits color, and proceeding to do the analysis on this exponentially smaller sample size.
Facial recognition algorithms do not always (often?) do this, but the problem of dealing with immense amounts of data do not go away. They simply exist in different ways. You're still trying to get invarient data (recognizing "bob #2423" no matter what bob is doing to fool the camera) out of a domain that has 2^(100*100*24) possible images (for a 100x100 full color RGB image; keep going up if you want something larger then 100x100, which is barely ID-photo sized.)
Throwing more data at the problem does not necessarily get you ahead. You must always throw out the vast majority of it anyhow to get any real work done.
(Also, you may be surprised; depth perception in humans is an interesting field of study. Less of it comes from your eyes then you may think; most of it comes from image processing. Your binocular vision has effectively no discrimination past six feet or something like that; I'd have to look the exact number up but it's shorter then most people would think.)
Re:If we want to make this technology work... (Score:2)
I probably ought to clarify that. In this domain the computer can indeed get a good depth perception shot if it wants. My point is that even humans make less use of this data then you might think, even at close range. Giving it to a computer adds new problems (handling that data), which may or may not be helpful anytime soon.
"Some people, when confronted with a problem, think ``I know, I'll use regular expressions.'' Now they have two problems." - jwz [jwz.org]. It's similar to this, I think. Merely throwing more data at a vision problem often adds to the problem list more then it takes away, at our present state of knowlege.
(Of course all of this is moot anyhow, because the math says even a human being isn't accurate enough to function as a facial recognition system anyhow. Computers aren't going to solve the problem. Nothing ever will. The math says it's impossible.)
Re: If we want to make this technology work... (Score:2)
> I, for one, am pretty much 99.99% correct when it comes to making positive recognition of those people around me that I see often.
I certainly am not an expert in these matters, but based on half a lifetime's self-observation, I'm pretty sure that your recognition of your fellow humans is based on subtleties of appearance and mannerisms rather than on some hyper-analytical form-matching mechanism.
I know that on several occasions I have been in a grocery store or somewhere and caught a former schoolmate out of the corner of my eye, recognizing him or her instantly. But as I approach to say 'hi' I get a better look and suddenly think that I have mis-recognized a stranger instead of correctly recognizing a former associate. It's only on the third or fourth look that I decide for sure that I should go ahead and say 'hi'.
Also notice the frequent situation where half your friends think Little Joey looks like Mom and the other half think he looks like Dad. I hypothesize that that's because some are looking at (say) the shape of his nose and others are looking at (say) the shape of his eyes. I.e., humans apparently recognize people on a fairly arbitrary subset of subtle cues rather than matching a remembered 3-D 'mask' to their faces.
As in so many other fields of AI, the technology that's on the market today falls far, far short of the basic abilities that humans -- and animals -- take so much for granted.
I wonder what the best today's technolgy could actually deliver is. If you set a threshold of (say) a maximum of 0.1% false positives, what are the chances of actually recognizing someone in your criminal/terrorist database if they are actively trying not to be recognized? I suspect the performance is going to be pretty dismal.
About Object and Face Recognition (Score:2)
He starts by describing basic object recognition; and he theorizes on how face recognition both builds on the basics, and yet, differs from, seemingly, all types of objects.
Troubling (Score:1)
Besides the infrignement on civil liberties, what was troubling to me about the scanners is the reduction to a mathematical sequence... meaning quite literally, that we're just another number. How depressing.
I got only one thing to say .... (Score:1)
IN YOUR FACE!
heh.
Grousing... (Score:2, Offtopic)
Gee, only beat this submission by about a month.
False positives, fales negatives, and wasting time (Score:5, Insightful)
As noted, there can be no "get past ID check free" letter or ID card, since those would immediately be forged. And with a 50% false negative rate (missing a suspect 50% of the time), the system seems hardly worth using.
I have not traveled by air since returning from Europe on September 19 (delayed from Sept. 12).
In the past, I would have flown between the San Francisco Bay Area and the Los Angeles area (a 1-hour flight, using any of the airports on either end), but now it's actually likely to be faster to drive (around six hours each way), after including all the "waiting in line time," the increased flight delays, and of course the time to get into and out of the airports (park here, rent a car there).
To be fair, of course, a system with a 50% false negative rate is presumably able to detect "known suspects" 50% of the time, which is almost certainly much better than human beings will ever do. Of course, the tests are probably being conducted under very favorable conditions, with an extremely small sample of "suspects." And of course, if the false-positives were equally distributed, we'd all be willing to suffer a one-in-a-thousand delay, if it actually had any meaningful benefit. (But we know that the false-positives won't be equally distributed, they will mostly affect persons in certain ethnic groups or with beards, etc., and while that means I'm less likely to be inconvenienced, I can't tolerate a system that punishes people for their skin color or ethnic background.)
What's scary, to me, is that we are giving up so much (in many little bits and pieces) for so little benefit. On Saturday, I discovered that I couldn't use the restrooms in the BART (train) stations again, because they were closed to prevent someone from planting a bomb in them. Okay, so I had to hold it for an hour until I got home, big deal. And armed troops in the airports, and on bridges, okay, I can live with that one thing. And I can't drop off my express mail without handing it to a postal clerk now.
But ding, ding, ding, we add up all the little "show-off" gimmicks and what we face is a huge impact that provides pretty much zero actual benefit. All the gimmicks combined might provide about 1% or 10% improved safety, at a much greater cost.
While I was stuck in London during the week after September 11, I worried that things would change for the worse, not because of things that terrorists did, but because of the things we would do out of panic and fear and prejudice and idiocy. Things are nowhere near my worst fears, but I think things are very bad, and ultimately I believe that the terrorists have already "won" by causing most Americans to change multiple aspects of our "way of life."
Re:False positives, fales negatives, and wasting t (Score:2)
While I was stuck in London during the week after September 11, I worried that things would change for the worse, not because of things that terrorists did, but because of the things we would do out of panic and fear and prejudice and idiocy.
You say this, but further up your post you said this...
I have not traveled by air since returning from Europe on September 19 (delayed from Sept. 12).
Your reaction is one of "panic and fear and prejudice and idiocy", having travelled extensively, both in and outside of US airspace, the security on internal US flights is still worse than internal flights in Europe.
So you've let a bunch of terrorists stop you flying, that's the reaction they wanted, why are you giving in...?
Al.Re:False positives, fales negatives, and wasting t (Score:2)
In the past, I would have flown between the San Francisco Bay Area and the Los Angeles area (a 1-hour flight, using any of the airports on either end), but now it's actually likely to be faster to drive (around six hours each way), after including all the "waiting in line time," the increased flight delays, and of course the time to get into and out of the airports (park here, rent a car there).
I've flown a couple of times since Sept. 11, and the only noticible slowdown i've experienced is the removal of curbside check-in. The LAX security checkpoint is faster now than before, more security goons + more metal detectors = better throughput.
Oh, and btw, when the check-in person asks you if you packed your own luggage and watched it at all times, "I didn't bring any luggage" is not the answer they want to hear... I'd try "mu" next time, but I think they'd be even less amused.
--
Benjamin Coates
Re:Who watches...? (Score:2)
--
Benjamin
The Ultimate System (Score:2, Insightful)
A security guard is sitting in front of a computer next to the x-ray machine ready for a positive match.
If you look nothing like the person. (different race or something like that) You would be let through to the gate and not even know you were positively identified.
If it may be a good match- you get stopped. The operator already has some information about the criminal in front of him. The operator will do an on the spot quick check. One thing that crimanals are notorious for is tattoos. If the passenger doesn't have them (or signs of removal surgery) let them go. If the passenger is a very close match do a more thorough examination.
Every night there can be an audit of the matches to make sure the security personel are doing their job. The system seems very effective to me.
The system by Visionics looks at 80 different facial characteristics. The configuration used by the airport only needed 14 matches to produce a positive. It seems this is a setting in software and could probably be lowered to produce more positives. Even if they are false positives the sytem I menetioned above would do the job.
Why on earth (Score:2)
Look-alikes? (Score:5, Insightful)
Let's say, some time in the future, they get the face-scanning technology to work right. 0.000001% false-positive rate. And it's implemented all over the US.
Let's also say that, among the 250 million people in the United States, one or more people had facial structures similar enough to terrorists' that they would trigger those scanners. In fact, they'd trigger every scanner that person was surveiled by. And let's say that person were you.
What would you do?
You couldn't go to an airport. You couldn't go to a major public attraction. You probably couldn't go to a public place without fear of some alarm going off and people waving automatic weapons in your face. Would you cower at home? Would you wear a bag over your head? Would you sue the US government? How would you cope?
Everybody runs. (Score:2, Funny)
Re:Look-alikes? (Score:2)
Re:Look-alikes? (Score:2)
No such thing as a cure-all (Score:2, Insightful)
Employing facial recognition is just one thing we can do - granted, we need to get the technology to work better, but we need to realize that it's multiple systems working together that is going to stop terrorists, not one or two "miracle systems."
False positives (Score:5, Insightful)
As for false negatives, even 50% is better than nothing as long as the false positive is much MUCH lower. Imagine catching 50% of the hijackers on September 11 before they boarded the planes. A lot of red flags could have gone up, and flights could have been delayed, the rest of the passengers could be more carefully scrutinized. No, this is not the solution to any problem. And no, it should not be used legally any more than a lie detector can be. Its a guide. It tells us where we might need to concentrate more of our efforts on.
As far as threats to privacy go, this makes sense in an airport, but it does not make sense out on the street. People go into an airport expecting to be searched, questioned, carded, etc. They do not have the same expectation while walking down the street. So unless the cops are currently chasing someone, they lose him, and you have a striking resemblance, they shouldn't bother you at all.
-Restil
Re:False positives (Score:2)
IMHO, a system that recognises faces in a manner that is needed for an airport, the recognising system shouldn't have to be 100% correct. As it isn't autonomous, but requires human confirmation to arrest someone, it's a tool for security. It's like an electronic wanted poster.
Of course, I'm not saying a face recognising system was viable in this occasion. I'm sure the authorities did well not implement it. Yet I'm not so sure that it couldn't be an improvement in security without sacrificing any extra privacy.
Re:False positives (Score:2)
One small problem ... how do you get those faces into the database to be checked against? Some of the most "recent" photos of terrorists may be >10 years old ...
So ... you have old photos of ~<500 known terrorists ... against ~>220 Million "good guys" ... you can see that you'll have so many false positives compared to real positives. (NOTE: numbers pulled out of my @$$ ... this is an example)
One thing that was (and still is) really irritating about the whole "we-need-better-security" mentality after 9/11 is a fundamental problem.
That problem is ... until you get to a "1984" society, there is no absolute security. The only security is a false sense of one.
Suppose we HAD scrutinized passenger lists more ... then what? Any potential terrorists would know that. Use something that the drug dealers use ... mules (people who are paid to bring drugs across the border, and have little/no background).
Now don't get me wrong here ... I think what happened was a tragedy, and I hope it doesn't happen again. However, given the openness of our society, I doubt that anything substantial will change in the long run.
Just recently (mid-May) I flew to BWI (Baltimore/Washington), and the "security" was about the same as when I flew to Vegas a couple of years ago. In fact, during the Vegas trip, my carry-ons were inspected ... not this latest trip though.
It's a balance between being secure and appearing to be secure.
Re:False positives (Score:2, Insightful)
I sold you and you sold me
There lie they and here lie we
Under the spreading chestnut tree
In this case, false positives are better... (Score:2)
The false negatives just make an already porous system even more so because whatever face-recognition system that gets put in place would in all probability be relied on to make sure it at least didn't miss anyone. If these systems get in place, we'll be less secure, 'cause the guards won't be on as high an alert, thinking the cameras will do it all for them.
Figures... (Score:4, Funny)
What's next? (Score:2)
IMO, face scanning is the single most worthless biometric in existance--not that I'd advocate any others. If entrepreneurs want to do something useful to increase security, they ought to improve devices which sniff for high explosives so I don't have to take off my frigg'n shoes every other time I fly.
Turn up false positive, false negative declines (Score:3, Insightful)
If you took this technology, made it match on too many faces and then had someone manually double-check the potential match, you would have a kick-ass system.
Like all powerful technology, its use must be ethical.
Not just FP,FN, but Base Rates! (Score:2, Informative)
As you may know, Bayes Theorem (actually a statement of fact in probability theory) says:
Post-test odds = Likelihood Ratio * Pre-test odds
(Where the likelihood ratio for a positive test is the sensitivity/(1-specificity), or TP rate / FP rate)
If your pre-test odds of being a terrorist are very low (and when you consider how many terrorists fly compared to how many non-terrorists fly, they must be exceedingly low), you're going to need a very, very powerful ("highly specific" in medical terms) test if you want to reliably determine that a given person ought to be treated with greater care.
On the other hand, if they were planning to spend a lot of time and money screening people anyway, and they could improve their sensitivity (TP rate), facial recognition might be a (statistically) sound approach to screening *out* suspects. That is, one you pass a face-detection screen that has a high TP rate, you don't need to be subjected to as much extra screening; but if you fail the face-detection screen, it's not really diagnostic.
Normally, you could use my diagnostic test calculator [uic.edu] to fool around with numbers yourself and see what the impact would be, but it appears to be down until I can get to the server (dratted dist upgrade!)
Designed by terrorists for terrorists?? (Score:2)
Ofcourse, the criminals will try to look different. And they will succeed. This system is based on corrupted principles, it is actually only good for recognising people who have no reason to change their face when entering the plane, it will recognise: your mom, your dad, girl nextdoor - but it will NEVER recognise the terrorist.
It will only cause extra hassle, and added false sense of security.
Easy to fool? (Score:2)
This all assumes that the terrorists will not try to fool the system. If a face recognition system was implemented at a given place, don't you think the terrorist would try to fool that system in some way with some kind of "fake faces"?
I assume that fingerprint readers should be much easier to make than this technology, correct? The fact is that those can be *very easily* fooled too! Read the latest Crypto-Gram newsletter [counterpane.com] for a story about how easy it really actually is - it's so easy it's almost scary.
How easy will it not be to fool this then?
Hey, this technology is great (Score:3, Interesting)
Because when they finally get it working right, with a really high degree of accuracy, then it'll positively identify me, and I'll be allowed to exercise my rights to have and bear arms on an airline for the purpose of forming a well ordered militia. Surely this situation exemplifies the purpose of the second amendment; an armed populace defending itself from attack.
What's that you say? That this won't happen? That security will still be something performed by bored and disinterested employees on the ground, not by the people under direct threat? That all this technology will do is to remove rights and further entrench the mentality that We, the People must be protected by a tiny minority of largely unanswerable and self appointed professionals.
Sometimes I wonder why we bother even pretending that the Constitution still applies. If anyone can think of a more relevant application for the Second Amendment short of a full scale invasion, I'd like to hear it.
Re:Hey, this technology is great (Score:2)
The second ammendment apparantly also gives you the right to remain ignorant. Any guns brought on board would need to be specially designed for that purpose, so as not to create holes in the aircraft when YOU MISS your target (or completely penetrate).
Not only that, but the main argument against arming people on board planes is that it makes it just a WEEEEE bit easier for a terrorist to bring arms on the plane or steal them from guys like you.
I would not be as concerned with fake positives.. (Score:2)
-- Tim
Osama is winning, and we are letting him (Score:3, Insightful)
By reacting the way we are in the U.S. Osama Bin Laden is getting exactly what he was aiming for. He wanted to destroy the American way of life, and by removing the freedom and civil rights the way we are, he is achieving his goal. There is no longer any need for him to act. We have met the enemy, and it is U.S.
Direct quote (Score:2)
It's quite ironic how the terrorists have won the "war on terror" the moment the US government started it.
Equipment NOT rejected due to inaccuracy (Score:2)
The decision by PBIA to not use the equipment had nothing to do with the accuracy of the equipment. Here [gopbi.com] is a less sensationalized story from the local newspaper which states "PBIA's decision to remove the equipment and not buy it reflects the federal government's takeover of airport security". The article mentions that the tests of the equipment were solicited right after the 9/11 incident, prior to the federal government announcing it would be addressing airport security. So the inaccuracy is not the reason this technology didn't end up in the airport.
maru
This AC unwittingly presents a good point (Score:1)
The recognition required to notice that this article is false involved a mere 5 % (approx) of the article. That is the same issue being faced by the developers of the technology humans have similar faces. How does one draw on the 5% that is different?
Re:useless (Score:1)
Jaime's Penis? jaime Gumb? as in Silence of the Lambs?
Re:slashdotters dont need to worry (Score:1)
This means you Randall "merlyn" Schwartz.
Re:slashdotters dont need to worry (Score:2, Informative)
Right now, in your own eyes, you are not a criminal. But what keeps you that way? What if the government decides something you do is a criminal offense? Perhaps they'll decide that Slashdot, as a part of Hax0r culture (I wouldn't call it that, but the people in power in this country are stupid enough to do so), must be outlawed, and its users are all 'terrorists.' Of course, fifty years ago we'd all be 'communists,' but times change and the way you make the idea of a subversive sound like the enemy change.
You see, anything is potentially a crime. Leaving my house, attending class, writing papers, playing water polo, jacking off ten hours a day- these are things that take up most of my time. The fact is, no one is to say that these are not crimes. If using drugs is a crime, if someone who feeds a non-violent subversive activist is a 'terrorist' now, any of these activities could become criminal.
In the majority of the United States, it is still legal to fire someone for quite simply being gay. There is no amendment to protect from this, there is no federal law. And it will be this way for a long time, most likely. In fact, some of the anti-discrimination laws that keep this from being true everywhere are being repealed. What's to say that you aren't a criminal in such an unjust nation?
We are not the land of the free, don't buy that. You aren't safe. Unless you work for the government in a high ranking office (as in you were either elected or appointed), or have a LOT of money, you can be screwed at any time.
Slashdotters need to worry. Fight surveillance! Fight for your freedom, no matter the cost.
Re:slashdotters dont need to worry (Score:1)
Having the government force people to conform to your (and my) opinion that gay people are not bad is just another example of the denial of person freedom. If I start a business with my own money, then why should the government make special cases about who I can and cannot fire at will?
Re:slashdotters dont need to worry (Score:1)
Re:slashdotters dont need to worry (Score:2)
Even that isn't enough to protect you. Former Vice President Dan Quayle was stopped at airport security for the more thorough check.
I can't find it now, it was a Slate article, the author recounting his experience being spot checked at the airport. He looked over and saw another passenger (Dan Quayle) being spot checked and piped up "only in America Mr. Vice President".
What Bothers Me... (Score:3, Insightful)
I'm in that database because when I was an 18 year old high school senior I committed the high crime of having had consensual sex with my girlfriend, who was a year and a half younger than I was. It's bad enough to get charged with a felony for consensual sex with a partner who's within 2 years of your own age, but now maybe I'll get harassed when I go to national monuments or big events because of hits in facial recognition software. In theory the facial recognition technology will only be hooked into a partial database of certain types of people. In practice, I doubt they'll be very selective.
What if you got arrested as a teenager for having a small amount of marijuana? What if you were accused of assault for a minor altercation? What about any number of minor infractions which still would have landed your face in the FBI database? My guess is, as technology gets better and more discriminating in the field, the parts of the FBI databasde used will get wider until the full database gets scanned.
So, it's not just false-positives that are a worry, but positives against people with very minor infractions that have still landed them in the FBI database. Should you get shaken down by some overzealous dweeb who thinks you're dealing drugs because 10 years ago you got caught with your personal stash of green? And what of the potential for abuse of sensitive personal data?
Now that this particular can of worms has been opened under the excuse of 9/11, it's only going to get bigger and more invasive. First they'll assure us the database they're using only has "violent" criminals in it. Then it'll only be felons. Next it's the whole FBI database, including all the pictures of people whose parents were stupid enough to fingerprint and photograph their children and submit a packet voluntarily "to protect your chuildren in case of abduction", and DMV databases as well.
Is it just me, or is it getting kind of Orwellian in here?
Re:What Bothers Me... (Score:2)
Kind of a random curiosity, but what state was this in? Most US states recognize 16 as the age of "sexual majority." (All Canadian provinces are at 14.) In my state of Ohio, its 16, but if the younger person is between 12 and 16, and the older one is within 4 years of their age, then the crime is a misdemeanor, and not a felonly. i presumed that other states had the same progressive system.
Re:What Bothers Me... (Score:3, Informative)
So, with that in mind - is keeping blacklists (or greylists, really) of people a good idea at all? We like to pretend that they keep us 'safer' - but I bet the sixty-year-old gay man (prosecuted under one of those 'unenforced' state sodomy laws) who's driven out of his neighborhood with cries of 'think of the children!' isn't feeling any safer as a result of the existence of these lists.
Memorial Day? (Score:1)
So what the hell makes it a big party/movie weekend? If that's all it is to most people, I suggest you scrap it. If all you see it as is a holiday then it shouldn't be.
US citizens are always going on about how they "fought for their freedom" etc but it sure doesn't look like you respect those who actually *did* the fighting.
To put this on-topic
Re:Don't abandon biometrics, focus on better syste (Score:2)
--
Benjamin Coates