Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Technology

Ears Might Be Better Than Fingerprints For ID 135

An anonymous reader writes "A new study says that outer ear could be better unique identification mark in human beings than finger prints. 'When you're born your ear is fully formed. The lobe descends a little, but overall it stays the same. It's a great way to identify people,' said Mark Nixon, a computer scientist at the University of Southampton and leader of the research. Nixon and his team presented a paper at the IEEE Fourth International Conference on Biometrics and using an algorithm identified people with 99.6 per cent accuracy." An anonymous reader adds a link to Wired's story on the same conference presentation, which adds this skeptical note: "'I have seen no scientific proof that the ear doesn’t change significantly over time. People tend to believe notions like these, and they are repeated over time,' said Anil Jain, a computer scientist at Michigan State University who was not involved in the study. 'Fingerprinting has a history of 100 years showing that it works, unless you destroy your fingerprints or work in an industry that gives you calluses.'"
This discussion has been archived. No new comments can be posted.

Ears Might Be Better Than Fingerprints For ID

Comments Filter:
  • by OSPolicy ( 1154923 ) on Sunday November 14, 2010 @01:57PM (#34223564) Homepage

    "Fingerprinting has a history of 100 years showing that it works."

    Fingerprinting has a history of well over 100 years, but what we see is that it works as long as it is not seriously challenged. In its only major rigorous challenge, the 50Kx50k text, substantial problems emerged.

    Keep in mind that fingerprints are never admitted into evidence, never used for identification, never even examined. Never. A finger touches a surface and it leaves a partial copy. An investigator finds it and puts powder (matrix) on it, which creates a visible picture of the copy. It is often not possible to get a good photo of the copy, so someone uses tape or other gear to get an image of the picture of the copy. Then someone photographs the tape containing the image of the picture of the copy. Then a print of the photograph of the tape of the image of the picture of the copy is created. If there are no more steps, which would be unusual, that print is what is actually used for evidence or analysis. Scientifically-minded readers will have already tallied up at least a partial list of the errors introduced at each step of the process.

    And what sort of analysis is done? The best lab in the country, the FBI, uses an analysis process taught by a high school grad who washed out of college after two years. Obviously, other labs do not enjoy such high standards. What standards do they use, you may ask? None. There are no required national standards for fingerprint analysts. There are guidelines that suggest that a high school diploma should be required, but the advisory guidelines bind no one.

    But at least they use a rigorous process with well-defined standards?

    "The International Association for Identification assembled in its 58th annual conference... based on a three-year study by its Standardization Committee, hereby states that no valid basis exists at this time for requiring that a predetermined minimum of friction ridge [fingerprint] characteristcs must be present in two impressions in order to establish positive identification."

    So no, there are no standards, which is a good thing because the relevant international body has determined that there is "no valid basis" for establishing one.

    So now they say that they can get better results by looking at someone's ears? Hm... Well, the good news is that they're probably right. The bad news is that they've got a long way to go before they can say that it's any great accomplishment.

  • by Luckyo ( 1726890 ) on Sunday November 14, 2010 @02:07PM (#34223628)

    Yes. The basis behind fingerprints is that as long as the regenerating tissue at the bottom of the skin layer remains alive, it will eventually regenerate same prints. However when damage extends to the deepest layers of the skin, the fingerprints are altered permanently. This is achievable via:

    1. Physical trauma. When potential damage extends below the regenerative layer of the skin, your fingerprints end up altered.
    2. Skin grafting: for example after heavy burns to your hands that require skin to be replaced fully. This will change your fingerprints.

    I suspect that trauma that took your fingerprints off was a surface trauma of some sort, that only removed your prints temporarily, as regenerative layer of the skin remained alive.

  • by wernst ( 536414 ) on Sunday November 14, 2010 @03:00PM (#34224050) Homepage

    I was born with ears that stuck out worse that Prince Charles. I was teased about them all through school.

    In college I had my ears "tucked," which basically made them lay flat against my head. I had generous grandparents.

    Anyway, the point is that to do this, (the following not for the queasy), they slice open your ear, take out the cartilage (which is what forms all the unique bumps and curves of your ear), manually reshape it, stick it back in, and then sew you up.

    Not only did my ears finally not stick out, but they looked totally different than they did before: none of the curves matched, and even my earlobes are a different shape (the bottoms are trimmed a bit and then stitched back to your head.)

    This is not terribly expensive surgery, and while a bit painful, if I were a criminal trying to beat a set of "earprints" somehow left at the scene of a crime, I'd have it done in a second.

  • by Anonymous Coward on Sunday November 14, 2010 @03:13PM (#34224172)

    Good point, but check your math - 900,000,000 *0.4% = 3,600,000. Same conclusion, though.

Old programmers never die, they just hit account block limit.

Working...