Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Facebook Privacy

Meta's Smart Glasses Repurposed For Covert Facial Recognition (404media.co) 47

Two Harvard students have developed smart glasses with facial recognition capabilities, sparking debate over privacy and surveillance. The project, dubbed I-XRAY, uses Meta's Ray-Ban smart glasses coupled with facial recognition software to identify strangers and retrieve personal information about them. AnhPhu Nguyen and Caine Ardayfio, the creators, tested the technology on unsuspecting individuals in public spaces. The glasses scan faces, match them against online databases, and display personal details on a smartphone within seconds. The students claim their project aims to raise awareness about potential privacy risks.
This discussion has been archived. No new comments can be posted.

Meta's Smart Glasses Repurposed For Covert Facial Recognition

Comments Filter:
  • "The students claim their project aims to raise awareness about potential privacy risks."

    uh-huh. so close, let me fix it:

    "The students hope their project raises their hirability and starting salary when they apply to companies involved in privacy risks."

    hahahahahaha

    • Sounds like we need another set of students to develop a "Scramble Suit" [wikipedia.org] to defeat all the facial and other recognition systems coming on line....
      • as with many social problems, the technical solutions exist but are not considered acceptable (despite being guaranteed by the second amendment of the us constitution).

        in this case, an airburst EMP would be effective.

    • Maybe. But lots of very college/university kids are bright, knowledgeable, and so enthusiastic about their personal projects that they're childishly naÃve about almost everything else.

      • sure, maybe. or we can consider a quote from another famous harvard student about his tech project: "They 'trust' me. Dumb fucks."

    • by GoTeam ( 5042081 )
      Or the students hope to get paid by government agencies who want to use their tech.
  • by Turkinolith ( 7180598 ) on Wednesday October 02, 2024 @10:40AM (#64834033)
    I'm honestly surprised we're just getting to this point. To be fair, when discussing augmented glasses, contacts, or eye replacements, facial recognition and information overlays has always been part of that. (Pop culture reference: Shadowrun)
    • Re:Just now? (Score:4, Insightful)

      by jenningsthecat ( 1525947 ) on Wednesday October 02, 2024 @10:57AM (#64834079)

      I'm honestly surprised we're just getting to this point.

      It's possible the we're NOT just getting to this point. Sometimes there's a pretty big gap between what's available to / known about by the public, and what already exists in secret.

      The technology to do this less surreptitiously has been known about publicly since the Glasshole era, if not before then. I wouldn't be surprised if law enforcement agencies and/or spooks have had, for perhaps years, nearly-undetectable devices that do covert identification.

      • by leptons ( 891340 )
        >since the Glasshole era, if not before then

        A good friend of mine worked for a FAANG and was developing their machine vision systems about 20 years ago. One of the products they worked on allowed cell phones to take a photo and the servers would figure out what the object was. This could also be used to identify people. The team put their foot down that their tech should not be allowed to be used in this way. So the tech capability has been there for about 20 years, but unfortunately it has finally bec
      • I'm honestly surprised we're just getting to this point.

        It's possible the we're NOT just getting to this point. Sometimes there's a pretty big gap between what's available to / known about by the public, and what already exists in secret.

        The technology to do this less surreptitiously has been known about publicly since the Glasshole era, if not before then. I wouldn't be surprised if law enforcement agencies and/or spooks have had, for perhaps years, nearly-undetectable devices that do covert identification.

        What do you mean by "less surreptitiously?" Most people are not going to notice someone with a regular smartphone taking pictures or recording. You can walk around with a camera hanging from a necklace and nobody will bat an eye, hell you can wander around wearing spandex bike pants and a helmet with a massive freaking go-pro camera and you'll hardly get a second glance. All the people having a panic attack about "secret recording" need to take a few deep breaths. Putting a camera on a pair of AR glasses i

    • by doug141 ( 863552 )

      Years ago stories leaked that there was smartphone app, and not a public one, that would do the same thing. It was so secret, journalists had to quote sources who were shown the app by a braggart.

  • In the US, you can be recorded unless you're in an area where privacy is normally considered "private", i.e., your home, a bathroom, private property, etc. When a gov't does it, it can be restricted behavior based on laws and court rulings but a private citizen can record your activities when you're in a public area. If linking that with software and an Internet connection bothers you, change the laws.

    In an era of cellphones and augmented vision technology, is anybody surprised that the linkage of that tech

    • Except for the fact that I, as a person being recorded, am also (unknowingly) having my data sucked into Meta's gaping maw without any consent, implied or otherwise. What is Meta's defined policies regarding their rights to, and usage of, images recordings and other data collected through their devices?

      • You mean, like how when you are being recorded in public, and information about you makes it go viral on Reddit, or Youtube?

        You do not get to own information about you pursuant to the first amendment, but you can keep people from rummaging through your secrets, either because they trespassed on your IRL or digital property, or because they violated some contractual obligation (like your doctor gossiping about you), this is called the "intrusion into seclusion" or the "disclosure of *private* facts) theory o

    • Because things can be both legal and bad. You have to be a total weirdo to think otherwise.

  • Don't they have better and more noble things to do than take a piece of dystopia technology from a disgusting big data company and create more dystopia?

  • For someone like me who has a terrible time remembering faces and names, augmented reality with facial recognition could replace the defective & underperforming 'wetware' that I have. Like a hearing aid for those with ear problems, or glasses for those with retina or lens problems. These things are socially acceptable, probably because there's accepted norms for their use.

    Feels like if the use case is to compensate for a disability, that's ok; but if you're using something that enriches a third party, o

    • If it's in an environment we're you are expected to know people, like at a company we're everyone wears name / photo badges, or with people who you've met and been told their name, then this seems fine to me. It would probably be polite to ask people if it's OK when you first meet them.

      It's linking it to vast online databases that seems creepy.

      I'm curious, would you say you have apantasia [wikipedia.org]? I do, and I also have a terrible time remembering people's faces.

  • This is the feature that makes smart glasses worth using -- giving you seem less information on who you are talking to -- it's just all the tech makers have been too cowardly to actually enable it.

    And no bullshit about how this harms privacy. Your privacy is as or perhaps more invaded by the tons of cameras in stores, atms etc recording and saving your image. This just makes that fact salient.

    I agree it's important to make sure the devices alert when they are storing recordings but facial recognition is u

  • Does anyone remember when nerds were not cool, but anti-social, in the sense of anti-social personality disorder not social anxiety? It swings back that way. This fear based need for predictability and control projected on a whole population, needing absolute surveillance at all times. Thus the guy shows up at the party, with the surveillance glasses that report on everyone there. No.
    Still, bold for MZ to shoot for something to come after phones, no one else is doing it, and if he is wrong what is right? Ju

    • Thus the guy shows up at the party, with the surveillance glasses that report on everyone there.

      ... Followed by the guy getting his ass kicked by the other attendees, as soon as they figure out what he's doing.

    • by PPH ( 736903 )

      True nerds just want to be left alone. In exchange for leaving everyone else alone.

      These glasses would be more likely to be used by the "social influencer" class. The people that like to organize big parties, pretend that they know "absolutely everyone, darling", name drop and use all of these skills to do their social climbing. In other words, someone who is famous for being famous.

  • to identify strangers and retrieve personal information about them.

    This is exactly what happens in the beginning of the book, "Light of Other Days". One of the characters arrives at a party and uses her glasses to identify people and bring up peronsal information on them.
  • Meta Glasses let you take pictures

    They made an automation to take this picture and look it up in online reverse image search

    THAT IS IT

    This is not terminator or robocop. It's just stupid.

  • Any smartphone sticking out of your shirt pocket can be programmed to do this. it can buzz if it recognizes someone relevant, otherwise just record times and places, or whatever else.

    • >> Any smartphone

      Very true, and even just a Raspberry Pi with a cheap camera unobtrusively mounted on your clothing can do it. The glasses are handy because they can display the recognition products in real time.

  • by Arrogant-Bastard ( 141720 ) on Wednesday October 02, 2024 @12:08PM (#64834413)
    There's no way this should have or could have passed review by an IRB, because it involves human subjects who haven't consented and haven't even been asked if they would grant consent. Most research in the US is subject to Federal Policy for the Protection of Human Subjects ('Common Rule') [hhs.gov] but even if that doesn't apply, all educational and research institutions have codes of conduct that align with that and with the Belmont Report [hhs.gov].

    This isn't a question of whether what they did was legal or not; it's a question of whether it's ethical -- and it's not. And I strongly suspect that the researchers knew this which is why they didn't even bother trying to acquire consent.

    Harvard needs to investigate this, and that investigation needs to include the faculty and staff involved, because somehow they failed to teach these students some of the fundamental principles of ethical research and they didn't shut this project down. And they should probably prepare themselves for the possibility that litigation on behalf of the victims is coming.
    • by bandi13 ( 579298 )
      What difference does it make whether it passed IRB? They could have implemented it at home outside of research. Just because they're students doesn't make it required to go through any sort of research review board. This technology is coming, the question is what are we as a society going to do about it? Though to be fair, it is already in existence at airports and many places in public.
    • "highly unethical"!?

      You and I know that's what Meta covertly does behind the scenes. Because that is exactly what their trackrecords show.

      These kids just made it obvious to anyone turning a blind eye to these AR goggles.

      Umad? Got any stake in Meta?

    • because it involves human subjects who haven't consented and haven't even been asked if they would grant consent.

      Nobody needs to notify you or ask for consent to use an image taken in public. And to be technically accurate the people are not the subject, it is the captured image which is the subject.

      it's a question of whether it's ethical

      There's nothing unethical about looking at a picture of some random person, and then trying to gather more information about them. Just because you added the phrase "using a computer" or whatever panic-inducing buzzword is being tossed around doesn't change that.

    • So you are worried about a couple of students who performed an experiment to prove corporate abuse. There is NOTHING unethical here. There are no victims. It was not an experimental study. It's a simple proof of concept. I have to assume that you have some sort of agenda to demonize this.

    • It's illegal in Texas, too, without consent.

      All that basically means though is that the database can't be scraped from the web. A "collective" of companies that you provide consent to could together come up with a list and probably share it. The only difference would be that those not in the database would show up as unidentified when detecting the face region.

      Law enforcement on the other hand can tie these to their massive databases that you are forced to consent to and use them at will.
  • Here's a link to a non-paywalled version; https://archive.ph/e7H9S#selec... [archive.ph]

    New York Times reporter Kashmir Hill detailed how both Facebook and Google had the technology to use facial recognition in combination with a camera feed, but declined to release it. As Hill mentions, Google’s chairman Eric Schmidt said more than ten years ago that Google “built that technology, and we withheld it.”

    • by CAIMLAS ( 41445 )

      I was so disappointed when I bought those in 2nd grade from a Boys Life magazine and they did absolutely nothing to help me see the forbidden fruits.

      • by Tablizer ( 95088 )

        Second grade? That's a little young to be that horny, but everyone matures at a different rate. I had enough problems as a kid, I didn't need libido distractions also.

  • This really just seems like an API call to existing services like Clearview AI and PIM eyes with AR glasses. Which isn't nothing - otherwise I guess you have to snap a photo on your phone and upload it to a service to get that info.
  • They say "repurposed" as if this isn't one of the primary 'killer apps' for this kind of thing. To wit -

    - location identification
    - map/direction overlays (requires location identification)
    - social enhancement via facial recognition -> identification (and now, with the use of AI - "John recently increased his social media presence on X and his wife left him in January based on a Facebook post")
    - industrial manual overlays (eg. to help technicians identify things like faults or order of operations and such

  • I heard a phrase "foley file" but can't find its origins now. It referred to a diplomat's assistant who would quietly give brief data just before the diplomat would shake hands.

    This trope has been used repeatedly in various sci-fi works since then. Star Trek, Robocop, Oath of Fealty, The Expanse, Gattaca, all have characters who have discreet scans to summarize people upon introduction.

  • I've got a good chunk of "face blindness", I forget the current medical/scientific name for it.
    It would be really great to have a heads up display pop up the name of whomever I'm looking at, and whatever reminder note I chose.
    You have no idea how often I'm talking to someone that obviously knows me, but I have no idea who they are. It really sucks!

    The nickname of "face blindness" is a bit misleading. You can see faces without a problem, but you tend to have no idea who they are, the two things just don't li
  • Now that my poor memory for names has gotten to be a real barrier to social interaction I would love a set of these that just knew my friends and family and could whisper the names in my ear.
  • The glasses scan faces, match them against online databases.

    I need to know how/where these databases are being built.

Our policy is, when in doubt, do the right thing. -- Roy L. Ash, ex-president, Litton Industries

Working...