Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Facebook Social Networks

Meta Builds Tool To Stop the Spread of 'Revenge Porn' (nbcnews.com) 94

Facebook's parent company, Meta, has worked with the U.K.-based nonprofit Revenge Porn Helpline to build a tool that lets people prevent their intimate images from being uploaded to Facebook, Instagram and other participating platforms without their consent. From a report: The tool, which builds on a pilot program Facebook started in Australia in 2017, launched Thursday. It allows people who are worried that their intimate photos or videos have been or could be shared online, for example by disgruntled ex-partners, to submit the images to a central, global website called StopNCII.org, which stands for "Stop Non-Consensual Intimate Images."

"It's a massive step forward," said Sophie Mortimer, the helpline's manager. "The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it." Karuna Nain, Meta's director of global safety policy, said the company had shifted its approach to use an independent website to make it easier for other companies to use the system and to reduce the burden on the victims of image-based abuse to report content to "each and every platform." During the submission process, StopNCII.org gets consent and asks people to confirm that they are in an image. People can select material on their devices, including manipulated images, that depict them nude or nearly nude. The photos or the videos will then be converted into unique digital fingerprints known as "hashes," which will be passed on to participating companies, starting with Facebook and Instagram.

This discussion has been archived. No new comments can be posted.

Meta Builds Tool To Stop the Spread of 'Revenge Porn'

Comments Filter:
  • by Joce640k ( 829181 ) on Thursday December 02, 2021 @12:36PM (#62040183) Homepage

    Cue a bunch of websites that reflect/warp/crop the images so the fingerprints don't work any more.

    • by Mal-2 ( 675116 )

      Tineye seems to be able to deal with this.

    • Re:Next up (Score:5, Funny)

      by fph il quozientatore ( 971015 ) on Thursday December 02, 2021 @01:47PM (#62040561)
      I'll grab a bucket of popcorn and watch the arms race then. I guess in 10 years my captchas will be "select all images with boobs".
    • If the amount of doctoring required makes the subject unrecognizable, that's a win.
      • I"m wondering what they #hash off of....still images captured from the video, or the whole video itself?

        I was thinking, if someone wanted to really invest into it, perhaps they just get pics of the person, and use what is becoming more freely usable and available Deepfake software and then, they could just churn out an almost never ending stream of revenge pr0n, with different situations, different acts and locations, etc.

        Would it not be hard for this system they're talking about to track it?

      • Comment removed based on user account deletion
    • And it doesn't tend to work that well against PhotoDNA which is likely what will power the core of this service. PhotoDNA is definitely not insurmountable, but you'd have to warp the heck out of the picture like applying effects filters to in order to really get around it.

    • by Anonymous Coward

      Back when I was a teen before Home Internet was a thing, When I was alone, I use to switch to the scrambled adult channel, in which a boob may be visible for a fraction of a second.

      But today where Porn is a quick search away, why would anyone deal with a heavily altered picture where they have access to more porn than they can safely deal with.

      While I am sure some people will get off on seeing someone they know, but if it is so doctored up, what is going to be the point.

    • They claim that they can identify deep fakes, so it sounds like they have a failsafe of:

      IF FacialRecognition($Image) == $Person && IsNSFW($Image)
      THEN BLOCK()

      Also image hashing of distorted and warped or reflected images is getting really good. Lots of strategies like breaking down small blocks and looking at something like the FFT decomposition rather than pixel matching and numerous other far more sophisticated strategies.

    • You submit your images to them and they will ensure they wont get posted. What could possibly go wrong, I am guessing their database of images will be dumped only in 2 years.
  • Wait, what!? (Score:5, Insightful)

    by elgholm ( 2746939 ) on Thursday December 02, 2021 @12:37PM (#62040189)

    So, to stop my sexually explicit images to be shared all over the Internet I'm supposed to upload them to a website?

    Yeah... That'll be a no.

    This needs to be an app, or a installable program on your computer, so you can make the digital fingerprints locally.

    • Re:Wait, what!? (Score:4, Insightful)

      by rjune ( 123157 ) on Thursday December 02, 2021 @01:47PM (#62040559)

      You're uploading them not just any website, but to FaceBook. Yeah, this is going to be great! They take such good care of personal data.

    • Re:Wait, what!? (Score:5, Informative)

      by CODiNE ( 27417 ) on Thursday December 02, 2021 @02:31PM (#62040751) Homepage

      What you suggest is what they're doing.

      StopNCII.org .. will not have access to or store copies of the original images. Instead, they will be converted to hashes in users' browsers, and StopNCII.org will get only the hashed copies.

      • by khchung ( 462899 )

        What you suggest is what they're doing.

        StopNCII.org .. will not have access to or store copies of the original images. Instead, they will be converted to hashes in users' browsers, and StopNCII.org will get only the hashed copies.

        And how would you know the browser was not, in parallel, sending another copy of the original to another destination?

        Do you trust FB not to do something so nefarious?

        • Because conspiracy theories are pointless, and with data privacy laws these days having actual teeth in many parts of the world the idea that a company would be as mind bogglingly stupid to do that is quite far fetched.

          Facebook is actually quite reliable. They have always done what they said. What they have said has been terrifying, but that's no reason to believe they suddenly would not do what they say.

    • Holy Heck...I came here to say the same thing.

      What could possibly go wrong when you have a website who's sole purpose is to collect intimate/explicit images from the whole world. At least when Target gets breached all I lose is a credit card number. When this place gets hacked it's going to be a mess.

      Supposedly it doesn't store any copies of those images but only generates a hash...until we find out that someone was sloppy in the code and oops it does store a jpeg in the cache or something.

    • by saider ( 177166 )

      Easy solution: How about just not taking sexually explicit images in the first place. It won't stop the secret filming, but I'm guessing 80-90% of the pics and vids are consensual.

      People really should reconsider a relationship if someone says "hey, lets make a video on *my* phone".

    • Re:Wait, what!? (Score:4, Interesting)

      by Anubis IV ( 1279820 ) on Thursday December 02, 2021 @04:07PM (#62041141)

      So, to stop my sexually explicit images to be shared all over the Internet I'm supposed to upload them to a website?

      Yeah... That'll be a no.

      This needs to be an app, or a installable program on your computer, so you can make the digital fingerprints locally.

      The whole idea is a complete non-starter, regardless of which way you go.

      Put the fingerprinting tool online? We have no guarantees images will be properly protected. Most of us have heard stories about TSA agents taking cell phone pics of scanner images of well endowed people, "Geek Squad" types scouring computers for nude pics, hospital workers looking up the medical records of celebrities in the hospital's care, or other instances of people abusing access.

      Give us a local fingerprinting tool? The services have no way of verifying that the images are valid. Trolls or people with a bone to pick could generate fingerprints for every frame of every movie, every picture their ex has ever posted to Facebook, or the logos for every company they dislike, effectively erasing their targets from Facebook.

      The CSAM (read: child porn) databases operated by organizations like the National Center for Missing and Exploited Children "work" because the images are already in the wild when they are collected, meaning the damage is already done and the images are verifiable. Unfortunately, all they can do is work to prevent additional harm after harm has already been done.

  • So.... (Score:5, Insightful)

    by Joce640k ( 829181 ) on Thursday December 02, 2021 @12:38PM (#62040193) Homepage

    Can I submit ordinary photos of myself and disappear from the internet or are the images "revised" by the people who work there?

    • Re:So.... (Score:5, Funny)

      by ewibble ( 1655195 ) on Thursday December 02, 2021 @01:06PM (#62040325)

      What I think we can do is download advertising from companies you don't like and upload it to this site, every ad on Facebook automatically goes to that site.

      My daughter reports every post she gets that doesn't contain animals, this is similar except nobody can actually examine the images because of privacy concerns.

  • by Stolovaya ( 1019922 ) <skingiii.gmail@com> on Thursday December 02, 2021 @12:38PM (#62040197)

    Until StopNCII.org gets hacked and then a treasure trove of revenge porn is released.

    But not like I have any better idea.

    • Uploading only a fingerprint of the image, not the complete image, seems like a better solution.

      • Might be a little more niche, but still services the 'metadata' fetish adequately.
      • Re: (Score:3, Insightful)

        by ewibble ( 1655195 )

        How about not letting someone take images of you in the first place, of course it only works if the image is taken with your consent, however you need to have the image for this to work so you probably gave consent. Of course once you find the image of you then upload it, but then whats stopping someone uploading every image to the site.

        Or you could just not care, whats wrong with your body? I think revenge porn is about the other person hurting you, don't give them the satisfaction, there is plenty of porn

        • Tell that to people who lost their job as a primary school teacher because of a leaked naughty pic.
          • Tell that to the teenager thrown out of their family because of a naughty pic -- happens regularly to non-straight kids, less frequently but still to straight kids.

        • Hey, Rip Van Winkle, not sure where you have been. But there's a new technology called Deep Fakes that can generate very credible images and videos of naked people by placing the face of the victim on the body of a porn actress/actor. No need to take real photos of the victim, or blame them. The summary mentions the term
      • Per the article, that is what they do "StopNCII.org .. will not have access to or store copies of the original images. Instead, they will be converted to hashes in users’ browsers, and StopNCII.org will get only the hashed copies"

        • Yeah, we are trusting that the conversion happens *on their side*. The photo should never leave the uploader's hard drive.

          • It says in my quote that it happens on the user's side not their side.

            • Ah. I didn't interpret "in the user's browser" as necessarily being on the user's computer. I thought that was just the tool used for uploading. I see the point. You're correct, my mistake.

    • What, as they say, could possibly go wrong...
    • What about just basic facial recognition using a non-pornographic photo or even a driver's license photo. This would also allow you to block videos that you don't have a copy of or might not know exist. The only 2 problems I see with this is that you need buy in from a bunch of sketchy porn sites and you have to have some way to deal with false positives.

    • by JackieBrown ( 987087 ) on Thursday December 02, 2021 @01:20PM (#62040413)

      The article says the image is not stored or even received.

      StopNCII.org .. will not have access to or store copies of the original images. Instead, they will be converted to hashes in users’ browsers, and StopNCII.org will get only the hashed copies.

    • by GoRK ( 10018 )

      It doesn't need to be hacked. It just needs to have a bunch of "non-NCII" images uploaded to it to the extent that its ability to properly function is completely ruined. Given that we have GANs that can generate limitless quantities of such material, it's a sure bet that this will occur.

  • If you dont want your nudes flying over the internet just send them to this central repo of nudes owned by a company you can trust with your intimate data xD yeah right...
  • Here's an idea. Not sure it will work, but it might be a bit easier to implement.

    Don't let your partner take nudes of you and don't take nudes of yourself.

    Or is that too simple an idea?

    • by dysmal ( 3361085 )

      Think of the starving venture capitalists who will be unable to afford their cocaine habit if they can't profit off of this technology because of your low tech solution you insensitive clod!

    • by Somervillain ( 4719341 ) on Thursday December 02, 2021 @01:27PM (#62040459)

      Here's an idea. Not sure it will work, but it might be a bit easier to implement.

      Don't let your partner take nudes of you and don't take nudes of yourself.

      Or is that too simple an idea?

      Too simple?..not sure...it's definitely a stupid one. If you're asexual, cool, you do you. Many of us like to be sexually aroused. We like to sexually arouse our partners and especially receive sexually explicit images from them. Most assume their husbands won't share the images. Also, many of them were shared without anyone's consent...a simple device leak or cloud provider, not necessarily a disgruntled partner.

      People are going to take dirty pics. It's a lot of fun. It's stupid and judgmental to shame people for doing so, which you're definitely doing.

      On a similar note, I hate weed. I don't like getting high. However, some do. It's their right. I don't want them getting fucking poisoned by the dispensary. Just because someone engages in activity that doesn't do it for me, doesn't mean I am indifferent to harm that befalls them. Same thing with gun owners. I don't enjoy shooting guns enough any more to own one. It's not for me. Regardless, I don't wish harm on responsible gun owners exercising their 2nd amendment rights.

      Yup, never taking pics decreases the chances of them getting shared. It doesn't eliminate threats from hidden cameras, peeping toms and such. However, by your logic, the AIDS crisis could have been stopped if people just never had sex. It's dumb. People are going to fuck. People are going to get high. People are going to shoot guns for fun. People are going to take naked pics. It may not be what you're into, but you don't need to be so judgmental. It's kind of a dick comment. Slut shaming is a shitty thing to do. Why do you want to discourage women from sharing pics of their bodies?

      • It's more of an extension of the old saying, "don't post anything on the internet that you don't want there forever".

        So, extending that, don't risk yourself by letting compromising images to be taken of you, because sure as shit there is ALWAYS going to be a risk that it will end up on the internet.

        You don't eliminate all risk of course, but that's the same with your AIDS analogy.

        Of course people will have sex and do drugs, but you can lower the chance of catching or passing on aids by not sharing needle

      • by saider ( 177166 )

        The fun of dirty pics has to be balanced against the quality of your character judgement.

        "What is the chance that this person will act spitefully if I were to break up and have a little fun with someone else?"

        Sadly, many people lack good judgement in this area and think their current significant other would *never* do anything like that. They often reach this conclusion after the third date or so.

        Age-old advice I learned a long time ago - "Do not record anything you would not be willing to show in court wit

        • The fun of dirty pics has to be balanced against the quality of your character judgement.

          "What is the chance that this person will act spitefully if I were to break up and have a little fun with someone else?"

          Sadly, many people lack good judgement in this area and think their current significant other would *never* do anything like that. They often reach this conclusion after the third date or so.

          Age-old advice I learned a long time ago - "Do not record anything you would not be willing to show in court with grandma by your side"

          Take a step back and consider that sharing intimate pics without consent is a CRIME. You're telling crime victims they should have listened to your grandma, roughly speaking. You're indirectly slut shaming them. You're shaming them for being sexual with people who commit crimes against them and not anticipating their romantic partners would be criminals...or just get their iCloud account hacked...many revenge porn victims had their images shared without consent from either party.

          While I avoid walking

      • Slut shaming is a shitty thing to do. Why do you want to discourage women from sharing pics of their bodies?

        Yes, how dare you point out that actions have consequences, including risks!

        We want something, and therefore it is good and can have no bad consequences, ever! Don't you understand that?

        • Slut shaming is a shitty thing to do. Why do you want to discourage women from sharing pics of their bodies?

          Yes, how dare you point out that actions have consequences, including risks!

          We want something, and therefore it is good and can have no bad consequences, ever! Don't you understand that?

          You know, revenge porn is a crime. You're blaming the victim. Do you blame mugging victims? If someone gets mugged walking down the street is it their fault? As you said "actions have consequences." Long ago, some dude I've never seen before in my life punched me in the face on the subway, then tried to steal my wallet. I guess that's my fault for what...what riding the subway?...carrying a wallet?

          Sharing pics without consent is a crime. It's illegal and shitty. You're being shitty by blaming th

    • Or is that too simple an idea?

      Yep, way too simple. You're ignoring the entire concept of human nature. Incidentally you really shouldn't let these humans know we are among them. Try to be more careful in the future smooth wombat.

  • This approach only works if you, a potential victim, have the nudes in question. A proper block on images which are not age appropriate would be automatic and not require users to go uploading things or generating fingerprints derived from their intimate pics.

    Last I checked, Facebook and Instagram are 13+ services, so why not block anything beyond what the company would consider age appropriate in the first place?
  • Of all the things Facebook finds problematic, thank the heavens they are investing in figuring out how to find and delete revenge porn.

    I mean, I totally get it. It's hard to stabilize world governments after a 12 year run at destabilizing all governments on planet earth. Focus on one issue at a time and work from there. Build a roadmap and make sure you don't get ahead of yourself. Invest in study groups so you can build a plan out for the next decade once you figure out what works for you.

    If it weren't

  • If it looks for an exact hash that is dumb. If it looks for a near hash or biometrics validation that may trigger on false positives. Still, I think the latter is better. Maybe couple that with notifying the victim or some delegated entity?

  • All well and good until the inevitable security breach or sale of submitted photos.

  • You have to upload porn of yourself as a basis for heuristic comparison.
  • by Mal-2 ( 675116 ) on Thursday December 02, 2021 @01:08PM (#62040341) Homepage Journal

    I'll grab popcorn and wait for people to submit thousands or millions of innocuous photos, causing a whole lot of innocent people to suddenly have to defend themselves, or at the very least, demand an explanation why their vacation photos got blocked because someone filed a picture of a nude sculpture as revenge porn, and it's in the background of their photos.

  • Uploading the images you do not want shared to a central repository seems like something that no one in their right mind would do. It also implies that all parties have a copy of the image they can upload. FB has decent facial recognition and AI experience. How feasible would it be for them to detect potential explicit images, match the faces to their user base, and ask the users if they are OK with the image being uploaded by third party?
  • You upload your dirty, dirty pictures to my server and if I find them on Facebook, I'll have them taken down.

    And hey, it's free!

  • This is like when Philip Morris changed their name to altria or something. Why let these ghouls rebrand themselves? We're largely powerless to stop them but we can at least try to make them own their own bad behavior.
  • What's to stop a bunch of anti-porn church ladies from uploading every actual porn image they can find, and how's this new thing going to tell the difference between revenge porn and ordinary porn?

    • Nah, the AI has gotten MUCH better. Those kinds of changes won't do it. You have to render the image almost unrecognizable and by that time you won't be able to recognize the person in the image anyway.
      • Then this increases the chances of false positives. It's highly doubtful that their 'AI' is sophisticated enough to analyze an image, and be able to compare it to another same enough to a human image and make a 100% correct guess every time.

        I wouldn't be at all surprised if it does no more than spot sampling, to save on processing on a system that is recieving thousands of still image files every *second* (nevermind videos). This guarantees that unrelated images will 'collide' with what's in the database.

  • The person who takes a photo owns that photo. Unless there is a contract beforehand about photo distribution the model does not have any say. If someone takes a selfie they own the photo. If they let their significant other take the photo and they don't have some kind of contract written out they can distribute those photos LEGALLY just about in any way they choose. This doesn't apply to something like a candid photo because if you are in your own home you have an expectation of privacy. However if someone
    • by DarkOx ( 621550 )

      Twitter's plan might be stupid but there is no reason it would be illegal. Twitter isnt preventing YOU from otherwise distributing a photo you took, THEY are just saying they won't distribute it if any of the subjects of the photo ask them not to do so.

      What twitter chooses to publish or not publish on THEIR site is THEIR business.

      This IS the current legal status at least at the surface as long CDA-230 lives. Does not matter what moralizing or policy arguments anyone wants to make about are the publishers,

    • by Shag ( 3737 )

      The person who takes a photo owns that photo. Unless there is a contract beforehand about photo distribution the model does not have any say.

      If you're talking candid street photography, then probably. Otherwise... I think there've been quite a few court cases due to people thinking your view of things isn't exactly settled law.

  • 1) Submit your images.
    2) StopNCII asks people to confirm that they are in an image.

    Yay! No more unsourceable porn.

  • So, the company who recently had a data-breach with over a Billion (with a B) records/users compromised wants women to upload their pics to they can create a hash of them to be used to scan the web...

    I think we are going to see the target of the next major hack be them...

    #Fappening2022

  • Man, this is great. I can't think of a single thing that could possibly go wrong with this plan. Not one!

    This is a great story to read on the first day I ever connected to the internet or interacted with a computer of any kind.

  • What happens when the photos get leaked from StopNCII.org?
  • is that you do not need to upload the entire object, just the hash.

    They want the pic to be sure you are not lying about it being revenge porn. But they do not need to do that. Once the pic is found, THEN they can confirm it is nudity, not before.

  • I've never seen a dude publically share pictures of messages he received in private.

    I've seen plenty of women do that though, usually to mock or shame the person in question.

    And yet it's women pretending they're the victims.

  • We just need to make sure it's you so we can let the FBI^Wanti porn system know where to send the MIB^W^W^W^W^Wthat the image should not be displayed on our webpages.

Garbage In -- Gospel Out.

Working...