Meta Builds Tool To Stop the Spread of 'Revenge Porn' (nbcnews.com) 94
Facebook's parent company, Meta, has worked with the U.K.-based nonprofit Revenge Porn Helpline to build a tool that lets people prevent their intimate images from being uploaded to Facebook, Instagram and other participating platforms without their consent. From a report: The tool, which builds on a pilot program Facebook started in Australia in 2017, launched Thursday. It allows people who are worried that their intimate photos or videos have been or could be shared online, for example by disgruntled ex-partners, to submit the images to a central, global website called StopNCII.org, which stands for "Stop Non-Consensual Intimate Images."
"It's a massive step forward," said Sophie Mortimer, the helpline's manager. "The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it." Karuna Nain, Meta's director of global safety policy, said the company had shifted its approach to use an independent website to make it easier for other companies to use the system and to reduce the burden on the victims of image-based abuse to report content to "each and every platform." During the submission process, StopNCII.org gets consent and asks people to confirm that they are in an image. People can select material on their devices, including manipulated images, that depict them nude or nearly nude. The photos or the videos will then be converted into unique digital fingerprints known as "hashes," which will be passed on to participating companies, starting with Facebook and Instagram.
"It's a massive step forward," said Sophie Mortimer, the helpline's manager. "The key for me is about putting this control over content back into the hands of people directly affected by this issue so they are not just left at the whims of a perpetrator threatening to share it." Karuna Nain, Meta's director of global safety policy, said the company had shifted its approach to use an independent website to make it easier for other companies to use the system and to reduce the burden on the victims of image-based abuse to report content to "each and every platform." During the submission process, StopNCII.org gets consent and asks people to confirm that they are in an image. People can select material on their devices, including manipulated images, that depict them nude or nearly nude. The photos or the videos will then be converted into unique digital fingerprints known as "hashes," which will be passed on to participating companies, starting with Facebook and Instagram.
Next up (Score:3)
Cue a bunch of websites that reflect/warp/crop the images so the fingerprints don't work any more.
Re: (Score:2)
Tineye seems to be able to deal with this.
Re:Next up (Score:5, Funny)
Re: (Score:2)
Re: (Score:2)
I was thinking, if someone wanted to really invest into it, perhaps they just get pics of the person, and use what is becoming more freely usable and available Deepfake software and then, they could just churn out an almost never ending stream of revenge pr0n, with different situations, different acts and locations, etc.
Would it not be hard for this system they're talking about to track it?
Re: (Score:2)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
Pedophiles have tried that... (Score:3)
And it doesn't tend to work that well against PhotoDNA which is likely what will power the core of this service. PhotoDNA is definitely not insurmountable, but you'd have to warp the heck out of the picture like applying effects filters to in order to really get around it.
Re: (Score:1)
Back when I was a teen before Home Internet was a thing, When I was alone, I use to switch to the scrambled adult channel, in which a boob may be visible for a fraction of a second.
But today where Porn is a quick search away, why would anyone deal with a heavily altered picture where they have access to more porn than they can safely deal with.
While I am sure some people will get off on seeing someone they know, but if it is so doctored up, what is going to be the point.
Re: (Score:2)
They claim that they can identify deep fakes, so it sounds like they have a failsafe of:
IF FacialRecognition($Image) == $Person && IsNSFW($Image)
THEN BLOCK()
Also image hashing of distorted and warped or reflected images is getting really good. Lots of strategies like breaking down small blocks and looking at something like the FFT decomposition rather than pixel matching and numerous other far more sophisticated strategies.
Re: (Score:1)
Re: (Score:1)
Wait, what!? (Score:5, Insightful)
So, to stop my sexually explicit images to be shared all over the Internet I'm supposed to upload them to a website?
Yeah... That'll be a no.
This needs to be an app, or a installable program on your computer, so you can make the digital fingerprints locally.
Re:Wait, what!? (Score:4, Insightful)
You're uploading them not just any website, but to FaceBook. Yeah, this is going to be great! They take such good care of personal data.
Re:Wait, what!? (Score:5, Informative)
What you suggest is what they're doing.
Re: (Score:3)
What you suggest is what they're doing.
And how would you know the browser was not, in parallel, sending another copy of the original to another destination?
Do you trust FB not to do something so nefarious?
Re: (Score:2)
Because conspiracy theories are pointless, and with data privacy laws these days having actual teeth in many parts of the world the idea that a company would be as mind bogglingly stupid to do that is quite far fetched.
Facebook is actually quite reliable. They have always done what they said. What they have said has been terrifying, but that's no reason to believe they suddenly would not do what they say.
Re: (Score:2)
Holy Heck...I came here to say the same thing.
What could possibly go wrong when you have a website who's sole purpose is to collect intimate/explicit images from the whole world. At least when Target gets breached all I lose is a credit card number. When this place gets hacked it's going to be a mess.
Supposedly it doesn't store any copies of those images but only generates a hash...until we find out that someone was sloppy in the code and oops it does store a jpeg in the cache or something.
Re: (Score:3)
Easy solution: How about just not taking sexually explicit images in the first place. It won't stop the secret filming, but I'm guessing 80-90% of the pics and vids are consensual.
People really should reconsider a relationship if someone says "hey, lets make a video on *my* phone".
Re:Wait, what!? (Score:4, Interesting)
So, to stop my sexually explicit images to be shared all over the Internet I'm supposed to upload them to a website?
Yeah... That'll be a no.
This needs to be an app, or a installable program on your computer, so you can make the digital fingerprints locally.
The whole idea is a complete non-starter, regardless of which way you go.
Put the fingerprinting tool online? We have no guarantees images will be properly protected. Most of us have heard stories about TSA agents taking cell phone pics of scanner images of well endowed people, "Geek Squad" types scouring computers for nude pics, hospital workers looking up the medical records of celebrities in the hospital's care, or other instances of people abusing access.
Give us a local fingerprinting tool? The services have no way of verifying that the images are valid. Trolls or people with a bone to pick could generate fingerprints for every frame of every movie, every picture their ex has ever posted to Facebook, or the logos for every company they dislike, effectively erasing their targets from Facebook.
The CSAM (read: child porn) databases operated by organizations like the National Center for Missing and Exploited Children "work" because the images are already in the wild when they are collected, meaning the damage is already done and the images are verifiable. Unfortunately, all they can do is work to prevent additional harm after harm has already been done.
So.... (Score:5, Insightful)
Can I submit ordinary photos of myself and disappear from the internet or are the images "revised" by the people who work there?
Re:So.... (Score:5, Funny)
What I think we can do is download advertising from companies you don't like and upload it to this site, every ad on Facebook automatically goes to that site.
My daughter reports every post she gets that doesn't contain animals, this is similar except nobody can actually examine the images because of privacy concerns.
Re: (Score:2)
Sounds like a good idea... (Score:5, Insightful)
Until StopNCII.org gets hacked and then a treasure trove of revenge porn is released.
But not like I have any better idea.
Re: (Score:2)
Uploading only a fingerprint of the image, not the complete image, seems like a better solution.
Re: (Score:2)
Re: Sounds like a good idea... (Score:3)
Re: (Score:3, Insightful)
How about not letting someone take images of you in the first place, of course it only works if the image is taken with your consent, however you need to have the image for this to work so you probably gave consent. Of course once you find the image of you then upload it, but then whats stopping someone uploading every image to the site.
Or you could just not care, whats wrong with your body? I think revenge porn is about the other person hurting you, don't give them the satisfaction, there is plenty of porn
Re: (Score:2)
Re: (Score:2)
Tell that to the teenager thrown out of their family because of a naughty pic -- happens regularly to non-straight kids, less frequently but still to straight kids.
Re: Sounds like a good idea... (Score:1)
Re: (Score:2)
Per the article, that is what they do "StopNCII.org .. will not have access to or store copies of the original images. Instead, they will be converted to hashes in users’ browsers, and StopNCII.org will get only the hashed copies"
Re: (Score:2)
Yeah, we are trusting that the conversion happens *on their side*. The photo should never leave the uploader's hard drive.
Re: (Score:2)
It says in my quote that it happens on the user's side not their side.
Re: (Score:2)
Ah. I didn't interpret "in the user's browser" as necessarily being on the user's computer. I thought that was just the tool used for uploading. I see the point. You're correct, my mistake.
Re: (Score:2)
Re: (Score:2)
What about just basic facial recognition using a non-pornographic photo or even a driver's license photo. This would also allow you to block videos that you don't have a copy of or might not know exist. The only 2 problems I see with this is that you need buy in from a bunch of sketchy porn sites and you have to have some way to deal with false positives.
Re:Sounds like a good idea... (Score:5, Informative)
The article says the image is not stored or even received.
StopNCII.org .. will not have access to or store copies of the original images. Instead, they will be converted to hashes in users’ browsers, and StopNCII.org will get only the hashed copies.
Re: (Score:2)
Plus 1
Re: (Score:3)
So, so surprised they anticipated the obvious objection and avoid it!
Re: (Score:2)
It doesn't need to be hacked. It just needs to have a bunch of "non-NCII" images uploaded to it to the extent that its ability to properly function is completely ruined. Given that we have GANs that can generate limitless quantities of such material, it's a sure bet that this will occur.
Yup sounds legit (Score:2)
Or. . . (Score:1)
Here's an idea. Not sure it will work, but it might be a bit easier to implement.
Don't let your partner take nudes of you and don't take nudes of yourself.
Or is that too simple an idea?
Re: (Score:2)
Think of the starving venture capitalists who will be unable to afford their cocaine habit if they can't profit off of this technology because of your low tech solution you insensitive clod!
Re: Or. . . (Score:2)
Some revenge porn is taken when via candid camera when the person is not aware.
So your solution to AIDS was to stop having sex? (Score:5, Insightful)
Here's an idea. Not sure it will work, but it might be a bit easier to implement.
Don't let your partner take nudes of you and don't take nudes of yourself.
Or is that too simple an idea?
Too simple?..not sure...it's definitely a stupid one. If you're asexual, cool, you do you. Many of us like to be sexually aroused. We like to sexually arouse our partners and especially receive sexually explicit images from them. Most assume their husbands won't share the images. Also, many of them were shared without anyone's consent...a simple device leak or cloud provider, not necessarily a disgruntled partner.
People are going to take dirty pics. It's a lot of fun. It's stupid and judgmental to shame people for doing so, which you're definitely doing.
On a similar note, I hate weed. I don't like getting high. However, some do. It's their right. I don't want them getting fucking poisoned by the dispensary. Just because someone engages in activity that doesn't do it for me, doesn't mean I am indifferent to harm that befalls them. Same thing with gun owners. I don't enjoy shooting guns enough any more to own one. It's not for me. Regardless, I don't wish harm on responsible gun owners exercising their 2nd amendment rights.
Yup, never taking pics decreases the chances of them getting shared. It doesn't eliminate threats from hidden cameras, peeping toms and such. However, by your logic, the AIDS crisis could have been stopped if people just never had sex. It's dumb. People are going to fuck. People are going to get high. People are going to shoot guns for fun. People are going to take naked pics. It may not be what you're into, but you don't need to be so judgmental. It's kind of a dick comment. Slut shaming is a shitty thing to do. Why do you want to discourage women from sharing pics of their bodies?
Re: (Score:2)
So, extending that, don't risk yourself by letting compromising images to be taken of you, because sure as shit there is ALWAYS going to be a risk that it will end up on the internet.
You don't eliminate all risk of course, but that's the same with your AIDS analogy.
Of course people will have sex and do drugs, but you can lower the chance of catching or passing on aids by not sharing needle
Re: (Score:2)
The fun of dirty pics has to be balanced against the quality of your character judgement.
"What is the chance that this person will act spitefully if I were to break up and have a little fun with someone else?"
Sadly, many people lack good judgement in this area and think their current significant other would *never* do anything like that. They often reach this conclusion after the third date or so.
Age-old advice I learned a long time ago - "Do not record anything you would not be willing to show in court wit
You're still slut shaming crime victims (Score:2)
The fun of dirty pics has to be balanced against the quality of your character judgement.
"What is the chance that this person will act spitefully if I were to break up and have a little fun with someone else?"
Sadly, many people lack good judgement in this area and think their current significant other would *never* do anything like that. They often reach this conclusion after the third date or so.
Age-old advice I learned a long time ago - "Do not record anything you would not be willing to show in court with grandma by your side"
Take a step back and consider that sharing intimate pics without consent is a CRIME. You're telling crime victims they should have listened to your grandma, roughly speaking. You're indirectly slut shaming them. You're shaming them for being sexual with people who commit crimes against them and not anticipating their romantic partners would be criminals...or just get their iCloud account hacked...many revenge porn victims had their images shared without consent from either party.
While I avoid walking
Re: (Score:2)
Slut shaming is a shitty thing to do. Why do you want to discourage women from sharing pics of their bodies?
Yes, how dare you point out that actions have consequences, including risks!
We want something, and therefore it is good and can have no bad consequences, ever! Don't you understand that?
Do you apply the same logic to getting carjacked? (Score:2)
Slut shaming is a shitty thing to do. Why do you want to discourage women from sharing pics of their bodies?
Yes, how dare you point out that actions have consequences, including risks!
We want something, and therefore it is good and can have no bad consequences, ever! Don't you understand that?
You know, revenge porn is a crime. You're blaming the victim. Do you blame mugging victims? If someone gets mugged walking down the street is it their fault? As you said "actions have consequences." Long ago, some dude I've never seen before in my life punched me in the face on the subway, then tried to steal my wallet. I guess that's my fault for what...what riding the subway?...carrying a wallet?
Sharing pics without consent is a crime. It's illegal and shitty. You're being shitty by blaming th
Re: (Score:2)
Or is that too simple an idea?
Yep, way too simple. You're ignoring the entire concept of human nature. Incidentally you really shouldn't let these humans know we are among them. Try to be more careful in the future smooth wombat.
Why not just block explicit imagery? (Score:2)
Last I checked, Facebook and Instagram are 13+ services, so why not block anything beyond what the company would consider age appropriate in the first place?
Revenge Porn Ruins Society! (Score:2)
Of all the things Facebook finds problematic, thank the heavens they are investing in figuring out how to find and delete revenge porn.
I mean, I totally get it. It's hard to stabilize world governments after a 12 year run at destabilizing all governments on planet earth. Focus on one issue at a time and work from there. Build a roadmap and make sure you don't get ahead of yourself. Invest in study groups so you can build a plan out for the next decade once you figure out what works for you.
If it weren't
Hash how (Score:2)
If it looks for an exact hash that is dumb. If it looks for a near hash or biometrics validation that may trigger on false positives. Still, I think the latter is better. Maybe couple that with notifying the victim or some delegated entity?
All well and good u. . . . (Score:2)
All well and good until the inevitable security breach or sale of submitted photos.
There's just one catch... (Score:2)
Because this will never be abused... (Score:4, Interesting)
I'll grab popcorn and wait for people to submit thousands or millions of innocuous photos, causing a whole lot of innocent people to suddenly have to defend themselves, or at the very least, demand an explanation why their vacation photos got blocked because someone filed a picture of a nude sculpture as revenge porn, and it's in the background of their photos.
Facial recognition? (Score:2)
Has you revenge porn been STOLEN on the web? (Score:2)
Upload here to check!
I have an idea how to prevent that (Score:2)
You upload your dirty, dirty pictures to my server and if I find them on Facebook, I'll have them taken down.
And hey, it's free!
Can we just call them Facebook (Score:2)
Re: Can we just call them Facebook (Score:2)
This is the old name game switcheroo to try to throw the public off track.
Seems to work well because they are still using this tired old tactic.
I thought FB, IG, etc.. banned all porn because they don't want a megachurch pastor to freak out over little Johnny seeing a nipple.
Re: Can we just call them Facebook (Score:2)
^Ironic because these pastors give little Johnny a book which is the most hardcore porn ever written.
The "revenge" part might be hard for AI (Score:2)
What's to stop a bunch of anti-porn church ladies from uploading every actual porn image they can find, and how's this new thing going to tell the difference between revenge porn and ordinary porn?
Re: The "revenge" part might be hard for AI (Score:2)
Revenge anti porn
Re: The "revenge" part might be hard for AI (Score:2)
Re: The "revenge" part might be hard for AI (Score:2)
A bit of tilting here, some thin lines there (Score:2)
There, that oughta do it!
Re: (Score:2)
Re: A bit of tilting here, some thin lines there (Score:2)
Then this increases the chances of false positives. It's highly doubtful that their 'AI' is sophisticated enough to analyze an image, and be able to compare it to another same enough to a human image and make a 100% correct guess every time.
I wouldn't be at all surprised if it does no more than spot sampling, to save on processing on a system that is recieving thousands of still image files every *second* (nevermind videos). This guarantees that unrelated images will 'collide' with what's in the database.
The thing is (Score:1)
Re: (Score:2)
Twitter's plan might be stupid but there is no reason it would be illegal. Twitter isnt preventing YOU from otherwise distributing a photo you took, THEY are just saying they won't distribute it if any of the subjects of the photo ask them not to do so.
What twitter chooses to publish or not publish on THEIR site is THEIR business.
This IS the current legal status at least at the surface as long CDA-230 lives. Does not matter what moralizing or policy arguments anyone wants to make about are the publishers,
Re: (Score:1)
The person who takes a photo owns that photo. Unless there is a contract beforehand about photo distribution the model does not have any say.
If you're talking candid street photography, then probably. Otherwise... I think there've been quite a few court cases due to people thinking your view of things isn't exactly settled law.
So the process is ... (Score:2)
1) Submit your images.
2) StopNCII asks people to confirm that they are in an image.
Yay! No more unsourceable porn.
Secure?? (Score:1)
So, the company who recently had a data-breach with over a Billion (with a B) records/users compromised wants women to upload their pics to they can create a hash of them to be used to scan the web...
I think we are going to see the target of the next major hack be them...
#Fappening2022
Great Idea! (Score:2)
Man, this is great. I can't think of a single thing that could possibly go wrong with this plan. Not one!
This is a great story to read on the first day I ever connected to the internet or interacted with a computer of any kind.
serious security risk (Score:1)
Whole point of a hash... (Score:2)
is that you do not need to upload the entire object, just the hash.
They want the pic to be sure you are not lying about it being revenge porn. But they do not need to do that. Once the pic is found, THEN they can confirm it is nudity, not before.
Funny how it's women who complain when they are th (Score:2)
I've never seen a dude publically share pictures of messages he received in private.
I've seen plenty of women do that though, usually to mock or shame the person in question.
And yet it's women pretending they're the victims.
Confirm your face (Score:2)
We just need to make sure it's you so we can let the FBI^Wanti porn system know where to send the MIB^W^W^W^W^Wthat the image should not be displayed on our webpages.