Facebook To Fight Revenge Porn by Letting Potential Victims Upload Nudes in Advance (bleepingcomputer.com) 370
Catalin Cimpanu, writing for BleepingComputer: Facebook is testing new technology that is designed to help victims of revenge porn acts. It works on a database of file hashes, a cryptographic signature computed for each file. Facebook says that once an abuser tries to upload an image marked as "revenge porn" in its database, its system will block the upload process. This will work for images shared on the main Facebook service, but also for images shared privately via Messenger, Facebook's IM app. The weird thing is that in order to build a database of "revenge porn" file hashes, Facebook will rely on potential victims uploading a copy of the nude photo in advance. This process involves the victim sending a copy of the nude photo to his own account, via Facebook Messenger. This implies uploading a copy of the nude photo on Facebook Messenger, the very same act the victim is trying to prevent. The victim can then report the photo to Facebook, which will create a hash of the image that the social network will use to block further uploads of the same photo.
This is already avaliable (Score:5, Funny)
Re:This is already avaliable (Score:5, Funny)
I already have a service that handles this, just send me the pic and I'll handle it....
I would take you up on this offer. But I would not want to be responsible your your blindness.
Re:This is already avaliable (Score:5, Funny)
A different breed of revenge porn, eh? Where the subject is not the victim?
Re: (Score:3)
My apologies, I didn't realize we'd broadened the scope of relevant media to include art.
Re:This is already avaliable (Score:5, Insightful)
I would also take you up on that offer. But could you please explain to me, first, how you deal with the things with which Facebook clearly does not:
- how do you avoid charges of moving and storing child porn if the user is underage?
- how do you make sure that minor changes to the original picture do not produce completely different signatures?
- how do you make sure that none of your employees have access to the originals?
- how do you make sure people upload only pictures in which they are the subject?
- how do you make sure that the mechanism is not used to suppress legitimate pictures?
- etc, etc, etc.
What could possibly go wrong?!
Re: (Score:3)
- how do you avoid charges of moving and storing child porn if the user is underage?
By computing the hashes client-side and transmitting and storing only the hashes, obviously.
- how do you make sure that minor changes to the original picture do not produce completely different signatures?
Wavelet transform. Compute hashes in wavelet space.
- how do you make sure that none of your employees have access to the originals?
By computing the hashes client-side and storing only the hashes, obviously.
Re: (Score:3)
Re:This is already avaliable (Score:5, Insightful)
Don't fucking let someone take pictures or video of you naked and/or having sex!!!!
Sheesh....when did something like common sense about not letting someone take pics of you in compromising situations go out the fucking door?
Re:This is already avaliable (Score:5, Funny)
Re:This is already avaliable (Score:5, Funny)
Well, I thought it was really just that easy until I realized that the reason nobody had ever taken a naked picture of me was because I was ugly.
Don't feel so down on yourself. I've taken plenty of naked pictures of you.
Re:This is already avaliable (Score:5, Funny)
Re:This is already avaliable (Score:5, Funny)
Rodney Dangerfield
Re: (Score:2)
This is no more or less embarrassing than someone getting a tattoo of $current_lover_name. Today we share digitally that which we did physically.
Re: (Score:2)
This is no more or less embarrassing than someone getting a tattoo of $current_lover_name. Today we share digitally that which we did physically.
Don't be so down on yourself. I've take plenty of nude photos of you through your bedroom window.
Re: (Score:2)
Re: (Score:2)
Re:This is already avaliable (Score:5, Informative)
Also, in UK, Facebook could be charged with possession of child pornography and the teen uploading the photo with distribution
Re: (Score:3)
Well in this particular case since they're expecting you to upload your nudies that implies you took a selfie or some such... in which case I'd like to add an addendum to GP:
and for $DIETY sake, don't take nudies and send them to someone!
Also I have a tangential question (sorta reverse revenge porn):
1) Minor takes nude selfie and sends it to target (say hated step parent).
2) Reports target for being in possession of CP.
3) now what?
Re: (Score:3, Insightful)
Also I have a tangential question (sorta reverse revenge porn):
1) Minor takes nude selfie and sends it to target (say hated step parent).
2) Reports target for being in possession of CP.
3) now what?
How it works in the US:
3) Police breaks down door and drags target to jail
4) Police finds evidence on phone
5) Depending on how rich/connected/white target is
a) not: target gets charged with possession of CP and goes to jail. Target is put on list of sex offenders.
b) very: target's lawyer points out the minor sent it, target is innocent. Minor is sent to juvie for distributing CP. Minor is put on list of sex offenders.
c) somewhat: (a) and (b)
Re:This is already avaliable (Score:5, Insightful)
yup 100% of the time it will work. except the times where it doesn't like when someone has hidden a camera in a bathroom or hotel room or their bedroom...
Or they accidentally upload the photo to their feed instead of the protection service.
Re: (Score:3)
And there have been incidents of people upset about being video recorded who couldn't do anything about it until they realized audio was recorded as well.
Re: This is already avaliable (Score:2)
It's a generation thing.
The new generation is not only completely superficial, but also clueless about privacy in technology.
Re: (Score:3)
Have to retire soon, this is getting too stupid, everywhere you look.
Re: (Score:2)
You left out 'self-absorbed' and 'narcissistic'.
Re: This is already avaliable (Score:5, Insightful)
Re: (Score:3, Interesting)
I have a QUICK solution to all this, works 100%.
Don't fucking let someone take pictures or video of you naked and/or having sex!!!!
Sheesh....when did something like common sense about not letting someone take pics of you in compromising situations go out the fucking door?
Yes, I suppose that's a reasonable solution, if you never want to receive sexy pictures or video from a significant other. As most people would like to receive such, then blaming the victim and discouraging the practice would seem to be counter to most folk's interests. But not yours [wikipedia.org], I guess.
Re: (Score:3)
That's a pretty big assumption there! I get along fine without sending nudy pics to my wife, and vice versa.
The rest of us appreciate them, though, so pass on our thanks.
Re:This is already avaliable (Score:5, Insightful)
Compromising position? Just do what I do, do not consider them compromising.
WTF is wrong with people? Showing war movies or action movies where people get blown away is OK, but if you were to show a married couple having sex to create a child it would be considered "dirty".
We live in a death culture.
Re:This is already avaliable (Score:4, Interesting)
Problem isn't the victim considering them compromising. The problem is the victim's family, friends, coworkers, boss, etc considering them compromising.
Really though, its a generational thing to some extent. By the time the children of the millennials are in their 40s or 50s and running the world, so many of them will have nudes, stupid social media posts, etc out in the world that its going to necessarily be a non-issue or for examples employers won't be able to find any employees that fit their "internet purity" conditions.
Its only a problem right now when the generation doing the hiring never really had to deal with these kind of things while the generation looking to be hired don't really care that much because everyone they know does it. Its the intersection of those two worlds where everything hits the fan.. well, in a generalized sense of course -- there will always be exceptions obviously.
Re:This is already avaliable (Score:4, Insightful)
Then don't look.
Re:This is already avaliable (Score:5, Funny)
Re:This is already avaliable (Score:5, Funny)
"We've noticed your hash is kinda small. Would you be interested in some hash-enlargement pills ?"
Re: (Score:3)
#hashshaming
Re:This is already avaliable (Score:5, Funny)
Right? There's NO WAY this could ever go wrong. Nobody from Facebook will ever see them. Nobody will ever hack facebook and steal them, facebook will never sell them to plastic surgeons under a marketing plan for people that need a little "nip" here and there. </sarcasm>
Re: (Score:2)
Seriously, they could release a "restrict pic" app that you could use to generate the hashes locally and then upload just the hash, but...
All the revenger has to do is change some metadata, or a pixel, and BANG! new hash.
Making this all relatively pointless.
Re: (Score:3)
It's probably an image signature, not literally a hash. But yeah, if the real intent was as stated, they'd release an app that uploads just the signature. This plan can only end in lulz.
Re:This is already avaliable (Score:5, Funny)
This reminds me of that website where you could enter you credit card number to check if it was leaked to the internet....
What could possibly go wrong? (Score:5, Insightful)
Re:What could possibly go wrong? (Score:4, Interesting)
They should allow the potential victim to upload the hash, and not the image.
Re: (Score:2)
Comment removed (Score:5, Informative)
Re: (Score:2)
The user would need to install their own perceptual hash tool, because a cryptographic hash would be trivially easy to get a false positive on: just flip, add, or remove a single bit on the file.
Re: (Score:3)
They should allow the potential victim to upload the hash, and not the image.
THIS!
However... Everybody knows that you can alter the hash on an image in any number of ways, including simply converting it to another image format or scale it.
You also know Facebook won't make this happen. The whole idea was to drive a new website with free content.
Re: What could possibly go wrong? (Score:2)
It's not that kind of hash, you doofus.
It's the signature that is computed by machine learning computer vision and that contains the weights associated with each feature deemed relevant for that application.
Re: What could possibly go wrong? (Score:4, Insightful)
Re: What could possibly go wrong? (Score:5, Insightful)
The point is that facebook would need to store more than just a hash to accomplish their goal -- they need ways to deal with the image being scaled, rotated, run through a filter, etc. In other words... they need to keep a likeness of the original image.
There are simple image signature algorithms that are stable across all those transformations (unless the filter is extreme), but will still make random collisions unlikely. Old technology at this point.
Re: What could possibly go wrong? (Score:3)
Re: (Score:2)
But do you really trust Facebook that much?
Nobody trusts FB. At least nobody should. I don't see any motivation for them to store the image, so I'd like to think they wouldn't, but they do make a habit of collecting everything they can get their mitts on.
If these hashes work the same way as the hashes I'm familiar with, circumvention will require the sophistication to make a minor alteration to a single pixel.
Re:What could possibly go wrong? (Score:4, Informative)
Image hashes typically work in a way such that you can find an image even if minor alterations have been made, such as if you re-compress them, change formats, alter a single pixel, etc. From what I understand, it often involves analysis of the color histogram used in initial searches, plus a tiny thumbnail for direct comparison, which would generally be too small to recognize a specific person. This lets you do "fuzzy" matching, unlike a hash like CRC32 or SHA1 which only can find exact matches.
I agree that this has all sorts of psychological barriers. "Hey, I'm worried about revenge porn, so I'm going to upload all my nude pics I shared with my ex-boyfriend to Facebook for analysis. You know, Facebook, the company that scans all my personal data for profit."
Re: (Score:3)
The link you provided describes a method for finding differences between two images. Using those two images. It says nothing about comparing images based on a hash.
Re:What could possibly go wrong? (Score:4, Interesting)
I know they "claim" they will not keep the pictures, but only a hash of the image. But do you really trust Facebook that much?
Won't take long before the police will pay Facebook to ID the corpses they find.
"Detective Hathaway, run this birthmark that looks like a camel through facebook and see who has a camel shaped birthmark on their arse"
Re: (Score:2)
Re: (Score:3)
Cute (Score:2)
Cute... Facebook pretending they don't have nude photos or a naked composite of everyone already.
Re: (Score:2)
But do you really trust Facebook that much?
If I'm being logical about it, maybe?
Name one PR problem that could cause people to leave Facebook? Evidently showing propaganda for hostile governments trying to destroy us from within didn't generate much heat. But if there's one thing Americans get upset about, it's boobs being shown.
I'd imagine FB is smart enough to realize they might actually get in trouble for letting nude pics they were trusted with slip.
My opinion might be different if anyone at all WANTED to see my nude ass...
Re: (Score:2)
And Facebook won't 'accidentally' use the nude image as a picture when sending out 'potential friend' notices to other people after datamining your shadow profile...
Re: (Score:2)
But do you really trust Facebook that much?
I do. But I wouldn't use their service anyway because matching hashes is utterly frigging useless. Mind you this from Facebook who's copyright infringement detection system can be defeated by altering the speed of the clip by 1%
Re: What could possibly go wrong? (Score:2, Informative)
Because English speakers know that "male" pronouns convey no information about gender. English is gender neutral unless female pronouns are used. So, when you want to make neutral statements you don't use female pronouns.
Re: (Score:3)
My penis is a thing of beauty. Everybody should see it!
Not everyone can afford an electron microscope
April Fools Day on Slashdot? (Score:2)
Why not a Porn version of Wikipedia? (Score:5, Funny)
What's wrong with putting all the nudes of every person on facebook on a database ?
What could go Equifax?
PSA (Score:2)
Labia shape hash! (Score:2)
You all laughed at me when I started building my labia shape hash algorithm, modded me funny.
Now you see how serious this issue is.
Ladies, send in your labia prints. Otherwise there can be no guarantee you'll be notified.
Next: Unlock you phone with the new 'snail trails' app.
Why not compute hash locally? (Score:5, Insightful)
The public reaction to this is understandably somewhat muted and off-put. Why upload nude photos to Facebook, indeed? The claim is that they will compute a hash of the image, and store that to prevent future uploads.
If that is really the case, when why not compute the hash locally on the user's machine, and upload only the hash? Surely that can be done on essentially all modern hardware from cell phone to desktop in a reasonable amount of time.
Re: (Score:2)
I forsee them analysing the images and following relationship status to advertise tattoo removal.
Re: (Score:2)
Hashing program is named pkzip.
Re: (Score:2)
I'm curious what kind of algorithm they are using for this.
Traditional hashing will hash the entirety of the image. A simple work around of simply resizing or cropping the image before uploading will get around it.
Unless they mean fingerprinting. Fingerprinting != Hashing.
Re: (Score:3, Interesting)
We just saw an article related to this. The hash is something like Microsoft's image identifier hash they acquired when they bought... ???...
It basically works like this.
Image is resized to a standard size 1020x768 I think
converted to black and white
edge detect applied
at this point Trained AI is supposed to be quite accurate in identifying matching photos
this works even if you resize the photo or color adjust it
Seems like there will be gaping holes to be discovered in this method though.
Re: (Score:2)
Re:Why not compute hash locally? (Score:5, Insightful)
If that is really the case, when why not compute the hash locally on the user's machine, and upload only the hash?
Cool. I hate CNN's fake news. I'm going to write a script that takes every image from every CNN story and uploads the hashes. Sharing of CNN stories on Facebook is going to be shut down.
s/CNN/whatever you hate/
The obvious corollary here is that Facebook needs not just the hashes but also the original image, so they can determine whether it's a real nude photo. Algorithms can do that pretty well, so Facebook may be able to arrange that no human ever needs to see the image... but there's no way for the uploader to be certain that's what they're doing.
Also, the "hash" probably needs to be something a bit more image-focused than, say, SHA256. Otherwise any trivial modification of the image would change the hash. So it's got to be something that survives scaling, cropping, resolution changes, watermarking, etc. Which means that if the exact algorithm leaks, people can reverse engineer it to figure out how to work around it. That's another reason they need to do the hashing on their end.
Re: (Score:2)
Better yet, write a script that pulls every image from the white house/oval office feed and uploads it as a porn hash. Now we're solving some problems!
Simpler solution (Score:5, Insightful)
If you don't want your nudes to end up on the internet, don't send them to other people.
Re: (Score:3)
If you don't want your nudes to end up on the internet, don't send them to other people.
Better yet... don't take them in the first place.
Re: (Score:2)
Implying that the type of people who consider using revenge porn are the kind of considerate and level headed thinkers that wouldn't ever take photos without soemones permission in the first place.
Re:Simpler solution (Score:5, Informative)
Oh. Another idiot. What about the ones others took, without you being aware
If someone else took the picture and the victim is unaware, the target probably doesn't have a copy to upload preemptively.
Re: (Score:3)
Re: (Score:3)
Pick better friends and don't get shitfaced, blacked-out drunk and/or high when you go out. Don't expect other people to be sympathetic when your own poor choices come to bite you in that ass you felt the internet just had to see.
Meh... Facebook can probably generate a fairly accurate nude photo of anyone based upon their skin tone, and various clothed body photos already taken. If they have enough body shots they can probably have an algorithm generate your body shape under those clothes. They can guess at the colour of your nipples and genitalia by looking at the pigmentation of your lips. (nipple colour supposedly similar to lip colour). Pubic hair colour can be guessed at by hair colour- they can't tell if you shave or not,
Re: (Score:3)
What about the ones others took, without you being aware?
Oh. Another idiot. You don't have the hash of those.
Re: (Score:2)
Or you know, infatuated with the idea of personal responsibility. I know I know, it's anathema to a well ordered, safe society... But people being held accountable for their decisions (even if it's negligence on their part, or results in being misled) is still alive and well in this country.
Maybe in another 20 years we'll finally solve this age old problem.
Reprehensible (Score:2)
Forcing users to upload highly-sensitive pics to make sure others' won't post them.
There HAS to be a better way.... like: how about analyzing an image and computing the hash on a client device and uploading just the hash + analysis data? Or at the very least.... mask any public individual identifying info inside the image before uploading.
Re: (Score:2)
Re: (Score:3)
That signature may be the funniest thing I've ever read on
I'm assuming, of course, that Distiraptor is a vector and Timeraptor is a scalar.
Re: (Score:2)
i *STILL* wouldn't trust facebook with that. they might say they're doing a local hash, but whoops, we sent the entire image. (Or they say: "We uploaded the image to verify the hashing algo on the client -- we immediately delete it. Honest!"
It's pretty fucking simple. A company with the singular purpose of hoovering personal information about you (a company so invasive, so creepy; that in absence of direct data from you, will INFER information about you and yours), and then sell that information to advert
The proper way to do this (Score:2)
Pills? (Score:2)
I'm terrible, I'm sorry. I couldn't help myself, the article made it too easy.
Is Mark Zuckerberg your friend? (Score:2)
Blocking uploads (Score:2)
" . . . its system will block the upload process."
Given that revenge porn is a crime in an increasing number of places, shouldn't be include "and notify the police of the attempt"? Does it even notify the user of the attempt?
What are the terms of service on these uploads? Do they include the clause that says "and we can change these TOS any time we want, to anything we want, and there's nothing you can do about it"?
Why do you need to send the image? (Score:2)
Re: (Score:2)
Requires Manual Review of Images (Score:2, Insightful)
This won't work because someone from Facebook would need to look at the images to determine if a request is legit, which, as the article says, is EXACTLY the thing the victim wants to avoid.
If nobody looks at the image, or, as some have suggested, the hash is computed client side (so nobody would be able to look at the image) it would be ripe for abuse. I could easily file takedowns for any pictures I want.
As a side note, someone also mentioned hashes won't work since they can be foiled by simple image mani
Easy fix (Score:3)
If nobody looks at the image, or, as some have suggested, the hash is computed client side (so nobody would be able to look at the image) it would be ripe for abuse.
There is a very easy fix for this - the first time the hash matches the takedown requires human approval. This way someone only looks at the image if the image is already uploaded for people to look at and you can't abuse the system by filing takedowns for random pictures. This would even reduce Facebook's work because instead of checking every upload they only have to check ones which match.
So who's idea was this, exactly? (Score:2)
SubjectIsSubject (Score:2)
Re: (Score:2)
This doesn't make sense. (Score:4, Insightful)
First off, is there really a problem with revenge porn on facebook and if there is, it would seem that the easiest solution for
facebook is to block all porn. I've never seen nudes on facebook. I always assumed that it would be against facebook policy
as facebook is mostly a PG-13 kindof place.
Second, I would think that facial recognition would be the correct solution. Let someone upload a picture of their face and
facebook can make sure that that particular face doesn't appear in nudes. An unidentified nude without a face even if someone
says "this is so-in-so" is pretty harmless as if you can't see the face you could pretty much say it is anyone.
Lastly, google just came out with facial recognition for dogs so presumably you could also use that same technology for
tattoos, or specific body parts too.
But again, I would think revenge porn would be primarily a problem on other services not facebook.
Re: (Score:3)
The prudishness endemic in American society is not universal. Thankfully, I live in a society that does not tie itself in hypocritical knots because a nipple was shown on TV.
Re:Yeah, about that (Score:5, Informative)
There are much better methods of hashing images than stupidly taking a file checksum, such as this one here:
https://pippy360.github.io/tra... [github.io]
This algorithm here does not care about affine transformations applied to the image, so it can be scaled, rotated, skewed, and still be a match.
Re: (Score:3)
The first step of hashing, per the linked article, involves finding keypoints in the image that are still detected as keypoints even in an affine transformed copy of an image. How is this done? Does it involve scale-invariant feature transform (SIFT) or some other feature detection means subject to a United States patent?
Re:Yeah, about that (Score:4, Informative)
There are image similarity algorithms out there that do not care about the absolute hash of the file, and can detect the same image cropped, scaled, or rotated just fine.
Here is one such algorithm:
https://pippy360.github.io/tra... [github.io]
Re: (Score:3)
Like audio fingerprinting this isn't a cryptographically accurate hash but more like a song fingerprint in that it only takes a small segment of the image for recognition.
Of course the real abuse would be to take a picture of the Eiffel tower and watch everyone's uploads fail for a few hours.