Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Facebook Social Networks Technology

Facebook To Fight Revenge Porn by Letting Potential Victims Upload Nudes in Advance (bleepingcomputer.com) 370

Catalin Cimpanu, writing for BleepingComputer: Facebook is testing new technology that is designed to help victims of revenge porn acts. It works on a database of file hashes, a cryptographic signature computed for each file. Facebook says that once an abuser tries to upload an image marked as "revenge porn" in its database, its system will block the upload process. This will work for images shared on the main Facebook service, but also for images shared privately via Messenger, Facebook's IM app. The weird thing is that in order to build a database of "revenge porn" file hashes, Facebook will rely on potential victims uploading a copy of the nude photo in advance. This process involves the victim sending a copy of the nude photo to his own account, via Facebook Messenger. This implies uploading a copy of the nude photo on Facebook Messenger, the very same act the victim is trying to prevent. The victim can then report the photo to Facebook, which will create a hash of the image that the social network will use to block further uploads of the same photo.
This discussion has been archived. No new comments can be posted.

Facebook To Fight Revenge Porn by Letting Potential Victims Upload Nudes in Advance

Comments Filter:
  • by pnutjam ( 523990 ) <slashdot@borowicz. o r g> on Tuesday November 07, 2017 @03:21PM (#55507863) Homepage Journal
    I already have a service that handles this, just send me the pic and I'll handle it....
    • by Major_Disorder ( 5019363 ) on Tuesday November 07, 2017 @03:27PM (#55507933)

      I already have a service that handles this, just send me the pic and I'll handle it....

      I would take you up on this offer. But I would not want to be responsible your your blindness.

      • by AdaStarks ( 2634757 ) on Tuesday November 07, 2017 @04:55PM (#55508805)

        A different breed of revenge porn, eh? Where the subject is not the victim?

      • by Tuidjy ( 321055 ) on Tuesday November 07, 2017 @09:08PM (#55510877)

        I would also take you up on that offer. But could you please explain to me, first, how you deal with the things with which Facebook clearly does not:
        - how do you avoid charges of moving and storing child porn if the user is underage?
        - how do you make sure that minor changes to the original picture do not produce completely different signatures?
        - how do you make sure that none of your employees have access to the originals?
        - how do you make sure people upload only pictures in which they are the subject?
        - how do you make sure that the mechanism is not used to suppress legitimate pictures?
        - etc, etc, etc.

        What could possibly go wrong?!

        • - how do you avoid charges of moving and storing child porn if the user is underage?

          By computing the hashes client-side and transmitting and storing only the hashes, obviously.

          - how do you make sure that minor changes to the original picture do not produce completely different signatures?

          Wavelet transform. Compute hashes in wavelet space.

          - how do you make sure that none of your employees have access to the originals?

          By computing the hashes client-side and storing only the hashes, obviously.

    • I bet you will.
      • by cayenne8 ( 626475 ) on Tuesday November 07, 2017 @03:35PM (#55508029) Homepage Journal
        I have a QUICK solution to all this, works 100%.

        Don't fucking let someone take pictures or video of you naked and/or having sex!!!!

        Sheesh....when did something like common sense about not letting someone take pics of you in compromising situations go out the fucking door?

        • by computational super ( 740265 ) on Tuesday November 07, 2017 @03:52PM (#55508209)
          Well, I thought it was really just that easy until I realized that the reason nobody had ever taken a naked picture of me was because I was ugly.
        • This is no more or less embarrassing than someone getting a tattoo of $current_lover_name. Today we share digitally that which we did physically.

          • This is no more or less embarrassing than someone getting a tattoo of $current_lover_name. Today we share digitally that which we did physically.

            Don't be so down on yourself. I've take plenty of nude photos of you through your bedroom window.

        • by CodeHog ( 666724 )
          yup 100% of the time it will work. except the times where it doesn't like when someone has hidden a camera in a bathroom or hotel room or their bedroom...
          • by lq_x_pl ( 822011 )
            In this case, the potential victim does not have access to the photo to pre-emptively upload. Doesn't apply to the context of this discussion. :-/
          • Well in this particular case since they're expecting you to upload your nudies that implies you took a selfie or some such... in which case I'd like to add an addendum to GP:

            and for $DIETY sake, don't take nudies and send them to someone!

            Also I have a tangential question (sorta reverse revenge porn):
            1) Minor takes nude selfie and sends it to target (say hated step parent).
            2) Reports target for being in possession of CP.
            3) now what?

            • Re: (Score:3, Insightful)

              by Anonymous Coward

              Also I have a tangential question (sorta reverse revenge porn):
              1) Minor takes nude selfie and sends it to target (say hated step parent).
              2) Reports target for being in possession of CP.
              3) now what?

              How it works in the US:
              3) Police breaks down door and drags target to jail
              4) Police finds evidence on phone
              5) Depending on how rich/connected/white target is
              a) not: target gets charged with possession of CP and goes to jail. Target is put on list of sex offenders.
              b) very: target's lawyer points out the minor sent it, target is innocent. Minor is sent to juvie for distributing CP. Minor is put on list of sex offenders.
              c) somewhat: (a) and (b)

          • by Oswald McWeany ( 2428506 ) on Tuesday November 07, 2017 @04:15PM (#55508437)

            yup 100% of the time it will work. except the times where it doesn't like when someone has hidden a camera in a bathroom or hotel room or their bedroom...

            Or they accidentally upload the photo to their feed instead of the protection service.

        • It's a generation thing.
          The new generation is not only completely superficial, but also clueless about privacy in technology.

        • Re: (Score:3, Interesting)

          by Theaetetus ( 590071 )

          I have a QUICK solution to all this, works 100%.

          Don't fucking let someone take pictures or video of you naked and/or having sex!!!!

          Sheesh....when did something like common sense about not letting someone take pics of you in compromising situations go out the fucking door?

          Yes, I suppose that's a reasonable solution, if you never want to receive sexy pictures or video from a significant other. As most people would like to receive such, then blaming the victim and discouraging the practice would seem to be counter to most folk's interests. But not yours [wikipedia.org], I guess.

        • by plopez ( 54068 ) on Tuesday November 07, 2017 @05:27PM (#55509079) Journal

          Compromising position? Just do what I do, do not consider them compromising.

          WTF is wrong with people? Showing war movies or action movies where people get blown away is OK, but if you were to show a married couple having sex to create a child it would be considered "dirty".

          We live in a death culture.

          • by Altrag ( 195300 ) on Tuesday November 07, 2017 @08:05PM (#55510511)

            Problem isn't the victim considering them compromising. The problem is the victim's family, friends, coworkers, boss, etc considering them compromising.

            Really though, its a generational thing to some extent. By the time the children of the millennials are in their 40s or 50s and running the world, so many of them will have nudes, stupid social media posts, etc out in the world that its going to necessarily be a non-issue or for examples employers won't be able to find any employees that fit their "internet purity" conditions.

            Its only a problem right now when the generation doing the hiring never really had to deal with these kind of things while the generation looking to be hired don't really care that much because everyone they know does it. Its the intersection of those two worlds where everything hits the fan.. well, in a generalized sense of course -- there will always be exceptions obviously.

    • by computational super ( 740265 ) on Tuesday November 07, 2017 @03:50PM (#55508187)
      I swear I'm only storing the hash code. Honest.
    • by Anonymous Coward on Tuesday November 07, 2017 @04:22PM (#55508519)

      This reminds me of that website where you could enter you credit card number to check if it was leaked to the internet....

  • by Major_Disorder ( 5019363 ) on Tuesday November 07, 2017 @03:23PM (#55507875)
    I know they "claim" they will not keep the pictures, but only a hash of the image. But do you really trust Facebook that much?
    • by Anonymous Coward on Tuesday November 07, 2017 @03:28PM (#55507949)

      They should allow the potential victim to upload the hash, and not the image.

      • This is what I was expecting the summary to say as what they're actually doing is ridiculous. However, the abuse possible for both seems like this just won't work. Either you're posting nudes (regardless of what Facebook says that doesn't seem smart) or someone will find a way to insert every possible hash (or a large chunk of random ones) into their system and it will flag every possible image...
      • The user would need to install their own perceptual hash tool, because a cryptographic hash would be trivially easy to get a false positive on: just flip, add, or remove a single bit on the file.

      • They should allow the potential victim to upload the hash, and not the image.

        THIS!

        However... Everybody knows that you can alter the hash on an image in any number of ways, including simply converting it to another image format or scale it.

        You also know Facebook won't make this happen. The whole idea was to drive a new website with free content.

    • by gnick ( 1211984 )

      But do you really trust Facebook that much?

      Nobody trusts FB. At least nobody should. I don't see any motivation for them to store the image, so I'd like to think they wouldn't, but they do make a habit of collecting everything they can get their mitts on.

      If these hashes work the same way as the hashes I'm familiar with, circumvention will require the sophistication to make a minor alteration to a single pixel.

      • by Dutch Gun ( 899105 ) on Tuesday November 07, 2017 @04:10PM (#55508397)

        Image hashes typically work in a way such that you can find an image even if minor alterations have been made, such as if you re-compress them, change formats, alter a single pixel, etc. From what I understand, it often involves analysis of the color histogram used in initial searches, plus a tiny thumbnail for direct comparison, which would generally be too small to recognize a specific person. This lets you do "fuzzy" matching, unlike a hash like CRC32 or SHA1 which only can find exact matches.

        I agree that this has all sorts of psychological barriers. "Hey, I'm worried about revenge porn, so I'm going to upload all my nude pics I shared with my ex-boyfriend to Facebook for analysis. You know, Facebook, the company that scans all my personal data for profit."

    • by Oswald McWeany ( 2428506 ) on Tuesday November 07, 2017 @03:45PM (#55508149)

      I know they "claim" they will not keep the pictures, but only a hash of the image. But do you really trust Facebook that much?

      Won't take long before the police will pay Facebook to ID the corpses they find.

      "Detective Hathaway, run this birthmark that looks like a camel through facebook and see who has a camel shaped birthmark on their arse"

    • Comment removed based on user account deletion
    • Actually I distrust Facebook so much that I'm pretty sure they already have a few gigabytes of naked pictures of everybody, so uploading another one won't make much difference.
    • Cute... Facebook pretending they don't have nude photos or a naked composite of everyone already.

    • But do you really trust Facebook that much?

      If I'm being logical about it, maybe?

      Name one PR problem that could cause people to leave Facebook? Evidently showing propaganda for hostile governments trying to destroy us from within didn't generate much heat. But if there's one thing Americans get upset about, it's boobs being shown.

      I'd imagine FB is smart enough to realize they might actually get in trouble for letting nude pics they were trusted with slip.

      My opinion might be different if anyone at all WANTED to see my nude ass...

    • And Facebook won't 'accidentally' use the nude image as a picture when sending out 'potential friend' notices to other people after datamining your shadow profile...

    • But do you really trust Facebook that much?

      I do. But I wouldn't use their service anyway because matching hashes is utterly frigging useless. Mind you this from Facebook who's copyright infringement detection system can be defeated by altering the speed of the clip by 1%

  • April Fools Day on Slashdot?
  • by Hylandr ( 813770 ) on Tuesday November 07, 2017 @03:24PM (#55507909)

    What's wrong with putting all the nudes of every person on facebook on a database ?

    What could go Equifax?

  • This reminds me of the humorous PSA where some teenage boys are offering free mammograms...
  • You all laughed at me when I started building my labia shape hash algorithm, modded me funny.

    Now you see how serious this issue is.

    Ladies, send in your labia prints. Otherwise there can be no guarantee you'll be notified.

    Next: Unlock you phone with the new 'snail trails' app.

  • by pz ( 113803 ) on Tuesday November 07, 2017 @03:28PM (#55507937) Journal

    The public reaction to this is understandably somewhat muted and off-put. Why upload nude photos to Facebook, indeed? The claim is that they will compute a hash of the image, and store that to prevent future uploads.

    If that is really the case, when why not compute the hash locally on the user's machine, and upload only the hash? Surely that can be done on essentially all modern hardware from cell phone to desktop in a reasonable amount of time.

    • Indeed - that'd be the only sane way to do it if you cared at all about privacy.
      I forsee them analysing the images and following relationship status to advertise tattoo removal.
    • Hashing program is named pkzip.

    • I'm curious what kind of algorithm they are using for this.

      Traditional hashing will hash the entirety of the image. A simple work around of simply resizing or cropping the image before uploading will get around it.

      Unless they mean fingerprinting. Fingerprinting != Hashing.

      • Re: (Score:3, Interesting)

        by Anonymous Coward

        We just saw an article related to this. The hash is something like Microsoft's image identifier hash they acquired when they bought... ???...
        It basically works like this.
        Image is resized to a standard size 1020x768 I think
        converted to black and white
        edge detect applied
        at this point Trained AI is supposed to be quite accurate in identifying matching photos
        this works even if you resize the photo or color adjust it

        Seems like there will be gaping holes to be discovered in this method though.

    • Comment removed based on user account deletion
    • by swillden ( 191260 ) <shawn-ds@willden.org> on Tuesday November 07, 2017 @07:08PM (#55510095) Journal

      If that is really the case, when why not compute the hash locally on the user's machine, and upload only the hash?

      Cool. I hate CNN's fake news. I'm going to write a script that takes every image from every CNN story and uploads the hashes. Sharing of CNN stories on Facebook is going to be shut down.

      s/CNN/whatever you hate/

      The obvious corollary here is that Facebook needs not just the hashes but also the original image, so they can determine whether it's a real nude photo. Algorithms can do that pretty well, so Facebook may be able to arrange that no human ever needs to see the image... but there's no way for the uploader to be certain that's what they're doing.

      Also, the "hash" probably needs to be something a bit more image-focused than, say, SHA256. Otherwise any trivial modification of the image would change the hash. So it's got to be something that survives scaling, cropping, resolution changes, watermarking, etc. Which means that if the exact algorithm leaks, people can reverse engineer it to figure out how to work around it. That's another reason they need to do the hashing on their end.

  • Simpler solution (Score:5, Insightful)

    by religionofpeas ( 4511805 ) on Tuesday November 07, 2017 @03:31PM (#55507961)

    If you don't want your nudes to end up on the internet, don't send them to other people.

    • If you don't want your nudes to end up on the internet, don't send them to other people.

      Better yet... don't take them in the first place.

    • Implying that the type of people who consider using revenge porn are the kind of considerate and level headed thinkers that wouldn't ever take photos without soemones permission in the first place.

  • Forcing users to upload highly-sensitive pics to make sure others' won't post them.
    There HAS to be a better way.... like: how about analyzing an image and computing the hash on a client device and uploading just the hash + analysis data? Or at the very least.... mask any public individual identifying info inside the image before uploading.

    • Comment removed based on user account deletion
    • i *STILL* wouldn't trust facebook with that. they might say they're doing a local hash, but whoops, we sent the entire image. (Or they say: "We uploaded the image to verify the hashing algo on the client -- we immediately delete it. Honest!"

      It's pretty fucking simple. A company with the singular purpose of hoovering personal information about you (a company so invasive, so creepy; that in absence of direct data from you, will INFER information about you and yours), and then sell that information to advert

  • So the proper way to do this is a one way transfer between servers and then disconnect each server from the internet when it's full. Then send requests over a highly restricted local network for comparisons. How they'll actually probably do this is on live, public-facing servers and just try to permissions-protect them or something stupid like that.
  • Imagine the enlargement pill ads when they start figuring out who would actually benefit!

    I'm terrible, I'm sorry. I couldn't help myself, the article made it too easy.
  • He wants to see you neked.
  • " . . . its system will block the upload process."

    Given that revenge porn is a crime in an increasing number of places, shouldn't be include "and notify the police of the attempt"? Does it even notify the user of the attempt?

    What are the terms of service on these uploads? Do they include the clause that says "and we can change these TOS any time we want, to anything we want, and there's nothing you can do about it"?

  • Simply create a utility that lets a user open an image, calculate the hash, and send to facebook. They have enough machine learning ability to look and tell if something is a hot dog or not a hot dog. Or are they going to rely on human beings to review every photo and validate that it isn't, say, a cat photo?
  • This won't work because someone from Facebook would need to look at the images to determine if a request is legit, which, as the article says, is EXACTLY the thing the victim wants to avoid.

    If nobody looks at the image, or, as some have suggested, the hash is computed client side (so nobody would be able to look at the image) it would be ripe for abuse. I could easily file takedowns for any pictures I want.

    As a side note, someone also mentioned hashes won't work since they can be foiled by simple image mani

    • If nobody looks at the image, or, as some have suggested, the hash is computed client side (so nobody would be able to look at the image) it would be ripe for abuse.

      There is a very easy fix for this - the first time the hash matches the takedown requires human approval. This way someone only looks at the image if the image is already uploaded for people to look at and you can't abuse the system by filing takedowns for random pictures. This would even reduce Facebook's work because instead of checking every upload they only have to check ones which match.

  • I'm personally suspicious of anyone who asks me for my private data -- all the more-so, when the first thing out of their mouth is that they only need it to protect me. And the thought that promptly entered my mind upon reading this particular blurb, is that perhaps somebody deep within the confines of Facebook HQ is positively drooling at the prospect of all those uploaded nudes that will soon be coming his way...
  • Awesome, because the day that leaks will be the beginning of the end for Facebook. They say they won't save them, but I don't trust them, not even a little, to keep their word.
  • Comment removed based on user account deletion
  • by Wycliffe ( 116160 ) on Tuesday November 07, 2017 @04:30PM (#55508581) Homepage

    First off, is there really a problem with revenge porn on facebook and if there is, it would seem that the easiest solution for
    facebook is to block all porn. I've never seen nudes on facebook. I always assumed that it would be against facebook policy
    as facebook is mostly a PG-13 kindof place.

    Second, I would think that facial recognition would be the correct solution. Let someone upload a picture of their face and
    facebook can make sure that that particular face doesn't appear in nudes. An unidentified nude without a face even if someone
    says "this is so-in-so" is pretty harmless as if you can't see the face you could pretty much say it is anyone.

    Lastly, google just came out with facial recognition for dogs so presumably you could also use that same technology for
    tattoos, or specific body parts too.

    But again, I would think revenge porn would be primarily a problem on other services not facebook.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...