Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Google Crime Facebook The Internet Twitter United Kingdom

Google, Facebook and Twitter To Block "Hash Lists" of Child Abuse 177

An anonymous reader writes: Facebook, Google, and Twitter are teaming up with the UK's Internet Watch Foundation (IWF) to share hash lists of blocked indecent images. The move is intended to ensure that a picture pulled from one site can't show up again elsewhere. The BBC reports: "Online security specialists welcomed the move as a positive step, but said it would not block content on the 'darknet' — a network with restricted access — where abusers often posted images."
This discussion has been archived. No new comments can be posted.

Google, Facebook and Twitter To Block "Hash Lists" of Child Abuse

Comments Filter:
  • This probably isn't a bad idea even though it won't stop the perverts. It greatly lessens the chance someone will come across something they didn't want to see.
    • Well ... this is great if no two things can have the same hash.

      But as soon as it starts blocking my picture of my dinner as kiddie porn, having Facebook and Twitter block it becomes fairly meaningless.

      I mean, are people using Google, Twitter, and Facebook for this stuff?

      • by digsbo ( 1292334 ) on Monday August 10, 2015 @02:41PM (#50287263)
        Image recognition is straightforward enough today to quickly find almost identical matches and generate the new hash. TinEye is really good for this kind of thing [tineye.com], and I'm sure Google's image match also works sufficiently well to keep an updated list of all the one-offs. Pretty easy to update it just like AdBlock or an SSL cert blacklist.
        • by AmiMoJo ( 196126 )

          False positives will be hell. It's bad enough that the content might be blocked or de-indexed, but imagine if perfectly innocent photos were tagged as child pornography. It's also worth noting that the IWF is not regulated or overseen in any meaningful way, and once broke Wikipedia with it's overzealous blocking of an album cover that had been on sale in the UK for decades.

          I'm just astounded that this is even worth doing. How dumb do you have to be to post child pornography to Twitter or Facebook?

          • by digsbo ( 1292334 )

            How dumb do you have to be to post child pornography to Twitter or Facebook?

            Think of the 17 yo girl who sends her boob pic to her boyfriend. This protects her against her own bad judgment, or against the bad judgment of her boyfriend, or the many, many "friends" who will see and re-share that pic, which to the casual observer could just as easily be an 18yo.

            • by TWX ( 665546 )
              Until other sites mirror it and add a watermark, or change the resolution, or crop the image, or color-shift the image to be artsy...
          • by TWX ( 665546 )
            I'm guessing that the album was from Blind Faith...

            I've always wondered how society will react over time as perspectives shift, and things that were previously acceptable become taboo.
          • I, for one, will be generating trillions of copies of the Virgin Killer cover art, to cover as much of the hash space as possible.
        • Possession of child pornography is illegal in itself.

          Given that how would they have the original image to match against?
          • by digsbo ( 1292334 )
            Law enforcement agencies are allowed to hold this stuff for training and evidence. If it's for enforcement purposes, they can do it.
          • by BitterOak ( 537666 ) on Monday August 10, 2015 @03:49PM (#50287817)

            Possession of child pornography is illegal in itself. Given that how would they have the original image to match against?

            Almost every child pornography possession statute that I've seen has an exception for law enforcement activities. For example, a jury examining photos in a jury room wouldn't be guilty of possession if those photos are evidence presented at trial.

        • by SuricouRaven ( 1897204 ) on Monday August 10, 2015 @03:40PM (#50287743)

          No. I use a similar algorithm to deduplicate my obscenely large stash of furry pornography*. It works, but there's a problem.

          Let's say that the chance of two unrelated images matching is, say, one in million. Great. That sounds amazing - and it is, that's ridiculously optimistic for phash alone, but we can assume they have something better involving composite hashes.

          Now feed into that a sizable database of child abuse imagery - say, ten thousand images. And a copy of the facebook photo library for one day, which is 350 million photos. Yes, that's facebooks claim, do not underestimate the number of compulsive photographers. That's 3,500,000,000,000 comparisons, and at your optimistic one-in-a-million error rate, 3,500,000 false positives to investigate every day.

          It can be done, but it's going to need a bit more than just perceptual hash comparisons.

          *Thus posting as AC.

          • by SuricouRaven ( 1897204 ) on Monday August 10, 2015 @03:41PM (#50287749)

            Got to caught up in checking the math I forgot to tick the box. Bah. Well, no-one cares anyway.

          • by digsbo ( 1292334 )
            A couple of simple methods to avoid false positives include including a couple other generated or inherent values (image size, a few pixel colors, etc.). Hash lookup does five-nines of sorting through that, then a quick set of comparisons like those narrows it to one or two images, which can be compared using image recognition. Do the cheap comparisons first, and only do more expensive false-positive checks for the positive matches.
          • by blueg3 ( 192743 )

            Let's say that the chance of two unrelated images matching is, say, one in million. Great. That sounds amazing - and it is, that's ridiculously optimistic for phash alone, but we can assume they have something better involving composite hashes.

            For the reasons you outline, a hash with a one-in-a-million collision rate (one in 2^20) is worthless for this purpose and for many purposes. Maybe this is an accurate rate for phash. That's because it's a fuzzy hashing algorithm. Typically, all of these law enforcement applications use MD5 or SHA-1, which have collision rates around 2^128 to 2^160 (not including manufactured hash collisions).

            Now feed into that a sizable database of child abuse imagery - say, ten thousand images. And a copy of the facebook photo library for one day, which is 350 million photos. Yes, that's facebooks claim, do not underestimate the number of compulsive photographers. That's 3,500,000,000,000 comparisons, and at your optimistic one-in-a-million error rate, 3,500,000 false positives to investigate every day.

            It can be done, but it's going to need a bit more than just perceptual hash comparisons.

            The numbers are, of course, much different when your hash has collision rate that's many, many orders of magnitude l

            • PhotoDNA is a perceptual hash. It's functionality is similar to phash - it might be a bit better, but I wouldn't expect it to be much better. Neither are going to come close to one-in-a-million: I picked a number that was intentionally optimistic.

              Cryptographic hashes would eliminate the false positive problem, but are trivial to alter. Beyond trivial: It happens frequently without even intending to do so.

              You don't need to look at collisions within the facebook sex - only those involving an image from each s

          • *Thus posting as AC.

            Oops?

            Re:keep honest people safe (Score:5, Insightful)
            by SuricouRaven (1897204)

            Get some help.

        • by blueg3 ( 192743 )

          They almost exclusively use binary hashes (MD5, SHA-1).

          • by digsbo ( 1292334 )
            Who? TinEye does not almost exclusively use binary hashes. I've tested their stuff. It deals with flipping, overlap/underlap, and is even good at picking things up from different angles and partial image matching.
            • by blueg3 ( 192743 )

              IWF, the organization named in the summary who is providing "hash lists" to Google, Facebook, and Twitter.

      • But as soon as it starts blocking my picture of my dinner as kiddie porn, having Facebook and Twitter block it becomes fairly meaningless.

        The question you need to ask yourself is "Why am I uploading pictures of my dinner?" Seriously. People do that all the time, and I have to wonder why. Does anyone really enjoy seeing other people's dinner? What's next? People posting pictures of their poop?

      • by blueg3 ( 192743 )

        Well ... this is great if no two things can have the same hash.

        With even the worse of the acceptable cryptographic hashes, it is essentially true that no two things can have the same hash*.

        * Barring manufactured collisions, which are best avoided but may or may not be a problem depending on your application.

    • by cdrudge ( 68377 )

      I see stuff I don't want to see all the time. That doesn't mean that other people don't have a right to post it. I can't recall a time where I ever came across an illegal image on Facebook or Twitter. Against TOS/AUP, sure. Trashy or tasteless? Definitely. But not illegal.

      And such a policy would be easily be circumvented. Flip the image horizontally. Crop it. Change the resolution slightly. Add more jpeg. Write a meme on it. Change the color balance slightly. It might stop the exact same image from spread

      • Of course none of this will work.

        But it gives the appearance of doing something, even if it doesn't have a hope in hell of doing anything.

        I'm sure this is considered largely a PR move to show you're tackling the issue. But if anybody believes this will have any impact, they're kidding themselves.

      • They'll be using a perceptual hash. I recall Microsoft has one that's already in use in law enforcement for this purpose.

        If it's a perceptual hash, changing resolution will achieve nothing. Nor will color balance, or jpeg compression. Meme might. Cropping or flipping certainly will though, at least for the phash algorithm I'm familiar with.

    • by lgw ( 121541 ) on Monday August 10, 2015 @02:57PM (#50287411) Journal

      This probably isn't a bad idea even though it won't stop the perverts. It greatly lessens the chance someone will come across something they didn't want to see.

      When they cam for the perverts, I said nothing, for I was not a pervert?

      This exact technology will allow governments to exercise very powerful censorship across the internet (or at least the part of the internet most people see). Want all pictures from that protest rally to vanish? Just twist the arm of any of these companies into adding a few hashes, or just slip them into a list the FBI no doubt routinely provides, and, just like that, down the memory hole. Plus, as you say, this won't stop the perverts. The only thing this actually accomplishes is empowering the totalitarian state.

      We seen a couple of stories here on /. already where IP blocklists were abused by governments to slip in websites of opposing political parties. It's a bit hard to believe this won't be abused similarly.

      • It's a bit hard to believe this won't be abused similarly.

        Unfortunately you're right. History shows that any and all technology has been used repressively, and this won't be different. The problem is not technology but human nature.

        I'm all for preventing abuse of children or for that matter abuse of anyone. But if someone thinks they have a simple answer that isn't a two-edged sword, they're dreaming.

      • by digsbo ( 1292334 )

        The only thing this actually accomplishes is empowering the totalitarian state.

        Since I'm usually on your side of that argument, I will respond by saying that there are already plenty of filtering/sifting technologies in place. I think you're getting to the point in your argument where "anything that can be used by the totalitarian state is a bad thing", which almost sounds like insisting we don't build or implement anything new that's useful to deal with large amounts of information.

      • by jrumney ( 197329 )

        When they cam for the perverts, I said nothing, for I was not a pervert?

        If I catch my daughter camming for the perverts, I certainly am not saying nothing.

    • by Cito ( 1725214 )

      But... But...But...

      How are folks supposed to find r@ygold, hussyfan, kingvid, pthc, babyshivid ???!?!?!? :P

  • by Anonymous Coward

    Why is the darknet Google's responsibility, again?

  • This isn't too different from our approaches to spam emails. But are these services actually used to share those kinds of images? I wonder who curates the list of hashes, and how long before someone starts adding pictures of stuff they don't like to the list.

    • Hit the nail on the head.

      Who watches the watchers?

      What accountability will there be?

    • The lists are never reviewed I'm certain.
      • We have to take it on trust that the lists are accurate and contain no false entries, because there is no independent confirmation and, given the contents of the lists, just possessing a copy without proper authorisation from law enforcement is a crime in most of the world.

        • Actually, the lists contain a type of image hash that contains no actual image data. You would need the original images to compare.
    • by cdrudge ( 68377 )

      From the wording of the article, I would imagine Internet Watch Foundation (IWF) would be the party that is responsible for curating/publishing the list of hashes. Here in the states, the National Center for Missing and Exploited Children, FBI, and Justice Department maintain a similar database [fbi.gov].

    • because we take enforcement action {X} against {Y}, then enforcement action {W} against {Z} in inevitable

      try these on:

      "we can't legalize marijuana, because then we have to legalize methamphetamine and heroin"

      "we can't legalize gay marriage, because then we have to legalize marrying the dead and marrying animals"

      do you see the problem? good, then know yourself: the slippery slope argument is failure, appeal to emotion, fear

      the slippery slope only works if you are dealing with people who never actually think

  • While it is an interesting concept, it is doomed to fail as a simple single pixel edit or hidden attribute edit will change the file's hash.
    • Re: (Score:2, Informative)

      by Anonymous Coward

      While it is an interesting concept, it is doomed to fail as a simple single pixel edit or hidden attribute edit will change the file's hash.

      Its not a hash in the sense of MD5/SHA etc that hashes the file contents at the byte level. Its a perceptual image hash, a well studied technique.

      A image will have the same hash even if a few pixels change or e.g its rotated/mirrored etc.. and resized of course

      • To which I give to you Gmask....
      • A few pixels change yes. Resistance to the others depends upon algorithm. Flipping or rotating work on the basic form of the phash algorithm. I expect they'll be using something more complicated, probably a composite hash incorporating several functions.

        • Of course the hash won't be bulletproof, but if nothing else, the picture will degrade with every kludge to get it to fail the hashing. But if the hash list is kept among the filterers, how do you know that you need to kludge it? And even if you do, how do you know what's enough to cheat the filter?
          • I wouldn't even bother.

            I'd stick it inside an encrypted rar or 7z file.

            You'd have to be a real idiot to upload child abuse images in the clear.

            • Most people don't understand encryption. That doesn't make them idiots -- just ill-informed.
              • If you're dealing with the type of data that major police agencies are actively hunting for, you'd have to be an idiot not to inform yourself on how to get away with it.

      • by blueg3 ( 192743 )

        Its not a hash in the sense of MD5/SHA etc that hashes the file contents at the byte level.

        It's MD5, SHA1, and PhotoDNA hashes.

        The standard in most law enforcement forensic applications is MD5 / SHA1, despite the obvious limitations.

        Sadly, it still is reasonably common to encounter byte-identical images that are on the relatively small "known-bad image" hash lists.

    • by Dwedit ( 232252 )

      Fortunately, people are idiots and often don't alter the files in any way before re-sharing them. Sometimes even the EXIF tags are intact.

    • by gweihir ( 88907 )

      There are "robust" image hashes out there, so that is not the real problem here. The real problem is the huge opportunity to do censorship this way. And anybody even trying to find out whether this is real or a complete abuse of the law will see some illegal pixels and will have searched for them, and hence can be easily removed to jail for a long, long time. Basically, doing it this way removes any legal possibility for ordinary citizens to evaluate what is going on and complain about abuses (except for ve

  • I'm surprised it took this long. Google must have such a hash list already, better yet an MD5 list, built from their human reviewers of their robot webcrawled image search, so they won't show up in customer searches.

    The real question is who keeps a database of pictures to review the list itself. Police? Google? Any normal prosecutor would happily prosecute Google for it just to add a notch to their belt (of asshole behavior).

  • Whenever you start itching to censor content you don't like, just keep in mind that some countries consider pictures of women not in a burka to be illegal pornography.

  • It should have said they can't censor things on the darknet not won't.

    If won't is correct are they running the darknet sites in question?

  • by Anonymous Coward on Monday August 10, 2015 @03:02PM (#50287455)

    Hate crime (also known as bias-motivated crime) is a usually violent (lock em up, kidnapping, sex offender lists, etc), prejudice motivated crime that occurs when a perpetrator (social justice warriors) targets a victim because of his or her perceived membership in a certain social group (paedophilia).

    This fight against paedophiles is really just unjustified homophobia in disguise, a form of racism, etc. The Internet Watch Foundation is not attacking child abuse, they are attacking paedophiles. The very article describes this as a " fight against paedophiles". There is no reason to think the vast majority of paedophiles are harming kids. This is why they're attacking child pornography. It's an easy target that won't go away and they can't possibly eliminate.

    It shares so many similarities with the war on drugs. If there is one group of people you can attack and generally get agreement on this is it. It's too small, unorganizable, spread out, etc, and nobody would dare defend it out of fear for there lives. This gives the social justice warriors ample room to do what they want. They are really nothing but a misguided group of racists spreading hatred and fear. You wouldn't attack homosexuals because some are sexually abusing little children. It's no different with paedophiles. The entire war on paedophilia is identical to the war on drugs. It's utterly illogical. The idea that porn leads to sexual abuse was disproved long ago. It's just like violent video games leading to violence in the real world. The reality is studies have shown the exact opposite to be true.

    This is doing nothing other than implementing a system of censorship and giving people the perception something is being done to stop child abuse. It's not. The article even says they're not able to stop the spread. It's not even illegal to be a paedophile, and yet they have no aversion to expressing there hatred for this group. They accuse an entire group of wrongdoing when there is zero evidence of that. There isn't any means to even produce such evidence because all the studies that back up paedophiles being bad were done on an imprisoned population which wouldn't represent paedophiles as a whole. It only represents violent paedophiles. It's no different than doing a study on homosexuals after locking up homosexuals. There is going to be a disproportionate number of violent homosexuals in custody.

    They're not attacking people who abuse children. They're attacking a group of people who are hated for no logical reason. It's no different than attacking homosexuals for what a minority have done.

    Go watch the 1950's Anti-Homosexual PSA - Boys Beware to see exactly what I'm talking about:
    https://www.youtube.com/watch?v=17u01_sWjRE

    They imply all homosexual are evil-doers that want to harm little children. It's utter nonsense.

  • from the ./summary:

    ...but said it would not block content on the 'darknet' — a network with restricted access — where abusers often posted images."

    Well that is one way to defeat the blocking. Or you could flip just one bit in the entire image and that would change the hash. Pick an LSB anywhere and nobody will notice that it is a different image. Assuming it's a hash of the entire image, and not a subsample. And assuming that image compression does not swallow your LSB flip. Even without those assumptions, there will be many trivial and, to the eye, undetectable, transformations which would defeat the hash.

    But something tells m

  • Blocking pictures of child abuse is like sweeping them under the carpet: we don't see them, but pedophiles can still download them. I fail to see how this prevents further abuse of the children in the pictures. And just to be clear: we are talking about adult men, sometimes elderly men, having sex with todlers. Penetration sex that is. And yes, that too.

  • Worked at AT&T and T-Mobile and they ran the hash list along with a virus scan on emails, mms messages and photo albums. I'm sure other image hosting services run the same checks. If any photo popped, you had to notify this third party company who acted with the cops. If a cop had a warrant, you would call the telcom to have an engineer to drop a dvd in the admin server, run a collect script that zips everything up, and then have the police department show up and pick up the dvd outta the dvd-rom drive

  • What utter moron of a child abuser would upload their pictures to facebook?

    They might not be the brightest of criminals, but seriously... they'd have to be pretty dumb to do that.

    • by Greyfox ( 87712 )
      This isn't about this. This is about the governments and their corporate cronies (Or the corporations and their government cronies) ramping up their efforts against the darknets. As DMCA enforcement gets more draconian, more people are starting to turn to the likes of Tor to get their Game of Thrones videos. The corporations are out ahead of it for a change. Look for innocuous legislation to be introduced in 2016, perhaps as a rider to a funding bill, that will quietly make running darknet nodes illegal. Pr
    • by gweihir ( 88907 )

      Right on the mark. And that is why the stated goals are rather obvious lies.

  • by Gliscameria ( 2759171 ) on Monday August 10, 2015 @03:58PM (#50287877)
    The media really makes it seem like nothing happens on the darknet other than child porn and terrorism. It's fitting that they really push this at a time when a more usable darknet should be very attractive to most people. Your IP address, cookies, device ID, browswer ID, OS ID, various logins are all being cross referenced. Your professional work 'searches' are going to be put in a pile with the rest of 'it'.
  • I have trouble poking technical holes here since, fundamentally the idea of using a hash table is somewhat sound, its used all the time for UUIDs, theres plenty of uniqueness right? I guess maybe we can rule out collisions for the most part....hell maybe pair the hash with a file size?

    If we are talking such tried and true technology and not some recently invented "photo hash" that I wouldn't have any faith in the uniqueness of....

    but then the implications of just having such a system means things can be inj

    • Not to mention that if they're just storing hashes, that a tiny change will generate a completely different hash. Change one pixel, by a single bit, and you have a different hash. Change the resolution, change meta-data, crop out a row or column, etc., etc., a service that serves up the images could do this automatically every time it displays an image.

      There's lots of ways around this, including only sharing content with trusted people via sneakernet if it comes to that.

      It sounds spiffy to the average perso

  • Who are these people who get paid to watch child pornography all day? (i.e. The people who classify the images.)

    There are some experts in this area who would be willing to offer their services for free to help Google out.

Fast, cheap, good: pick two.

Working...