Forgot your password?
typodupeerror
Google Crime Privacy

Google Spots Explicit Images of a Child In Man's Email, Tips Off Police 790

Posted by samzenpus
from the do-not-pass-go dept.
mrspoonsi writes with this story about a tip sent to police by Google after scanning a users email. A Houston man has been arrested after Google sent a tip to the National Center for Missing and Exploited Children saying the man had explicit images of a child in his email, according to Houston police. The man was a registered sex offender, convicted of sexually assaulting a child in 1994, reports Tim Wetzel at KHOU Channel 11 News in Houston. "He was keeping it inside of his email. I can't see that information, I can't see that photo, but Google can," Detective David Nettles of the Houston Metro Internet Crimes Against Children Taskforce told Channel 11. After Google reportedly tipped off the National Center for Missing and Exploited Children, the Center alerted police, which used the information to get a warrant.
This discussion has been archived. No new comments can be posted.

Google Spots Explicit Images of a Child In Man's Email, Tips Off Police

Comments Filter:
  • Others?? (Score:5, Interesting)

    by wisnoskij (1206448) on Sunday August 03, 2014 @11:22PM (#47596657) Homepage
    How does Google do this for one person? If they suddenly started scanning images for this, you think they would uncover a few thousand people at a time. Are we supposed to believe that they specially targeted him, or that he is the only person to ever send naked pictures of children through gmail?
  • Good riddance (Score:5, Interesting)

    by penguinoid (724646) <spambait001@yahoo.com> on Sunday August 03, 2014 @11:25PM (#47596675) Homepage Journal

    Both to the pedophile and to the illusion of privacy people had when using Gmail.

    (They have an obligation to report child porn if they find it, but they don't have an obligation to look. My suspicion is Google is not happy about what happened.)

  • by hjf (703092) on Sunday August 03, 2014 @11:39PM (#47596749) Homepage

    Not checksumming or hashing. It's called "feature extraction". I know about it. I made a video about a little software I made based on OpenCV which is able to identify a picture I show it, through my webcam, among 20,000 pictures stored in my computer. It's the only video in my youtube channel that actually has views.
    Anyway, once you have the features, you can analyze an image and see if it contains any part of any of the images in your database. It doesn't matter if it's slighlty blurred, partially covered, rotated, and it doesn't matter if it takes the whole screen or just a fraction. In my demo I show how my program recognizes Magic: The Gathering cards in my hand (which is much more difficult than recognizing poker cards).
    Oh, and it does this at several matches per second on a Core 2 duo class machine.

  • by felixrising (1135205) on Sunday August 03, 2014 @11:54PM (#47596825)
    I highly doubt this is as nefarious as it seems on the surface. Chances are google applies hashing to each image that passes through their servers in order to reduce duplication of stored files. Some files may have been flagged before as being child porn and they setup some alerting when new emailed images match this pre-existing hash... nothing worse than an AV signature match... Note: I'm just guessing here, but there is no way google has a team of people sitting there scanning every single email, it's all automated and we have already given express permission for google to do some content analysis on our emails, that is after all how they target advertising at us and turn a profit... gmail isn't free!
  • by JThundley (631154) on Sunday August 03, 2014 @11:56PM (#47596833) Homepage

    Were they really snooping around this guy's email for no reason or do they check your attachments against a list of hashes of known child porn?

  • Re:This is chilling (Score:4, Interesting)

    by NoKaOi (1415755) on Monday August 04, 2014 @12:06AM (#47596895)

    Hmm, I don't know. This is the first time I've heard of something like this from Google, so it could have been just an inquiry into a random technical problem, a Google employee suspicious of their neighbor, a Google employee who got a tip-off from his best friend, or anything, really.

    All of those scenarios just go to show that, contrary to what Google has claimed in the past, their employees can and do view emails even without a court order.

  • by sumdumass (711423) on Monday August 04, 2014 @12:50AM (#47597095) Journal

    Google messing up the evidence chain doesn't have to be about the 4th amendment and the police.

    It could go to the legitimacy of the ebidrmce altogether. What assurances can be offered thst the photos were not planted by an employee of google who has a beef with pedophilles. After all, google did happen to look in this man's private email that people think is as private as snsil mail even though they gave google access knowingly or unknowingly, find a pictue, and alert the proper people to make sure something comes of it. Even if it was discovered automatically by software, the question of how it got there still comes about.

  • by joe_frisch (1366229) on Monday August 04, 2014 @01:56AM (#47597311)

    Which seems like a great way to catch the minor offenders who are trading old pictures, but not the really serious offenders who are producing NEW child porn. One could even argue that it creates a market for new child porn that doesn't have known signatures.

    I wonder if child porn is the only type of material that is checked against a known database?

  • by GNious (953874) on Monday August 04, 2014 @02:19AM (#47597399)

    It is in the ToS, which at least 1 party (the account-owner) has agreed to.

    We can try all we want to compare this to 1984 and what-nut, but if we explicitly allows a company to rummage through our email, we have no basis for complaining when it happens.

    (Note: I can think of at least 1 country where this part of the ToS would be invalid)

  • Re:Brain surgery? (Score:4, Interesting)

    by SuricouRaven (1897204) on Monday August 04, 2014 @02:41AM (#47597471)

    Because there is no sexual area of the brain. It's a distributed function. You'd have to cut out so much brain they'd end up comatose or dead.

    You can try to surpress sex drive hormonally, or even by castration. It's still not reliable. There's too much of a psychological element involved: Even if you remove the hormones, that doesn't mean they won't still want to look.

  • by AmiMoJo (196126) * <mojo@@@world3...net> on Monday August 04, 2014 @03:33AM (#47597637) Homepage

    We should not be happy that Google is perusing the content of our E-mail with anything but automated tools

    It is an automated tool. They look for hashes of known illegal images.

    That in itself is worrying because recipients can't control what email appears in their inbox. There are sites out there that offer illegal imagery for download specifically for sending to victims to get them in trouble, or for posting to forums as a kind of trolling.

  • by jeIIomizer (3670945) on Monday August 04, 2014 @06:46AM (#47598267)

    What I see is a claim that the demand has a direct relation with THOSE(in the movie/picture) children being abused.

    Demand forces no one to do anything. The fault lies with those who rape. It's like how if someone falsely screamed "fire" in a crowded theater and people panicked and harmed others in the panic, the ones at fault for causing the damage would be the ones who caused the damage, not the speaker. Our legal system obviously doesn't see it this way, but I disagree with the legal system.

    How about on a live theatre? Won't be censorship too to make it illegal?

    You are mistaking the action with outlawing the result. It's perfectly valid to break it up if real people are getting hurt, but unless they're taking down videos or images, that isn't happening.

    What we are talking about now is censoring images/videos after they've been created, not live theater.

    You're so intoxicated with your "no censorship" dogma that you failed to sense that people buying this movies/pictures are paying money to pedophiles to rape children.

    Nope. I did not fail to consider that; it's just irrelevant to me. Go after the rapists, and stop trying to harass people who merely view or buy the content.

    Also, you mean "rapists" or "child molesters"; not "pedophiles." Pedophiles simply have a sexual attraction to prepubescent children. They are not necessarily rapists, and do not necessarily even view child porn.

    You're an idiot.

    Nope. Just someone who despises censorship.

  • by Raenex (947668) on Monday August 04, 2014 @06:52AM (#47598293)

    If someone is a child molester, I would think it highly likely that they suffer from a mental illness, and need our help.

    How do you propose to "help" them? I believe there is no effective way to "help" such people beyond castration.

    And the whole "mental illness" angle seeks to remove personal responsibility from the equation. Why not cave in to your worst impulses? You just suffer from a mental illness, and it's up to society to "help" you.

  • by AmiMoJo (196126) * <mojo@@@world3...net> on Monday August 04, 2014 @07:50AM (#47598489) Homepage

    It's obvious why these things happen, it's just that society is powerless to do anything constructive about it.

    There are basically two types of people who rape children. Some are just normal, otherwise healthy people who have a natural attraction to pubescent children below the age of consent. Like it or not human beings are driven to breed well below the age of consent, it's just our genetic make-up. Most adults understand why this is a bad thing to do and restrain themselves, and those who don't are incorrectly labelled paedophiles (paedophile refers to someone attracted to pre-pubescent children). It doesn't matter if they actually harmed anyone, the mere fact that they were unable to repress their natural urges and looked at a child with feelings of lust is enough.

    The other group are those with a mental illness who are attracted to pre-pubescent children. They have mental health problems that need to be addressed. Unfortunately society makes it very difficult for them to get treatment because of the extreme stigma attached to their condition. The media tends to pain paedophiles as monsters, so extremely disgusting that people with that illness do not want to associate themselves with that image in the early stages when treatment would be most effective and prevent any actual crimes taking place.

    Obviously we need to protect children and punish criminals, but the way we go about it now we actually create an environment where people can't get treatment before they become criminals.

  • by arth1 (260657) on Monday August 04, 2014 @08:44AM (#47598753) Homepage Journal

    In other words, you have sympathy for all people except those who go to church.

    Trust me, I have plenty of sympathy for them. The greater the delusion and the greater damage it causes, the more tragic it is, and the more sympathy I have. Those who molest the minds of not just one or two, but entire generations of children are truly those I feel the most sorry for. Like child molester might believe in and justifies their actions with "child love", these sad individuals believe in "god love" and that it justifies crippling a child's mind (and often body) like they themselves were crippled.
    It is tragic, and I would do anything to offer them help so this can stop.
    Again, I think the best thing we can do is ask ourselves "why" - find the root cause for why people turn into monsters. What happens in people's brains, and how can we offer (not force, but offer) our assistance?

  • by LWATCDR (28044) on Monday August 04, 2014 @09:01AM (#47598871) Homepage Journal

    "t. Do they really want Google telling the government who owns guns, who visits anti-government websites, what they say on their hangouts about campaigning against the president etc?"
    None of those things are illegal.
    How do you feel about making contributions public knowledge. You could get drummed out of a job at liberal tech company just for supporting a law that passed a public vote but that they didn't like!

  • by smellsofbikes (890263) on Monday August 04, 2014 @11:28AM (#47599941) Journal

    That means only the most incompetent pedos aren't already randomly tweaking their jpgs - the smart ones are doing it in the EXIF section so it won't even change the picture.

    The smart implementations probably hash the image payload excluding EXIF, for exactly that reason - maybe downsample and reduce the colorspace too, so trivial tweaks won't have that effect any more.

    This isn't definitive research, but in the early days of G+, some friends posted a lot of porn to see how quickly Google caught and deleted the pictures. What they found was that Google's algorithms, once trained with a picture, could find that picture if it had been resized, flipped along the vertical axis, and cropped. (one was cropped to the point where it was no longer technically porn, since it was just a person's face, and it still disappeared.)

To err is human -- to blame it on a computer is even more so.

Working...