Forgot your password?
typodupeerror
Cloud Communications Microsoft

Microsoft Tip Leads To Child Porn Arrest In Pennsylvania 353

Posted by timothy
from the looking-looking-everywhere dept.
Shades of the recent arrest based on child porn images flagged by Google in an email, mrspoonsi writes A tip-off from Microsoft has led to the arrest of a man in Pennsylvania who has been charged with receiving and sharing child abuse images. It flagged the matter after discovering that an image involving a young girl had been allegedly saved to the man's OneDrive cloud storage account. According to court documents, the man was subsequently detected trying to send two illegal pictures via one of Microsoft's live.com email accounts. Police arrested him on 31 July.
This discussion has been archived. No new comments can be posted.

Microsoft Tip Leads To Child Porn Arrest In Pennsylvania

Comments Filter:
  • by mrspoonsi (2955715) on Wednesday August 06, 2014 @05:05PM (#47617263)
    Dropbox? Apple iCloud?
    • by will_die (586523)
      dropbox usage license already allows them to do something like this, so chances they are.
    • by EvilJoker (192907)

      I'm more concerned about where the scans extend from here. It would be relatively trivial to include "scene release" pirated content in a similar hash group, and report it accordingly.

      Even worse would be that Dropbox, Google Drive, etc starts scanning OUTSIDE of their directories, or adding new ones without asking. The only thing really stopping this is a matter of volume - hashing that many files would slow down the system too much, and uploading the hashes would take too long. Neither of these is insur

      • by Jane Q. Public (1010737) on Wednesday August 06, 2014 @06:43PM (#47618159)

        I'm more concerned about where the scans extend from here. It would be relatively trivial to include "scene release" pirated content in a similar hash group, and report it accordingly.

        I think the real point is that any of these companies could have done this at any time. It isn't so much a matter of "Look! they did something great!" (and they did)... it's more a matter of: look at the shitty privacy intrusion they've committed on hundreds of thousands, if not millions, of people, in order to accomplish that one great thing.

        Freedom has a cost. And part of that cost is that some people will get hurt that otherwise might not have been hurt. But it's a cost worth paying, because otherwise millions more pay far more, even if it's only a little bit every day. Eventually that turns into a lot every day. That's not paranoia, that's history. Over and over and over again.

        • Freedom has a cost. And part of that cost is that some people will get hurt that otherwise might not have been hurt. But it's a cost worth paying, because otherwise millions more pay far more, even if it's only a little bit every day. Eventually that turns into a lot every day. That's not paranoia, that's history. Over and over and over again.

          Easier to say when you're not the child getting molested. Which isn't to say you're wrong, but all too often the "FREEEEDOM!!!" crowd misses the very real costs that hurt very real (and very helpless) people. Its not as simple as all freedom all the time, we really do need a healthy balance.

          • by Ol Olsoc (1175323) on Wednesday August 06, 2014 @07:56PM (#47618567)

            Easier to say when you're not the child getting molested. Which isn't to say you're wrong, but all too often the "FREEEEDOM!!!" crowd misses the very real costs that hurt very real (and very helpless) people. Its not as simple as all freedom all the time, we really do need a healthy balance.

            But your problem sir, and it is a really big one, is that who gets to decide that balance?

            I worked with a guy who once said. "I don't care if they come into my bedroom and fuck my wife, as long as they keep the country secure". He was willing to give up any semblance of freedom for his "security".

            Fot you see, there are people here, just as intense as the Freedom people you think are missing the point, who would have every aspect of your life intruded upon, mandatory searches, and no privacy whatsoever.

            Thet's their idea of a healthy balance.

            But seriously, storing anything in "the cloud" means that it will be looked at. Whover the fucker is, good for him. There is such a thing as criminal stupidity.

            • Well it's a good thing we have a whole system of government balanced on the idea of not doing whatever 1 guy says. Seriously, the amount of people who call for government protections of something to protect them from the government which they then say they are powerless to influence is ridiculous.

              If it is noteworthy that we can in fact influence policy, then it is also worth noting that there is no obvious slippery slope between hash matching child abuse images sent over services unencrypted, and then prose

      • hashing that many files would slow down the system too much

        Nope, because they already hash them. It is very common for several people, sometimes dozens or hundreds of people, to store or email the same image, zip file, etc. So to cut down on storage and transmission costs, the images are already hashed, and only stored once.

  • The demo 1x1 pixel has the color 0xF0B8A0 and contains child pr0n down-sampled.

    What, you thought numbers being illegal was nonsense too?

  • by Mal-2 (675116) on Wednesday August 06, 2014 @05:10PM (#47617291) Homepage Journal

    Sweet Jesus, if you're going to send things in the clear, you have no idea who might be able to lay eyes on it. This goes for storing things locally -- people have been busted for stored files when they take a machine in for repair as well.

    When in doubt, encrypt. When not in doubt, get in doubt.

    • by AmiMoJo (196126) *

      We don't know the nature of these "child porn" files. Might have been fully clothed "jailbait" that the user didn't think was actually illegal, but which the police take a dim view of, for example. These days the police will claim "child porn" in their press releases when it's really more like teens sexting or something.

  • Seriously, I must not be normal. This is clearly "for the children", there's really nothing morally disputable about this particular case. So, why can't I see it as progress? Why am I worried that it was automatically spotted?

    I need to get my s... straight. Think of the children. Think of the children. The system is good for me. The good guys have nothing to worry about.

    • No you do not (Score:3, Insightful)

      by thieh (3654731)
      This is what studying ethics/morality feels like. And this isn't exactly progress, unless you count "progressing to a police state". Many things in life are conflicts of various field of interest, and it is up to the philosophers/activists/lawyers/judges/lobbyists/legislature to figure them out.
      • by akozakie (633875)

        Nah, not "to a police state". The difficult part of observing a slippery slope is the admission than "yup, we're mostly there". Otherwise you'll just lie to yourself until you're not just at the end of the slope, but anchored, settled and playing solitaire with a full family there.

        That's the problem with highly emotional subjects (like pedophilia). You need to conciously limit your emotional response to them, otherwise you will accept as "lesser evil" things that are really a bigger problem.

        Scale is what ma

  • Why aren't these guys encrypting their stuff? I would imagine extra care are to be taken if they think what they are doing can be morally objectionable... And then it hit me that the NSA works like that too. Always blow on the morally objectionable stuff.
    • because most criminals are stupid...and thank god they are. The authorities are inept enough on their own.
    • Why aren't these guys encrypting their stuff?

      Presumably, the ones that encrypt don't get caught and therefore don't make the news.

    • by amiga3D (567632)

      If you encrypt then they know you're hiding something.

  • by Sonny Yatsen (603655) * on Wednesday August 06, 2014 @05:19PM (#47617357) Journal

    I don't understand the surprise people are experiencing from the revelation that Google and Microsoft scans the stuff you upload to their cloud storage systems.

    You are literally giving them a copy of your files, and generally speaking, you also agreed to allow them to allow them to scan your stuff. Google Drive's terms of service explicitly states that your stuff will be scanned:

    "Our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored. "

    Why would anyone reasonably think that their stuff is somehow private when it's in the cloud?

    • You are correct that automated scanning combined with reporting to the government is to be expected in today's political climate. However, you would be incorrect if you asserted that the founding fathers expected the asymmetry where the populace could not similarly examine Lois Lerner's e-mails.
    • Re: (Score:3, Insightful)

      by thieh (3654731)
      The problem usually comes down to that "personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection" didn't include "days in court" nor "jail time" as their catalog of "personally relevant product features".
      • Yes, but the terms of the ToS also generally states that you wouldn't misuse their services. For instance, Google Drive's ToS states:

        "You may use our Services only as permitted by law, including applicable export and re-export control laws and regulations."

        Using Google Drive for child porn obviously violates this clause of the ToS, and once that happens, you are at the mercy of the Cloud provider on the basis of you having agreed to the terms of the Terms of Service.

    • by RobinH (124750) on Wednesday August 06, 2014 @05:34PM (#47617507) Homepage
      My significant other deals with teenagers all the time in schools, and it's amazing how many of them get irate when parents/teachers/police start to question them about stuff they posted on Facebook. The content usually comes to light because one of their "friends" have showed the authorities the content, or in some cases the teen actually friends the teacher/police officer. Their typical response is, "that's my private Facebook page!"
    • Of course they're scanning it. I would have assumed that they're scanning it for viruses/malware, for the sake of deduplication, and to provide indexing so that I can search it. It's been very public that Google also scans your email in order to serve ads, with the assurance (whether it comforts you or not) that this is all done by machines and Google employees don't see your email.

      However, searching email for the sake of reporting illegal activity to law enforcement is a bit concerning. It seems easy t

      • To me, the only thing here that makes the slope a lot less slippery is that they're reportedly doing purely automated scans, comparing against a database of illegal images, as opposed to open-ended heuristics attempting to detect anything suspicious.

        That's a distinction without a difference.

    • by turp182 (1020263)

      And in turn the cloud services are storing very illegal images. It's just due diligence if you ask me.

      I wonder how much staff they have to review this sort of thing (it would be a terrible job if you ask me, like watching the toilets in Southland Tales - which was awesome when combined with the comic book).

    • by houghi (78078)

      Why would anyone reasonably think that their stuff is somehow private when it's in the cloud?

      We live in a thime where you can ask : Why would anyone reasonably think that their stuff is somehow private?

    • by akozakie (633875)

      This.

      I really do not understand it. How can people (hell, it seems like MOST people) not see that using anything like a storage facility removes any expectation of privacy unless there are clear regulations about that in the agreement? In real, physical world this is so obvious. Earlier you could count on laziness and scalability problems, but hey - automation!

      I believe that the problem is we simply did not have the time yet to adapt to the thought that over less than a single generation processing went fro

  • This is the sound of the panoptic, dystopian police state coming. Good luck everyone!

  • by iamacat (583406) on Wednesday August 06, 2014 @05:22PM (#47617389)

    We all know what kind files they will scan for next. Because MPAA/RIAA are way more important than children!

    • by mi (197448)

      Because MPAA/RIAA are way more important than children!

      A remarkably stupid statement. And with an exclamation mark too!

      Google and others are doing this for two reasons

      1. Genuine and sincere disapproval of child pornography, which remains one of the very few things, that are still considered wrong by (almost) everybody;
      2. Fear of bad publicity, which would surely ensue, when a CP-ring is discovered by other means later and the mail-providers get asked the uncomfortable questions over why they've tolerated it desp
      • Google and Microsoft are going through your private data because of power and control. They also receive Government incentives to do so, so gain cash as a side effect. Any claim of altruism is either a delusional fantasy or sock puppetry.

        Just as I said about Google doing the same thing the other day, Microsoft is WRONG to do this. They are not a Law enforcement agency and have no right to read through customer data on their own accord. With a warrant from a court, different story, because that is the le

        • by mi (197448)

          They are not a Law enforcement agency and have no right to read through customer data on their own accord.

          Are you aware of a law prohibiting it? I am not, but IANAL... Without such a law, their access to the subscribers' email is controlled only by their own Terms of Service. Which are, of course, subject to change at their discretion.

  • by BradMajors (995624) on Wednesday August 06, 2014 @05:26PM (#47617417)

    In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

    • by tlhIngan (30335)

      In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

      Actually, no.

      They get a big list of file hashes from the National Center for Exploited Children or something, and it's implemented as part of the file scan. All that happens is they check file hashes and if it matches, then they do more in-depth analysis (is it an image file? etc).

      Which begs the question on the general stupidity since hashes are so trivially easy to change and it's extrem

      • by godel_56 (1287256) on Wednesday August 06, 2014 @06:06PM (#47617849)

        In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

        Actually, no.

        They get a big list of file hashes from the National Center for Exploited Children or something, and it's implemented as part of the file scan. All that happens is they check file hashes and if it matches, then they do more in-depth analysis (is it an image file? etc).

        Which begs the question on the general stupidity since hashes are so trivially easy to change and it's extremely easy to obfuscate (just zip it up with a password).

        People are lazy. Even ones who really know that what they do isn't really appreciated by the general population and really ought to try to cover their tracks... and don't.

        Nope, from the TFA they process the image to derive a signature which can survive things like resizing, changing resolution etc. It's not just a simple hash.

    • by DrXym (126579)
      Not necessarily. The FBI could have supplied Google & Microsoft with a long list of md5 / sha1 hashcodes for abuse images which they obtained in raids or forums and these providers have programmed their system to raise a flag whenever they get a hit. Then a human might go in to confirm the match and from there its just a matter of informing the police. It may well be there are other ways of "fingerprinting" an image that are more resilient than a hash code and still useful enough for matching pictures a
  • by GrumpySteen (1250194) on Wednesday August 06, 2014 @05:29PM (#47617451)

    If only I had a large enough collection of tinfoil hats to sell to all the posters freaking out over this.

  • by jeIIomizer (3670945) on Wednesday August 06, 2014 @05:30PM (#47617465)

    This cloud crap is just trash. At least use encryption (not theirs) or something.

    Plus, they caught someone with images that shouldn't be illegal to have to begin with. When is an actual rapist going to be arrested?

    • by Russ1642 (1087959)

      Just like with drugs they go after the users, then they turn on the dealers, and then they turn on the producers. Gotta work your way up.

      • Just like with drugs they go after the users

        That works so well!

        Gotta work your way up.

        Going after people looking at/sharing images is morally wrong, so it isn't even a viable option. And that's a non sequitur, anyway.

      • by Carnildo (712617)

        You know how they work their way up with drugs? By offering reduced charges/reduced sentences for providing evidence. For example, a drug user will be offered probation/dropped charges for ratting out his dealer, who in turn will have a "possession with intent to distribute" reduced to mere possession for saying who his supplier is, and so on up the line until they find someone big enough to go all-out against.

        The police can't do that with CP. There are no lesser versions of possession, and dropping char

  • by wiredlogic (135348) on Wednesday August 06, 2014 @05:57PM (#47617771)

    One of these days a hash collision will happen on an innocuous file and the jackboots will ruin someone's life over it.

    • No. Because

      1) These are not simple hashes, for more read the article. They seem to be able to distinguish images in the same way Shazam can recognise music while ignoring bar or vehicle background noise.

      2) Once flagged, the image would be reviewed before further action is taken. A false positive would not make the news.

      Now it's your turn. How will a hash collision lead to the jackboots ruining someone's life?

  • by MindPrison (864299) on Wednesday August 06, 2014 @06:12PM (#47617899) Journal
    ...was actually much more interesting to read than the actual news, where to start...lets see now:

    - We have a member here who thinks Pedophilia is a disease and think Pedophilia equals abusing children:
    He/she is one of the numerous clueless people out there who have NO idea if this is actually a disease or just like Homosexuality. Arguing with such a person is completely futile, but they'll always be in numbers. It's kind of voting for stupid. (Yes, that was a H2G2 reference).
    - We also have several members here who thinks Pedophiles should be arrested and behind bars just for being Pedophiles, never mind if they committed any crimes.
    - We've got the usual anonymous coward zealots that thinks that if you don't have anything to hide, there is nothing to worry about.
    Wanna bet who's next on tomorrows "sick" list? It can't possibly be you, can it?
    - We've got the next predictable bunch who immediately attacks someone who defends the freedom of the individual, and calls them Pedophiles, because they can't POSSIBLY be normal or straight if they defend Pedophiles, now can they?
    (Who exactly defended who now?) Never mind the actual facts, just as long as you get YOUR hidden agenda across.
    - And then we have those who thinks that images of kids being exploited are okay, just as long as you bust the purps behind the images, and not the users.
    (And who are the users now again? Sick Pedophiles, or nasty voyeuristic perverts that wants to get a kick out of something unthinkable and illegal?) And where do we draw the line? Naked kids? Kids posing sexually, and how do you define that?), family photos available to all? Imagine the number of youtube and imageshack users you'd have to arrest or at least suspect. Who do you trust today?

    I'll let you in on a little secret of mine, for years I've been working undercover together with a police agent who is a close friend of mine to uncover several secret child-abuse rings in various countries - trust me when I say...this is the WORST JOB IN THE WORLD. I got into it because some family members of mine was abused, and I thought I'd use my skills for something good. Over time I learned that albeit we DID get a lot of these rings busted, we also ruined several families lives, destroyed childhoods because the law and common sense doesn't mix at all.

    Everyone sees red when it comes to Child Abuse, and rightly so - but it is important...no...VITAL for progress that we somewhat keep our heads above water here and try to think rationally. It is NOT rational to point fingers at everyone who wants anonymity as a suspect of anything, it is NOT rational to call every Pedophile a CHILD ABUSER, it is NOT rational to think that if your opinion differs from the stupid masses...that you are in LEAGUE with ANYONE who happens to NOT fit your OPINION today (eg. those who want to HELP PEDOPHILES - are NOT nessesarily Pedophiles themselves, but a lot of the angry mob especially in here seem to think so).

    I get upset by this, because I think of Mr. Allan Turing, who was just recently pardoned by the British for the grave injustice brought upon him just for having a sexual preference he might not even have ANY control over (we're not talking urges and constraint here, we're talking sexual PREFERENCES).

    I do NOT want a society that becomes totalitarian where every deviant of nature becomes a freak to be hung, burned and ridiculed for just being different. I see YOUR mind as a private thing, just like your diary as a private thing. What you THINK of or FANTASIZE of is YOUR BUSINESS ONLY, and NO ONE ELSE.
    And there is nothing that gets me fired up more than someone using child abuse in ever shape and form, fantasy or drawn, real or not - to excuse severe abuse of human rights, to pry into our daily lives with the law in hand...and with a lot of supporters that mean well...but really have NO CLUE of the REAL danger they're actually putting themselves in by supporting this ludicrous development.

    Wake up and smell the coffee, people!
    • Re: (Score:2, Interesting)

      by Anonymous Coward

      I live in the UK, which is a country that is genocidal* (deny it as much as you want, people, it won't stop being true) when it comes to people who're accused of being pedophiles, hebephiles, child pornographers or child molesters.

      It doesn't matter if the person is innocent or not; once they're accused, they will wind up being murdered. Example: http://www.telegraph.co.uk/news/uknews/crime/10409326/Man-accused-of-being-paedophile-and-murdered-for-photographing-garden-vandals.html

      Unlike most people, I can ac

  • What about pictures of one's baby or young kid nude, is it illegal to send such images to the kids grandparents for example? It can be very hard to tell from just a picture whether abuse took place or not. Sometimes it's clear, and I hope the victims get all the help they need. But in other cases, would these automated systems mark the images as "child abuse" and get the parents in trouble? The topic is so heated that even slight suspicions can lead to big problems.

    • by gweihir (88907)

      Better not do it. Even if you are acquitted a few years later, your life will be ruined.

  • So now that these companies fully admit of scanning content and analyzing the results with express reasons to 'police' their users, does this mean they are now liable if they let *any* illegal content thru?

    Also, today its child porn, but what about tomorrow? 'hate' discussions? hacking discussions? warez ? political dissent?

    Easy answer is encrypting before you use the services, but this is a much larger issue with long term ramifications...

  • If you store anything on a cloud server, it will be looked at. Whether it is for kiddie porn, possible IP infringement, passwords, or just useful business information for competitors, it will be looked at.

I judge a religion as being good or bad based on whether its adherents become better people as a result of practicing it. - Joe Mullally, computer salesman

Working...