Microsoft Tip Leads To Child Porn Arrest In Pennsylvania 353
Shades of the recent arrest based on child porn images flagged by Google in an email, mrspoonsi writes A tip-off from Microsoft has led to the arrest of a man in Pennsylvania who has been charged with receiving and sharing child abuse images. It flagged the matter after discovering that an image involving a young girl had been allegedly saved to the man's OneDrive cloud storage account. According to court documents, the man was subsequently detected trying to send two illegal pictures via one of Microsoft's live.com email accounts. Police arrested him on 31 July.
Which company is next in line? (Score:5, Insightful)
Re: (Score:2)
Re: (Score:3)
I'm more concerned about where the scans extend from here. It would be relatively trivial to include "scene release" pirated content in a similar hash group, and report it accordingly.
Even worse would be that Dropbox, Google Drive, etc starts scanning OUTSIDE of their directories, or adding new ones without asking. The only thing really stopping this is a matter of volume - hashing that many files would slow down the system too much, and uploading the hashes would take too long. Neither of these is insur
Re:Which company is next in line? (Score:5, Insightful)
I'm more concerned about where the scans extend from here. It would be relatively trivial to include "scene release" pirated content in a similar hash group, and report it accordingly.
I think the real point is that any of these companies could have done this at any time. It isn't so much a matter of "Look! they did something great!" (and they did)... it's more a matter of: look at the shitty privacy intrusion they've committed on hundreds of thousands, if not millions, of people, in order to accomplish that one great thing.
Freedom has a cost. And part of that cost is that some people will get hurt that otherwise might not have been hurt. But it's a cost worth paying, because otherwise millions more pay far more, even if it's only a little bit every day. Eventually that turns into a lot every day. That's not paranoia, that's history. Over and over and over again.
Re: (Score:2)
Freedom has a cost. And part of that cost is that some people will get hurt that otherwise might not have been hurt. But it's a cost worth paying, because otherwise millions more pay far more, even if it's only a little bit every day. Eventually that turns into a lot every day. That's not paranoia, that's history. Over and over and over again.
Easier to say when you're not the child getting molested. Which isn't to say you're wrong, but all too often the "FREEEEDOM!!!" crowd misses the very real costs that hurt very real (and very helpless) people. Its not as simple as all freedom all the time, we really do need a healthy balance.
Re:Which company is next in line? (Score:4, Insightful)
Easier to say when you're not the child getting molested. Which isn't to say you're wrong, but all too often the "FREEEEDOM!!!" crowd misses the very real costs that hurt very real (and very helpless) people. Its not as simple as all freedom all the time, we really do need a healthy balance.
But your problem sir, and it is a really big one, is that who gets to decide that balance?
I worked with a guy who once said. "I don't care if they come into my bedroom and fuck my wife, as long as they keep the country secure". He was willing to give up any semblance of freedom for his "security".
Fot you see, there are people here, just as intense as the Freedom people you think are missing the point, who would have every aspect of your life intruded upon, mandatory searches, and no privacy whatsoever.
Thet's their idea of a healthy balance.
But seriously, storing anything in "the cloud" means that it will be looked at. Whover the fucker is, good for him. There is such a thing as criminal stupidity.
Re: (Score:2)
Well it's a good thing we have a whole system of government balanced on the idea of not doing whatever 1 guy says. Seriously, the amount of people who call for government protections of something to protect them from the government which they then say they are powerless to influence is ridiculous.
If it is noteworthy that we can in fact influence policy, then it is also worth noting that there is no obvious slippery slope between hash matching child abuse images sent over services unencrypted, and then prose
Re: (Score:2)
hashing that many files would slow down the system too much
Nope, because they already hash them. It is very common for several people, sometimes dozens or hundreds of people, to store or email the same image, zip file, etc. So to cut down on storage and transmission costs, the images are already hashed, and only stored once.
Re:Which company is next in line? (Score:5, Interesting)
Which company is next in line?
What makes you think they have not been parallel-processed?
Microsoft's terms and conditions for its US users explicitly state that it has the right to deploy "automated technologies to detect child pornography or abusive behaviour that might harm the system, our customers, or others".
Now, is it my imagination or does that description cover something like: "Our employees have free access to everyones files so eventually all pics get viewed and tagged. Because think of the children. Terrorism, fire, brimstone and death!".
TFA says it requires 'fingerprint'. ie. already having whatever theyre looking for archived...
Re:Which company is next in line? (Score:5, Informative)
NCMEC has the collection of actual illegal pictures. They have government permission to have them.
Everyone else (Microsoft, Google, Facebook, etc) just has the list of hash values. Totally legal for them to have.
This system has been public knowledge for at least 3 years. Just google NCMEC and follow the links!
And (since someone always complains) yes, the people running this know what a hash collision is. They are experts with hash functions and image processing.
Re: (Score:3)
NCMEC has the collection of actual illegal pictures. They have government permission to have them.
Everyone else (Microsoft, Google, Facebook, etc) just has the list of hash values. Totally legal for them to have.
This system has been public knowledge for at least 3 years. Just google NCMEC and follow the links!
And (since someone always complains) yes, the people running this know what a hash collision is. They are experts with hash functions and image processing.
Let me give a bit of detail to this. NCMEC has a collection of actual illegal pictures, as do the FBI. This, in turn, can be turned into hash/size tuples...which makes it very, very easy to automate searching for content without 1, needing direct human observation of anything but the content that matches a signature, 2, requiring much work on behalf of Google/Microsoft/Apple/, or 3, actually giving pictures of child pornography to the provider. Essentially, it's trivial to repurpose technologies intended
Re: (Score:3, Informative)
They don't hash the raw file itself they construct a specialised hash based on the image content. It breaks the image up into chunks, analyses those chunks and generates a hash from that analysis. The intent being to make it resilient to cropping, scaling and colour changes. http://en.wikipedia.org/wiki/PhotoDNA
Quick, sue /. as these numbers are illegal ! (Score:2)
The demo 1x1 pixel has the color 0xF0B8A0 and contains child pr0n down-sampled.
What, you thought numbers being illegal was nonsense too?
Re: (Score:2)
Re: (Score:2)
Now that you have posted it here, they will be coming for you! And I think I have to secure-erase a browser-cache....
In the clear? SRSLY? (Score:5, Insightful)
Sweet Jesus, if you're going to send things in the clear, you have no idea who might be able to lay eyes on it. This goes for storing things locally -- people have been busted for stored files when they take a machine in for repair as well.
When in doubt, encrypt. When not in doubt, get in doubt.
Re: (Score:2)
We don't know the nature of these "child porn" files. Might have been fully clothed "jailbait" that the user didn't think was actually illegal, but which the police take a dim view of, for example. These days the police will claim "child porn" in their press releases when it's really more like teens sexting or something.
Re: (Score:3)
Nope, just severely allergic to stupidity. Whether I agree with the law (some parts I do, some I don't), or indulge in that sort of material myself (which I don't) are both irrelevant -- if content you are distributing is likely to cause authorities to intervene if it is noticed, then encrypt that shit. Simple as that. If you are in the habit of moving such content, it's even better to get in the habit of encrypting EVERYTHING so as to obfuscate what is worth attacking and what is not.
Re: (Score:2)
Files generally are not encrypted on a one-off basis, instead they are saved within an encrypted "container". This encrypted contain could contain other arbitrary files and will likely use a unique seed value to start encryption, both of which will ensure that you will not be able to find a reproducible file hash for bad images. What you are describing is basically a known-ciphertext attack and is well understood within encryption.
I need therapy (Score:2)
Seriously, I must not be normal. This is clearly "for the children", there's really nothing morally disputable about this particular case. So, why can't I see it as progress? Why am I worried that it was automatically spotted?
I need to get my s... straight. Think of the children. Think of the children. The system is good for me. The good guys have nothing to worry about.
No you do not (Score:3, Insightful)
Re: (Score:2)
Nah, not "to a police state". The difficult part of observing a slippery slope is the admission than "yup, we're mostly there". Otherwise you'll just lie to yourself until you're not just at the end of the slope, but anchored, settled and playing solitaire with a full family there.
That's the problem with highly emotional subjects (like pedophilia). You need to conciously limit your emotional response to them, otherwise you will accept as "lesser evil" things that are really a bigger problem.
Scale is what ma
I was wondering... (Score:2)
Re: (Score:3)
Re: (Score:2)
Presumably, the ones that encrypt don't get caught and therefore don't make the news.
Re: (Score:2)
If you encrypt then they know you're hiding something.
Why wouldn't you think they are scanning? (Score:4, Insightful)
I don't understand the surprise people are experiencing from the revelation that Google and Microsoft scans the stuff you upload to their cloud storage systems.
You are literally giving them a copy of your files, and generally speaking, you also agreed to allow them to allow them to scan your stuff. Google Drive's terms of service explicitly states that your stuff will be scanned:
"Our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored. "
Why would anyone reasonably think that their stuff is somehow private when it's in the cloud?
Re: (Score:2)
Re: (Score:3, Insightful)
Re: (Score:2)
Yes, but the terms of the ToS also generally states that you wouldn't misuse their services. For instance, Google Drive's ToS states:
"You may use our Services only as permitted by law, including applicable export and re-export control laws and regulations."
Using Google Drive for child porn obviously violates this clause of the ToS, and once that happens, you are at the mercy of the Cloud provider on the basis of you having agreed to the terms of the Terms of Service.
Re:Why wouldn't you think they are scanning? (Score:5, Interesting)
Re: (Score:2)
Of course they're scanning it. I would have assumed that they're scanning it for viruses/malware, for the sake of deduplication, and to provide indexing so that I can search it. It's been very public that Google also scans your email in order to serve ads, with the assurance (whether it comforts you or not) that this is all done by machines and Google employees don't see your email.
However, searching email for the sake of reporting illegal activity to law enforcement is a bit concerning. It seems easy t
Re: (Score:2)
That's a distinction without a difference.
Re: (Score:2)
And in turn the cloud services are storing very illegal images. It's just due diligence if you ask me.
I wonder how much staff they have to review this sort of thing (it would be a terrible job if you ask me, like watching the toilets in Southland Tales - which was awesome when combined with the comic book).
Re: (Score:2)
Re: (Score:2)
This.
I really do not understand it. How can people (hell, it seems like MOST people) not see that using anything like a storage facility removes any expectation of privacy unless there are clear regulations about that in the agreement? In real, physical world this is so obvious. Earlier you could count on laziness and scalability problems, but hey - automation!
I believe that the problem is we simply did not have the time yet to adapt to the thought that over less than a single generation processing went fro
Drip. Drip. Drip. (Score:2)
This is the sound of the panoptic, dystopian police state coming. Good luck everyone!
Hold on to your DVD backups (Score:5, Insightful)
We all know what kind files they will scan for next. Because MPAA/RIAA are way more important than children!
Re: (Score:2)
A remarkably stupid statement. And with an exclamation mark too!
Google and others are doing this for two reasons
*cough* Bullshit *cough* (Score:2)
Google and Microsoft are going through your private data because of power and control. They also receive Government incentives to do so, so gain cash as a side effect. Any claim of altruism is either a delusional fantasy or sock puppetry.
Just as I said about Google doing the same thing the other day, Microsoft is WRONG to do this. They are not a Law enforcement agency and have no right to read through customer data on their own accord. With a warrant from a court, different story, because that is the le
Re: (Score:2)
Are you aware of a law prohibiting it? I am not, but IANAL... Without such a law, their access to the subscribers' email is controlled only by their own Terms of Service. Which are, of course, subject to change at their discretion.
Microsoft's child porn collection (Score:5, Funny)
In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.
Re: (Score:2)
Actually, no.
They get a big list of file hashes from the National Center for Exploited Children or something, and it's implemented as part of the file scan. All that happens is they check file hashes and if it matches, then they do more in-depth analysis (is it an image file? etc).
Which begs the question on the general stupidity since hashes are so trivially easy to change and it's extrem
Re:Microsoft's child porn collection (Score:4, Informative)
Actually, no.
They get a big list of file hashes from the National Center for Exploited Children or something, and it's implemented as part of the file scan. All that happens is they check file hashes and if it matches, then they do more in-depth analysis (is it an image file? etc).
Which begs the question on the general stupidity since hashes are so trivially easy to change and it's extremely easy to obfuscate (just zip it up with a password).
People are lazy. Even ones who really know that what they do isn't really appreciated by the general population and really ought to try to cover their tracks... and don't.
Nope, from the TFA they process the image to derive a signature which can survive things like resizing, changing resolution etc. It's not just a simple hash.
Re: (Score:2)
Some details here: http://en.wikipedia.org/wiki/P... [wikipedia.org]
Re: (Score:2)
I could make a fortune (Score:5, Funny)
If only I had a large enough collection of tinfoil hats to sell to all the posters freaking out over this.
Re: (Score:2)
It only takes a minute to make your own, besides, the ones you're selling are probably bogus ones developed by the NSA with backdoors for mind reading.
foil "how to" video... Re:I could make a fortune (Score:2)
foil "how to" video [youtube.com]
If only I had a large enough collection of tinfoil hats to sell to all the posters freaking out over this.
Why do people even use this garbage? (Score:3)
This cloud crap is just trash. At least use encryption (not theirs) or something.
Plus, they caught someone with images that shouldn't be illegal to have to begin with. When is an actual rapist going to be arrested?
Re: (Score:2)
Just like with drugs they go after the users, then they turn on the dealers, and then they turn on the producers. Gotta work your way up.
Re: (Score:2)
Just like with drugs they go after the users
That works so well!
Gotta work your way up.
Going after people looking at/sharing images is morally wrong, so it isn't even a viable option. And that's a non sequitur, anyway.
Re: (Score:2)
You know how they work their way up with drugs? By offering reduced charges/reduced sentences for providing evidence. For example, a drug user will be offered probation/dropped charges for ratting out his dealer, who in turn will have a "possession with intent to distribute" reduced to mere possession for saying who his supplier is, and so on up the line until they find someone big enough to go all-out against.
The police can't do that with CP. There are no lesser versions of possession, and dropping char
Re: (Score:3)
Does not compute. How is having copies of something in any way, shape, or form similar to receiving stolen goods? Someone loses their goods when they're stolen; that's the point.
This is just government censorship, and pointless government censorship at that; that's something everyone should oppose. None of your useless analogies will convince me otherwise.
Re: (Score:2)
We punish people for receiving stolen goods, because doing so reduces the demand for stolen goods, and reduces the appeal of stealing them in the first place.
So that was your point? I care about it because the stolen goods are actually someone else's property. People's actions are their own; if you steal something, that's your fault. If you rape someone, that's your fault. It doesn't matter if you think someone else wanted you to.
Also, I value fundamental freedoms over safety, so even if I bought into your point, I would still oppose you.
2. Yes, it's government censorship. Yes, the right to free speech isn't absolute - incitement to riot, libel, etc.
It should be. I oppose all government censorship.
3. Since nothing will convince you otherwise, there's no point in continuing this discussion - your beliefs are an article of faith, like those of a New Earth Creationist.
So you're saying that all of your beliefs could be changed through a mere dis
Re: (Score:2)
[Citation needed].
Otherwise, for all you know, it could increase the demand because they have to go make it themselves instead of just downloading stuff that already exists.
Hash collision in 3 2 1 ... (Score:3)
One of these days a hash collision will happen on an innocuous file and the jackboots will ruin someone's life over it.
Re: (Score:2)
No. Because
1) These are not simple hashes, for more read the article. They seem to be able to distinguish images in the same way Shazam can recognise music while ignoring bar or vehicle background noise.
2) Once flagged, the image would be reviewed before further action is taken. A false positive would not make the news.
Now it's your turn. How will a hash collision lead to the jackboots ruining someone's life?
Re: (Score:2)
It won't fall on any programmer's shoulders. All it requires is a prosecutor, judge, and jury who don't understand technology. None are in short supply.
Oh wow, the commenters in here... (Score:5, Insightful)
- We have a member here who thinks Pedophilia is a disease and think Pedophilia equals abusing children:
He/she is one of the numerous clueless people out there who have NO idea if this is actually a disease or just like Homosexuality. Arguing with such a person is completely futile, but they'll always be in numbers. It's kind of voting for stupid. (Yes, that was a H2G2 reference).
- We also have several members here who thinks Pedophiles should be arrested and behind bars just for being Pedophiles, never mind if they committed any crimes.
- We've got the usual anonymous coward zealots that thinks that if you don't have anything to hide, there is nothing to worry about.
Wanna bet who's next on tomorrows "sick" list? It can't possibly be you, can it?
- We've got the next predictable bunch who immediately attacks someone who defends the freedom of the individual, and calls them Pedophiles, because they can't POSSIBLY be normal or straight if they defend Pedophiles, now can they?
(Who exactly defended who now?) Never mind the actual facts, just as long as you get YOUR hidden agenda across.
- And then we have those who thinks that images of kids being exploited are okay, just as long as you bust the purps behind the images, and not the users.
(And who are the users now again? Sick Pedophiles, or nasty voyeuristic perverts that wants to get a kick out of something unthinkable and illegal?) And where do we draw the line? Naked kids? Kids posing sexually, and how do you define that?), family photos available to all? Imagine the number of youtube and imageshack users you'd have to arrest or at least suspect. Who do you trust today?
I'll let you in on a little secret of mine, for years I've been working undercover together with a police agent who is a close friend of mine to uncover several secret child-abuse rings in various countries - trust me when I say...this is the WORST JOB IN THE WORLD. I got into it because some family members of mine was abused, and I thought I'd use my skills for something good. Over time I learned that albeit we DID get a lot of these rings busted, we also ruined several families lives, destroyed childhoods because the law and common sense doesn't mix at all.
Everyone sees red when it comes to Child Abuse, and rightly so - but it is important...no...VITAL for progress that we somewhat keep our heads above water here and try to think rationally. It is NOT rational to point fingers at everyone who wants anonymity as a suspect of anything, it is NOT rational to call every Pedophile a CHILD ABUSER, it is NOT rational to think that if your opinion differs from the stupid masses...that you are in LEAGUE with ANYONE who happens to NOT fit your OPINION today (eg. those who want to HELP PEDOPHILES - are NOT nessesarily Pedophiles themselves, but a lot of the angry mob especially in here seem to think so).
I get upset by this, because I think of Mr. Allan Turing, who was just recently pardoned by the British for the grave injustice brought upon him just for having a sexual preference he might not even have ANY control over (we're not talking urges and constraint here, we're talking sexual PREFERENCES).
I do NOT want a society that becomes totalitarian where every deviant of nature becomes a freak to be hung, burned and ridiculed for just being different. I see YOUR mind as a private thing, just like your diary as a private thing. What you THINK of or FANTASIZE of is YOUR BUSINESS ONLY, and NO ONE ELSE.
And there is nothing that gets me fired up more than someone using child abuse in ever shape and form, fantasy or drawn, real or not - to excuse severe abuse of human rights, to pry into our daily lives with the law in hand...and with a lot of supporters that mean well...but really have NO CLUE of the REAL danger they're actually putting themselves in by supporting this ludicrous development.
Wake up and smell the coffee, people!
Re: (Score:2, Interesting)
I live in the UK, which is a country that is genocidal* (deny it as much as you want, people, it won't stop being true) when it comes to people who're accused of being pedophiles, hebephiles, child pornographers or child molesters.
It doesn't matter if the person is innocent or not; once they're accused, they will wind up being murdered. Example: http://www.telegraph.co.uk/news/uknews/crime/10409326/Man-accused-of-being-paedophile-and-murdered-for-photographing-garden-vandals.html
Unlike most people, I can ac
Nude kids (Score:2)
What about pictures of one's baby or young kid nude, is it illegal to send such images to the kids grandparents for example? It can be very hard to tell from just a picture whether abuse took place or not. Sometimes it's clear, and I hope the victims get all the help they need. But in other cases, would these automated systems mark the images as "child abuse" and get the parents in trouble? The topic is so heated that even slight suspicions can lead to big problems.
Re: (Score:2)
Better not do it. Even if you are acquitted a few years later, your life will be ruined.
liability? (Score:2)
So now that these companies fully admit of scanning content and analyzing the results with express reasons to 'police' their users, does this mean they are now liable if they let *any* illegal content thru?
Also, today its child porn, but what about tomorrow? 'hate' discussions? hacking discussions? warez ? political dissent?
Easy answer is encrypting before you use the services, but this is a much larger issue with long term ramifications...
Oh,for chrissake (Score:2)
Re: (Score:2)
My parents of pictures of me that would probably have sent to jail if they took them today
fuuuuuuck, mine were god damned MOVIES!
Re: (Score:3)
Microsoft (or Google) getting a hit on a flagged image (or on image processing) means that they turn over the results of that hit to LEO.
If LEO works to arrest you based on that information, then you're subject to the justice system like any other suspected criminal.
You can argue that the justice system might have an axe to grind against pedo's, and you're probably right, but they're still afforded due process.
Witch hunts describe looking for things that aren't there - you know, witches. Sick fucks with pi
Re: (Score:2)
There is a reason that the red scare is described as a witch hunt, so saying that the term "witch hunt" can only be used for imaginary things is simply false.
Re: (Score:2)
I always thought of "witch hunt" not as referring to the actual pursuit of those engaged in witchcraft (which is not imaginary, btw, only the idea that it works is imaginary, IMHO), but rather as referring to the drive to utterly crucify the subject of the hunt. BURN THEM, do not treat them as a human being, subject them to cruel and unusual punishment.
YMMV.
Re: (Score:2)
This would be like a self-storage site that had a drug sniffing dog.
Re: (Score:2)
Possessing images that authorities subjectively deem "disgusting" is illegal there? The US has "I know it when I see it." The fact that such censorship (based on opinions about subjective matters, no less) is allowed in any 'free' countries shows that free countries don't truly exist.
Re: (Score:2)
Re: (Score:2, Insightful)
Re: (Score:3, Insightful)
No, and that is a ludicrous analogy.
It's more, I should not go into someone else's home, leave my stuff there, and when a legally-dubious thing happens to be in my stuff in their house, I should not expect them to simply let it go (considering that a lot of legally-dubious things have clauses about "conspiracy" and "required to report").
Re: (Score:2, Insightful)
Your comparison is perfect, assuming you want people searching through your stuff for legally dubious things. The big issue is that this searching could be expanded to catch other, less harmful files. What if they were searching for generic pornography, leaked government documents, or "backups" of programs/media? Surely that isn't something you'd want.
Re:Trust the Computer. The Computer is your friend (Score:5, Insightful)
Hey, pedophiles have serious mental issues and deserve a special place in prison .
A pedophile is nothing more than a person who is sexually attracted to prepubescent children. Not all pedophiles rape or even look at child porn, and not all child rapists are even necessarily pedophiles.
Also, why do they need a special place in prison? Why not 'normal' rapists, or murderers? Do they also get special places in prison? If not, then why single out this group? Because mentions of 'the children' cause your irrational brain to malfunction?
that you report the existence of any of this kind of stuff because of the harm that is caused to the children.
Voodoo is not real. Voodoo does not exist. Images will not harm people like voodoo dolls. Any 'harm' is caused by their own reaction, assuming that they even see it. But if the mere thought that an image of themselves could be out there is enough to make themselves emotionally unstable, then there is nothing that can be done for them, because censorship is - in practice - futile.
So in this case, it IS for the children and it's hard to argue with the logic.
No, it's easy, and that's because there is no logic; just a strong desire for more and more government control over what information is accessible to people.
Re:Trust the Computer. The Computer is your friend (Score:5, Insightful)
The harm is in the production of the images in the first place, not in the viewing of them. The viewing supports the production. Or the production supports the viewing. I am not sure, given that I do not operate in those circles. From what I have read about it, the consensus seems to be that most kiddie porn is produced by family members abusing their younger relatives.
It can probably be argued that the people making the images would continue to make them even if they did not have an audience to share them with. Even so, there is still some social value in discouraging people from consuming the images. If people are interested in the images, that is a form of social acceptance for those who make the images.
It is bad enough that people have these demons that they struggle with. It is terrible that they abuse those who are too young to protect themselves and in most cases, do not even realize how wrong the activities are. The last thing that we need as a society is to encourage others to consume the evidence of that abuse.
Re: (Score:3, Insightful)
The harm is in the production of the images in the first place
I agree 100%.
The viewing supports the production.
People's actions are their own. If the rapists rape, then it is their fault for raping, whether or not they're doing it for a profit or because they want others to see the videos or images. Going after people who merely look at the content is blaming them for other people's actions, and I don't condone that.
But even if that were true, I'm 100% opposed to government censorship, even if it keeps people 'safe.' So no such arguments will work on me.
The last thing that we need as a society is to encourage others to consume the evidence of that abuse.
The last thing we need is censorship.
Re: (Score:2)
I am willing to agree that blanket censorship is a bad thing.
How can you be opposed to the censorship of child pornography? Please avoid the slippery slope argument. That one has been played out.
Re: (Score:2)
How can you be opposed to the censorship of child pornography?
Because I am opposed to government censorship. I do not believe that censoring something merely because people don't like it, or because you think it influences others in 'bad' ways (in this case, people say it influences people to make more) is okay. It's too subjective for me. But again, I oppose all censorship.
Re: (Score:2)
Let's use an extreme example here. Someone rapes your mom and takes pictures and posts them on the internet. Would you be opposed to allowing your mother to issue a DMCA take down notice? That is censorship.
Following your logic as I think you are laying it out, you would have to be opposed to that too. After all, rape is bad and we are not necessarily condoning rape. We are simply looking at images of something that has already happened. We are not profiting from them. The rapist is not profiting fro
Re: (Score:2, Insightful)
except we don't arrest people for photos of any other type of crime scene. those rape photos couldn't be stopped. how would the DMCA even apply?
Re: (Score:2)
The victim is already victimized and will not be un-victimized.
The victim continues to be victimised for as long as the images are public, because they will continue to be confronted with the images (causing them to relive the trauma) and because anyone they meet has potentially seen the images and will treat them differently because of it.
Re:Trust the Computer. The Computer is your friend (Score:4, Insightful)
I agree. And this is why I posed the question to the OP. He is against "any" censorship. I was curious if that also applies to censorship of negative things that happen to someone close to him who he presumably loves and cares for.
It is one thing to try to portray kiddie porn as "just pictures". It is another thing entirely when they are "just pictures" of your child, or your niece.
This is going to be a bit too metaphysical for this audience, but there truly is "good" and "evil" energy in the world. I do not mean in the Christian sense of heaven and hell. I mean real evil. Real, emotional and mental sickness that should have no place in a civilized society. Yet at the same time, an evil that is inevitable given the reality that the universe must be balanced, and that every action must have an equal and opposite reaction. Evil that is the polar opposite of love and compassion and caring.
Re: (Score:2)
This has nothing to do with influencing people. This has to do with the real children who are harmed making them. It is noteworthy that the supreme court ruled recently that depictions of child pornography which are simulations - i.e. computer generated images - are not illegal to have.
Re: (Score:3)
The problem is that once you have a censorship system in place, it will see a huge expansion of its mission pretty soon. Whenever a "web block-list for child porn" leaks, the few people brave enough to verify it find that many, often the majority of entries actually, do not concern child porn. That is absolute no surprise to any student of history or human nature.
So, as soon as you allow some censorship, you basically will have it all a few years later. And that is exactly what is being done here: Justify i
Re: (Score:2)
No, distributing media through private channels only has a negative impact on Hollywood -- in all other venues it's a net benefit to the producers.
Re: (Score:3)
The last thing we need is censorship.
No, we need "censorship". There are just some thing which legally must be limited in any society, even a free one like ours. You cannot legally incite riots, yell "Fire!" in a crowded theater and stuff like that.
Re: (Score:2)
You know, there is an exception to everything. You can't legally scream "FIRE" in a crowded public place as a joke. That's because the possible harm outweighs the harm of censorship. Even though you obviously disagree, the public, for it's own protection has decreed child porn images illegal. This censorship is held to be in the best interest of society. My own political leanings border on anarchy but in this I have to support censorship as well. To allow publishing of this kind of material is worse t
Re: (Score:2, Insightful)
The viewing supports the production. Or the production supports the viewing. I am not sure,
Well, let me clear it up for you, since it's a pretty simple one-way cause and effect: Production supports viewing. Viewing, in and of itself, does exactly nothing to support anything else.
Purchasing? That could support production. Page views on a site that runs ads? That could support production. Pulling from a site that keeps a record of the number of downloads, such that the uploader gets some kind of gratification
Re: (Score:2)
Actually, there is good reason to believe that most (practically almost all) child abuse does not get documented and hence does not get published on the Internet. The harm to children is when they get abused, pictures are immaterial to that. Yet the "authorities" focus entirely on this tiny fraction of abuse that gets documented and seem to be much, much more intent in suppressing the documentation instead of finding out who got abused and who did it in order to stop it. And they seem to be doing basically
Re: (Score:2)
God dammit, you must be on the board of every PTA throughout the country. That kind of reasoning is trotted out for every single mommism in existence. Pot as a gateway drug, video games and violence et al.
Unless you're an exquisite troll and that went right over my head, in which case -- congrats.
Re: (Score:2)
Have you ever considered that all the discussions in the PTA meetings might have at least SOME truth? After all, a lot of people seem to think the same way.
There I go trolling myself...
Re: (Score:2)
That maybe true, but the opposite maybe also be true, if you keep talking about it saying how bad it is, it may encourage people to do it as well.
What ever you do don't push that button, you may not have even thought about pushing that button, but the moment someone tells you not to, you want to.
I don't know how much each effect influences us, I would love to see some studies on adults of reverse physiology adults. I know it works very well in children.
Re: (Score:2)
"You can't be attracted to or have sex with 14 year-olds" WTF.
First, plenty of teenagers are attracted to people their own age; just because you were never attracted to anyone nor was anyone attracted to you during your junior high years does not mean it does not happen.
Second, plenty of people are attracted to kids that age. I just read about a case in my local, small town newspaper.
Third, there have been plenty of 14 year-olds getting pregnant so we either have a helluva lot of proof that the story of Jes
Re:Trust the Computer. The Computer is your friend (Score:5, Insightful)
Re: (Score:2, Interesting)
I'd add the distribution folks added to someplace in between "locked away for life to be Bubba's princess" and "go see a shrink" but I'm not going to argue with your approach. In any case, this stuff needs to be dealt with in a pretty harsh way and get those who are disposed to harm children off the streets. To me that means that we need to be taking a close look at the dude (or dudete) caught downloading this stuff, because if you are getting your jollies from it, it's a short distance from looking to doi
Re: (Score:2, Insightful)
And while you're at it, don't forget to drag in all people that view 'rape porn', 'snuff porn', and of course all the people that playing violent video games.
Because, naturally, if they get their jollies from viewing and playing this stuff, it's only a short distance to doing it. And before you know it, half the adult population will be taking pictures of themselves raping/killing/bashing.
Re: (Score:3)
It is actually not for the children. It is a crusade against pixels. The harm to the children has already be done, often decades earlier. As to people that pay for this stuff, it seems there are almost none. Scanning like this is also quite cheap and produces a lot of "perpetrators" that may never have touched a child. It can easily be adapted to scan for other stuff, like political opinions and the like, once such a system is in place.
There is also the little problem that much of this material does not act
Re: (Score:2)
For example, it could be a Lisa Simpson drawing. That can already get you put away for decades.