Cleanfeed Canada - What Would It Accomplish? 211
Bennett Haselton has another article on offer for us today, this time looking at the implications of a Canadian initiative to protect children online. Bennet writes: "Cybertip.ca, a Canadian
clearinghouse for providing information to law enforcement about online
child luring and child pornography, has announced that a
group of major ISPs will begin blocking access to URLs on Cybertip's list
of known child pornography sites. A Cybertip spokesperson says that the
list fluctuates between 500 and 800 sites at any given time." Read on for the rest of his analysis.
The system is named after a similar
filtering system used by service provider BT in the UK. It is also
reminiscent of a law passed in Pennsylvania in 2002 requiring ISPs to block
URLs on a list of known child pornography sites; the law was struck down in 2004 on First
Amendment grounds. Although child pornography is of course not protected
by the First Amendment, the law was struck down partly because the ISPs
were blocking entire servers and IP address ranges, hundreds of thousands
of non-child-pornography sites were also being blocked.
Under the implementation of the Cleanfeed system, representatives from Sasktel, Bell Canada, and Telus claim that only exact URLs will be filtered, not sites hosted at the same IP address. (Although conventional Internet filtering programs sold to parents and schools have also made the same claims, only to turn out to be filtering sites by IP address after all, so we'll have to wait until the filtering is implemented before we know for sure.) The other difference of course is that the Cleanfeed system is not the law, so there's nothing to "strike down" in court. Cybertip did acknowledge that this means customers can get around the filtering for now by switching to a non-participating service provider, although they are encouraging more providers to sign up. Cybertip declined to say whether any providers had simply refused to participate. But of course it's much easier than that to get around the filter, since filter circumvention sites like Anonymouse and StupidCensorship will not be blocked.
So, if it's that easy to circumvent, does it do any good? Even respected Canadian academic and columnist Michael Geist, hardly a friend of censorship in other forms, has spoken out in favor of the plan. I'm going to go out on a limb and say that it doesn't accomplish anything meaningful, and may set a horrible precedent that could make it much easier to block other content in the future.
First of all, it seems that it obviously won't stop anyone who is deliberately looking for child porn. Empirically there's no way to tell -- we don't whether systems like Cleanfeed in the UK have prevented people from accessing child pornography on purpose. Even if the providers are counting the number of blocked accesses to known child porn sites, nobody knows what people have been looking at instead through proxy sites like Anonymouse. All we can do is ask, logically, whether it is likely to work. I think purely logical arguments are frustrating when there is no empirical data to act as a referee, but let's face it, users are not going to self-report on their success at finding child pornography, and there's no way to see what users are accessing through encrypted circumvention sites. Logic is all we have.
So, consider people who are deliberately looking for child pornography. Such people are likely to be resourceful to begin with (since real child porn -- remember, non-sexual pictures of naked children do not count -- is vastly less common than regular porn; Cybertip claims after all that they "only" have about 800 sites on their list, compared to millions of regular porn sites). Virtually all such people would be aware of circumvention sites like Anonymouse, or of peer-to-peer networks, which Cybertip says they have no plans to block. So nothing is blocked from people who want to get around the filter.
The only scenario where the filters could make a difference is the case where someone accidentally accesses a child porn site. Now when I first read the Cybertip press release announcing that the filter would aim to stop "accidental" exposure to child porn, I thought that was just a tactfully sarcastic way of referring to the people who get caught accessing child porn and claim it was just a mistake. But Cybertip.ca claims they've received over 10,000 reports since January 2005 from people who accessed child porn by accident. Even though that only works out to about 15 per day, I have to concede in those cases it almost certainly was a bona fide mistake, for the simple reason that nobody would voluntarily report accessing a child pornography URL that they visited on purpose. But even so, there's the question: What have you accomplished by blocking accidental exposure?
I would argue that the harm done by child pornography is to the minors coerced into the production of it, not to the people who view it. (This, by the way, corresponds with current U.S. jurisprudence; the U.S. Supreme Court ruled in 2002 that a law banning fake child porn was unconstitutional, even when the viewer can't tell the difference.) Obviously you prevent the most damage by stopping child porn at the production stage, but if it's too late for that, you can try to stop people from obtaining it willfully. This lowers the demand and decreases the incentive for people to produce more in the future.
But how would it lower demand if you block people from accessing it accidentally? If those people weren't going to proceed to buy or download more pictures anyway, then they're not fueling the demand. You can block them from accessing the pictures, but the pictures are still out there, and the people who really are fueling the demand can still access them.
So it seems that by blocking someone from accidentally viewing child porn, all you've really accomplished is to avoid offending their sensibilities. Now I don't mean that mockingly, I'm certainly not disagreeing with anyone whose sensibilities are offended by child porn. But there are lots of graphic pictures on the Internet that could offend someone's sensibilities, which are outside of Cleanfeed's mandate. Consider a photo of a 16-year-old having sex, versus a photo of an adult woman fellating a horse; even though the former is illegal to possess and the latter isn't, I think most people would be more grossed out by the second one. (I would even argue that there was more harm to the participants in the making of the second one, and in this case the law's priorities are a bit screwed up. Poor horse!)
So, why block 1% of the content that would offend someone's sensibilities, when 99% of the content that would still offend that person would still be out there? The fact that the 1% is illegal doesn't answer the question; even if it's illegal, you don't have to block it, so what have you accomplished if you do?
Possibly law enforcement is sick of people using the "I accidentally clicked on it" excuse when they get caught accessing child pornography, and wants to remove that as a defense. But couldn't someone just as easily claim that they "accidentally" accessed child pornography through a circumvention site like Anonymouse? They could claim that they thought they were accessing a regular porn site, they were using a circumventor to protect their privacy, and they didn't know that the site carried child porn and didn't find out until they'd already accessed it. So it doesn't seem like the filtering would remove the "accidental" defense.
So, I don't think the filtering accomplishes much at all, but it could set a very bad precedent once the filters are in place. Once Internet users have accepted the precedent that ISPs should block content that is "probably" illegal, what's to stop organizations and lawmakers from demanding that ISPs block access to overseas sites that violate copyright, for example, as the RIAA did in 2002? The technical means will already be in place, and more importantly, people will have gotten used to the idea that legally "questionable" content should be blocked. And with lobbyists claiming that 90% of content on peer-to-peer networks violates copyright laws, wouldn't it follow logically to block peer-to-peer traffic as well?
In a legislative climate where lawmakers have proposed everything from jail time for p2p developers to letting the RIAA hack people's PCs for distributing copyrighted files, we should resist any kind of content-based blocking that would let them get their foot in the door. That includes even well-intentioned efforts like Cleanfeed.
Lame. (Score:2, Insightful)
Why not a safe White List ? (Score:1, Insightful)
- simple to implement
- no software install on user end.
Why not? Because protecting children is not a point!
Censorship is what they are after.
It won't work! (Score:3, Insightful)
Here is more about it: http://www.pkblogs.com/thegallopingbeaver/2006/03/ canadian-software-will-breakdown-great.html [pkblogs.com]
Best of all, that very tool is now open source!
Simple answer (Score:3, Insightful)
But even so, there's the question: What have you accomplished by blocking accidental exposure?
Well for one, you're potentially protecting yourself from false accusations of accessing child porn, when you legitimately accessed it by accident by clicking on some link where you didn't know what would come up.
That is assuming of course that the agencies won't be using this proxy and filter list to charge people who are blocked with *attempting* to access the material.
What freedoms are you giving up? (Score:3, Insightful)
What freedoms are those, again? If I don't want to see you internet access to a given site, well, that's my right under free market principles. If you don't like it, find another provider. If I want to simultaneously limit my corporate liability and improve my public relations by actively preventing people from committing a crime (deliberately accessing illegal content), well, that's my right.
If you want to set up your own ISP in Canada without those restrictions, go ahead. If you want to set up an ISP that only shows web-pages about cats, or muffins, or religion, or science, or whatever, go ahead... it's not illegal.
I think you're confusing hype with loss of freedoms. The entire "child pornography" topic is usually just hype -- because if governments and citizens really cared about child abuse, they'd spend more time finding better ways to monitor and prevent child abuse by parents and relatives.
In the vast majority of cases, the abuser is someone the child knows well, and the abuse is not recorded in any way. Given this truth, why all the fuss over the recordings (the "child porn"), and where is the outrage over the real issue, the child *abuse*? The misplaced focus is depressing...
Infrastructure... (Score:5, Insightful)
Child Porn is just a MacGuffin, a universally despised act that is easy enough to strike up paranoia about. Unlike Terrorism, or Drugs, or Global Warming, or other issues, it has universal political support for legislation dealing with the problem.
Here is how the system is going to be expanded in Canada:
1. Block Child Porn Sites (after all, only a filthy disgusting pedophile would be against blocking child porn sites).
2. Block "Hate" Sites (after all, only a filthy digusting Nazi would be against blocking hate sites).
3. Block Political "Advertising" (After all, we don't want people with lots of money advertising on the internet, and corrupting our democracy!)
4. Block Dangerous Information (after all, why does someone really need to know how to build a gun, or a bomb, or manufacture drugs)
5. Block Sites that Compete Unfairly (after all, Google has a monopoly on search engines! Canadians shouldn't use an American monopoly, they should use a Canadian search engine, run by the CBC!)
6. Block Sites that Exploit Women (after all, we don't want women to be exploited... that is why we need to ban the Miss Universe pagent website!)
7. Block 'Bad' News Sites (after all, Fox News or Al Jazeera are highly biased news channels... they could confuse the minds of Canadians with their one-sided programs).
And so on, and so forth. Once the infrastructure is in place, it costs NOTHING to expand the list of blocked sites - and it is always easy enough to come up with some sort of reasonable arguement why certain sites should be blocked. Once this system is in place and works well, every political party will be screaming to have something they don't like banned - and without any real Libertarian minority in Canada, the only arguement will be over what things should be banned.
Yes but... (Score:2, Insightful)
Re:What freedoms are you giving up? (Score:2, Insightful)
Re:Um, distraction, maybe (Score:4, Insightful)
If you (or your wife, or your child) are forced to be photographed nude or engaged in sexual activity to which you have not consented, are you not victimized every time those photographs are seen or distributed? Are you really arguing that there's no victim here?
Re:Um, distraction, maybe (Score:3, Insightful)
Well, the State (insert your favorite government here) is in the business of trying to tell people what to think. Despite the protections on speech and expression in the United States, the government (and certain religious persuasions) would prefer it if you didn't think about these things at all and don't wish to be subjected to the actualizations of the thoughts of others.
Now, I see child porn as morally reprehensible. Frankly, I think you have to somewhat depraved to enjoy thought of sexual contact with pre-teens (not to mention [though I will] the dead or animals). But as long as the thought is in your head, and does not lead to overt acts that can be said to be contrary to the social welfare and the welfare of individuals, what you do in your own head is your own business.
Thought has to be the last bastion of privacy we have available. It's the testing ground, where we let loose the demons that plague us regularly and where we can do it in a controlled fashion. I tend to think it's healthier to work out our aggression and rage in the relatively harmless environment of our mind than to let them out into the daylight. Trying to eliminate even the thought of something bad or wrong is a futile gesture at best.
Re:Yes but... (Score:3, Insightful)
You'll be giving up your children's freedom, too. Is that the choice you want to make? Do you want your children growing up in a world where the government is what teaches them good from bad, instead of you? Once we acknowledge that it is appropriate for the government to tell us what is OK to look at and what isn't, we've given away the very rights and freedoms which make us unique, and which make us, well, 'free'.
I recognize that this means that sometimes bad people do bad things, but taking away the rights of all future generations in order to stop rare, individual actions is not the appropriate response. I realize it's easy to knee-jerk and respond by doing the first thing that comes to mind, but 'wisdom' is so-called for a reason - it involves reflection and rational thought, not knee-jerk reactions.
Re:Yes but... (Score:3, Insightful)
What I am talking about is that bills like this one are the sort that grant permission to the government to tell us what is ok and what isn't. It is THAT which I find unacceptable.
Well, then bring on more of the same. Since it's already this way, it MUST be ok. Hmm, I think you'll find large segments of the population disagreeing with you..
You've got it backwards. The culture itself thought it was alright, so the government did too. Likewise, WE, moral beings, believe stealing to be wrong - I don't know about you, but that's why I don't steal. You're saying you'd steal if the government said it was ok, but you really thought it was wrong?
*coughs* *points to display name*.
Balance doesn't always swing the way you hope it will, unfortunately.