Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Google Microsoft Your Rights Online

Google and Microsoft To Block Child-Abuse Search Terms 308

mrspoonsi writes "Leading search engine companies Google and Microsoft have agreed measures to make it harder to find child abuse images online. As many as 100,000 search terms will now return no results that find illegal material, and will trigger warnings that child abuse imagery is illegal. The Google chairman said he hired a 200-strong team to work out a solution over the last three months. Google's previous set of measures, which displayed a warning to people attempting to search for illegal material and caused a 20 percent drop in illicit activity."
This discussion has been archived. No new comments can be posted.

Google and Microsoft To Block Child-Abuse Search Terms

Comments Filter:
  • I imagine that this will work until the child abusers find a way around it.
    • by Joining Yet Again ( 2992179 ) on Monday November 18, 2013 @08:59AM (#45453777)

      It's something to INCREASE abuse by:

      1) Redirecting resources away from finding abusers;

      2) Giving the impression that "something is being done already" so resources don't need to be reviewed;

      3) Misidentifying abuse as something which is caused by the availability of images of abuse, when in fact almost all child sex abuse occurs within families or thanks to trusted acquaintances for various complex reasons which require careful analysis rather than knee-jerk political reactions.

      • by Pi1grim ( 1956208 ) on Monday November 18, 2013 @09:13AM (#45453887)

        Exactly. Unfortunately this is tactics of sweeping the dirt under the rug. Shutting your eyes and pretending it's not happening. I don't understand why noone in their right mind thinks that hiding criminal activity reports will stop crime, but are sure that if we remove all child abuse pictures from the internet, then the problem will solve itself.

      • by ArsenneLupin ( 766289 ) on Monday November 18, 2013 @09:37AM (#45454097)
        Protecting the children is not the point of this. It's done to give the powers that be just another arrow in their quiver to crush the little man if he ever dares to fight against one of their corrupt construction projects, or if he ever dares to do his job too well researching who planted bombs against utility poles in the eighties. At least, that's what it is used for here in Luxembourg.
        • by Xest ( 935314 ) on Monday November 18, 2013 @10:01AM (#45454377)

          Actually, I think it's done for no other reason than to shut Claire Perry and The Daily Mail with their "Stop online porn" campaign the fuck up - yes, that's a real thing.

          Since she was elected this is the only issue she's focussed on, if I were Dave Cameron I'd be pretty sick of hearing her harp on about things she doesn't understand by now too and would probably do something useless and pointless just to get her off my back.

          Not saying it'll work of course, and not defending it, but I can understand why someone would cave in to a multi-year barrage of whining from that silly cow.

          Now we just need her to suffer the same fate as Jacqui Smith, the last MP who was as whiny and clueless as Claire Perry - her being caught charging her husband's porn to her expenses. Karma - it's great.

          • by AmiMoJo ( 196126 ) *

            Wait, I checked the Daily Mail website and it seems to be peddling porn... Even child [dailymail.co.uk] pornography [dailymail.co.uk]. Are they trying to get themselves banned?!

        • Re: (Score:2, Insightful)

          by Anonymous Coward

          Protecting the children is not the point of this.

          (In US) quite a bit of effort has gone into banning art drawings that may look to contain underage characters or actors that may look under-18 (regardless of whether they are underage or not). So I think it is fair to say that the actual children are long forgotten in this crusade.

      • 4) If pedophiles are prevented from getting this stuff virtually online, they might turn to doing it themselves and actually molest children.

        I have no idea how plausible that hypothesis is, but it might give some of those knee-jerk political reactions a second thought.
        • Re: (Score:3, Interesting)

          by JackieBrown ( 987087 )

          I think the opposite is probably true. I know watching woman in pornographic videos increases my visualizing women in day to day interactions in similar roles.

          One of the best things for my marriage was when we decided to quit watching these types of videos. It moved the focus of sex back to love instead of a sport.

          • by Joining Yet Again ( 2992179 ) on Monday November 18, 2013 @09:57AM (#45454321)

            You understand the difference between "visualising" and "raping", yes? Watching porn did not making you a rapist?

          • It doesn't sound like the porn was the problem. Your reaction to it wasn't typical.
          • Re: (Score:2, Insightful)

            by Anonymous Coward
            A healthy mind, even with tendencies toward socially unaccepted thoughts or occasional actions is still a healthy mind. It is much different than a mind that doesn't work correctly in the first place. Rape (toward anyone) and abusing children and/or child porn doesn't stem from a healthy mind. The healthy mind will look back and say, "this is wrong, I have to stop this." The unhealthy mind will continue to harbor fantasies and then eventually act on them. That's not saying they couldn't harbor fantasie
          • by ShanghaiBill ( 739463 ) on Monday November 18, 2013 @12:23PM (#45455703)

            I think the opposite is probably true.

            There is no evidence to support your belief. There have been many instances where the availability of pornography in a society changed, either by legal changes or technology (such as the spread of Internet porn). These changes are correlated with a decline in sexual violence. Here is an overview of the evidence [psychologytoday.com].

            I have known several guys that watched porn compulsively. They all had no relationships with women. The porn was a replacement for actual sex. I don't know if the same is true for pedophiles, but it seems to me that child porn is as likely to reduce molestation as it is to cause it. It seems reasonable to me to ban any porn depicting an actual child, but banning porn using adults posing as children, or animation, should not be done without clear evidence that it is harmful.

      • I see the point you're trying to make, but changing a couple of search sites' reactions to certain search terms entered into them really isn't going to have a detrimental effect. All it's going to do is prevent abusers and non-abusing consumers if the content from getting to the product, which is a good thing and needed to happen anyway. People making incorrect judgements on this sort of thing were misinformed regardless, and anyone who bases their directing of resources based on what google and microsoft
      • by bluefoxlucid ( 723572 ) on Monday November 18, 2013 @12:34PM (#45455811) Homepage Journal

        The issue is massively complex.

        We like the feel-good measures. We "rescued 380 children" last week by finding people associated with a nudist site that had pics of naked kids. The news articles collectively indicate that about 14 children in India were "identified" (not rescued), and that a bunch of teachers and such were removed from schools. In general, the conclusion by the online community is that 380 children were under the purview of teachers who might be into kiddy porn, and so "we rescued 380 children!" In other words: no actual children who were being abused have ceased being abused.

        The actual act of censoring child pornography is highly disturbing in itself. If we're assuming that people who have an internal thought and interest in children sexually are a threat, and thus making child pornography illegal, then we have two problems. The first problem is we're trying to punish thought-crime: child pornography isn't illegal because it's harmful, but rather because we want to punish people for having these thoughts we find personally disturbing. The second problem is we're completely incapable of pursuing enforcement against persons who we've deemed dangerous (for their thoughts), until they take some kind of action.

        That second problem is exacerbated by one questionable hypothesis: with the pornography outlet blocked by being as risky if not riskier than sex, will these people express by child abuse? If they're trying to find satiation and weighing risk, it's obvious that your Internet can be invisibly monitored (and thus is extremely risky) while you can at least manipulate and control children if you can get them to keep secrets (thus the spread of information is slow, if not controllable--and it's absolutely more controllable than the monitoring of your Internet activity). So it's much better to have actual sex with children than to search for child pornography at this point: it's safer.

        The above hypothesis is questionable for two reasons. First: we know that exposure to pornography and other visual effects provides comfort. People start looking at perverse stuff online, then they start watching gay porn, they move to bath houses and start experimenting with homosexuality... it happens, it's a common pattern, and a lot of straight men (and women) have experimented with homosexuality or bondage or whatnot by the cycle of introduction (initial thought or suggestion), curiosity, exposure, and then action. Thus we have another questionable hypothesis: that watching child pornography may acclimate a person to action, leading to actual child sex interactions.

        Another problem: action may come in different forms. Wired ran an article about online sex roleplay services, including everything from vanilla stuff to furry MUCKs (hilarity ensued: apparently a lot of not-furries got on furry sex mucks and were culture shocked). Common sexual exploration includes everything from furry fandom to group sex to, yes, underage roleplay. There are also real-world analogues of this: people actually roleplay scenarios, everything from teacher-student (college) to maids to rape play, up to and including finding young (18-20) and/or young-looking girls who can dress up as even younger girls. Schoolgirl roleplay is common; I've even known a number of girls who, in a nutshell, had the body of a thirteen year old when they were 25-ish--they could dress enough to look young-20s, but if you threw one in jeans and a t-shirt and tennis shoes you would swear she's got to be 12, *maybe* 13. That means there are many perfectly legal ways to act on these fantasies directly.

        So we have a complicated net of censorship, inaction, thoughtcrime, opposing psychological theories on whether outlets help or lead to bigger crimes, and outlets that are physical but provide a harmless mechanism of action. We could also get into some social considerations like the abridged rights of minors and the philosophical concern of this whole age-of-majority thing: apparently minors don't

        • by SuricouRaven ( 1897204 ) on Monday November 18, 2013 @01:26PM (#45456255)

          Take a look at the 2012 CEOP report and you can see some of that feel-good in their dubious statistics.
          http://ceop.police.uk/Documents/ceopdocs/CEOP_TACSEA2013_240613%20FINAL.pdf [police.uk]
          For example, they claim to have identified 70,000 new 'IIOC' files. Except on closer reading, duplicate detections of the same image count more than once, so that figure may be several times higher than it really should be. And of those, 75% are on their 'least serious' scale, a level which includes things you'd find in the family photo album. And one-fifth of them were classified as 'self generated' - most of which are likely young people taking a naked picture for their boyfriend who then sends it to the wider internet.

          My favorite part:
          'The commercial distribution of IIOC on the open internet is estimated to account for a very small percentage of the transactions taking place. This low level is likely to be a result of the large volume of IIOC in free circulation.'
          Yes... piracy is killing commercial child abuse!

          "Schoolgirl roleplay is common"

          Of course it is. For the majority of people, school was the time of sexual awakening and exploration. That's going to leave an impression, so it's no surprise many people want to re-live it.

    • by gweihir ( 88907 ) on Monday November 18, 2013 @09:13AM (#45453879)

      Actually it does not do anything about child abuse. It just hides the problem. People that look at such images are a minor side-issue. The real issue is people that abuse children, and even there those that document their crimes in images or video seem to be a small minority.

      I think this is designed (like so many efforts by law enforcement) to give the appearance of doing something really valuable, while it is likely rather meaningless in reality and may even be counter-productive. If this effort went into preventing children from being harmed in the first place, it might actually accomplish something. Instead they go for an easy, but fake, win.

      • I don't imagine it will do anything about child abuse, but if I wanted to look for pictures of it all day long, there are now 200 more jobs that have been created, where I can do so without fear or being caught.
      • Re: (Score:3, Insightful)

        by Gavagai80 ( 1275204 )
        People that look at such images are the ones who make the crime profitable. Without the profit, the crime decreases -- obviously not as much as in the case of say drugs, since a lot of abusers do it for their own enjoyment, but there are still plenty who do it for profit who can be put out of business.
        • by gweihir ( 88907 ) on Monday November 18, 2013 @10:47AM (#45454889)

          That is a red herring. Those that _sell_ this stuff can easily be identified and shut down by a very classical police technique called "follow the money". And that, again, has the added benefit that it may actually help some of those getting abused. Just drying up public business will just drive them underground (remember the prohibition?) and do nothing to help any abuse victim at all.

          • Actually, the CEOP 2012 report states that commercial distribution is almost non-existent.
            http://ceop.police.uk/Documents/ceopdocs/CEOP_TACSEA2013_240613%20FINAL.pdf [police.uk]

            It's all people swapping collections and images with friends.

      • by msobkow ( 48369 )

        Sure it does something. Blocking all the slang terms will make it harder to find child porn.

        Unfortunately it will also make it a lot harder to find perfectly innocent items like "chicken", too.

    • by methano ( 519830 )
      It seems that there is no problem so bad, that the coders don't think it can be solved by writing more code.
      • by N1AK ( 864906 )
        Neither Google or Microsoft wanted to do this. They did it after they were threatened. If they hadn't done this then the British government intended to pass legislation requiring them to do something (which could, and likely would, have been even worse from a censorship perspective). I doubt either company has any expectation that this will 'solve' the issue of child pornography online.
    • Actually all I hear is "we're bowing to political pressure" and "using either Google or Bing is a mistake" (and not just for child abuse).

  • by Joining Yet Again ( 2992179 ) on Monday November 18, 2013 @08:53AM (#45453723)

    Please search for and compile the list of 100,000 terms.

    Which will inevitably all:
    - Have double meanings;
    - Be likely to be used by victims of abuse who are looking for help;
    - Be useful for legitimate research;
    - Be searched for by people looking for news or discussion on censorship;
    - End up with a lot of political hot topics thrown in.

    Thanks!

    • by AmiMoJo ( 196126 ) * on Monday November 18, 2013 @09:01AM (#45453791) Homepage Journal

      For example, there is a popular French singer who does a song called "Lolita", presumably after the novel. For that matter the novel itself is perfectly legitimate.

      Anyway, what kind of idiot googles for child pornography. Really, how many users are that dumb?

      • by Thanshin ( 1188877 ) on Monday November 18, 2013 @09:13AM (#45453889)

        Really, how many users are that dumb?

        The answer to that question should be clear* to anyone who uses the word "users" and has over one month or professional experience.

        *: In this context, the word "clear" is to be interpreted as "painfully obvious. Crystalline as one of the axioms on which the universe stands; bright as the one truth all other truths are to be measured against.".

      • by gweihir ( 88907 ) on Monday November 18, 2013 @09:21AM (#45453959)

        I think nobody will be that dumb. But on the other side, people may be dumb enough to think that some searches were for such material. That all this has very little to do with children actually being abused seems to escape them as well, because most of the messed up people that abuse children will not document it and the few that do will not put that material online where Google can find it.

        This is designed to give the appearance of "doing something" about child abuse, while it really accomplished nothing. It might be a test-run for a censorship list though.

      • Anyway, what kind of idiot googles for child pornography. Really, how many users are that dumb?

        Obligatory You Must Be New Here.

    • It seems that the filter is less aimed at general filtering of all websites & more towards just those that host the illicit images. The idea I got was that using these terms to search for images would return no results, but a general web search would still have results.
      And since this is done by a set of companies, one would hope that politics would not come into play in how the list of terms is managed. But in this day & age, I highly doubt it.
    • It didn't say that 100k terms returned no results at all. It said that 100k terms returned no child abuse results.

      It looks like what they're doing is removing the sites from their index, not really screwing around with the algorithms (which is why it's possible for Bing and Google to share their work here).

      As such, none of the things you mentioned are particularly relevant, because none of them would be removed. In fact, by removing child porn from the results, they would be promoted. You could argue tha

      • It didn't say that 100k terms returned no results at all. It said that 100k terms returned no child abuse results.

        Wait... Google has EVER returned results with pictures of child sex abuse?!

        I suppose I'm lucky in that for the past 15 years I've never accidentally entered the wrong terms, because I've never seen anything I'd regard as an image of child abuse.

        Given this, my concern is what new things they are doing - particularly (see above) re manipulation of text results.

    • by Chrisq ( 894406 ) on Monday November 18, 2013 @09:21AM (#45453953)

      Please search for and compile the list of 100,000 terms.

      Which will inevitably all:
      - Have double meanings;
      - Be likely to be used by victims of abuse who are looking for help;
      - Be useful for legitimate research;
      - Be searched for by people looking for news or discussion on censorship;
      - End up with a lot of political hot topics thrown in.

      Thanks!

      Very true ..... for example I was thinking about searching about how this technology works but to do so would mean searching for dodgy things like "child abuse image filter"

      • "child abuse image filter"

        Oh my! You were probably... trying to bypass it. Pedo!

        Your name vill also go on ze list.

        (To think we do with sincerity what we once saw as so wrong that we once mocked it...)

    • by Xest ( 935314 )

      Non-Brits wont be able to help. According to the article I read this morning this is a global thing. Microsoft and Google are going to censor these terms right across the globe.

    • Cue shift in pedo code words. "Anyone know where I can find a farm stand with underripe melons and bananas?" "Looking for a late model used car, less than 13 years old. Must have tiny headlights." "Need small pizza, smothered in sauce, no sausage."

      Tom Lehrer said it best: "When correctly viewed, everything is lewd!"

  • Well, (Score:4, Funny)

    by Zanadou ( 1043400 ) on Monday November 18, 2013 @08:58AM (#45453771)
    I guess children will have to search for abuse some other way, then.
  • Just the Start? (Score:5, Interesting)

    by mrspoonsi ( 2955715 ) on Monday November 18, 2013 @08:59AM (#45453781)
    Fair enough, child abuse is universally against the law (unless there are a few countries without such laws on their statue), but by the same token murder is illegal the whole world over, and I do not see Google bringing up an "Illegal search" page if you were to type "how to murder someone", perhaps it will do one day...

    Yesterday I was not allowed to take a single photograph of my daughter who was in a dance competition, to quote "in case it ends up on the internet". This memory (dance competition) will be lost now, because it was not recorded. There was even an announcement, make sure all Phones and iPads are kept in your pocket / bag, something seems very wrong with this endless search for the boogeyman.
    • Re:Just the Start? (Score:4, Interesting)

      by rioki ( 1328185 ) on Monday November 18, 2013 @09:05AM (#45453831) Homepage

      How fitting, the current quote:

      Do you guys know what you're doing, or are you just hacking?

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      They did that at my niece's dance competition, but guess what? They had photographs and recordings that I could buy at some ridiculous price.

    • by Chrisq ( 894406 )

      Fair enough, child abuse is universally against the law (unless there are a few countries without such laws on their statue)

      Read most countries with sharia based law [rawa.org]

      • by N1AK ( 864906 )
        Obvious troll is obvious.

        The country with the 5th highest level of child brides is overwhelmingly Christian (80%). Two of the top 4 include a considerable Christian population which participate in child marriage. Child marriage is obviously a big issue in some Islamic countries, but only the ill-informed or racist, wouldn't be aware that child marriage has been common in many religions and is often present because it was common in countries prior to conversion to the worlds major religions.
    • by gweihir ( 88907 )

      That sounds like people there are in hysterics and have lost all rationality. Even if it ends up on the Internet, so what?

      • Exactly, a picture of a child dancing in a leotard is not child abuse, except for when it is found on a pedophiles computer, then it is classed as such. That creates the problem because stigmatizes normal images of children and yes I class a photo of a child wearing a swimming costume or dance costume as normal, should I feel odd taking a photo of my child on a beach? a mother would not, but as a man I am open to suspicion.
    • Re:Just the Start? (Score:4, Interesting)

      by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Monday November 18, 2013 @09:33AM (#45454063) Homepage Journal

      Yesterday I was not allowed to take a single photograph of my daughter who was in a dance competition, to quote "in case it ends up on the internet". This memory (dance competition) will be lost now, because it was not recorded.

      Are you keeping a scrapbook? One fun thing to do would be to put a MEMORY REDACTED card in it for every event you're not permitted to photograph for some bullshit reason. Hopefully in 40 years you'll be permitted to look at it and shake your head.

    • by havana9 ( 101033 )
      There were 'official' photographers? I remember that on similar events the presence of a professional photographer implied that unofficial photographers weren't allowed, and one had to buy the official image. If I know in advance that there's a ban on phones, I'll buy a black and white 35 mm film and take my old '70s mechanical camera and its giant and noisy flash lamp. Just for trolling.
      • Yes there were, outside of the dance hall at the entrance taking photos on a white background. I can do that at home (stood next to a white wall), instead I would have liked a photo of the actual dancing, in the event, during the moment.
    • Re:Just the Start? (Score:5, Insightful)

      by Xest ( 935314 ) on Monday November 18, 2013 @10:12AM (#45454511)

      So join or put a question to the PTA demanding the school answer why on earth it's preventing parents from saving memorable moments of their children's upbringing.

      If no one questions it this shit will keep propagating, I'd wager you're not the only parent pissed off about this and given that the school wouldn't exist without the parents and their kids then it needs to be stamped out.

    • by RedBear ( 207369 )

      Yesterday I was not allowed to take a single photograph of my daughter who was in a dance competition, to quote "in case it ends up on the internet". This memory (dance competition) will be lost now, because it was not recorded. There was even an announcement, make sure all Phones and iPads are kept in your pocket / bag, something seems very wrong with this endless search for the boogeyman.

      That. Is. Certifiably. Insane.

      I believe there is a step coming up shortly in this descent into madness where we will all be forced to pluck out our eyes, cut out our tongues, puncture our eardrums, surgically remove our genitalia and chop off our hands.

      You know, to make the world safe.

      For the children.

  • by Hartree ( 191324 ) on Monday November 18, 2013 @09:10AM (#45453859)

    You could try to get a secret court order that Google wasn't allowed to talk about that made them add noted child pornography search terms like "Edward Snowden" to the list.

  • Depressing job (Score:3, Insightful)

    by dubdays ( 410710 ) on Monday November 18, 2013 @09:16AM (#45453909)
    I'm not going to comment on whether or not this is a good idea or not. However, I will say that 200 Google employees had to code and test this. That has to be one of the shittiest jobs I can think of. Maybe it could be rewarding in some way, but damn.
    • I think a shittier job would be doing computer forensics. You end up having to see this stuff as well as go testify about it in court. It would become part of your life, inescapable. I'd given some thought to going into forensics but the thought of that deterred me, I don't think I could hack it. I've heard it said that there's a great personal reward in locking up the pervs, but it seems to me it would come at a great personal price. I wonder what the suicide rate is in the profession?
  • How is keyword blocking going to help abuse victims find recovery resources? I thought most kiddie-pr0n was on the darknet.

    Far more innocents will be hurt than the intended targets.

  • Where
        http://www.simpsoncrazy.com/content/pictures/family/HomerStranglesBart1.gif
    is blocked, while
        http://www.manowar-collection.de/Manowar1984Poster.jpg
    is considered safe.
  • If they can do this, why not block anti-American speech while we're at it? I mean, those people are terrorists right?

  • The people who frequent in this kind of material aren't searching for it on the open internet; they're using TOR networks and hidden FTP sites. The real solution is good investigative work, but that requires resources and effort.
  • They're baiting the MPAA/RIAA by doing this. They're going to get sued by every agency that doesn't want something found be they torrents or unpopular political views. Slippery slope ready for action.

  • A vastly better idea would be to allow these search terms through, monitor which images / sites were subsequently clicked on and then provide this information allow with IP logs to the relevant law enforcement agencies. In other words, let these freaks hang themselves with their own rope. The most likely consequence of banning these terms is that child porn will be driven underground, into Tor servers and so forth where it is far more difficult to monitor.
    • This is probably about the casual pedo who hasn't really gotten into it deeply and is exploring to see what's out there. A scary message can make them stop exploring by triggering guilt and fear. Most of them aren't technical enough for the underground.
      • by DrXym ( 126579 )
        They aren't technical enough yet. Blocking search terms just incentivizes people to search for alternate sources of material.
  • Sex Crime!! Sex Crime!!

  • Wow, two for two [slashdot.org] today, eh?

    I predict that this will be about as successful as all other attempts to censor information has been. But don't let that stop you. At least you look like you're "doing something" just like the fool politicians in the other story, right?

  • by Walterk ( 124748 ) <<slashdot> <at> <dublet.org>> on Monday November 18, 2013 @10:09AM (#45454471) Homepage Journal

    This is likely to be hugely ineffectual, as the actual numbers point to a rather different typical abuser:

    In the United States, approximately 15% to 25% of women and 5% to 15% of men were sexually abused when they were children.[33][34][35][36][37] Most sexual abuse offenders are acquainted with their victims; approximately 30% are relatives of the child, most often brothers, fathers, mothers, uncles or cousins; around 60% are other acquaintances such as friends of the family, babysitters, or neighbours; strangers are the offenders in approximately 10% of child sexual abuse cases.[33] In over one-third of cases, the perpetrator is also a minor.[38]

    From: Wikipedia [wikipedia.org]

    So what is this actually supposed to accomplish apart from censorship? What sort of "unsavoury" things are in this list of 100k search terms that are not even illegal? Snowden perhaps?

    • Those stats don't say anything about how many of the acquaintances are motivated at least partially by profit from selling photos/videos. There's little reason to think they'd be any more or less likely than strangers to have that motive.
    • by N1AK ( 864906 )
      But politicians have, and want, to been seen doing something. They can't invade the family/friend unit without doing something unpopular so they make a lot of noise, and show a lot of activity, regarding the small fraction that is left. I'm not going to spell out the social demographics but the chances of abuse in a household vary by orders of magnitude based on certain factors (I won't spell them out because I don't want this to become a debate about those factors). Offering more help in protecting childre
  • by jcochran ( 309950 ) on Monday November 18, 2013 @10:32AM (#45454721)

    Hmm... Seems to me that if google and bing have enough content indexed so as to be able to identify content as matching those 100,000 prohibited searches, then they ought to be able to automatically notify authorities about those web sites holding said prohibited content. Who can then take the appropriate legal actions. Which would target those who are making such content available, not the mere sick individuals seeking such content.

    I've got to wonder why such wasn't mentioned.

  • by fredrated ( 639554 ) on Monday November 18, 2013 @02:55PM (#45457065) Journal

    I didn't know there were that many words in the English language. Are any left to search with?

Nothing is rich but the inexhaustible wealth of nature. She shows us only surfaces, but she is a million fathoms deep. -- Ralph Waldo Emerson

Working...