Follow Slashdot blog updates by subscribing to our blog RSS feed


Forgot your password?
The Internet News

Wikipedia Adds No Follow to Links 264

netbuzz writes "In an attempt to thwart spammers and search-engine optimization mischief, Wikipedia has begun tagging all external links on its site "nofollow", which renders those links invisible to search engines. Whether this is a good thing, a bad thing, or simply unavoidable has become a matter of much debate." This topic has come up before and the community voted to remove nofollow back in 2005. This new round of nofollow comes as a directive from Wikia President, Jimbo Wales.
This discussion has been archived. No new comments can be posted.

Wikipedia Adds No Follow to Links

Comments Filter:
  • From TFA:

    Although the no-follow move is certainly understandable from a spam-fighting perspective, it turns Wikipedia into something of a black hole on the Net. It sucks up vast quantities of link energy but never releases any.

    The situation is a classic tragedy of the commons []: does the interest of malificent spammers outweigh Wikipedia's rôle as a semantic mediator between alien but related nodes?

    Should Wikipedia transition to leaf from cut-point, it may have significant and unforeseen effects on internet-topology.

    • by spun ( 1352 ) <> on Monday January 22, 2007 @03:46PM (#17714322) Journal
      Read the wiki article you link to. The tragedy of the commons only applies to unmanaged resources. Wikipedia is a communally managed resource, so the analogy is less than apt. Your speculation regarding the impact of a no-follow wiki on the rest of the Internet is interesting, though.

      I bring up the point about the Tragedy of the Commons because the parable has been used as an excuse to privatize communally managed resources, when such resources do not fall prey to the Tragedy. Reasoning such as yours could be used to justify the 'privatization' of wikipedia, turning it into an experts-only publication where the public has no input. This would be as bad a misapplication of the lessons of the Tragedy parable as it is when governments and industry collude to privatize such things as water cooperatives, which are public but managed resources and not vulnerable to the Tragedy at all.

      • Re: (Score:2, Insightful)

        I think you have a problem with the concept. The tragedy of the commons happened when common ground was abused because no one stakeholder managing their produce had a stake in the integrity of the common land.

        That applies just as much to Wikipedia as well. No editor or group of editors has a stake in the integrity of Wikipedia when anyone connected to the Internet can undo, vandalize or otherwise screw up what they have written. Still less do they have a stake in the maintenance of encyclopedia standards si
        • by spun ( 1352 ) <> on Monday January 22, 2007 @04:20PM (#17714756) Journal
          Why don't you read the original essay? The tragedy of the commons happened when a common ground was abused because no effective method of management was in place to ensure the integrity of the common land. That method of management could be a single stakeholder, or a communal system of management. The original essay was very clear in regards to the fact that there is more than one way to effectively manage a resource.

          Now, we could argue all day as to whether the system of management wikipedia has in place is effective or not, but we cannot argue that it has such a system. Imagine, would there be a tragedy of the commons if everyone felt free to simply kill all the cos of the offenders? If there weas, it would certainly be a different tragedy. That is akin to the management system of wikipedia. No overgrazing because any one person can nuke every single cow on the planet, and any other person can resurect every dead cow on the planet.

          An experts only publication would not be a bad idea. Why don't you start one up and tell me when you get say 1/1,000 the number of articles wikipedia has, or 1/10,000 the readers. But don't do it to wikipedia, start your own. Wikipedia already has a system that works well enough. Sorry if you don't like it, but in this free market of ideas, enough people find it useful, as is, to make it one of the most popular sites on the Internet.
          • uncommonly tragic? (Score:3, Insightful)

            by PopeRatzo ( 965947 ) *
            I think Wikipedia's decision to "no-follow" their links is quite reasonable. The Internet has seen enough of the manipulators and astroturfers who try their best to distract us, and it shows the worthiness of the Wiki leadership that they'd take this step.

            The notion that the Internet is going to organically solve such problems smacks of the magic "free-market" economics that are supposed to make the world a paradise, but end up tilting the field in favor of the most powerful. There is no magic that's goi
            • Re: (Score:3, Insightful)

              by ShieldW0lf ( 601553 )
              Personally, I think the idea of a "we're going to link to this resource, and when you're figuring out the topographical/popularity metrics of the interweb, we want you to count it and give us and them "points", but when we link to that resource, we want you to pretend that we didn't link to it and not give them any points." system is retarded.

              Spam and whatnot may be a problem, but this is not the solution. This is just dumb.

              Here's an idea:

              If any site has "no-follow" links on it, that means that not only ar
          • I am exhausted. The first 5 comments where all +5 interesting or insightful. I didn't get a troll, link spam or random offtopic rant to catch my breath on.

            Somebody needs to ban these people. If this trend continues, then /. might get a reputation as a place for on-topic discussion about technology.
        • "Oh the horror. Just imagine if Wikipedia was written only by people who knew what they were talking about. Terrors like that keep me up at night."

 is trying to write such an encyclopedia. It's a small project, but it's pretty active already. It'll be interesting to see how it goes - there's got to be more than one way to do this, after all. See if it interests you.

        • by danpsmith ( 922127 ) on Monday January 22, 2007 @04:59PM (#17715158)
          Oh the horror. Just imagine if Wikipedia was written only by people who knew what they were talking about. Terrors like that keep me up at night.

          I'm sick and tired of this particular beef with wikipedia. Just because you can't quote wikipedia in your thesis for your doctorate doesn't mean its useless. If you want reliable source material look elsewhere, if you want an exorbitant quantity of information, Wikipedia has that. It's the quick and dirty resource for people who might just need to know a few things about a subject without having to fact check and such. That's what it should be treated as. The fact that non-experts are allowed to edit entries is what made it grow to be the resource it is today.

          If some of the information is inaccurate, so what? It's not like heart surgeons are looking up how to conduct an operation on Wikipedia. People need to stop beating on its potential for inaccuracy and instead see it as what it is, a great resource for learning about topics or at least a starting point given no other resources. The Internet as a whole tends to have a large amount of inaccurate information, but that doesn't make the Internet useless. The quantity of information largely and fully outweighs the risk of inaccuracy. Everything has inaccuracies anyway, and Wikipedia's usefulness makes any mistakes it has well worth the benefit of having it versus not having it. It's a mighty powerful resource, and I'm tired of hearing it bashed just because some random vandal could and sometimes does screw up a few entries (even though they are usually fixed in a pretty timely manner). It's an online resource, take it for what it is and quit bitching about how one entry out of 10,000 is inaccurate, and just be thankful you have the 10,000 entries. Or better yet, just don't use it if you find it offensive.

          • If you want reliable source material look elsewhere, if you want an exorbitant quantity of information, Wikipedia has that. It's the quick and dirty resource for people who might just need to know a few things about a subject without having to fact check and such. That's what it should be treated as. The fact that non-experts are allowed to edit entries is what made it grow to be the resource it is today.

            Maybe we should be calling it the "Hitchhiker's Guide to the Galaxy"????
        • Just as a clarification, in addition to what was already said: the tragedy of the commons is _not_ a generic wildcard for any tragedy in any kind of communal resource.

          The essay is on a very specific scenario: over-utilization of an unmanaged resource.

          The original example was this: you have an unmanaged piece of grassland, where all the villagers can bring their cows to graze. For each of the individual farmers, adding one more cow means more profits. Unfortunately the same applies to everyone, so everyone w
      • by jfengel ( 409917 )
        What's the difference between "communally-managed" and "unmanaged"? That is, what's the difference between Wikipedia being communally-managed and the classic field-of-sheep commons? The latter also has community opprobrium to try to keep your usage fair.

        On Wiki you can actually go so far as to remove resource usages you don't find appropriate, but its success so far seems to be insufficient value to the trolls and spammers. If somebody were really intent on "overgrazing" wikipedia, automated troll-
        • by David Gerard ( 12369 ) <slashdot.davidgerard@co@uk> on Monday January 22, 2007 @04:20PM (#17714746) Homepage
          "If somebody were really intent on "overgrazing" wikipedia, automated troll-bots would have no difficulty spewing crap all over it faster than the community could work to revert it. I'll be honest, I'm surprised I haven't seen more if it already."

          You will be utterly unsurprised to know this happens already ...

          In general, any obvious objection to the idea of a wiki encyclopedia already happens and is already dealt with day to day. We have a ridiculous array of spambots and vandalbots already attacking Wikipedia and trying to turn it to their use, never mind our work trying to write an encyclopedia. So we have an EQUALLY ridiculous array of antivandalbots to deal with these things as needed. Our immune system is quite frightening to contemplate at times ...
          • This is why I don't understand Jimbo's decision. The current system works. When I go to Wikipedia, I am almost always surprised by the *lack* of commercially motivated worthless links. There may be some links that were added by bots, but Wikipedia *always* edits out the ones that aren't genuinely useful. As far as I can tell, there is no problem to solve. Maybe it takes a lot of work to filter all the spam links, but that work is successful.

            I actually am not too worried about this though. I think Goog
          • by jfengel ( 409917 )
            I'm curious to know more about this. Having spent some time on Slashdot, I've seen that there are an awful lot of people with an awful lot of time on their hands, looking for nothing more than to spew filth at somebody whose reaction they'll never see. I just bumped into one in a different thread a few minutes ago. It was anonymous and therefore left at 0, but I still ran into it drilling down into a question I found interesting.

            Wiki removes things a bit more thoroughly, but I know that the trolls are out t
        • by spun ( 1352 )
          Well, another responder answered your question quite handily already, but I will add this: Unmanaged means I have no effective way of dealing with your overgrazing or wikipedia abuse. Communally managed means we can, as a community, keep you from abusing the resource. If common grazing land were like wikipedia, the answer would be nuking all your sheep from orbit and posting a lock and a sign on the commons stating "you must be at least THIS------> Respectable before you can graze your sheep on these com
          • by jfengel ( 409917 )
            the answer would be nuking all your sheep from orbit and posting a lock and a sign on the commons stating "you must be at least THIS------> Respectable before you can graze your sheep on these commons."

            Are you saying that the equivalent happens on wikipedia already? I was under the impression that it still supported anonymous editing.
            • by spun ( 1352 )
              Plenty of controversial articles are locked against anonymous/new-account edits. What happens is two or more editors start nuking each other's sheep, one or more of them complains, a wiki official comes by and resurects the favored sheep, and then posts a lock on the gate.
      • Re: (Score:3, Interesting)

        by frankie ( 91710 )

        The current problem with Wikipedia is more of an offshoot from Tragedy of the Commons. In the grand tradition of Slashdot analogy-stretching:

        • Wikipedia is the field
        • long-time users are the (semi-enlightened, self-regulating) farmers
        • HOWEVER, thousands of new farmers have arrived in town, with more every day
        • AND it turns out that at least half of them are actually human-shaped insects a la Mimic [] trying to devour the field AND the cows

        In all seriousness, Wikipedia has simply outgrown its youthful innoce

        • Re: (Score:2, Insightful)

          Peer-reviewed anarchy breaks down after a sufficient quantity of greedy scumbags show up.

          Very true, and that's true of any democracy - that is, one where each individual within it has exactly the same amount of power. The only variable is the amount of time it takes to break, and the reason is not some inherent flaw in the system of government (or the abstract idea of individual freedom that it provides) - it's simply due to the fact that there are always a bunch of scummy assholes out there who will be

    • by rossifer ( 581396 ) on Monday January 22, 2007 @04:02PM (#17714536) Journal
      Should Wikipedia transition to leaf from cut-point, it may have significant and unforeseen effects on internet-topology.
      Wikipedia will remain a node-cluster in the larger web. The only difference is that for Google ranking, they no longer contribute to the ranking of outside websites. This will not stop people from putting relevant external links on Wikipedia pages, it just reduces the benefit to the linked site.

      In my experience as a forum webmaster, there is simply no other choice. Any place where the unverified public can put up links, spammers will put up links to their crap, which do more than just use your resources for their ends. If Google notices that your site seems to have become a spammer link-farm, you're entire site will very likely be removed from Google, with all of the bad mojo that entails. So, any page where the unverified public can put up links, those links must be "nofollow", or else...

      Personally, I'm astonished that Wikipedia hasn't done this from the beginning.

      • by David Gerard ( 12369 ) <slashdot.davidgerard@co@uk> on Monday January 22, 2007 @04:22PM (#17714784) Homepage
        "Personally, I'm astonished that Wikipedia hasn't done this from the beginning."

        All the Wikipedias other than English have had this in place already. It's just that the flood of spammers has been so bad on English Wikipedia we've finally had to put it on there too.

      • Re: (Score:2, Insightful)

        by cheater512 ( 783349 )
        No. They still contribute Pagerank to other websites.

        The pagerank just leaks out from other places. MediaWiki's main site is a good example.
        Also the other language wikis dont have nofollow so they will get a massive boost.

        I'd really hate to be at google at the moment. Search results will be doing really funny things in the next month or so.
        • Re: (Score:3, Informative)

          I'd really hate to be at google at the moment. Search results will be doing really funny things in the next month or so.

          This is why I feel that Google needs to provide multiple indexing algorithms, where a user can decide how pages are ranked in their search results. This would make things a bit more complicated for Google, but even more complicated for the people try target deficiencies in the algorithm. The idea being if there are multiple algorithms, it is hard to know which one to target.
          • I guess the answer would be, attack all of them. The spammer problem seems to me to be the antithesis of the 'whack-a-mole' online pirate. They'll always be there. Maybe I'll put up with the evil spammer if it means the continued good of free stuff.
    • by Lazerf4rt ( 969888 ) on Monday January 22, 2007 @04:27PM (#17714836)
      ...outweigh Wikipedia's rôle as a semantic mediator between alien but related nodes?

      False premise. Wikipedia is not a "semantic mediator between alien but related nodes". Wikipedia is just a free encyclopedia.

      The only reason why an external link should be placed in Wikipedia is because that external link is already significant in some way. Wikipedia does not exist to make those external links any more significant than they already are. It seems to me that is the essential point of the Wikipedia policy, Wikipedia is not a soapbox [].

      So, since there is no such "tragedy of the commons", Wikipedia is free to tag their links "nofollow" if they want to. If it raises Wikipedia's search results over the external links in Google, good for them. That's the way it should be. These bloggers who nitpick about Google PageRanks 24/7 strike me as a bunch of whiners, frankly.

  • by XorNand ( 517466 ) * on Monday January 22, 2007 @03:31PM (#17714112)

    "nofollow" only exists because Larry Page and Sergey Brin had a (at the time) brillant idea of ranking webpages according to how many sites linked back to it... and now that method of determining relevance is broken. Prior to this innovation, most search engines relied upon META tags... which also eventually broke. Google is where it is today because they recognized that the web had evolved past META tags (and other techniques of self-describing content).

    My point is that the Internet as a whole souldn't be tripping over ourselves because Google's invention too is now obsolete. The "nofollow" attribute is just an ugly hack created to accommodate the frequently-gamed PageRank algorithm. We should instead find new ways to determine relevance. Hey, if your idea is good enough, you might even find yourself a billionaire someday too. Who knows, maybe the next wave will also wash away all those god-forsaken AdSense landing pages and domain squatters (oh please, oh please, oh please...).

    • by RAMMS+EIN ( 578166 ) on Monday January 22, 2007 @03:42PM (#17714242) Homepage Journal
      ``Google is where it is today because they recognized that the web had evolved past META tags (and other techniques of self-describing content).''

      More like meta tags never worked. Much better to judge the content of a page by...looking at the content. Only a fraction of pages included meta tags, anyway.
    • by Dan Farina ( 711066 ) on Monday January 22, 2007 @03:42PM (#17714244)
      Actually this sort of flow model was well documented in IR, AI, and mathematic research for a period long before Google. While credit should be delivered for implementing this scheme in a world of already-entrenched search engines, it falls into the category of age-old computer science. This same scheme is also used to compute the final likelihood of states in Markov models -- a technique at least 30 or 40 years old.

      In a nutshell: the eigenvalues of the adjacency matrix.
    • Re: (Score:3, Interesting)

      by garcia ( 6573 )
      In addition to what you have mentioned above, Wikipedia should not be given the weight it is in Google rankings, period. My Wikipedia user page should not show up as a top five return for a Google search of my name. It shouldn't show up at all simply because it's not as important as the other information out there on me.

      The only reason the Wikipedia user entry exists is because Google does rank the pages *very* highly. Bleh.

    • by jfengel ( 409917 )
      Is the hack really all that ugly? It actually strikes me as rather elegant: rather than looking at something tangential to the page itself, like a META tag, its looking at something fundamental about the nature of the web. The notion that a page donates some of its importance to other pages seems quite elegant to me, and the NOFOLLOW tag is a simple extension of that notion: "Even though I'm linking to this page, for whatever reason I don't consider it important."

      Open user-editable web sites like Wiki shou
    • by Kjella ( 173770 )
      We should instead find new ways to determine relevance.

      We've tried letting webpages describing their relevance. (Meta tags)
      We've tried letting others describe a webpage's relevance. (PageRank)

      Short of spritiual divination and feng shui, how many other models could there be?

      There's of course the "expert" model. but it has plenty issues with bias, shills and not least of all cost and scope, it's just not feasible to review even a fraction of a fraction of the sites google reviews daily. If you let everyone be
      • by Baricom ( 763970 )
        I've always wondered if grammatical scoring would be helpful. This is not beyond the capabilities of today's computers, because I'm not proposing understanding the content - only whether the grammatical syntax seems reasonable. SP4m d1ffernt b1cause looks differnt, and it shouldn't be too hard for a search engine to notice obvious flaws in language. This has the added benefit of forcing spammers to use proper grammar, which should then be easier to understand. A simple comparison to other pages can chec
      • by XorNand ( 517466 ) *
        Bah... there's always a way to build a better mousetrap. In fact, I personally am working on one that I feel is quite good. And there are experts in natural language processing and statistics that I'm sure could innovate circles around me in this area. In a few years I'm sure web search will be a whole new ballgame (again).
    • Re: (Score:2, Informative)

      by VENONA ( 902751 )
      Actually, nofollow predates Google. It dates back to at least HTML 2.0, so sometime around '94 or so. Google launched in 1998. It's original intent really was nofollow, not the 'don't index' that Google and some other engines mutated it into, which is what turned it into the ugly hack that you described it as.

      I don't really subscribe to the Google==Good viewpoint commonly seen on Slashdot. I'm not saying Google==Evil, just that very little in this world is an unalloyed good, and that very much applies to Go
  • by dreddnott ( 555950 ) <> on Monday January 22, 2007 @03:35PM (#17714170) Homepage
    While I don't necessarily disagree with the reinstatement of Wikipedia's nofollow policy, I do have to say one thing: Jimbo Wales is a tool.

    Yesterday, after reading and noting glaring inconsistencies in the Wikipedia articles and talk pages for Wikipedia [], Larry Sanger [], and Jimbo Wales [], as well as Jimbo Wales' user page [], I have lost a bit of respect for Wikipedia and a lot more for one of its cofounders. I can't believe he's trying to manipulate his encyclopedia project this way!
    • by AlexMax2742 ( 602517 ) on Monday January 22, 2007 @04:23PM (#17714798)
      You are a fool if you think that the stupidity stops there: When Wikipedia gives sysop priviliages to batshit insane people like this guy [], and he somehow managed to keep said privilages for as long as he did (the only reason he lost said priviliages is because he picked a fight with another abusive admin), you know that there is something fundamentally wrong with Wikipedia.

      Now if only someone can unprotect this article []...
      • Yeah, MONGO is quite a character. I was going to contribute to the various articles on the September 11th World Trade Center attacks last year, and while reading the talk pages I realised that nothing productive was going to happen while MONGO was an admin. The articles are much better now (and no, I'm not one of those whacked 911truth guys either).

        I didn't really see Seabhcan as an abusive administrator, but maybe that's just the Irish in me.

        Encylopedia Dramatica, well, I'm not 100% sure that it needs to h
    • Jimbo Wales has tools, his little minions who are trying hard to white wash history by making him the only founder of Wikipedia. Just try changing founder to co-foudner on his user page and watch them swarm all over your ass. He "suggests" and his little minions scurry as if doing a favor in Jimbo's eyes will make you more Powerful and Important.
      • Flaming rhetoric aside, I tend to agree with your sentiments, although I was very impressed with how the Wikipedia editors dealt with Jimbo on his article's talk page [].

        My favourite entries:
        "Co-founder" is simply false, and we have reliable sources which report that I have called it, on the record, in the press, "preposterous". That is definitive as to it being controversial, and therefore if you want Wikipedia to take a stand on it, you want Wikipedia to push a particular point of view.--Jimbo Wales 17:12,
  • by chris_eineke ( 634570 ) on Monday January 22, 2007 @03:40PM (#17714226) Homepage Journal
    How does the link="nofollow" attribute render links invisible to search engines? It's up to the search engines to ignore or to regard them.

    If you don't want search engines to follow links on your website(s), you could rely on them to give you a proper agent string so that you can serve pages that don't include hyperlinks. But that's ugly nonetheless.
    • If you don't want search engines to follow links on your website(s), you could rely on them to give you a proper agent string so that you can serve pages that don't include hyperlinks. But that's ugly nonetheless.

      If a search engine detects you're serving significantly different content to its robot than you are to the rest of the web (e.g. by comparing the contents served to a different IP with a web browser user agent string) it will probably erase your entire site from its index.

      • Thanks. You explained the 'ugly' part of the solution. I did not think of that. What would you say does 'significantly different content' mean in this context? Should a search engine penalize web sites that strip out links (and only that)?
  • pointless (Score:5, Insightful)

    by nuzak ( 959558 ) on Monday January 22, 2007 @03:44PM (#17714294) Journal
    Nofollow doesn't work if you just put the URL directly in the text, and google will treat them more or less as links (to the site at least, though possibly not the path).

    The way to fix this is with stable versions -- you don't let search engines see unstable versions at all. But having looked at the craptastic mediawiki codebase, I can sympathize with them not wanting to bother with adding such a major feature.
  • How would it be unavoidable? They could have avoided it by...simply not doing it...couldn't they?
  • by fyoder ( 857358 ) on Monday January 22, 2007 @03:45PM (#17714314) Homepage Journal

    This won't solve the problem, since humans may still follow the links, so it's still worthwhile for spammers to have links in Wikipedia. Even if it doesn't up their pagerank, Wikipedia can still serve them as a spam delivery system.

    However, it helps Google by not uping spammer's page rank. And less noise in the search results is good for the users of Google.

    • There are plenty of better ways to game Google than Wikipedia links. The entire SEO industry is designed to increase your pagerank on given keywords, and if you have enough money, they will produce results. You can pay your way to a #1 google ranking relatively easily and inexpensively (well, inexpensive for a corporate marketing department at least.)

      This just probably will slow the crapflood of googlebombing links on Wikipedia, which take editors' resources to find, remove and keep removing. Most of the 'n
  • by Anonymous Coward
    Will Wikipedia face the same fate of the Open Directory Project -- where marketeers have spammed the site to render it useless. Check out the ZDNet post... []
  • by MBraynard ( 653724 ) on Monday January 22, 2007 @03:48PM (#17714338) Journal
    How about creating a new Google-style Ranking system that only ranks sites based on the number of no-follow links heading towards them?
  • Not invisible (Score:4, Interesting)

    by truthsearch ( 249536 ) on Monday January 22, 2007 @03:52PM (#17714404) Homepage Journal
    which renders those links invisible to search engines.

    Uh, not really. The big search engines choose to not follow those links.

    Using nofollow reduces the incentive for spammers, but in this case it will hurt search engines. Google wants to provide the most worthy links at the top of search results. Being linked from wikipedia is supposed to denote reliable sources or very relevant information. Therefore Google is slightly more accurate for having those links to follow in wikipedia. The nofollow will make search engines slightly less useful.
    • Being linked from wikipedia is supposed to denote reliable sources or very relevant information.

      Is it? We all know that in practice, the only thing having your link in a Wikipedia article actually means in the real world is the last person to edit the article either thought it belonged, or didn't happen to look into it. It's the ill-advised prestige people seem to attach to a Wikipedia-linked site that will keep it worth it for the spammers to keep spamming, regardless of the nofollow tags.

    • Your totally right about the reduction in inventive for spammers and that it's a somewhat odd choice for Wikipedia to make.

      Why not let search providers be responsible for their own results? It is ultimately their choice how they let links from domains influence their results, nofollow or otherwise. This is like an admission that the community can't handle the spam and is surrendering; and that won't work anyway.

      Some search engines give extra weight to wikipedia links. results star
  • Jimbo is the president of Wikia and the founder of Wikipedia. These are separate and distinct roles.

    Speaking as a Wikipedia press volunteer, it's a goddamn nightmare keeping them separate in press perception. Because Jimbo is Mr Wikipedia, so even though Wikia is COMPLETELY UNASSOCIATED with Wikipedia, they keep conflating the two.

    I ask that Slashdot not perpetuate this. Jimbo asked this as the founder of Wikipedia and the Final Authority on English Wikipedia, and Brion (the technical lead and Final Authority on MediaWiki) switched it on.

    May I say also that we've been watching the spamming shitbags^W^WSEO experts bitch and whine about it, and it's deeply reassured us this was absolutely the right decision. We would ask Google to penalise links from Wikipedia, except the SEO experts^W^Wspamming shitbags would just try to fuck up each other's ranking by spamming their competitors.

    To the spammers: I commend to you the wisdom of Saint Bill Hicks: "If you're a marketer, just kill yourself. Seriously."

    • by nuzak ( 959558 )
      While the general "spamming shitbags" sentiment is spot-on (I put the hurt on spamming shitbags for a living), I gotta ask: is this how you conduct yourself as a press volunteer? No, probably not ... but if you think wikipedia links should be penalized in Google, then this seems to me a tacit admission that Wikipedia is basically a cesspool as goes content. You gotta wonder how much that colors the rest of the attitude.

      You sound burned out, but hey you probably lasted longer than I did.

      I did find Wikipedi
      • We're not 'reliable' and we don't claim to be. This is important: we don't save the reader the trouble of having to think when reading.

        Most of the complaints that 'Wikipedia isn't reliable' appear to be complaints that we haven't saved them the trouble of thinking. I have to say: too bad. It's useful or it wouldn't be a top 10 site. But it's just written by people. Keep your wits about you as you would reading any website. We work to keep it useful, but if you see something that strikes you as odd, check the references and check the history and check the talk page.

        Wikipedia does not save the reader from having to think.

        • Wikipedia does not save the reader from having to think.

          I deeply appreciate Wikipedia's usefulness, but this makes it sound as though Wikipedia's sporadic unreliability is a feature, not a bug.
          • Re: (Score:3, Insightful)

            by brian0918 ( 638904 )
            Except that it's generally a good idea for individuals to retain their ability to think in all situations.
          • It's just how it is, by its nature. When people say "Wikipedia is not reliable", they seem to mean "I have to think, waaah."

            There are all sorts of ideas on how to abstract a "reliable" subset of Wikipedia. Someone just has to bother, really.

            • Re: (Score:3, Informative)

              by Raindance ( 680694 ) *
              It's just how it is, by its nature. When people say "Wikipedia is not reliable", they seem to mean "I have to think, waaah."

              Some people misunderstand what Wikipedia is, definitely. But I think we differ on the importance of reliability: I see an unreliable source as not merely 'requiring people to think' but potentially deeply messing up someone's understanding of a topic. Once the brain learns something incorrect or biased, it often takes effort and attention to unlearn it.

              There are all sorts of ideas on h
    • by fyoder ( 857358 )

      We would ask Google to penalise links from Wikipedia...

      Hey, no problemo. Simply remove the ability to link to external sites altogether. As someone who has a couple of links from Wikipedia to my content, I know were I to be penalized for them, I would remove them very quickly, as would others. So why not just eliminate them completely in the first place thus saving time and aggravation for all parties?

    • Your post is contradictory. You imply that Wales is nothing to do with Wikipedia in your first sentence, then state that he's the "Final Authority on English Wikipedia". You can't have it both ways.

      Regardless, probably the main reason the press and everyone else is "confusing" Wales' role with Wikipedia is entirely due to the man himself. As a successful self-publicist he frequently wades in with his two cents worth on Wikipedia. Here, on /. too, in person.

      You may say he's separate, but I, for one, d
  • by victim ( 30647 ) on Monday January 22, 2007 @03:58PM (#17714482)
    This should be considered a step in an evolving policy. The next step should be that old links, ones that have survived many edits and time as well as links added or edited by known and trusted editors should omit the no-follow tag. Then wikipedia can continue to serve as an interpreter of the WWW.
    • That would be the ideal way of doing it.

      MediaWiki needs developers. If someone can write something to do this, cleanly enough that it passes the developers' exacting code standards (when you run a top-10 website on PHP and MySQL, you need to know what you're doing), please contribute!

    • Re: (Score:3, Interesting)

      by Kelson ( 129150 ) *

      The next step should be that old links, ones that have survived many edits and time as well as links added or edited by known and trusted editors should omit the no-follow tag.

      I like this idea. nofollow is more useful for the unmaintained or rarely-maintained site. If you're going to leave the site alone for a month and come back, you probably want to avoid rewarding the comment/wiki spammers who drop by in the meantime. On the other hand, once you verify the site, it's worth helping the site out a bi

    • On further reflection, this would be a means for wikipedia to communicate to search engines and browsers the trust level of link. A more general solution would be to introduce link signing. Allow people to create a "linker id" and a private linker key. They could then sign links with their id and a signature.

      The search engines are then free to decide who they trust and how much. Link spammers should be obvious by making huge numbers of links to the same content. People who make consistently good links can b
      • If you can code this, that would be marvellous.

        (At the moment, the thing MediaWiki most lacks is good coders - people who can do database programming to a MySQL database in PHP, efficiently enough to run a top-10 website which is nonprofit and hence broke by definition. CODERS WANTED!)

  • by jesterzog ( 189797 ) on Monday January 22, 2007 @04:04PM (#17714560) Homepage Journal

    I don't think this will do much to stop Wikipedia link spamming for several reasons:

    • Many spam links on Wikipedia aren't commercially motivated spam, but just people who've naively put external links in articles without properly understanding or caring about the editing policy. They're not thinking so much about search engines as about pointing people to their website (or their favourite website) because they think it's more important than it probably is. If it's a relatively obscure article, it might stay there for months or longer before someone goes through and reviews the links.

    • Wikipedia is only one of the websites that publishes Wikipedia content. There are lots of other sources that clone it, precisely as they're allowed to under the licence, and re-publish it. They usually add advertising to the content, or use it to lure people to some other form of revenue. These sites are easy to find by picking a phrase from Wikipedia and keying it in to a search engine like Google, and I doubt they'll add the nofollow attribute to their reproductions of the content.

      Wikipedia is probably treated as a more important source of links by search engines, but whatever's published on Wikipedia will be re-published in many other places within the weeks that it takes for the new content to be crawled and to propagate. And links on any Wikipedia articles will propagate too, of course.

    • Even if you ignore search engines, having external links from a well written Wikipedia article that gets referenced and read a lot is probably going to generate at least some traffic to a website. Wikipedia articles are often a good place to find good external sources, probably because they get audited and the crappy ones get removed from time to time. This is exactly what provides motivation for spammers to try and get their links added, though.

    Good on them for trying something, but I don't think it'll stop spammers very much.

  • by Bananenrepublik ( 49759 ) on Monday January 22, 2007 @04:04PM (#17714572)
    If this is of benefit to the search engine operators, then it should be simple enough for the search engine operators to follow or not follow external links from wikipedia, with or without NOFOLLOW. Wikipedia has a high enough profile that search engines already treat it differently from Average John's Incredibly Boring Blog, and they will know if it is of benefit for them to follow those links, without wikipedia putting some policy in place.
  • by Distan ( 122159 ) on Monday January 22, 2007 @04:09PM (#17714616)
    I think the article noted that the last time this came up for vote by the community, the community voted it down. I think it also notes that this is something that Jimbo Wales dictated, and not something that went through the normal community approval process.


    Why would Wales simply dictate this change be made?

    Because Wikipedia is a source of high-quality links. Editors have increasingly been making sure to put high-quality references in articles, mainly as links to other web sites. A single Wikipedia article can often contain links to the best websites related to that subject.

    So ask yourself why would Wales want to make those links private, and no longer harvested by Google.

    Is it that hard to figure out?

    If you still don't know, then ask yourself what business Wales has announced that he wants to pursue with his new for profit company, Wikia?

    Search Engines.

    In the words of Paul Harvey, now you know the REST of the story.
    • by Animats ( 122034 ) on Monday January 22, 2007 @04:23PM (#17714794) Homepage

      Wales' behavior may be an issue for Wikipedia. If the same person is involved with a profit-making venture and a nonprofit in the same area, the tax status of the nonprofit becomes questionable. When a US nonprofit files their tax return, they have to list any officers or directors involved with profit-making ventures in the same field.

      The IRS is concerned because if you have a nonprofit and a for-profit organization under the same management, it's often possible to structure things so that the for-profit corporation shows a phony tax loss.

    • by Bluephonic ( 682158 ) on Monday January 22, 2007 @04:32PM (#17714888) Homepage
      Sorry, that doesn't make sense. As other people have mentioned, nofollow is not a magic incantation that search engine crawlers have no choice but to obey. Google can do whatever it wants with any link (they could choose to completely ignore the nofollow attribute when it's on wikipedia pages, for example).
  • SleepyHappyDoc added italics to his Slashdot post today. An unnamed source decryed this move as "unnecessary" and strongly implied that this action was not noteworthy. Film at eleven.
  • by adnonsense ( 826530 ) on Monday January 22, 2007 @04:36PM (#17714942) Homepage Journal

    ... is here []; they seem to be concerned about a "search engine optimization world championship".

    Personally I think we can all do our bit and stop linking to Wikipedia so much, because Google is starting to give the impression that Wikipedia is the fount of all knowledge - to the detriment of pages which contain better information but which don't happen to have WP's massive net presence.

  • Overkill (Score:3, Insightful)

    by ivan256 ( 17499 ) on Monday January 22, 2007 @04:48PM (#17715054)
    Wouldn't a better approach be to figure out the average longevity of a spam link on the site, and tag links with 'nofollow' for slightly longer than that period of time? After that they can remove the 'nofollow' because, presumably, if it was spam the link would have been removed already.
  • by mi ( 197448 ) <> on Monday January 22, 2007 @04:48PM (#17715060) Homepage Journal

    For example, auto-add the "nofollow" only to the links added in recent edits (for some definition of recent). Once a particular link was part of the page long enough (and survived other people's edits), it can be followed by the search engines...

    I, for one, contributed a number of wild-life pictures to Wikipedia, but am also selling them in my own shop []. I don't think, it is unfair for me to expect links to my shop from the contributed images to be followed...

  • I put on nofollow on my blog some time ago, and there is a simple turing test to post comments, and it explains to people that links in the comments will not be counted by search engines like Google.

    They still try to comment spam. And not simply spams where they hope people will click on the links. They just are pretty thick, and never stop doing something once they heard it was useful.

    So it will be several years before the spammers back off due to nofollow.

    Nofollow, in effect says, "This link was not app
  • IMHO this is part of what's wrong with Wikipedia. They claim to be open to all and to have a community, deciding many things by consensus.

    Except when Jimbo, or another well-known admin overrules everyone else.

    They've even sneakily formalized this policy in renaming Votes for deletion to Articles for deletion, suggesting that while a discussion can take place about an article's fate, it can generally be ignored if an admin (typically the one placing it up for deletion) disagrees.

    There's some interesting information over at WikiTruth [] about this (like everything else, taken with a grain of salt; there's some obvious bias there).

    Anyway, I personally believe this is a bad thing for the overall health of the internet. Wikipedia is a huge site. Making it irrelevant to search engines will probably affect Google quite a lot, and give a *huge* boost to whoever figures out how to get around the nofollow restriction.

"If the code and the comments disagree, then both are probably wrong." -- Norm Schryer