Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
The Internet News

Berners-Lee Launches New W3 Foundation 111

robertsonadams tips us to the initiation of the World Wide Web Foundation with $5M of seed funding from the Knight Foundation. From the announcement: "Sir Tim Berners-Lee, inventor of the World Wide Web, unveils the World Wide Web Foundation. It aims to advance One Web that is free and open, to expand its capability and robustness, and to extend its benefits to all people on the planet." The new foundation's site should have video up soon of Berners-Lee's speech at the kickoff event. The foundation hopes to raise $50M–$100M and will issue grants in Web science, technology and practice, and Web for society. Initial plans will be disclosed early next year.
This discussion has been archived. No new comments can be posted.

Berners-Lee Launches New W3 Foundation

Comments Filter:
  • Quick! (Score:5, Funny)

    by BitterOldGUy ( 1330491 ) on Monday September 15, 2008 @07:11AM (#25007995)
    Let's find all the jokes that have been posted about the Web being forked and post, "See! We told you so! Funny mod my ass!"
  • by gogita21 ( 1183387 ) on Monday September 15, 2008 @07:14AM (#25008019)
    One Web to rule them all...
    • From his speech:

      Back in 1989 I was a programmer at CERN, the high energy physics research center near Geneva. At that time, one huge accelerator, the LEP, had been completed, and work was just starting up on the new Large Hadron Collider (the LHC). Coincidentally, the LHC was just turned on just a few days ago. Right now, there will be a lot of pressure as the results of many years of work are put to the test. But in 1989, there was a slight lull in the pressure between completion of the LEP and the start of work on the new LHC. It was during that lull that my boss, Mike Sendall, allowed me to work on a side project -- a global hypertext system I called the World Wide Web. It took me a couple of months to put together the technology, to design HTTP and HTML and URLs, and build the first browser and server. But the technical design was only part of the work. There was an important social side of the design. The Web does not just connect machines, it connects people. When a link is made, it is a person who makes the link. When a link is followed, it is a person who decides to follow it. Understanding and accounting for the social side of the Web was, and remains, a vital part of encouraging its growth. For example, it took 18 months for my colleague Robert Cailliau and me to persuade the CERN directors not to charge royalties for use of the Web. Had we failed, the Web would not be here today. Later on, in the early 1990s, a new threat arose when competing browser developers sought to divide the Web into incompatible islands. I was approached from all sides by people wanting to work together to preserve "One Web." The whole point of a hypertext link is that it can potentially link to anything out there. One Web is far more interesting and valuable than many small ones.

      • Back in 1989 I was a programmer at CERN, ...

        'CON' is Latin for "used with certain words to add a notion similar to those conveyed by with, together, or joint" and since we're all with him, does that make us 'ConCERN'ed?

    • by rk87 ( 622509 ) <chris.r.walton@NOSpam.gmail.com> on Monday September 15, 2008 @07:33AM (#25008225) Journal

      Bastard, ya beat me to it!

      Three Interwebs for the AT&T-kings under Bell Labs,
      Seven for the GNU-lords in their halls of source,
      Nine for Mortal n00bs doomed to pwnage,
      One for the Borg on his dark throne
      In the Land of Internet where the /b/tards lie.
      One Web to connect() them all, One Web to gethostbyname() them,
      One Web to bring them all and in the darkness bind() them
      In the Land of Internet where the /b/tards lie.

    • Oh Jesus.</randal>
  • I've been using the WWW for years now.
    • Re: (Score:2, Funny)

      by linhares ( 1241614 )
      The news is that this dude says he did the www, not Al Gore.
      • Al Gore never made that claim.

        He played a key role supporting legislation that helped construct the networks within the US, and eventually opened them to commercial traffic. For this, he took credit for having helped create the internet, just as Eisenhower helped create the Interstate Highway system. He's perfectly justified in that claim, too.

      • The news is that this dude says he did the www, not Al Gore.

        If by "This Dude" you mean Tim Berners-Lee, then it's not at all news.

        Al Gore built the internet (in that he's responsible for legislation encouraging it being built), while Tim Berners-Lee invented the World Wide Web.

        Should anyone be unfamiliar with that distinction, it is discussed to some satisfaction at http://webopedia.internet.com/DidYouKnow/Internet/2002/Web_vs_Internet.asp [internet.com] and a quick google search for, say, "internet vs. www" should give you more information.

        Also, Al Gore's legislation encouraging

    • Re: (Score:3, Informative)

      by Lenneth ( 1363547 )
      It's aim is to improve the web not recreate it. :)
    • by Anonymous Brave Guy ( 457657 ) on Monday September 15, 2008 @07:39AM (#25008289)

      They're not claiming to recreate the web or anything like that. Rather, Berners-Lee has expressed concern about some of the trends in the way the WWW is working, and is now doing something about it. One example cited in the media today is the difficulty in distinguishing between rumours and content from reputable sources, since there is no robust mechanism for indicating the authenticity or credibility of a web site. This has led to fears of the LHC sucking the world into a black hole or, more seriously, to parents being misinformed about the dangers of MMR vaccine and making health decisions that are not in their child's best interests because of the bad information.

      I would suggest that this is a more general problem rather than anything specific to the web, and I don't believe it can ever be solved in all cases because there can never be an ultimate authority on all things, nor should there be. But an effort to provide a framework where realistically credible groups can be seen to endorse the content on certain sites as respectable has to be a step in the right direction: sometimes there's no substitute for seeing a qualified, experienced professional, but if I'm looking for general information on-line, I'd rather know that the professional-looking site I'm reading has been vetted by expert medical, legal, financial, technical or other eyes, as appropriate, rather than just being designed by someone with a good eye but containing content that is misleading or outright dangerous.

      • They're not claiming to recreate the web or anything like that. Rather, Berners-Lee has expressed concern about some of the trends in the way the WWW is working, and is now doing something about it. One example cited in the media today is the difficulty in distinguishing between rumours and content from reputable sources, since there is no robust mechanism for indicating the authenticity or credibility of a web site. This has led to fears of the LHC sucking the world into a black hole or, more seriously, to parents being misinformed about the dangers of MMR vaccine and making health decisions that are not in their child's best interests because of the bad information.

        I agree this is a nice idea, but somehow making it a part of the infrastructure of the web is a bit alarming, and seems almost impossible, especially without creating an ultimate authority as you put it with the ability to hand out endorsements.

        • by Anonymous Brave Guy ( 457657 ) on Monday September 15, 2008 @08:39AM (#25009109)

          I think the trick is not to try to create an authority to give endorsements itself — which we seem to agree is doomed before it starts — but rather to begin with a mechanism by which a web site can claim one or more endorsements by named parties on specified dates, those endorsements and dates can be verified in real time, and there is a mechanism for immediate revocation by the endorsing party if a more recent check makes continued endorsement inappropriate.

          As various recent discussions on SSL have considered, we are already part way there, but at the moment all you can do is prove you have a secure connection to a certain on-line resource, without knowing who is behind that resource in real life. This is already a significant problem for industries such as banking, but any structured identification protocols developed to help there could just as well be used to show that, for example, a site describing first aid procedures was verified and endorsed by the Red Cross within the last three months.

          The overheads of getting real people to check sites before giving an endorsement might be prohibitive, and I'm not sure you'd get that many endorsements relative to the number of sites that might deserve them if there was time to check them all. But starting from that basis, we could move to more of a web-of-trust system. As Google proved with their Page Rank algorithm, even a relatively simple idea along these lines can be remarkably effective as a starting point.

          Of course, Google's story also tells us that sooner or later, people will learn how to game such a simple system, and that is an as-yet unsolved problem. But that doesn't mean it can't be solved, particularly if we're talking about a new organisation with some real world resources that could get real people to investigate the credentials of major nodes in your trust network as a starting point. Community-driven web sites like Wikipedia have shown us another possible tool we could use: it's also a system that can be gamed, but usually not for long without someone noticing, and for the most part the information supplied is good.

          There is a lot of potential complexity here, and there are many ways things could go wrong. I doubt any system will ever be perfect. However, it's not as if this is a win/lose scenario: just improving the signal/noise ratio is a benefit to everyone affected, and we could certainly do better than we do today.

          • [from the article]

            The Web Foundation will identify benefits of the Web for these communities, and issues of access to (and availability of) relevant, usable, and useful content. The foundation will do so through support of ongoing and new efforts to develop critical services related to better health care, nutrition, education, and emergency relief.

            Sounds more like a shiny, one-world government. Will the world be better off when the WWW, in collaboration with the WWWF supplant the governments of small and/or

          • Re: (Score:3, Interesting)

            by coryking ( 104614 ) *

            We need a way to tie the methods we use to identify ourselves in the real world to our online world. In the real world, we have drivers licenses, bank cards, passports, birth certificates, business licensing ... you name it. No need to re-invent the wheel with crazy schemes that try to avoid using our real life "proof of existence".

            Somehow, a protocol stack needs to be created to let us take these real world things and get our exciting protocols to "verify" them. For example, some sites will text message

          • but rather to begin with a mechanism by which a web site can claim one or more endorsements by named parties

            You mean like a site purporting that the earth is six thousand years old can be endorsed by say, Drudge? LOL... quis custodiet ipsos custodes

            • No-one watches the "watchers". An endorsement is worth only as much as the credibility of the endorsing party. A site with endorsements from multiple reputable sources is probably a good source of information. If certain sites choose to endorse certain other sites that some people would find unhelpful, then those people will start to give less credibility to the endorsing site's other recommendations. It's not like Drudge is some sort of authority on anything in real life, so I don't see why just that singl

              • Re: (Score:3, Insightful)

                To some people Drudge might be a heck of an endorsement. To the ones who want him to be. There are groups to whom different people will place greater emphasis on credibility than on others. For example, the people who like Drudge likely won't think much of the New York Times newspaper, a source many people would trust. I don't think the endorsements thing would work for this reason. There are enough people out there who will support anything. If it did anything at all, it would act to label and easily
      • I would suggest that this is a more general problem rather than anything specific to the web

        Humanity bug #1: Idiocy has a majority market share.

  • Controversial or not - the pr0n industry has in many areas driven the web technology development forward for years. If I had $5M to throw into the advancement of web development I'd buy $5M worth of pr0n. ^^

     

  • by Morgaine ( 4316 ) on Monday September 15, 2008 @07:30AM (#25008193)

    I suggest that Tim uses his influence and backing from the new foundation to fight this latest China-inspired UN move to provide IP traceback and lose anonymity [slashdot.org] across the net.

    His WWW would never have blossomed the way it did under such Big Brother conditions, and we'd all be a lot poorer for it. The control freaks just don't understand the benefit of emergent systems, and that freedom has a price. Sure, we suffer a few annoyances and some real crimes, but it's still infinitely better than everybody living in a police state.

    • Maybe, maybe not (Score:5, Informative)

      by Anonymous Brave Guy ( 457657 ) on Monday September 15, 2008 @07:49AM (#25008421)

      Hopefully the WWWF will take a rather more balanced view than that expressed in the parent post. I have some faith that it will: Berners-Lee has always struck me as both a smart guy and someone who genuinely wants to do the right thing. It is interesting that considering issues such as privacy and security is explicitly mentioned in the WWWF concept paper (available on their web site), but that Berners-Lee also told the BBC he was concerned about the need to separate rumour from reliable information on the web. Whether or not on-line anonymity should be possible is pretty fundamental to these issues.

      • The way I see it, there isn't anything that would inherently prevent anonymous data from being "vouched for" in the web of trust. The source doesn't have to be identifiable for a piece of information to be valid and verified. However, the identity (and consequent authority) of the verifier is critical for a recipient of the data to be able to make an informed judgment about it's veracity.

        With a system like this, the power of anonymous open communication that the web offers would be increased, not decreased.

      • Is there really a question whether online anonymity should be possible? I would assume that you would have a firmer stance, Anonymous Brave Guy.

        I suppose I'm among those who would say that you can't have free speech without the option of anonymity, and without free speech the Internet becomes another controlled broadcast system for the powerful to use. That's the last thing we need.

        At the same time, I'm not against some sort of identity verification scheme. Real-world analog: I carry a driver's license

        • Is there really a question whether online anonymity should be possible? I would assume that you would have a firmer stance, Anonymous Brave Guy.

          I think there certainly is a question of it, and my stance is certainly against absolute anonymity.

          I'm aware of the amusing irony, of course, but I see no real conflict. Sure, I occasionally choose to goof off on Slashdot under a pseudonym. These days, I don't use my real name very much on-line for privacy reasons. I don't believe people should be compelled to give their identity to any old person reading their comments on-line, and I have grave reservations about the data protection and privacy implication

          • if you have a government that does not respect and protect the people's right to discuss that government freely and without fear of reprisal, then the government in that country is broken, and it is already time to move on to the next of the four boxes and remove that government from power.

            How do you go about removing it from power if you aren't permitted to talk to your fellow citizens about wanting to remove it from power? That's exactly why the freedom to speech is so important.

            • by Dog-Cow ( 21281 )

              That's completely circular. Ultimately, we have freedom of speech because the Government hasn't (yet) taken it away. If free speech is taken away in terms of law, it is necessary to break the law and correct the situation in any way possible. If free speech is removed through force than it does not matter whether you ever had the law on your side.

              • Oh, right, so I guess we shouldn't have any law permitting free speech, because if the government is good then they won't restrict speech in bad ways, but if it turns evil they'll just take it away anyway? Hey, maybe we shouldn't have laws, since it's safe to assume that either the police will be good, in which case we don't need laws, or the police will be bad, in which case laws won't help us! That will make everything better.

    • Re: (Score:3, Insightful)

      by Zeinfeld ( 263942 )
      His WWW would never have blossomed the way it did under such Big Brother conditions, and we'd all be a lot poorer for it.

      It is hard to know where to start here.

      Back in the 1990s the use of cryptography was subject to a whole rack of restrictions. The fight between advocates of cryptography and Louis Freeh's FBI is known as the crypto-wars and some folk like Phil Zimmerman were harassed in the same way that the FBI harassed Charlie Chaplin and other opponents of Hoover.

      Fortunately there were also folk

      • by Raenex ( 947668 )

        I was working with the Clinton-Gore '92 online campaign right at the start of the Web and later with the Whitehouse. They saw the opportunity to disintermediate the mainstream press, what W. has called 'the filter'.

        How sad then that the Clinton administration was behind the Clipper chip.

        • How sad then that the Clinton administration was behind the Clipper chip.

          It was a civil service measure that they had been waiting to spring on a new administration. Clipper was sold to the administration as being entirely uncontroversial.

          What that meant was that when the opposition appeared there was no base of support for the proposal. In normal circumstances it would have died quickly.

          The reason it did not was Louis Freeh, the single worst Clinton appointee. He was ignorant, incompetent and disloya

  • by pieterh ( 196118 ) on Monday September 15, 2008 @07:36AM (#25008253) Homepage

    1. Stop the moves in Europe to lock-down the Internet and install filters at every ISP, which are being pushed by the music, movie, and TV industries in cahoots with the telecoms giants that now control most of the ISP landscape.

    2. Bring the Internet to Africa. For crying out loud, enough of the extortion already. Africans need cheap communications to escape their geographic and historic prison [devilswiki.com], and while GSM was a plausible attempt, it's being strangled by the telcos.

    3. Invest in new platforms for free and open digital standards [digistan.org]. These are the basis of the Internet and they are being strangled by firms like Microsoft which want to see their own technologies dominate.

    • by jc42 ( 318812 ) on Monday September 15, 2008 @10:30AM (#25011019) Homepage Journal

      1. Stop the moves ... to lock-down the Internet and install filters at every ISP, ...
      2. Bring the Internet to Africa. ...
      3. Invest in new platforms for free and open digital standards. These are the basis of the Internet and they are being strangled by firms like Microsoft which want to see their own technologies dominate.

      All true, but note that this is nothing new. Much of the early history of the Internet was based on exactly these problems. The original US DoD funding for ARPAnet was openly aimed at fighting a growing problem in the military: They were using more and more electronic comm devices, but hardly any two pieces of equipment from different manufacturers could communicate sensibly. The corporate world everywhere wanted customers to buy only from them, and official standards didn't help much. Manufacturers found ways to "enhance" the standards in ways that were incompatible with competitors' equipment.

      The solution was to build a new sort of "network" layer that ran on top of all the vendors' incompatible equipment. The new network would encode the data into a binary form that wasn't supposed to be understood by the lower-level equipment. The lower-level stuff was used simply to transport the bits, which at every interface would be translated into whatever form the next equipment could transfer correctly. At the final destination, whatever form the bits arrived in would be translated back into whatever form the last chunk of hardware wanted.

      Initially this was expensive. It involved a lot of separate computers (the IMPs) that interceded all over the place. With time, as people understood how to do the job right, and solid-state circuitry became smaller, the job could be moved into circuit cards and then chips that did the same job. Now the Internet part of a gadget is small and cheap.

      But the entire point was to admit that the companies that supply the hardware and connectivity would always be trying to sabotage any standards, and force customers to buy only their own equipment by blocking communication with competitors' equipment. The fact that ISPs and telcos are doing the same today should come as no surprise. They always have done this, and they always will. The questions isn't whether we can prevent this; we can't. The question is whether we can get our bits delivered through a network built of unreliable components. The fact that some components are actively trying to block traffic, for whatever reasons, is simply a fact of life. For the network to work, it has to work despite failures (accidental or intentional) on the part of the low-level comm equipment.

      One obvious approach is to consider IPv4 as just a vendor-supplied network, and solve the ISPs' sabotage the same way we have for decades: Build another network layer above it that takes into account its failures.

      Of course, we're well on our way to doing this. One part is known as "https". Another part is known by the name "mesh". Bittorrent implements another part. And others are under development. With time, we can make the corporate world's sabotage as irrelevant with the same approach we have been using since the 1960s, when the ARPAnet started up. We encode the data into forms that they can't decode, and when they drop or damage too many packets, we route around them.

  • by tripmine ( 1160123 ) on Monday September 15, 2008 @07:37AM (#25008261)
    W3F?!?!?
    • W Three F.

      WTF!

      I suppose I should go read TFA, but what's wrong with the W3C?

      • The W3C defines the standards used on the 'Net. From what I've read about the proposed W3F, its purpose will be to encourage trends that increase the freedoms and openess of the Net, a role the W3C doesn't fulfill.
        • Is there a reason the W3C doesn't do that? I can't remember a W3C spec being closed. Given that, compliance with W3C specs certainly helps encourage openness.

          • While adhering to specs improves compatibility among products, it doesn't necessarily promote openness. For instance, both Ford and Chevrolet make cars that run on petrol, have pedals in the same places, have steering wheels that turn the front wheels, etc. However, the designs of those cars are 'closed source', as it were.

            I know, I know, it's not a great analogy, but is it cars.

            • I'm not using "openness" to mean implementation and design. I'm talking about the fundamental requirements to build a competing product.

              If I wanted to build a competing car, I wouldn't have to pay patent royalties to anyone to put the pedals in the same places, or to make it run on petrol. However, if I wanted to make a competing web browser, I would have to make it deal with an entirely closed source Flash -- there currently isn't even an option to make a deal with Adobe to get the specs I would need to re

  • I bet he's kicking himself for not making HTTP encrypted by default.....

    • Why?

      HTTP was designed as a more advanced version of Gopher, and was primarily intended to serve as a vehicle for distributing (and linking between) scientific information.

      There was absolutely no need for it to be secure. Encrypting all traffic back in those days would have also created a huge (and unnecessary) CPU overhead.

    • Do you realise how slow and expensive encryption is?

  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Monday September 15, 2008 @07:42AM (#25008321)
    Comment removed based on user account deletion
    • Re: (Score:3, Funny)

      by Anonymous Coward

      The "ready in 2022" for HTML5 is not the date for HTML5 spec completion, it's the date of when it'll be supported in Internet Explorer.

    • Re: (Score:1, Informative)

      by Anonymous Coward

      The 2022 date is effectively meaningless. That is when every single minute edge case in the specification has been proven to be worked out, and implemented in an exactly identical manner by at least two browsers. (Hint: at present there isn't a single W3C specification which has reached this point.)

      What matters to all of us is when we can start using features from HTML 5. This is roughly from Last Call, which is hoped to be reached next year.

      See also these posts by Anne van Kesteren: Re: 2022 [annevankesteren.nl] and Meaning of [annevankesteren.nl]

    • Re: (Score:3, Insightful)

      Irony for the day: the sort of misinterpretation in the parent post and the rumours that result are exactly the sort of things Berners-Lee was concerned about in a recent interview. (See also: Slashdot article summaries. ;-))

  • by Anonymous Coward on Monday September 15, 2008 @07:47AM (#25008385)

    Wow, Knight Foundation is footing the bill, I am sure we will see Michael, and Kit running the show...seriously having Kit run the web would be awesome....knight rider rules

  • Since the W3C already exists, and is already headed by TBL, and is already designed to improve the Web, why not give it more funding and extend its remit?

  • Does my NAT firewall, with my private IP# addresses behind it, make me the enemy of Sir Berners-Lee's "One and Open" Web? Will his W3 Foundation give me a C Block?

  • by emag ( 4640 ) <slashdot@gur s k i .org> on Monday September 15, 2008 @08:33AM (#25009007) Homepage

    Does this mean the W3F will be releasing a Web KITT?

  • by debrain ( 29228 ) on Monday September 15, 2008 @08:42AM (#25009175) Journal

    ... are often legal. The technology will evolve because there are market incentives (not to mention curious and innovative actors). All legal actors, in contrast, will oppose this evolution because it operates against predictability. What point is evolving technology if the typical beneficial uses are undermined legally?

    See (in no particular order) the disputes over: RIAA, MPAA, GPL, Copyright, Patents, Trademarks, Domain names, etc. I would argue that the most beneficial contribution to the world wide web would be researching, educating, and giving effect to laws that promote internet technology, as opposed to undermine it.

    • Re: (Score:2, Insightful)

      by jmcwork ( 564008 )
      I think this post is on the right track. Since the WWW is essentially user interface and data distribution system (a truly amazing one) on top of a worldwide network infrastructure, and since most of the issues are content or access related, it seems that you would run into most of the same problems without the WWW layer in place. I could use archie to find an ftp site with music files on it then use ftp and mget to pull over some subset of those files with a single command. Same problems with RIAA, copy
  • Did he get a cool car out of the deal too?

  • Inventors (Score:1, Troll)

    by pak9rabid ( 1011935 )

    Sir Tim Berners-Lee, inventor of the World Wide Web...

    Sounds like him and Al Gore have something in common

  • Network Neutrality? (Score:3, Interesting)

    by MobyDisk ( 75490 ) on Monday September 15, 2008 @09:15AM (#25009635) Homepage

    Since Tim-Burners Lee supports network neutrality [cnet.com] I wonder if this foundation will assist in that cause.

  • by davide marney ( 231845 ) on Monday September 15, 2008 @09:38AM (#25010085) Journal

    From http://www.webfoundation.org/programs/ [webfoundation.org]

    The Foundation will launch with three projects:
    Web Science and Research, Web Technology and Practice, and Web for Society.

    The output of the projects will be:

    - Studies
    - Basic research
    - Thought leadership
    - Curricula
    - Conferences, workshops, etc.
    - Support for organizations developing Web standards
    - Support for organizations using the web to solve social problems
    - Training materials, guidelines, etc.

    • It will be interesting to see how all this good work will be funded, or rather, by whom. Google is a no-brainer, but who else? The Web for Society program sounds like something that should be a target for funding from technology-supporting charities like Bill & Melinda Gates' Foundation.
  • "Sir Tim Berners-Lee, inventor of the World Wide Web

    This claim of "inventing" the WWW is rather dubious and is perpetuated mostly by those anti-Americans, who would like to diminish America's contribution to the Internet (wonder of the world).

    What Tim wrote at CERN was more like a Wiki — a hyper-text interface to a single database. He invented neither the hyper-text itself, nor the database, of course. He did work on Mosaic [wikipedia.org], but was neither the only nor the main person there — and the project

    • by NickFortune ( 613926 ) on Monday September 15, 2008 @10:20AM (#25010867) Homepage Journal

      His contribution was, no doubt, huge, but the inventor he was not -- considering the existence of all the prior works, including Gopher, there was nothing in his work, that was not "obvious to someone skilled in the art".

      Umm... your wikipedia link points at an entry in the talk page for Sir Tim's wikipedia entry. It cheerfully conflates Internet and Web in order to try and make a case. There are some fairly robust rebuttals there as well.

      Citing such poor quality references weakens your argument rather than strengthening it.

      • Re: (Score:1, Troll)

        by mi ( 197448 )

        Citing such poor quality references weakens your argument rather than strengthening it.

        My intention was to show, that I'm not alone in my opinion. The follow-up counter-arguments aren't robust at all, in my opinion, and are broken apart easily. For example, Gopher is, clearly, the "prior art", and so are other, even earlier systems.

        The main point is, if Tim were to file a patent based on the claims made on his behalf, and try to enforce it to, say, collect royalties, the entire "community" would've been u

        • My intention was to show, that I'm not alone in my opinion

          But the source you cite appears to have been written by an anonymous nutter. If that's the best support you can muster, I'm afraid you may have something of a mountain to climb.

          The main point is, if Tim were to file a patent based on the claims made on his behalf, and try to enforce it to, say, collect royalties, the entire "community" would've been up in arms.

          But he isn't trying to file a patent, so, you know, so what? If Tsar Nicholas the Seco

          • by mi ( 197448 )

            have been written by an anonymous nutter.

            Khmm... I wonder, what made him a "nutter" in your opinion. The disagreement with you or the desire to remain anonymous?

            But he isn't trying to file a patent [...]

            If he were, he would've failed — and that's the test, that we should be employing in determining, whether anyone is an inventor of something. The ability to pass the patent's requirements may not be sufficient to deserve the honor (there are some ridiculous patents awarded), but it ought to be a requi

            • Khmm... I wonder, what made him a "nutter" in your opinion. The disagreement with you or the desire to remain anonymous?

              Well, there's the madly over the top accusation that heads it all off for the start. "Fraud of the Century" forsooth. I mean even if the point had merit, and I'm far from convinced of that, I think we could find numerous cases of actual fraud where actual harm was done. Then there's the fact the author keeps trying to prove that Sir Tim in unworthy of any accolade by reference to the fa

              • by mi ( 197448 )

                I'm not sure that's a good test. I mean, yes, there were other hypertext interfaces [...]

                Uhm, so — you doubt the test's validity in one sentence, and then try to satisfy it in the next? Weird...

                Tim's protocols and markup language were the ones that became widely adopted.

                So? Did RIM ("Blackberry") invent "wireless email"?

                It's generally accepted that the web's success is down to the accessibility of the technology rather than the basic ideas behind it.

                That's NCSA's Mosaic, of which Sir Tim was only one

                • you doubt the test's validity in one sentence, and then try to satisfy it in the next? Weird...

                  I concede that other hypertext implementations came first, and you think I'm trying to claim patent rights for Sir Tim? That is weird.

                  So? Did RIM ("Blackberry") invent "wireless email"?

                  And if anyone was hailing Tim as the Father of Hypertext, then this would be a relevant point.

                  That's NCSA's Mosaic, of which Sir Tim was only one of the contributors -- and Netscape...

                  According to the wikipedia talk page th

  • I heard Berners-Lee will be facing off with The Undertaker in a three round cage match for control of the HTML standard.
  • Gotta love his WWW! But...

    TBL was either crucified or ignored (justly in either case) by the programming masses for his Semantic Web(SW) initiative and now has ascended into the electronic cloud from hence he will issue e-mail missives anew.

    With any luck this will, like the SW, quietly fade and die. Surprising that some fool tossed so much cash into a TBL initiative. There must have been some Ecstasy in the drinking water when that one went down.

  • Knight Foundation?

    Does that mean Berners-Lee will suddenly sprout 1980's hair, a cool leather jacket, and then drive around in a talking, bullet proof car with lasers and stuff?

  • You can get over 2/3 of the way through the FAQ [webfoundation.org] without seeing the word "synergies".

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...