Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Network Networking The Internet

On This Day 25 Years Ago, the Web Became Public Domain (popularmechanics.com) 87

On April 30, 1993, CERN -- the European Organization for Nuclear Research -- announced that it was putting a piece of software developed by one of its researchers, Tim Berners-Lee, into the public domain. That software was a "global computer networked information system" called the World Wide Web, and CERN's decision meant that anyone, anywhere, could run a website and do anything with it. From a report: While the proto-internet dates back to the 1960s, the World Wide Web as we know it had been invented four year earlier in 1989 by CERN employee Tim Berners-Lee. The internet at that point was growing in popularity among academic circles but still had limited mainstream utility. Scientists Robert Kahn and Vinton Cerf had developed Transmission Control Protocol and Internet Protocol (TCP/IP), which allowed for easier transfer of information. But there was the fundamental problem of how to organize all that information.

In the late 80s, Berners-Lee suggested a web-like system of mangement, tied together by a series of what he called hyperlinks. In a proposal, Berners-Lee asked CERN management to "imagine, then, the references in this document all being associated with the network address of the thing to which they referred, so that while reading this document you could skip to them with a click of the mouse."

Four years later, the project was still growing. In January 1993, the first major web browser, known as MOSAIC, was released by the National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign. While there was a free version of MOSAIC, for-profit software companies purchased nonexclusive licenses to sell and support it. Licensing MOSAIC at the time cost $100,000 plus $5 each for any number of copies.

This discussion has been archived. No new comments can be posted.

On This Day 25 Years Ago, the Web Became Public Domain

Comments Filter:
  • by Anonymous Coward

    I remember spending hours trying to get Mosaic to comple under SLS linux. ahh the X driver issues, the horror of it all but after a week i managed to get it up and running.

    • Mosaic installed and ran great under Win 3.11.

      • by TWX ( 665546 )

        Was gonna say, my first experience with the World Wide Web was when a social club that I was a junior member of met at a university student union for a change, and one of the members of the club that was a student in the honors college took us over to the honors college computer lab to show it to us.

        I think it was on a Windows for Workgroups 3.11 platform, but they were working on getting it going on a Sun machine that they had in the lab. CDE biatches!

      • worked *much* better under SunOS at work and NextStep at home

    • Re: (Score:1, Troll)

      I'm old enough to remember the last few remaining Gopher sites. CERN did not invent the 'web'.

      I wish CERN would stop claiming credit for things they didn't do ('web'), and stick to claiming discovery of things that don't exist ('higgs').
      • Yeah, I'm old enough to remember the day when somebody could invent a superior solution, and put out a new client, and everybody would be like, "cool I'll go download that app and use that instead." Can you imagine trying to move people off of http/html today?

      • For all intents and purposes Mosaic DID invent the web as the world knows it today. Links, images and text on a single page = Mosaic.

      • by Anonymous Coward

        Gopher was about midway between http and ftp. The web could never have been built on top of gopher. Sure, it delivered data, and browsers could visit gopher sites by their URL, but it was too rigidly structured and too text-oriented to be successful in the long run.

        If the web is like a library, gopher was more like a book of the month club.

      • by Anonymous Coward

        'Web' is an abbreviation of WorldWideWeb. CERN did not invent the Web, Tim Berners-Lee and Robert Cailleau invented it by publishing the WorldWideWeb proposal [w3.org]. Note that it says "HyperText" right in the title. GOPHER was an Internet service, but it was not hypertext and not part of the Web.

    • by k6mfw ( 1182893 )

      And soon it was the Information Superhighway. Local engineering society had a speaker talking about this new thing, I have the handouts someplace (there was no PDF download). I remember many people asking how can a business make money using such a thing. I was thinking that would be cool to simply click a link instead of navigating groups on my Compuserve (I saved by Model 500 phone so I could still use the acoustic coupler).

      There was other talks about a coke machine was connected to the Information Super

    • by PPH ( 736903 ) on Monday April 30, 2018 @06:30PM (#56533269)

      By 1994, Mosaic would build pretty easily on a Slackware distribution. I had that, NCSA httpd and a few other goodies installed on a Dell PC that Boeing was good enough to drop on my desk a few months before the Windows install crew could get around with their box-o-floppies. By the time they did stop by, I just told them "Never mind."

      Engineering was responsible for providing documents to the factory floor. Which was done with a convoluted combination of an index database (accessible via 3270 terminals), some notes scribbled on a piece of paper and then a manual search for a file on some server share. Along with all the possible fat-finger errors imaginable. So one day I was goofing around with httpd and managed to get a read-only link to the mainframe to select datasets applicable to a particular plane. And then format a link to the document server. Two clicks and you're done. I showed it to my boss who showed it to some factory managers. Boss came back from meeting and said, "The shop wants this web thingy in production in two weeks." So we got an actual server (Sun), built the pieces and made the schedule. Factory loved it. Boeing computer services* hated it. They figured that this kind of development could have earned them a few tens of millions of dollars and a fully staffed program for a year or two.

      *One of their IT guys asked me how I (the sole maintainer of the entire web system) managed to build the HTML index pages from a database dump to keep the web data up to date. They didn't understand dynamic pages back then.

  • I've been using this wonderful technology called "computer" for a while. I recently came up with a neat application for it, and I'm going to call it "calculator" -- they might as well be synonyms, since most people will never use any other applications on it.
  • by greenwow ( 3635575 ) on Monday April 30, 2018 @03:17PM (#56531789)

    We licensed it to test our new web site that I don't think any customers even used until a couple of years later. The site was pretty crappy since I learned HTML from viewing the source on other sites, and it took me a lot of time so that was a huge waste of money.

    • That's another thing they invented: users not giving a shit about licenses.

  • by IGnatius T Foobar ( 4328 ) on Monday April 30, 2018 @03:21PM (#56531823) Homepage Journal
    Those were the good old days, when there were lots of different web sites and if you didn't like one, you could go to another. Many of us remember "the Internet interprets censorship as damage and routes around it." Somehow we got to a point where Facebook and Google interpret the free Internet as damage and route around it.
    • Back in the late 90s people would take their interests or hobbies put them on a website and then promptly add feel-good widgets like web counters and guestbooks to boost their egos a bit, feel like they had an audience rather than just throwing information into the void.

      That was the seed of social media right there. Web counters evolved into web rings, communities, user scores and eventually huge public forums like Twitter and Facebook. A users own ego is actively played by these systems to encourage more u

    • by mikael ( 484 )

      During 9/11 and major Internet outages across the world, we discovered that the long-distance carriers weren't using mesh networks but had consolidated their infrastructure into spanning tree networks to save money. Then customers realized that even having multiple providers wasn't enough as they had virtualized their capacity through subleasing of fibre optic conduits, bundles and even virtual networks. Facebook and Google had to deal with so much data sloshing around their pipes, they needed colocation fa

  • I was such a Pollyanna back then, with my Utopian vision of how people were going to use the Internet; to share and exchange new ideas and information and further human understanding; to have intellectual exchanges with people on the other side of the world I'd never met... :P
    • by TWX ( 665546 )

      I still remember Eternal September when Usenet as it had been basically died. It certainly wasn't perfect but it was a perfect beacon of light compared to the cesspool that it became.

    • by N!k0N ( 883435 )

      [...] with my Utopian vision of how people were going to use the Internet; to share and exchange new ideas and information and further human understanding [...]P

      br/ And then you stumbled on your first goatse link.

    • by PPH ( 736903 )

      Nope. Even then, it was pretty clear that it's primary application would be to replace teletype ASCII porn.

  • Imagine a headline saying “The 25th anniversary of the end of net neutrality”?
  • by Martin S. ( 98249 ) on Monday April 30, 2018 @03:47PM (#56531995) Journal

    The irony is this is being reported by on a paywalled site that doesn't work when using a host file to block ads.

  • by Solandri ( 704621 ) on Monday April 30, 2018 @04:25PM (#56532321)
    Back then domain names were $100 for 2 years. Hosting was an additional $10/mo or so, if you weren't fortunate enough to be at a school or work someplace which let you set up your own web server (I had one for myself, and another for my dog - IPv4 addresses were plentiful back then too).

    Those costs are what drove people to "free" web services like GeoCities, MySpace, and eventually Facebook. You can justify the cost to set up your own domain and website if it's going to be a business venture or a major part of your online profile. But for the vast majority of people, it wasn't worth it. Which is what allowed the personal-info-harvesting vultures to swoop in and take over the web as we know it today. In some ways I think it was actually better before the web, when simply having an account on an Internet service automatically gave you a finger profile [wikipedia.org] you could fill out however you wanted at no additional cost.

    Nowadays most ISPs also give you some free web space (and an email address) with your account. But it's too little, too late.
    • In many ways, things are even more accessible now. I know that idea goes against the zeitgeist, but hear me out. I purchased my domain name from gandi.net for a reasonable sum (in the tens of dollars), and I run my personal site from my home cable connection (Comcast). I do this by using afraid.org for dynamic DNS (free) so that I don't have to pay for a business class connection with a static IP. That's fine, because I'm running a personal site, not a business. Furthermore, all of the software I use is fr
  • What was a great idea has been perverted into something foul and cancerous. Get in your time machine and take it back. Dialup BBSs were bad enough, but at least they were local.
  • by Trailer Trash ( 60756 ) on Monday April 30, 2018 @04:51PM (#56532535) Homepage

    I was working at Indiana University at the time. In the fall of 1993 I found the world wide web. I got xmosaic working on my unix desktop and also installed NCSA httpd. I downloaded the HTML specification and got to work implementing web pages. At the time, I had pages that took 30 seconds to generate.

    Our department (basically IT for the university) was smack dab in the middle of moving our information services from the VAX/VMS cluster to the newfangled gopher service.

    https://en.wikipedia.org/wiki/... [wikipedia.org]

    Along with a few other folks I did a "stop the presses" and convinced them to abandon that project and go straight to the WWW. It took a lot of convincing since they were so invested in getting gopher up and running. Plus, the text-only information was a pretty easy direct-map to gopher.

    The browsers were really primitive at the time. No stylesheets, but we had inline images and could set background colors.
    Not only that, the development was mainly done on the X11 platform. The Windows and Mac browsers were always lagging in features.

    Fill-out forms were a new thing, and the sub ordering application was the standard demonstration for that. Before fill-out forms, there was the "isindex" tag which would show a search box, the contents of which would be added as the "query" part of the url when you hit enter.

    There were no cookies and thus no real way to keep state. When I quit the university, we were working on a way to store session information in files on the back end. The idea was basically what PHP was doing back around 1999 - give the user an MD5 hash or something that was used as the query portion of the url. Every url had to go through CGI, and the script would look up the file containing the session information and read it in, and possibly write it out. We wanted to move some of the administrative actions - such as students setting up accounts - to the web. Since fill-out forms weren't really available on the Mac and Windows platforms, we were looking at using "isindex" to get all information to the backend.

    It's amazing how far we've come in 25 years. I started doing heavy web development in 1999 and even then it was amazing how far it had advanced.

  • by Anonymous Coward
    Seriously, even back then, Berners-lee will tell you that WWW was NOT a 1 man project.
    Hypertext came from a number of people such as Bush, and Nelson, along with Apple.
    HTML was simply an implementation of IBM's GML.
    Cern httpd was a straight forward implementation of FTP (thank god for that).

    Probably the one thing that Berners-lee's group really did was put all of this together and then open-sourced it
    while companies were getting pieces/parts, but then trying to figure out how to control and profit
  • In the late 80s, Berners-Lee suggested a web-like system of mangement, tied together by a series of what he called hyperlinks.

    Which term, along with "Hypertext", several others, and much of the vision of a web of interconnected online documents, had been coined and promoted by Ted Nelson, many years before. See the first edition of Computer Lib/Dream Machines" [google.com] (1974), for an early treatment of the ideas, and _Literary Machines_ for a more developed one.

    His (ill-fated) Project Xanadu was, by that time, funded and working round-the-clock to try to put together a world electronic library - with substantially more functionality (including an attempt to provide an acceptable emulation of or substitute for everything you can do with paper publications), when Tim's work hit the net and created the World Wide Web.

    Xanadu deviated from the WWW design in a number of ways, including:
      - "fine-grained links" (where a link end points to a particular range of a target, rather than a whole page or a point within it {unless you want that to be the target}),
      - bi-directional links (you can inquire what links are inbound to where you are reading and follow them backward)
      - avoidance of the "Library of Alexandra" / broken link problem by distributed database techniques.
    and I could go on.

    But they had bitten off a BIG problem, and the WWW, with its non-proprietary servers filled an immediate need with an immediately usable solution.

    • Oops. Pointed the link to the search, not the document.

      See the first edition of _Computer Lib/Dream Machines"_ [linkedbyair.net] (1974), for an early treatment of the ideas, and _Literary Machines_ for a more developed one.

      (Note that the printed version of the book was two half-books, like an Ace Double, which you flipped over to read one vs. the other. In the above PDF link you read the _Dream Machines_ half - which is where you find the Hypertext stuff - by starting at the first page and flipping pages forward, the _Comp

  • It was so much better back then, so much less absolute crap.

    The internet was filled with just information and mostly intelligent people, the masses had no fucking clue.

    It's been a wild ride seeing the internet develop, if only we could take modern hardware / network speed back then to the better content / contributors

E = MC ** 2 +- 3db

Working...