On This Day 25 Years Ago, the Web Became Public Domain (popularmechanics.com) 87
On April 30, 1993, CERN -- the European Organization for Nuclear Research -- announced that it was putting a piece of software developed by one of its researchers, Tim Berners-Lee, into the public domain. That software was a "global computer networked information system" called the World Wide Web, and CERN's decision meant that anyone, anywhere, could run a website and do anything with it. From a report: While the proto-internet dates back to the 1960s, the World Wide Web as we know it had been invented four year earlier in 1989 by CERN employee Tim Berners-Lee. The internet at that point was growing in popularity among academic circles but still had limited mainstream utility. Scientists Robert Kahn and Vinton Cerf had developed Transmission Control Protocol and Internet Protocol (TCP/IP), which allowed for easier transfer of information. But there was the fundamental problem of how to organize all that information.
In the late 80s, Berners-Lee suggested a web-like system of mangement, tied together by a series of what he called hyperlinks. In a proposal, Berners-Lee asked CERN management to "imagine, then, the references in this document all being associated with the network address of the thing to which they referred, so that while reading this document you could skip to them with a click of the mouse."
Four years later, the project was still growing. In January 1993, the first major web browser, known as MOSAIC, was released by the National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign. While there was a free version of MOSAIC, for-profit software companies purchased nonexclusive licenses to sell and support it. Licensing MOSAIC at the time cost $100,000 plus $5 each for any number of copies.
In the late 80s, Berners-Lee suggested a web-like system of mangement, tied together by a series of what he called hyperlinks. In a proposal, Berners-Lee asked CERN management to "imagine, then, the references in this document all being associated with the network address of the thing to which they referred, so that while reading this document you could skip to them with a click of the mouse."
Four years later, the project was still growing. In January 1993, the first major web browser, known as MOSAIC, was released by the National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign. While there was a free version of MOSAIC, for-profit software companies purchased nonexclusive licenses to sell and support it. Licensing MOSAIC at the time cost $100,000 plus $5 each for any number of copies.
I remember this day. (Score:1)
I remember spending hours trying to get Mosaic to comple under SLS linux. ahh the X driver issues, the horror of it all but after a week i managed to get it up and running.
Re: (Score:2)
Mosaic installed and ran great under Win 3.11.
Re: (Score:2)
Was gonna say, my first experience with the World Wide Web was when a social club that I was a junior member of met at a university student union for a change, and one of the members of the club that was a student in the honors college took us over to the honors college computer lab to show it to us.
I think it was on a Windows for Workgroups 3.11 platform, but they were working on getting it going on a Sun machine that they had in the lab. CDE biatches!
Re: (Score:2)
I'm not going to stand up for Win 3.1x here, but suggesting that Linux was usable by J. Random User in 1993 for almost anything, much less their girlfriend insisting that they needed to install it for them has either got their dates wrong, had one heck of a girlfriend, or is making crap up. Getting Linux running in 1993 on arbitrary hardware was a royal pain in the ass if you wanted anything other than a shell.
Re: (Score:1)
Or, she was a fellow CS major, wasn't your average user, and was far more interested in being able to do the things she needed to than using Windows. She bought the machine, I bit my tongue about Windows ... and she laste
Re: (Score:2)
In '93 I was at MIT, and knew people in the MIT X Consortium. This was before it left MIT to become the X Con, Inc. Motif and the Common Environment were still more than an year in the future.
Getting X Windows to run on linux was an ordeal, even for the best X wizards in the world, even on hardware handpicked for that purpose. Installing it on off the shelf PCs, just because someone asked you? Bullshit. This was a time were an IBM PS/2 was still considered serviceable hardware, and anyone who wanted to
Re: (Score:3)
You were lucky to get PC's running at 60MHz back then. It wasn't until 1994 that there were 90MHz PC's running Doom.
Re: (Score:2)
Yes. The TCP/IP stack was added in the Windows for Workgroups multimedia extensions available as an additional install on the drivers disk for a CD drive (if I recall correctly). I vividly remember installing it on a Win 3.1 laptop and dialing-up with a 14.4 modem. Changed my life.
Re: (Score:2)
and dialing-up with a 14.4 modem. Changed my life.
Correction: It will change your life. When that page finishes loading.
Re: (Score:2)
Without 100 tracking cookies and javascript bloat pages loaded about as fast as they do now.
Fun story: the first image I ever downloaded from the "internet" was a 50kb b&w satellite weather image in a lynx browser on a 2600 baud internal modem. The serial mouse I had connected was on the same IRQ as the modem, however, so the modem would freeze when the mouse wasn't active but then resume when the mouse was moving. So I sat there for what seemed like an hour moving the mouse in circles until the pic
Re: (Score:3)
Ah yes. And the code name for the WFWG TCP/IP stack was "Wolverine.
Re: (Score:3)
WINDOWS machines on the Internet in 1993? I didn't think Net BEUI was routable.
IIRC, you needed Winsock and win32s in order to get Mosaic running on Win 3.1x, but it was certainly possible.
Re: (Score:2)
"you needed Winsock and win32s"
Yep that was it and what I meant by TCP/IP stack in my posts.
Re: (Score:3)
When the first ISP's to home users became available they gave out Winsock Trumpet applications to use to read Email and Usenet, do FTP, gopher, traceroute, ping and whois
Re: (Score:2)
worked *much* better under SunOS at work and NextStep at home
Re: (Score:1, Troll)
I wish CERN would stop claiming credit for things they didn't do ('web'), and stick to claiming discovery of things that don't exist ('higgs').
Re: I remember this day. (Score:2)
Yeah, I'm old enough to remember the day when somebody could invent a superior solution, and put out a new client, and everybody would be like, "cool I'll go download that app and use that instead." Can you imagine trying to move people off of http/html today?
Re: (Score:2)
For all intents and purposes Mosaic DID invent the web as the world knows it today. Links, images and text on a single page = Mosaic.
Re: (Score:1)
Gopher was about midway between http and ftp. The web could never have been built on top of gopher. Sure, it delivered data, and browsers could visit gopher sites by their URL, but it was too rigidly structured and too text-oriented to be successful in the long run.
If the web is like a library, gopher was more like a book of the month club.
Re: (Score:1)
'Web' is an abbreviation of WorldWideWeb. CERN did not invent the Web, Tim Berners-Lee and Robert Cailleau invented it by publishing the WorldWideWeb proposal [w3.org]. Note that it says "HyperText" right in the title. GOPHER was an Internet service, but it was not hypertext and not part of the Web.
Re: (Score:2)
Re: (Score:2)
And soon it was the Information Superhighway. Local engineering society had a speaker talking about this new thing, I have the handouts someplace (there was no PDF download). I remember many people asking how can a business make money using such a thing. I was thinking that would be cool to simply click a link instead of navigating groups on my Compuserve (I saved by Model 500 phone so I could still use the acoustic coupler).
There was other talks about a coke machine was connected to the Information Super
Re:I remember this day. (Score:4, Interesting)
By 1994, Mosaic would build pretty easily on a Slackware distribution. I had that, NCSA httpd and a few other goodies installed on a Dell PC that Boeing was good enough to drop on my desk a few months before the Windows install crew could get around with their box-o-floppies. By the time they did stop by, I just told them "Never mind."
Engineering was responsible for providing documents to the factory floor. Which was done with a convoluted combination of an index database (accessible via 3270 terminals), some notes scribbled on a piece of paper and then a manual search for a file on some server share. Along with all the possible fat-finger errors imaginable. So one day I was goofing around with httpd and managed to get a read-only link to the mainframe to select datasets applicable to a particular plane. And then format a link to the document server. Two clicks and you're done. I showed it to my boss who showed it to some factory managers. Boss came back from meeting and said, "The shop wants this web thingy in production in two weeks." So we got an actual server (Sun), built the pieces and made the schedule. Factory loved it. Boeing computer services* hated it. They figured that this kind of development could have earned them a few tens of millions of dollars and a fully staffed program for a year or two.
*One of their IT guys asked me how I (the sole maintainer of the entire web system) managed to build the HTML index pages from a database dump to keep the web data up to date. They didn't understand dynamic pages back then.
In other news... (Score:2, Funny)
Re: (Score:1)
You mean like TFA? No thanks, it's leaving out all sorts of details and gives tim "drm for the web is a good thing" berners-lee more credit than due.
I had had no idea my company spent that much! (Score:3, Interesting)
We licensed it to test our new web site that I don't think any customers even used until a couple of years later. The site was pretty crappy since I learned HTML from viewing the source on other sites, and it took me a lot of time so that was a huge waste of money.
Re: (Score:2)
LOL, that's what we did to get access to Mentor Graphics for a circuit board layout program before we switched to Tango Pro that would run on a PC. Apollo computers were very expensive at the time. They got a little cheaper after HP bought them, but by then it was basically a dead-end platform not worth investing in.
Re: (Score:2)
That's another thing they invented: users not giving a shit about licenses.
Re: (Score:1)
https://en.wikipedia.org/wiki/University_of_Illinois_at_Urbana%E2%80%93Champaign
Dummy.
Remember when there were lots of web sites? (Score:5, Insightful)
Facebook / Twitter: Ego Strokes for Regular Folks (Score:1)
Back in the late 90s people would take their interests or hobbies put them on a website and then promptly add feel-good widgets like web counters and guestbooks to boost their egos a bit, feel like they had an audience rather than just throwing information into the void.
That was the seed of social media right there. Web counters evolved into web rings, communities, user scores and eventually huge public forums like Twitter and Facebook. A users own ego is actively played by these systems to encourage more u
Re: (Score:2)
During 9/11 and major Internet outages across the world, we discovered that the long-distance carriers weren't using mesh networks but had consolidated their infrastructure into spanning tree networks to save money. Then customers realized that even having multiple providers wasn't enough as they had virtualized their capacity through subleasing of fibre optic conduits, bundles and even virtual networks. Facebook and Google had to deal with so much data sloshing around their pipes, they needed colocation fa
Ah memories (Score:2)
Re: (Score:3)
I still remember Eternal September when Usenet as it had been basically died. It certainly wasn't perfect but it was a perfect beacon of light compared to the cesspool that it became.
Re: (Score:3)
[...] with my Utopian vision of how people were going to use the Internet; to share and exchange new ideas and information and further human understanding [...]P
br/ And then you stumbled on your first goatse link.
Re: (Score:2)
Nope. Even then, it was pretty clear that it's primary application would be to replace teletype ASCII porn.
Fight for the internet/web’s future. (Score:1)
The Irony (Score:3)
The irony is this is being reported by on a paywalled site that doesn't work when using a host file to block ads.
Impediment was the domain name (Score:4, Insightful)
Those costs are what drove people to "free" web services like GeoCities, MySpace, and eventually Facebook. You can justify the cost to set up your own domain and website if it's going to be a business venture or a major part of your online profile. But for the vast majority of people, it wasn't worth it. Which is what allowed the personal-info-harvesting vultures to swoop in and take over the web as we know it today. In some ways I think it was actually better before the web, when simply having an account on an Internet service automatically gave you a finger profile [wikipedia.org] you could fill out however you wanted at no additional cost.
Nowadays most ISPs also give you some free web space (and an email address) with your account. But it's too little, too late.
Re: (Score:2)
Not even email - that gets outsourced to Microsoft Webmail.
Re: (Score:2)
Go back 25 years and make it not happen (Score:3)
Quick trip back in time (Score:5, Interesting)
I was working at Indiana University at the time. In the fall of 1993 I found the world wide web. I got xmosaic working on my unix desktop and also installed NCSA httpd. I downloaded the HTML specification and got to work implementing web pages. At the time, I had pages that took 30 seconds to generate.
Our department (basically IT for the university) was smack dab in the middle of moving our information services from the VAX/VMS cluster to the newfangled gopher service.
https://en.wikipedia.org/wiki/... [wikipedia.org]
Along with a few other folks I did a "stop the presses" and convinced them to abandon that project and go straight to the WWW. It took a lot of convincing since they were so invested in getting gopher up and running. Plus, the text-only information was a pretty easy direct-map to gopher.
The browsers were really primitive at the time. No stylesheets, but we had inline images and could set background colors.
Not only that, the development was mainly done on the X11 platform. The Windows and Mac browsers were always lagging in features.
Fill-out forms were a new thing, and the sub ordering application was the standard demonstration for that. Before fill-out forms, there was the "isindex" tag which would show a search box, the contents of which would be added as the "query" part of the url when you hit enter.
There were no cookies and thus no real way to keep state. When I quit the university, we were working on a way to store session information in files on the back end. The idea was basically what PHP was doing back around 1999 - give the user an MD5 hash or something that was used as the query portion of the url. Every url had to go through CGI, and the script would look up the file containing the session information and read it in, and possibly write it out. We wanted to move some of the administrative actions - such as students setting up accounts - to the web. Since fill-out forms weren't really available on the Mac and Windows platforms, we were looking at using "isindex" to get all information to the backend.
It's amazing how far we've come in 25 years. I started doing heavy web development in 1999 and even then it was amazing how far it had advanced.
I hope that Berners-lee reads this trash (Score:1)
Hypertext came from a number of people such as Bush, and Nelson, along with Apple.
HTML was simply an implementation of IBM's GML.
Cern httpd was a straight forward implementation of FTP (thank god for that).
Probably the one thing that Berners-lee's group really did was put all of this together and then open-sourced it
while companies were getting pieces/parts, but then trying to figure out how to control and profit
"What he called" was coined long before. (Score:3)
In the late 80s, Berners-Lee suggested a web-like system of mangement, tied together by a series of what he called hyperlinks.
Which term, along with "Hypertext", several others, and much of the vision of a web of interconnected online documents, had been coined and promoted by Ted Nelson, many years before. See the first edition of Computer Lib/Dream Machines" [google.com] (1974), for an early treatment of the ideas, and _Literary Machines_ for a more developed one.
His (ill-fated) Project Xanadu was, by that time, funded and working round-the-clock to try to put together a world electronic library - with substantially more functionality (including an attempt to provide an acceptable emulation of or substitute for everything you can do with paper publications), when Tim's work hit the net and created the World Wide Web.
Xanadu deviated from the WWW design in a number of ways, including:
- "fine-grained links" (where a link end points to a particular range of a target, rather than a whole page or a point within it {unless you want that to be the target}),
- bi-directional links (you can inquire what links are inbound to where you are reading and follow them backward)
- avoidance of the "Library of Alexandra" / broken link problem by distributed database techniques.
and I could go on.
But they had bitten off a BIG problem, and the WWW, with its non-proprietary servers filled an immediate need with an immediately usable solution.
Corrected link. (Score:2)
Oops. Pointed the link to the search, not the document.
See the first edition of _Computer Lib/Dream Machines"_ [linkedbyair.net] (1974), for an early treatment of the ideas, and _Literary Machines_ for a more developed one.
(Note that the printed version of the book was two half-books, like an Ace Double, which you flipped over to read one vs. the other. In the above PDF link you read the _Dream Machines_ half - which is where you find the Hypertext stuff - by starting at the first page and flipping pages forward, the _Comp
Yup I remember the good ol days (Score:1)
It was so much better back then, so much less absolute crap.
The internet was filled with just information and mostly intelligent people, the masses had no fucking clue.
It's been a wild ride seeing the internet develop, if only we could take modern hardware / network speed back then to the better content / contributors