How Moore's Law Saved Us From the Gopher Web 239
Urchin writes "In the early 1990s, the World Wide Web was a power-hungry monster unpopular with network administrators, says Robert Topolski, chief technologist of the Open Technology Initiative. They preferred the sleek text-only Gopher protocol. Had they been able to use data filtering technology to prioritize gopher traffic Topolski thinks the World Wide Web might not have survived. But it took computers another decade or so to be powerful enough to give administrators that option, and by that time the Web was already enormously popular." My geek imagination is now all atwitter imagining an alternate gopher-driven universe.
You kids and your fancy gopher (Score:5, Funny)
I'm pressing ESC twice to access this damn BBS.
Re: (Score:2)
And More Laws will destroy it (Score:5, Funny)
Uh, no. (Score:5, Insightful)
Even if the Web had been stunted by throttling, the demand for multimedia content would have eventually driven the rise of the Web or at least a super-Gopher.
Re:Uh, no. (Score:5, Insightful)
Well, multimedia is a orthogonal concern, really. If anything, in the early days gopher was more convenient for multimedia than the web.
The thing about the web, the defining characteristic from the point of view of providers of information, was HTML. And HTML was a pain. It still is but since we assume it's necessary we don't think of it as pain. Back in the day, it was much easier dump all your stuff into gopher, including your multimedia files, than it was to write a whole new bunch of HTML from scratch.
HTML was pretty far from what people eventually wanted the web to do too, which was to be an app platform. A lot of fancy architectin' has gone on to get it where it is today, and people are still screwing around with stuff like flash.
The thing about the PITA of HTML is that it forced people to redo so much of their content into a uniform format, what's more a format that could be spidered by robots. That's the secret sauce. Yeah it's nice that people can follow hyperlinks, but the ability deal with basically one kind of data (marked up docs with hyperlinks in them) that really made the web powerful.
Another thing was that while the early HTML wasn't very much like what people wanted for their documents, and despite abortive early attempts to add things like fonts (not to mention our beloved blink tag), HTML's SGML roots gave it architectural flexibilty. It needed the flexibilty so that the the missing 99% of what really people wanted could be added later without turning it into a hopeless mess.
Re:Uh, no. (Score:5, Insightful)
The thing about the PITA of HTML is that it forced people to redo so much of their content into a uniform format, what's more a format that could be spidered by robots. That's the secret sauce. Yeah it's nice that people can follow hyperlinks, but the ability deal with basically one kind of data (marked up docs with hyperlinks in them) that really made the web powerful.
The thing about HTML is that it really didn't force anyone to do anything. A web server will serve plain text files just fine and so long as everyone's MIME types are good, your browser will display them. Another non-secret secret of the web is that it doesn't require a HTTP server. You can serve a web site just fine (albeit a little slowly and without dynamic content) via FTP. Finally, I took a bunch of drinking game content and put up a drinking game website by just writing a CGI (this was back in the early nineties) to write a header, insert a PRE tag, include the text file, insert a closing PRE tag, and write a footer. Careful examination of this description will reveal that I did not actually have to do anything to my text files. In addition, text files can be spidered just fine. HTML renders down to text, when done correctly (or it doesn't spider) and text is already text.
Got any other erroneous information to share?
Re: (Score:2)
A web site made up of text files would be pretty dysfunctional.
Re: (Score:2)
Re: (Score:2)
A web site made up of text files would be pretty dysfunctional.
In other words, a typical web site?
Re: (Score:2)
Actual experience with FTP vs. Gopher vs. WWW (Score:5, Insightful)
At first, the journal served papers through anonymous FTP.
Then, I crafted a Gopher structure to make browsing easier.
As soon as HTML/HTTP came along, I created the HTML version of the journal. It was much more maintainable than the Gopher version, because the hyperlinks decoupled the document structure from the file-system tree structure just enough. In a few years, I stopped maintaining the Gopher version, because it required an order of magnitude more work than the HTML, and readers all preferred the HTML anyway.
Adding pictures and stuff is rather trivial for the data architecture, although demanding for the network implementation. With a more maintainable structure, Gopher would have added the extras. It was the Hyperlinks that made HTML work better.
HTML also has some serious maintenance problems, but they appear later when the archive gets large, and they can be addressed with things like PHP compiling and content management systems.
From another point of view: Gopher essentially made file trees visible over the network (which is what I thought I wanted at first). HTML/HTTP provides a crude network database model distributed over the network.
Future advances in data architecture (as opposed to the types of data within that architecture) will have to do with other database models, and with other sorts of commitments between distributed servers, and with looser coupling between data ownership and server ownership. E.g., a way to provide reasonable assurance of future access to a particular data item (access includes being able to find it, not just its existence), without depending on a particular server at a particular registered domain name (the Wayback machine ameliorates the problem, but doesn't solve it).
Re: (Score:2)
So did FTP servers. Gopher was basically like an FTP service with a tad more metadata.
Re: (Score:3, Insightful)
> This is the same sense in which 'apt-get' is a convenient way to get software.
Compared to hunting down something and then dealing with the crap and ads that
are likely to infest any Windows download sit, "apt-get" is HIGHLY convenient.
"get me some program" and letting "get me some" sort out all of the relevant
details is a big improvement over how most people install these software these
days.
"modern software" that needs to be installed by a shiny happy GUI installer
is actually a step backwards in many re
Re: (Score:3, Funny)
...or at least a super-Gopher.
I, for one, welcome our new underground overlords.
Sorry, the thought of super-Gophers scares me more than cloned dogs, or Africanized bees, or cloned dogs with Africanized bees in their mouths so when they bark they shoot bees at you.
Re: (Score:2)
I, for one, welcome our new underground overlords. Sorry, the thought of super-Gophers scares me more than cloned dogs, or Africanized bees, or cloned dogs with Africanized bees in their mouths so when they bark they shoot bees at you.
Best string of Simpsons references. Ever.
Re: (Score:2)
Actually the Web would have been nicer too if the images were optional links to follow instead of being shown inline by default. The early web was very inefficient, and you could spend a lot of time staring at a picture slowing being drawn when all you really wanted to see was the text.
Gopher was also much more organized in many ways. The free-form HTML resulted in most sites being a jumble. On the other hand, the free-form nature also spurred a lot of experimentation and growth.
Even today I find most we
Le Super Gopher (Score:2)
...or at least a super-Gopher.
I think I saw that guy in Caddyshack...
Multimedia was inevitable (Score:5, Insightful)
Even if Gopher had dominated due to filtering (a premise I don't agree with), multimedia capabilities would have eventually been added to the protocol out of demand. We'd have the same web we have today.
Re:Multimedia was inevitable (Score:4, Interesting)
Eventually, maybe, but exposure drives demand; if it had stalled long-enough for, say, cable and phone companies to deliver substantial non-free interactive multimedia outside of the context of the web first, its very likely that nothing socially like the current web would have existed any time near now, even if many of the individual features that are important about the web were available in one form or another on some networked electonic system that was widely available elsewhere.
Re: (Score:3, Insightful)
Eventually, maybe, but exposure drives demand; if it had stalled long-enough for, say, cable and phone companies to deliver substantial non-free interactive multimedia outside of the context of the web first, its very likely that nothing socially like the current web would have existed any time near now, even if many of the individual features that are important about the web were available in one form or another on some networked electonic system that was widely available elsewhere.
You have plenty proprietary network examples; CompuServe, GEnie, Prodigy, Sierra Network, AOL. Some are certainly more multi-media than others. But the common issue is that they were all their own digital islands. That worked well for decades. Until the Internet consumed public consciousness (and AOL launched the September that never ended [wikipedia.org]).
The power of the 'web isn't in multi-media delivery. That's not to say it isn't important. But there is a more fundimental feature; ubiquity. For all the features
Re:Multimedia was inevitable (Score:4, Informative)
Ubiquity, a feature of the internet, was a consequence of multimedia, a feature of the web -- almost anyone could get access to the internet for many years before the web was popular (I remember first looking into local ISP options in ~1991.) Comaparatively few people did until the web was popular because there was no appeal to most people. The internet, which had been around for quite sometime, became omnipresent because it offered something which rapidly drew wide interest, and that was the multimedia offered by the the web.
Re:Multimedia was inevitable (Score:5, Interesting)
The internet, which had been around for quite sometime, became omnipresent because it offered something which rapidly drew wide interest, and that was the multimedia offered by the the web.
Not at all. Email was the killer app. And that wasn't multi-media.
I remember trying to get an Internet connection in '91. It wasn't to be had where I was. I had to "borrow" a link from the local university. I got involved with an outfit opening up an ISP in the area. And while firing up Netscape got folks really happy, it was email that got the subscription. Folks wanted to be able to email their kids off at college. We were in a military town with a base who was on a constant deployment schedule (myself included). Military families bought subscriptions as soon as they realized email was (almost) instant compared to the 2 weeks it took for snail-mail to make it across the pond and into sandland.
Now, to be sure, for me... the 'web was a killer app as well. I remember being all giddy over clicking a link that had a .au in it's URL (and not paying LD charges). This was the realization of Clarcke's 2010. And then I was pulling up images of all matter of content - from magazines to hobbies to... well.. other interests.
But all of this would be window dressing if it wasn't for the fact that I can email anyone no matter what service provider they use. And when I want to bring up Megacorp Hobby's web page to order supplies to do a project I read about on some enthusiast's private underwater basket weaving fan site... I don't have to worry about the provider then either.
The underpinnings to this all is ubiquity. I had a lot of these features during the years I used CompuServe, et. al. And services like Sierra Network were pushing the graphics / multi-media angle. But none of them hooked me up with a fan site in Australia.
Re: (Score:2)
You know - you're right. There was all that fear about MSN eating AOL's lunch; the unfair advantage MS would have putting a MSN shortcut on the desktop. And then there was the Internet.
Re: (Score:2)
I agree, I don't think gopher would have become popular as just a text protocol. The internet might have stagnated waiting for multimedia to be shoehorned into gopher. People seem to like the pictures, movies, motion and all the other bling, preventing all that probably would have stunted the internet's popularity. If people didn't really want the pictures, they could just stick with lynx, and that really doesn't seem to be retaining any traction.
Re: (Score:3, Insightful)
Probably. We already had things like compuserv, prodigy, BBS, fidonet, email, minitel, etc but it wasnt until Joe Sixpack could see photos, play music, and click with a mouse did it take off in the market. The command line, memorizing keyboards, etc is a real barrier to entry. A lot of FOSS people dont understand that.
Its equally, if not more likely, that someone would have just invented something web-like and leapfrogged over gopher like TBL did at CERN.
Not to mention PCs having multimedia capabilities was
Gopher was great (Score:3, Interesting)
If Gopher had won we would have had more a focus on content than presentation. I hardly think this is a bad thing.
Re: (Score:3, Insightful)
I'm not sure that's right.
The thing is, it's possible to architecturally separate presentation from content from metadata in HTML. Furthermore, people do care about presentation. Who are we to say they shouldn't? The problem is confusing the two.
Here's what I see wrong with the puritanical belief that outlawing presentation hanky-panky will keep the flock virtuously focused on content: people will cheat. When they think they can get away with it, they'll enthusiastically engage in all manner of abominat
Re:Gopher was great (Score:4, Interesting)
Whoever said graphics on a web page is not content? Whoever said that the beautiful graphics intensive web pages today are not a form art? Is the only form of content is text? No! Is telling an artist they can only use a pencil and are not allowed to use any colours at all in their work reasonable limitations on an artist? No. Using colour, paint and so on gives you more capability that allows you to create even more exquisite content. The greater graphics capability of flash, and hopefully soon open spec web environment equivalents, allows one to portray and create art not possible with text.
Re:Gopher was great (Score:4, Informative)
Oh Archie [wikipedia.org] and Veronica [wikipedia.org], how I miss thee.
Gophers? (Score:2, Funny)
Gophers!! (Score:2, Informative)
No, not little, brown, furry rodents...
Massive, golden, beautiful beasts!! [gophersports.com]
As most of you know (or maybe don't know), it's called gopher because it was developed at the University of Minnesota... http://en.wikipedia.org/wiki/Gopher_(protocol) [wikipedia.org]
I loved Gopher (Score:5, Funny)
Thats what is really stopping me from getting an iPhone, because I can't access badger-net.
Re:I loved Gopher (Score:5, Informative)
Badgerbadgerbadger.com is not connected with the creator of the flash movie, it is just some guy trying to profiteer over the meme. Stick with linking to the original authors, not the leeches.
Re: (Score:3, Informative)
Badgerbadgerbadger.com is not connected with the creator of the flash movie, it is just some guy trying to profiteer over the meme. Stick with linking to the original authors, not the leeches.
I did a quick search for the badger flash vid before posting, and just took the first link I could find. I thought that was the original site at first. I hadn't seen the flash video in years so I didn't know the original URL.
The original is Here [weebls-stuff.com]
Re: (Score:3, Informative)
Badgerbadgerbadger.com is not connected with the creator of the flash movie, it is just some guy trying to profiteer over the meme. Stick with linking to the original authors, not the leeches.
Except that badgerbadgerbadger.com's little flash movie has a link in the bottom right-hand corner of it, pointing to www.weebls-stuff.com [weebls-stuff.com] - the aforementioned original author.
Re: (Score:2)
When history looks back at the first generation to extensively use the World Wide Web, it will sigh a collective "WTF?"
Irritation (Score:5, Insightful)
People think that if Person X hadn't been around we might not have Technology Y. Okay, this is based on the idea that somehow Person X has some unique ability and only Person X can create Technology Y. Hate to break it to you, but you're not special. Neither is Person X. Second, the reason we have Technology Y is because we needed it. If those needs haven't gone away, then the pressure to fill that void remains -- and somebody else will come along and fill it eventually. Now you're right that maybe Betamax might have beaten VHS if not for a disturbance in the force, or it would have been HD-DVD instead of Bluray, or whatever... But we'd still have high density optical media. Gopher would have died simply because it didn't meet the needs of the population. Maybe it wouldn't be HTTP that replaced it five, or ten years later, but something like it would have been created.
Re:Irritation (Score:5, Interesting)
Hate to break it to you, but you're not special. Neither is Person X.
That is a crock of shit. I mean, I may not be special, but certain persons who helped shape the future of science (including computing) are. There is no denying the "specialness" of people like Nikola Tesla or Albert Einstein. Why, then, should you deny the specialness of someone who is arguably less special than they are, but more special than you are? Simple jealousy? History is chock-full of examples of people whose unique way of thinking changed the shape of our world, the canonical example being Newton. He saw things in a way that others did not, and he advanced science dramatically. Maybe Tim Berners-Lee is no Einstein or Newton or Tesla, but he is certainly an individual with unique thought and influence.
In any case, the argument here is actually that if we didn't have the processing power to do multimedia, that we would have had a dramatic population increase in gopherspace rather than exponential growth of the WWW. The only part of the argument that is stupid is that people were already serving images over gopher; you needed an external viewer, of course. But sooner or later, someone would have come up with a multimedia markup extension for gopher, and then gopher would have been the WWW, just with a different protocol.
Re: (Score:2)
Re: (Score:2)
There is no denying the "specialness" of people like Nikola Tesla or Albert Einstein. Why, then, should you deny the specialness of someone who is arguably less special than they are, but more special than you are? Simple jealousy?
No. Success depends on a lot more than just a person's innate "specialness". Or I can be more blunt: They just happened to be in the right place, at the right time, and had what was needed. There have been hundreds of failed Nikola Teslas -- I could manufacture him on an assembly line and sprinkle copies throughout society and I'd be unlikely to reproduce what the original did, simply because the environmental factors would be lacking. I'm sorry, because I know everyone wants to feel special and unique, tha
Re: (Score:3, Insightful)
Sure, so what? That it depends on more than a person's "specialness" does not refute that "specialness" matters.
Yes, and to be equally blunt, hundreds of millions or billions of other people were around at the right time, very many of them at the same right place or one equally right; what was key is the "had what was needed"
Re: (Score:3, Insightful)
And yet Leibniz invented calculus too, independently and at about the same time. Methinks you need a better example.
Re: (Score:2)
From what I understand, they invented two different types of calculus. Also, there was some reason to suspect that Leibniz may have gotten the idea from Netwon, and they had quite a fight about it for many years.
Re: (Score:3, Informative)
I would also point out a famous quote of Newton in this context: "If I have seen further it is only by standing on the shoulders of Giants."
The history of mathematics is actually full of examples of parallel discoveries and rediscoveries. Mathematics is not the only science
Re: (Score:2)
Since everything in mathematics can be derived from the base definitions isn't everything bound to be rediscovered given enough time and effort?
I know I've rediscovered tons of mathematics by accident since I needed it for something only to later discover someone else had already done it more formally a few hundred years ago.
Re: (Score:2)
You could also argue that when confronted with a problem whose solution is unique, anybody who solves it will necessarily obtain the same solution. However, that still leaves the question of why independent minds try to solve essentially the same problem, *before* the problem becomes generally recognized or famous.
Gauss, Lobachevsky and Bolyai inve
Re:Irritation (Score:5, Insightful)
The thing is, when the precursors to a new technology get developed, the new technology becomes apparent to growing numbers of people until someone develops it.
I remember experimenting with GUIs in my local Atari Computer Club back in 1981 (using light pens instead of mice... light pens were easier to fabricate). People nowadays like to think that Bill Gates invented the Graphical User Interface. Slightly more savvy individuals think it was one of the Steves (Jobs/Woz) who did it. Marginally less clueless folks might think it was Xerox, or IBM or whoever. The fact is that as soon as cheap consumer-grade computers hit the market (mid-late 10970's), GUI controlled operating systems became inevitable. If there was a gang of people in my little backwater town working on the issue, there must have been thousands upon thousands of people experimenting with GUI controls nationwide.
Finally, compared to the technologies upon which it relies (with regards to the Internet), HTML, and, by extension the Web, is trivial. The ONLY important tag in HTML that matters is the link anchor, and this itself had precedents. How long would it have really taken for people to start including the gopher addresses of referenced documents in their documents that they posted on gopher? How long before gopher browsers were developed that could retrieve and display documents that were encoded in standard formats became available? Anyone who remembers using the gopher browser that shipped with early versions of OS/2 knows that gopher could have done the duty of http, given its absence... particularly as a standard document format would have eventually developed to ease spider indexing.
Really, folks, there were a lot of us working on this stuff back then. What we have now is a crude compromise (with Flash cancer), but that we would have a graphically navigable network of documents spanning the globe was never in doubt.
Re: (Score:3, Interesting)
And yet Leibniz invented calculus too, independently and at about the same time. Methinks you need a better example.
Or you need to learn about Newton's theory of gravitation, which fits the GP's point much better:
"He saw things in a way that others did not, and he advanced science dramatically."
Calculus is math, which is admittedly a large part of science... but I believe the GP's point was that figuring out that things fall down because "down" is relative to the large, relatively stationary object we stand on was probably completely inconsistent with the then-current accepted "truths". Think different, ya dig?
Re: (Score:2)
If Newton hadn't thought of it, Bernoulli or Maxwell or somebody would have instead.
Re: (Score:2)
Unfortunately, both of your examples could have easily been replaced by someone else, around the same time they did it. Both Tesla and Einstein, for the all their work, which I appreciate, were not the only people doing what they did. They did not do all of that work by themselves, they built on work done before them.
I'm sure there is something somewhere that was unique to a particular person that came up with the idea, but rarely is it ever the person credited with it, and I doubt anyone short of the fir
Re: (Score:2)
Actually, if you look closer at history you'll notice that most things were invented by several people around the same time but only one of them got credit. While the inventors were all above the norm, they weren't unique.
Re: (Score:2)
The premise that "some people are geniuses, and this special-ness is a necessary requirement for certain theories to be discovered" isn't quite as self-evident as you make it out to be.
In Pratchett's "The Science of Discworld 3: Darwin's Watch", there is a chapter that deals specifically with this issue. The author argues that there is a "time" when the mass of previous evidence and research will result in the theory being "created". See this guy's summary:
"They also show how it's not enough to have an idea
Re: (Score:2)
Sometimes, but sometimes not. Given a set of tools and a problem to salve, I'm sure there will be duplicity in the different ways that they use the toosl to solve the problem. But every so often you do get one person who thinks a little differnt and does it in a unique way.
After they have done it in their unique way, it seems obvious that thigns can be done that way. But they had to think it up first. So sometimes peope lare special and thing would be differnt if they had
Re: (Score:2)
There are two concepts here. The first is the uniqueness of the individual. The second is the idea who's time has come.
I agree that history shows that there are certain ideas who's time has come. There are examples of certain revolutionary changes being worked on from different angles. And with hindsight, we can see that the change was only a matter of time.
But saying anyone could have brought about these changes sounds an awful lot like arm-chair quarter-backing; "yeah - I could have done that." There
Re: (Score:2, Funny)
You say that like they're different things (Score:2)
I remember Gopher. Used it a bit, when I first got online. The WWW was to Gopher as Web 2.0 is to WWW. Really. The web was a natural progression of improvement from Gopher. It was wasn't called Gopher 2.0, much like Windows 95 wasn't called Windows 4.0. It was a new version, and somebody though it woudl be good to give it a new name.
Nothing has changed (Score:3, Insightful)
"In the early 1990s, the World Wide Web was a power-hungry monster unpopular with network administrators"
As I write this, Firefox is using 300mb of ram and 100% of one core, so not much has changed since then.
it would be the same (Score:2)
the only difference users would see would be that the text of a page would load first and URLS would all start with gopher:// [gopher]
Re:it would be the same (Score:4, Interesting)
the only difference users would see would be that the text of a page would load first and URLS would all start with gopher:// [gopher]
The only reason the text of a page doesn't load first today is that web browsers are badly behaved. Firefox will often refuse to render a page until it gets all the content. That's not the most aggravating thing about it though; if a connection is reset, then Firefox now shows you a page saying that it was reset, instead of the page content that it DID successfully manage to download. I don't know who's responsible for this "feature" but it's fucking stupid. It made the web mostly unusable when I was on a modem, because I'd be happily reading a page, some ad would fail to load, and then Firefox tells me the page failed to load. Whoever made that decision should definitely be asked to justify it, or asked to fuck off immediately.
Re:it would be the same (Score:4, Insightful)
I understand your complaint. But to give rendering engine developers some credit, if you really understood the complexities of rendering html properly, you'd understand why they stopped trying to do partial rendering a long time ago, its just not worth the effort at this point.
Can it be done? Of course, is it worth it? Meh, considering most of the Internet is pretty reliable, the amount of times partial rendering would help doesn't really justify diverting that effort from other more important aspects of rendering.
Re: (Score:3, Informative)
I understand your complaint. But to give rendering engine developers some credit, if you really understood the complexities of rendering html properly, you'd understand why they stopped trying to do partial rendering a long time ago, its just not worth the effort at this point.
No, you DON'T understand my complaint. Firefox will have rendered the page, right? Then a SINGLE PAGE ELEMENT fails to load, or perhaps the site fails to close the TCP connection properly, and now Firefox says "The connection was reset while the page was loading" and the page I was just successfully reading disappears.
If you read and understood my comment this time, now you understand my complaint.
Serious doubt about tftp (Score:2)
I don't see tftp as ever having been an important part of the internet over long-haul connections. Tftp would have been what it was intended to be and is, a very straightforward protocol that can be implemented with incredibly tiny footprint with little risk of getting it wrong. Notably:
-TCP is *much* better at reliable communication without penalty. TFTP is intentionally dumb, send a block, ack a block, send a block, ack a block. Again, easy to write, horrible performance. In TCP we have adjusting win
Bring the Gopher (Score:3)
Every time I have to sit through a bunch of crappy Flash or out of control javascript, I find myself wishing I could get a decent gopher feed.
I don't get it. (Score:2)
Early 90's computers...486DX2? Pentium 90? That's enough to route much more traffic than any of the nodes at that time could even conceive of, and can do QoS to boot.
Doesn't Gopher run on port 70, making it easy to prioritize over port 80 traffic? It would seem (although I could be completely wrong) that the biggest holdback wasn't hardware, just that QoS hadn't really been brought to fruition in time.
I doubt it (Score:2)
I doubt gopher would have met the needs of the internet as well as the web, and would have been sufficient. The combination of HTTP and HTML has been proven to be enormously successful. Gopher would have needed some major work to make it as flexible as HTML. Web would have probably have replaced Gopher in any case. The design of the web is more practical and better thought out than gopher.
Saved us? (Score:2)
Some of us don't like where the bloated, cpu and bandwidth wasting internet is heading. A world where gopher survived and flourished doesn't sound all that bad.
Re: (Score:2)
Net Neutrality (Score:2)
So in other words, Net Neutrality saved the web?
I predicted the failure of Mosaic... (Score:4, Interesting)
after seeing it on a secretary's desktop at NASA in the early 90s. My comment was very close to "Yeah, but I can already get all that with gopher; I don't think it will take off." Now, in my defense, just six months later I predicted that in a few years you would see panel trucks with web addresses instead of 800 numbers. The couple of people I told that to looked at my like _I_ was crazy. Damn, I wish I would have put my retirement savings behind that thought.
Until mosaic, gopher was far better (Score:2)
In 1993, I heard about this "world wide web" thing and tried it. I think it was "www", some really awful command line tool. gopher worked far better at the time and was much easier to use. Then Mosaic came out and changed the world. I think you could have done a multimedia gopher along the same lines though; it maybe wouldn't have been quite as flexible, but as far as bandwidth consumption goes, media is media...
Re: (Score:2)
Licensing (Score:5, Interesting)
It's surprising that no one here on slashdot has pointed out that a major difference between the html and gopher was that gopher services had to get a licence [wikipedia.org] from the University of Minnesota while http servers could be constructed without a licence.
Free open software with free open standards is what got the web going.
Re: (Score:3, Informative)
Only if you wanted to use their code. It was very easy to write a simple Gopher server. Not saying that NCSA making their http server code freely available didn't help in the adoption of http/html, it certainly was a strong factor, but the U of M gopher server wasn't the only one out there.
If you're nostalgic for Gopher... (Score:2)
...and have access to a pre-OS X MacOS, TurboGopher VR is a must-see. The screenshot at the bottom of this page (http://www.tidbits.com/iskm/iskm3html/pt4/ch24/ch24.html [tidbits.com]) doesn't do it justice.
Suffice to say, it is probably the only Gopher client that will ever have a key mapped to a "jump" action that is interpreted literally.
3rd grade misunderstanding of protocols (Score:3, Insightful)
The difference between HTTP and Gopher has NOTHING WHATSOEVER to do with ability to serve multimedia content, nor with bandwidth. HTTP, or really HTML, just allows more diverse linking patterns than Gopher's hierarchical format. But there's nothing non-graphical or content specific about gopher. I RAN graphical Gopher clients perfectly happily (well, including early Web browsers that supported that protocol.
Re:lol whut? (Score:5, Insightful)
Yes it was- people went nuts with images on their pages. I even remember one early commentator saying that text-only web pages were actually *better* for people on 14.4k baud modems.
Re:lol whut? (Score:4, Informative)
If you want to see an old style yet tasteful web page, visit my vintage 2000 Open Slate Project site. [openslate.net] It features a "3D" background, another fad that faded. No Flash. I do need to spend more time updating that site.
Re:lol whut? (Score:4, Informative)
Re: (Score:2)
Yes, but that was already about 1999. Moore's Law was already pretty mature by then and we were well past the need for gopher.
I'm not saying everything you mentioned isn't execrable, though. It was (except the dancing baby. I liked that (that last part should be said in my Cleveland voice.))
Comment removed (Score:4, Informative)
Re: (Score:3, Insightful)
Re: (Score:3, Informative)
Yes it was- people went nuts with images on their pages.
Oh, how history repeats itself: first they went nuts with images. Then it was animated images, and about the same time, flashing text. Then it was flash animations. Now, it's XML. Try using the complex gmail view in HTML mode sometime. There's no reason whatsoever that it should use more bandwidth to send an email (once the system has loaded) via that interface than through the simple HTML view; in fact, it should take less. Nope! It takes more.
Re:lol whut? (Score:5, Insightful)
Isn't there a reason, though? Presumably, using javascript/ajax, you don't need to send/receive as much information (i.e., reload the ENTIRE page) at a time. With plain HTML, you would have to receive a copy of the entire page again... ?
I see no reason why it should take less in normal HTML. Any explanations why you think so?
Re: (Score:3, Informative)
I see no reason why it should take less in normal HTML. Any explanations why you think so?
You're reading it wrong. GP said:
There's no reason whatsoever that it should use more bandwidth to send an email (once the system has loaded) via that interface than through the simple HTML view; in fact, it should take less.
Therefore, you are making the same argument as the GP, but with less reading comprehension.
Re:lol whut? (Score:5, Interesting)
'I even remember one early commentator saying that text-only web pages were actually *better* for people on 14.4k baud modems.'
As I recall (Get Off My Lawn, etc.) if you were on a slow connection the web pretty much became a text-only medium initially. I used Lynx rather a lot back then (for speed), while Mosaic tended to be a rather frustrating experience. One of the cool new features that got everyone excited about one of the early versions of Netscape was its ability to show you the text (and of course active clickable links to other pages) without having to wait for every single image on the page to load (assuming you had image loading turned on at all). Suddenly the web started to look like a useable medium rather than an over-ambitious experiment crippled by slow networks and unresponsive software.
Re: (Score:3, Insightful)
Sounds good to me, gopher don't do flash, right?
Also with lower band-width requirements hosting would be cheaper so banners wouldn't have been a necessity to support the website.
Re: (Score:3, Insightful)
They are.
I still browse with images turned off if I am on a slow GSM connection.
Re: (Score:2, Funny)
We only pray that whoever invented the tag was executed by firing squad.
Re: (Score:2)
Are you kidding? The web was a huge pain. It was slow on old 14.4k SLIP connections, it was an outright shitty protocol to download with (and still is, really). By comparison, Gopher was quick. I remember discovering Project Gutenberg via my first ISP having a bookmark page. I was running OS/2 Warp, which came with a functional gopher client.
I sometimes think the Web was the worst thing to happen to the Internet. Maybe, without it, it would have taken a few more years to become the Big Thing, but imag
Re:lol whut? (Score:5, Funny)
Perhaps, you worthless piece of crap, you should have my post. But being that you're too fucking stupid to probably even breathe without your genetically-diseased mother popping her head through the day shouting "Asswipe, inhale!", I guess I can forgive you your cretinism and illiteracy.
But please, quit trying to hump your dog. She's a he, and has been dead for a couple of years. I know, with your puny mind, it's hard to fathom necrophilic bestiality being wrong, but somewhere in the slack-jawed, low-browed head of yours there must be some small glimmer of morality.
Re:lol whut? (Score:4, Insightful)
You got a way with words, MightyMartian. I'll give you that. Not many words, but the ones you have you use to great effect.
Re:lol whut? (Score:5, Informative)
Back in the mid-1990's, the most economical internet connection for small companies was a 64Kb ISDN link billed by the kilobyte, with a local university as the ISP. As most conference announcements were broadcast by USENET, the store-and-forward service was so slow, it was fairly common to have the conference, then receive the invitation three days later. If you wanted to download a file, ftp was likely to fail due to ISDN congestion that you would be forced to use a uuencode-by-email service. You E-mailed a message with the ftp path you wanted to download to the server, then it would download the file, chop it up and uuencode it back to you in lots of little pieces.
Otherwise, home users had the choice of a 14.4 kilobaud modem - some ISP's like Demon Internet built their own DOS window based application to manage E-mail/USENET postings. You could download the headers first, then pick out which full postings you wanted to download. Even then with a PC, you were still cramped for space with 40/80 Megabyte hard disk drives. One high resolution image from SGI could take up more disk space than you had on your PC.
Re:lol whut? (Score:5, Interesting)
The uudecode by email services were useful when my company had a UUCP connection via a nearby City Council (which was the only provider offering such a service to businesses at the time). We didn't have enough bandwidth for the binary newsgroups (9600 modem, IIRC), so that was the only way to get files without waiting for them to be shipped on CDROM so we could go to the one machine that had a CDROM reader and copy them off. ISDN wasn't available at our exchange so our first permanent connection was a 9600 leased line to the local University. At that point, direct ftp became a more attractive option than uudecode - which often took days due to throttling by the ftp-by-mail systems to control load, and we quickly learnt about the ftp reget and passive transfers to local servers to save international bandwidth costs (we were billed $4/MB for international traffic, but local traffic was free).
One of the things I downloaded was a graphical gopher program for OS/2. That was great - much better interface than FTP, even than the graphical FTP clients that had started to appear by then, it supported linking to images and binaries, but without support for ftp's reget, I didn't see the use for anything other than text. One of my co-workers showed me this great new program he'd downloaded for the FreeBSD box that was serving as our internet gateway - lynx. Comparing it to the graphical gopher, I found it unusable - links were scattered throughout the text instead of in a nice menu at the end like gopher, and predicted that this new http protocol would quietly die out along with other little known new protocols of the time. A few months later, someone downloaded a beta version of Mosaic. Now the web started to look worthwhile, and within weeks myself and co-worker who had introduced me to lynx came in on the weekend to replace the FreeBSD box with a new 486 running Slackware and a CERN webserver.
Re: (Score:2)
Re: (Score:2)
I bought my first PC around 1989 (3000 pounds for 20MHz 386 Dell System 310 with 20 Megabyte hard drive). Got a second 40 Megabyte hard disk drive two years later. Added several new graphics cards until Intel introduced the VESA bus to kill off graphics accelerators. Mid 1990's the PC's were 486 PC's with 100+ Megabytes hard disk drives, but modems were still 14.4K.
Re: (Score:2)
I had an 80mb drive in '95. Granted, it was old and later in the year I was able to upgrade to a full gigabyte (which wasn't cheap).
Re: (Score:2)
There were 1 Gigabyte drives and larger for servers, but those were in the order of tens of thousands of dollars/pounds. In 1991, a Sun 3/180 or a mainframe would have a couple of Gigabytes, but for a desktop system (Dell 310 era), a 1 gigabyte drive would cost more than the PC itself.
Re: (Score:2)
It would be paradise for the BBS Geek with with his VGA monitor and 14K modem.
Re: (Score:2)
TradeWars 2002, woot!
Re: (Score:2)
The 'Web is a mass of hypertext page servers that operate ON TOP of the Internet. Its called the World Wide Web because the pages have links that refer to other sites... a web of links.
DNS just translates names like "slashdot.org" into IP addresses a little bit like a phone directory.