Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Networking Technology

How Moore's Law Saved Us From the Gopher Web 239

Urchin writes "In the early 1990s, the World Wide Web was a power-hungry monster unpopular with network administrators, says Robert Topolski, chief technologist of the Open Technology Initiative. They preferred the sleek text-only Gopher protocol. Had they been able to use data filtering technology to prioritize gopher traffic Topolski thinks the World Wide Web might not have survived. But it took computers another decade or so to be powerful enough to give administrators that option, and by that time the Web was already enormously popular." My geek imagination is now all atwitter imagining an alternate gopher-driven universe.
This discussion has been archived. No new comments can be posted.

How Moore's Law Saved Us From the Gopher Web

Comments Filter:
  • by Anonymous Coward on Friday March 13, 2009 @05:30PM (#27187393)

    I'm pressing ESC twice to access this damn BBS.

  • by relikx ( 1266746 ) on Friday March 13, 2009 @05:30PM (#27187397)
    or try their hardest at least.
  • Uh, no. (Score:5, Insightful)

    by AKAImBatman ( 238306 ) * <<moc.liamg> <ta> <namtabmiaka>> on Friday March 13, 2009 @05:30PM (#27187399) Homepage Journal

    Even if the Web had been stunted by throttling, the demand for multimedia content would have eventually driven the rise of the Web or at least a super-Gopher.

    • Re:Uh, no. (Score:5, Insightful)

      by hey! ( 33014 ) on Friday March 13, 2009 @05:56PM (#27187713) Homepage Journal

      Well, multimedia is a orthogonal concern, really. If anything, in the early days gopher was more convenient for multimedia than the web.

      The thing about the web, the defining characteristic from the point of view of providers of information, was HTML. And HTML was a pain. It still is but since we assume it's necessary we don't think of it as pain. Back in the day, it was much easier dump all your stuff into gopher, including your multimedia files, than it was to write a whole new bunch of HTML from scratch.

      HTML was pretty far from what people eventually wanted the web to do too, which was to be an app platform. A lot of fancy architectin' has gone on to get it where it is today, and people are still screwing around with stuff like flash.

      The thing about the PITA of HTML is that it forced people to redo so much of their content into a uniform format, what's more a format that could be spidered by robots. That's the secret sauce. Yeah it's nice that people can follow hyperlinks, but the ability deal with basically one kind of data (marked up docs with hyperlinks in them) that really made the web powerful.

      Another thing was that while the early HTML wasn't very much like what people wanted for their documents, and despite abortive early attempts to add things like fonts (not to mention our beloved blink tag), HTML's SGML roots gave it architectural flexibilty. It needed the flexibilty so that the the missing 99% of what really people wanted could be added later without turning it into a hopeless mess.

      • Re:Uh, no. (Score:5, Insightful)

        by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday March 13, 2009 @06:15PM (#27187925) Homepage Journal

        The thing about the PITA of HTML is that it forced people to redo so much of their content into a uniform format, what's more a format that could be spidered by robots. That's the secret sauce. Yeah it's nice that people can follow hyperlinks, but the ability deal with basically one kind of data (marked up docs with hyperlinks in them) that really made the web powerful.

        The thing about HTML is that it really didn't force anyone to do anything. A web server will serve plain text files just fine and so long as everyone's MIME types are good, your browser will display them. Another non-secret secret of the web is that it doesn't require a HTTP server. You can serve a web site just fine (albeit a little slowly and without dynamic content) via FTP. Finally, I took a bunch of drinking game content and put up a drinking game website by just writing a CGI (this was back in the early nineties) to write a header, insert a PRE tag, include the text file, insert a closing PRE tag, and write a footer. Careful examination of this description will reveal that I did not actually have to do anything to my text files. In addition, text files can be spidered just fine. HTML renders down to text, when done correctly (or it doesn't spider) and text is already text.

        Got any other erroneous information to share?

        • by hey! ( 33014 )

          A web site made up of text files would be pretty dysfunctional.

          • Ever used GameFaqs? All of their written FAQs are text files. They just use php to shove them into pre tags so they can put some links at the top. A few years ago they didn't even do that though, and just linked directly to the txt
          • A web site made up of text files would be pretty dysfunctional.

            In other words, a typical web site?

        • by ODBOL ( 197239 ) on Friday March 13, 2009 @07:49PM (#27188909) Homepage
          I founded one of the early online journals before the invention of HTML/HTTP. It's the Chicago Journal of Theoretical Computer Science, providing articles in copy-edited LaTeX source, as well as precompiled PS and PDF.

          At first, the journal served papers through anonymous FTP.

          Then, I crafted a Gopher structure to make browsing easier.

          As soon as HTML/HTTP came along, I created the HTML version of the journal. It was much more maintainable than the Gopher version, because the hyperlinks decoupled the document structure from the file-system tree structure just enough. In a few years, I stopped maintaining the Gopher version, because it required an order of magnitude more work than the HTML, and readers all preferred the HTML anyway.

          Adding pictures and stuff is rather trivial for the data architecture, although demanding for the network implementation. With a more maintainable structure, Gopher would have added the extras. It was the Hyperlinks that made HTML work better.

          HTML also has some serious maintenance problems, but they appear later when the archive gets large, and they can be addressed with things like PHP compiling and content management systems.

          From another point of view: Gopher essentially made file trees visible over the network (which is what I thought I wanted at first). HTML/HTTP provides a crude network database model distributed over the network.

          Future advances in data architecture (as opposed to the types of data within that architecture) will have to do with other database models, and with other sorts of commitments between distributed servers, and with looser coupling between data ownership and server ownership. E.g., a way to provide reasonable assurance of future access to a particular data item (access includes being able to find it, not just its existence), without depending on a particular server at a particular registered domain name (the Wayback machine ameliorates the problem, but doesn't solve it).
    • Re: (Score:3, Funny)

      by craash420 ( 884493 )

      ...or at least a super-Gopher.

      I, for one, welcome our new underground overlords.

      Sorry, the thought of super-Gophers scares me more than cloned dogs, or Africanized bees, or cloned dogs with Africanized bees in their mouths so when they bark they shoot bees at you.

      • I, for one, welcome our new underground overlords. Sorry, the thought of super-Gophers scares me more than cloned dogs, or Africanized bees, or cloned dogs with Africanized bees in their mouths so when they bark they shoot bees at you.

        Best string of Simpsons references. Ever.

    • Actually the Web would have been nicer too if the images were optional links to follow instead of being shown inline by default. The early web was very inefficient, and you could spend a lot of time staring at a picture slowing being drawn when all you really wanted to see was the text.

      Gopher was also much more organized in many ways. The free-form HTML resulted in most sites being a jumble. On the other hand, the free-form nature also spurred a lot of experimentation and growth.

      Even today I find most we

    • ...or at least a super-Gopher.

      I think I saw that guy in Caddyshack...

  • by bonch ( 38532 ) on Friday March 13, 2009 @05:33PM (#27187443)

    Even if Gopher had dominated due to filtering (a premise I don't agree with), multimedia capabilities would have eventually been added to the protocol out of demand. We'd have the same web we have today.

    • by DragonWriter ( 970822 ) on Friday March 13, 2009 @05:38PM (#27187515)

      Even if Gopher had dominated due to filtering (a premise I don't agree with), multimedia capabilities would have eventually been added to the protocol out of demand. We'd have the same web we have today.

      Eventually, maybe, but exposure drives demand; if it had stalled long-enough for, say, cable and phone companies to deliver substantial non-free interactive multimedia outside of the context of the web first, its very likely that nothing socially like the current web would have existed any time near now, even if many of the individual features that are important about the web were available in one form or another on some networked electonic system that was widely available elsewhere.

      • Re: (Score:3, Insightful)

        by _Sprocket_ ( 42527 )

        Eventually, maybe, but exposure drives demand; if it had stalled long-enough for, say, cable and phone companies to deliver substantial non-free interactive multimedia outside of the context of the web first, its very likely that nothing socially like the current web would have existed any time near now, even if many of the individual features that are important about the web were available in one form or another on some networked electonic system that was widely available elsewhere.

        You have plenty proprietary network examples; CompuServe, GEnie, Prodigy, Sierra Network, AOL. Some are certainly more multi-media than others. But the common issue is that they were all their own digital islands. That worked well for decades. Until the Internet consumed public consciousness (and AOL launched the September that never ended [wikipedia.org]).

        The power of the 'web isn't in multi-media delivery. That's not to say it isn't important. But there is a more fundimental feature; ubiquity. For all the features

        • by DragonWriter ( 970822 ) on Friday March 13, 2009 @06:36PM (#27188205)

          The power of the 'web isn't in multi-media delivery. That's not to say it isn't important. But there is a more fundimental feature; ubiquity.

          Ubiquity, a feature of the internet, was a consequence of multimedia, a feature of the web -- almost anyone could get access to the internet for many years before the web was popular (I remember first looking into local ISP options in ~1991.) Comaparatively few people did until the web was popular because there was no appeal to most people. The internet, which had been around for quite sometime, became omnipresent because it offered something which rapidly drew wide interest, and that was the multimedia offered by the the web.

          • by _Sprocket_ ( 42527 ) on Friday March 13, 2009 @06:57PM (#27188437)

            The internet, which had been around for quite sometime, became omnipresent because it offered something which rapidly drew wide interest, and that was the multimedia offered by the the web.

            Not at all. Email was the killer app. And that wasn't multi-media.

            I remember trying to get an Internet connection in '91. It wasn't to be had where I was. I had to "borrow" a link from the local university. I got involved with an outfit opening up an ISP in the area. And while firing up Netscape got folks really happy, it was email that got the subscription. Folks wanted to be able to email their kids off at college. We were in a military town with a base who was on a constant deployment schedule (myself included). Military families bought subscriptions as soon as they realized email was (almost) instant compared to the 2 weeks it took for snail-mail to make it across the pond and into sandland.

            Now, to be sure, for me... the 'web was a killer app as well. I remember being all giddy over clicking a link that had a .au in it's URL (and not paying LD charges). This was the realization of Clarcke's 2010. And then I was pulling up images of all matter of content - from magazines to hobbies to... well.. other interests.

            But all of this would be window dressing if it wasn't for the fact that I can email anyone no matter what service provider they use. And when I want to bring up Megacorp Hobby's web page to order supplies to do a project I read about on some enthusiast's private underwater basket weaving fan site... I don't have to worry about the provider then either.

            The underpinnings to this all is ubiquity. I had a lot of these features during the years I used CompuServe, et. al. And services like Sierra Network were pushing the graphics / multi-media angle. But none of them hooked me up with a fan site in Australia.

    • I agree, I don't think gopher would have become popular as just a text protocol. The internet might have stagnated waiting for multimedia to be shoehorned into gopher. People seem to like the pictures, movies, motion and all the other bling, preventing all that probably would have stunted the internet's popularity. If people didn't really want the pictures, they could just stick with lynx, and that really doesn't seem to be retaining any traction.

    • Re: (Score:3, Insightful)

      by gad_zuki! ( 70830 )

      Probably. We already had things like compuserv, prodigy, BBS, fidonet, email, minitel, etc but it wasnt until Joe Sixpack could see photos, play music, and click with a mouse did it take off in the market. The command line, memorizing keyboards, etc is a real barrier to entry. A lot of FOSS people dont understand that.

      Its equally, if not more likely, that someone would have just invented something web-like and leapfrogged over gopher like TBL did at CERN.

      Not to mention PCs having multimedia capabilities was

  • Gopher was great (Score:3, Interesting)

    by ta bu shi da yu ( 687699 ) on Friday March 13, 2009 @05:36PM (#27187491) Homepage

    If Gopher had won we would have had more a focus on content than presentation. I hardly think this is a bad thing.

    • Re: (Score:3, Insightful)

      by hey! ( 33014 )

      I'm not sure that's right.

      The thing is, it's possible to architecturally separate presentation from content from metadata in HTML. Furthermore, people do care about presentation. Who are we to say they shouldn't? The problem is confusing the two.

      Here's what I see wrong with the puritanical belief that outlawing presentation hanky-panky will keep the flock virtuously focused on content: people will cheat. When they think they can get away with it, they'll enthusiastically engage in all manner of abominat

    • Re:Gopher was great (Score:4, Interesting)

      by Eravnrekaree ( 467752 ) on Friday March 13, 2009 @06:24PM (#27188059)

      Whoever said graphics on a web page is not content? Whoever said that the beautiful graphics intensive web pages today are not a form art? Is the only form of content is text? No! Is telling an artist they can only use a pencil and are not allowed to use any colours at all in their work reasonable limitations on an artist? No. Using colour, paint and so on gives you more capability that allows you to create even more exquisite content. The greater graphics capability of flash, and hopefully soon open spec web environment equivalents, allows one to portray and create art not possible with text.

  • Oh, not the little, brown, furry rodents.
  • by alienunknown ( 1279178 ) on Friday March 13, 2009 @05:41PM (#27187567)
    But I really prefer Badger [badgerbadgerbadger.com] over Gopher.

    Thats what is really stopping me from getting an iPhone, because I can't access badger-net.

    • Re:I loved Gopher (Score:5, Informative)

      by Dwedit ( 232252 ) on Friday March 13, 2009 @06:04PM (#27187813) Homepage

      Badgerbadgerbadger.com is not connected with the creator of the flash movie, it is just some guy trying to profiteer over the meme. Stick with linking to the original authors, not the leeches.

      • Re: (Score:3, Informative)

        Badgerbadgerbadger.com is not connected with the creator of the flash movie, it is just some guy trying to profiteer over the meme. Stick with linking to the original authors, not the leeches.

        I did a quick search for the badger flash vid before posting, and just took the first link I could find. I thought that was the original site at first. I hadn't seen the flash video in years so I didn't know the original URL.

        The original is Here [weebls-stuff.com]

      • Re: (Score:3, Informative)

        by znerk ( 1162519 )

        Badgerbadgerbadger.com is not connected with the creator of the flash movie, it is just some guy trying to profiteer over the meme. Stick with linking to the original authors, not the leeches.

        Except that badgerbadgerbadger.com's little flash movie has a link in the bottom right-hand corner of it, pointing to www.weebls-stuff.com [weebls-stuff.com] - the aforementioned original author.

    • When history looks back at the first generation to extensively use the World Wide Web, it will sigh a collective "WTF?"

  • Irritation (Score:5, Insightful)

    by girlintraining ( 1395911 ) on Friday March 13, 2009 @05:45PM (#27187591)

    People think that if Person X hadn't been around we might not have Technology Y. Okay, this is based on the idea that somehow Person X has some unique ability and only Person X can create Technology Y. Hate to break it to you, but you're not special. Neither is Person X. Second, the reason we have Technology Y is because we needed it. If those needs haven't gone away, then the pressure to fill that void remains -- and somebody else will come along and fill it eventually. Now you're right that maybe Betamax might have beaten VHS if not for a disturbance in the force, or it would have been HD-DVD instead of Bluray, or whatever... But we'd still have high density optical media. Gopher would have died simply because it didn't meet the needs of the population. Maybe it wouldn't be HTTP that replaced it five, or ten years later, but something like it would have been created.

    • Re:Irritation (Score:5, Interesting)

      by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday March 13, 2009 @05:57PM (#27187725) Homepage Journal

      Hate to break it to you, but you're not special. Neither is Person X.

      That is a crock of shit. I mean, I may not be special, but certain persons who helped shape the future of science (including computing) are. There is no denying the "specialness" of people like Nikola Tesla or Albert Einstein. Why, then, should you deny the specialness of someone who is arguably less special than they are, but more special than you are? Simple jealousy? History is chock-full of examples of people whose unique way of thinking changed the shape of our world, the canonical example being Newton. He saw things in a way that others did not, and he advanced science dramatically. Maybe Tim Berners-Lee is no Einstein or Newton or Tesla, but he is certainly an individual with unique thought and influence.

      In any case, the argument here is actually that if we didn't have the processing power to do multimedia, that we would have had a dramatic population increase in gopherspace rather than exponential growth of the WWW. The only part of the argument that is stupid is that people were already serving images over gopher; you needed an external viewer, of course. But sooner or later, someone would have come up with a multimedia markup extension for gopher, and then gopher would have been the WWW, just with a different protocol.

      • I think he meant "special" in terms of "no X, no Y, ever." Unique as in completely unique with no equal anywhere in the past or future (or present, hehe). Hence his usage of "unique" ...
      • There is no denying the "specialness" of people like Nikola Tesla or Albert Einstein. Why, then, should you deny the specialness of someone who is arguably less special than they are, but more special than you are? Simple jealousy?

        No. Success depends on a lot more than just a person's innate "specialness". Or I can be more blunt: They just happened to be in the right place, at the right time, and had what was needed. There have been hundreds of failed Nikola Teslas -- I could manufacture him on an assembly line and sprinkle copies throughout society and I'd be unlikely to reproduce what the original did, simply because the environmental factors would be lacking. I'm sorry, because I know everyone wants to feel special and unique, tha

        • Re: (Score:3, Insightful)

          Success depends on a lot more than just a person's innate "specialness".

          Sure, so what? That it depends on more than a person's "specialness" does not refute that "specialness" matters.

          Or I can be more blunt: They just happened to be in the right place, at the right time, and had what was needed.

          Yes, and to be equally blunt, hundreds of millions or billions of other people were around at the right time, very many of them at the same right place or one equally right; what was key is the "had what was needed"

      • Re: (Score:3, Insightful)

        by mrchaotica ( 681592 ) *

        History is chock-full of examples of people whose unique way of thinking changed the shape of our world, the canonical example being Newton. He saw things in a way that others did not, and he advanced science dramatically.

        And yet Leibniz invented calculus too, independently and at about the same time. Methinks you need a better example.

        • And yet Leibniz invented calculus too, independently and at about the same time. Methinks you need a better example.

          From what I understand, they invented two different types of calculus. Also, there was some reason to suspect that Leibniz may have gotten the idea from Netwon, and they had quite a fight about it for many years.

          • Re: (Score:3, Informative)

            The differences between the fluxional (Newton) and differential (Leibnitz) calculus are minor, and they both depend on two geometric ideas which go back to at least Archimedes (method of exhaustion) and Descartes (analytic geometry).

            I would also point out a famous quote of Newton in this context: "If I have seen further it is only by standing on the shoulders of Giants."

            The history of mathematics is actually full of examples of parallel discoveries and rediscoveries. Mathematics is not the only science

            • Since everything in mathematics can be derived from the base definitions isn't everything bound to be rediscovered given enough time and effort?

              I know I've rediscovered tons of mathematics by accident since I needed it for something only to later discover someone else had already done it more formally a few hundred years ago.

              • Don't forget that definitions are subject to change over time, and are often the result of formalizing an idea which has already proven its worth in the past.

                You could also argue that when confronted with a problem whose solution is unique, anybody who solves it will necessarily obtain the same solution. However, that still leaves the question of why independent minds try to solve essentially the same problem, *before* the problem becomes generally recognized or famous.

                Gauss, Lobachevsky and Bolyai inve

          • Re:Irritation (Score:5, Insightful)

            by Geezle2 ( 541502 ) on Friday March 13, 2009 @08:55PM (#27189415)
            Newton/Leibniz and calculus is actually one of the very best arguments for why the individual is unimportant in the larger historical scheme of things. The same technological synchronicity can be seen in a number of inventions and developments, such as atomic energy/bombs, airplanes, telephone, computers, etc.

            The thing is, when the precursors to a new technology get developed, the new technology becomes apparent to growing numbers of people until someone develops it.

            I remember experimenting with GUIs in my local Atari Computer Club back in 1981 (using light pens instead of mice... light pens were easier to fabricate). People nowadays like to think that Bill Gates invented the Graphical User Interface. Slightly more savvy individuals think it was one of the Steves (Jobs/Woz) who did it. Marginally less clueless folks might think it was Xerox, or IBM or whoever. The fact is that as soon as cheap consumer-grade computers hit the market (mid-late 10970's), GUI controlled operating systems became inevitable. If there was a gang of people in my little backwater town working on the issue, there must have been thousands upon thousands of people experimenting with GUI controls nationwide.

            Finally, compared to the technologies upon which it relies (with regards to the Internet), HTML, and, by extension the Web, is trivial. The ONLY important tag in HTML that matters is the link anchor, and this itself had precedents. How long would it have really taken for people to start including the gopher addresses of referenced documents in their documents that they posted on gopher? How long before gopher browsers were developed that could retrieve and display documents that were encoded in standard formats became available? Anyone who remembers using the gopher browser that shipped with early versions of OS/2 knows that gopher could have done the duty of http, given its absence... particularly as a standard document format would have eventually developed to ease spider indexing.

            Really, folks, there were a lot of us working on this stuff back then. What we have now is a crude compromise (with Flash cancer), but that we would have a graphically navigable network of documents spanning the globe was never in doubt.

        • Re: (Score:3, Interesting)

          by znerk ( 1162519 )

          And yet Leibniz invented calculus too, independently and at about the same time. Methinks you need a better example.

          Or you need to learn about Newton's theory of gravitation, which fits the GP's point much better:
          "He saw things in a way that others did not, and he advanced science dramatically."

          Calculus is math, which is admittedly a large part of science... but I believe the GP's point was that figuring out that things fall down because "down" is relative to the large, relatively stationary object we stand on was probably completely inconsistent with the then-current accepted "truths". Think different, ya dig?

          • Or you need to learn about Newton's theory of gravitation, which fits the GP's point much better

            If Newton hadn't thought of it, Bernoulli or Maxwell or somebody would have instead.

      • Unfortunately, both of your examples could have easily been replaced by someone else, around the same time they did it. Both Tesla and Einstein, for the all their work, which I appreciate, were not the only people doing what they did. They did not do all of that work by themselves, they built on work done before them.

        I'm sure there is something somewhere that was unique to a particular person that came up with the idea, but rarely is it ever the person credited with it, and I doubt anyone short of the fir

      • Actually, if you look closer at history you'll notice that most things were invented by several people around the same time but only one of them got credit. While the inventors were all above the norm, they weren't unique.

      • The premise that "some people are geniuses, and this special-ness is a necessary requirement for certain theories to be discovered" isn't quite as self-evident as you make it out to be.

        In Pratchett's "The Science of Discworld 3: Darwin's Watch", there is a chapter that deals specifically with this issue. The author argues that there is a "time" when the mass of previous evidence and research will result in the theory being "created". See this guy's summary:

        "They also show how it's not enough to have an idea

    • but you're not special.

      Sometimes, but sometimes not. Given a set of tools and a problem to salve, I'm sure there will be duplicity in the different ways that they use the toosl to solve the problem. But every so often you do get one person who thinks a little differnt and does it in a unique way.

      After they have done it in their unique way, it seems obvious that thigns can be done that way. But they had to think it up first. So sometimes peope lare special and thing would be differnt if they had

    • There are two concepts here. The first is the uniqueness of the individual. The second is the idea who's time has come.

      I agree that history shows that there are certain ideas who's time has come. There are examples of certain revolutionary changes being worked on from different angles. And with hindsight, we can see that the change was only a matter of time.

      But saying anyone could have brought about these changes sounds an awful lot like arm-chair quarter-backing; "yeah - I could have done that." There

    • Re: (Score:2, Funny)

      by wzzzzrd ( 886091 )
      No. If I hadn't scratched my arse at that one time in the 90s, the default color of hyperlinks would be green.
  • I remember Gopher. Used it a bit, when I first got online. The WWW was to Gopher as Web 2.0 is to WWW. Really. The web was a natural progression of improvement from Gopher. It was wasn't called Gopher 2.0, much like Windows 95 wasn't called Windows 4.0. It was a new version, and somebody though it woudl be good to give it a new name.

  • by Anonymous Coward on Friday March 13, 2009 @05:53PM (#27187679)

    "In the early 1990s, the World Wide Web was a power-hungry monster unpopular with network administrators"

    As I write this, Firefox is using 300mb of ram and 100% of one core, so not much has changed since then.

  • if gopher had won we would have ended up extending it to be able to embed references to ftp or tftp hosted files, and tftp would have been an important part of the internet instead of a rarely used protocol.

    the only difference users would see would be that the text of a page would load first and URLS would all start with gopher:// [gopher]
    • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Friday March 13, 2009 @06:18PM (#27187975) Homepage Journal

      the only difference users would see would be that the text of a page would load first and URLS would all start with gopher:// [gopher]

      The only reason the text of a page doesn't load first today is that web browsers are badly behaved. Firefox will often refuse to render a page until it gets all the content. That's not the most aggravating thing about it though; if a connection is reset, then Firefox now shows you a page saying that it was reset, instead of the page content that it DID successfully manage to download. I don't know who's responsible for this "feature" but it's fucking stupid. It made the web mostly unusable when I was on a modem, because I'd be happily reading a page, some ad would fail to load, and then Firefox tells me the page failed to load. Whoever made that decision should definitely be asked to justify it, or asked to fuck off immediately.

      • by BitZtream ( 692029 ) on Friday March 13, 2009 @08:56PM (#27189427)

        I understand your complaint. But to give rendering engine developers some credit, if you really understood the complexities of rendering html properly, you'd understand why they stopped trying to do partial rendering a long time ago, its just not worth the effort at this point.

        Can it be done? Of course, is it worth it? Meh, considering most of the Internet is pretty reliable, the amount of times partial rendering would help doesn't really justify diverting that effort from other more important aspects of rendering.

        • Re: (Score:3, Informative)

          by drinkypoo ( 153816 )

          I understand your complaint. But to give rendering engine developers some credit, if you really understood the complexities of rendering html properly, you'd understand why they stopped trying to do partial rendering a long time ago, its just not worth the effort at this point.

          No, you DON'T understand my complaint. Firefox will have rendered the page, right? Then a SINGLE PAGE ELEMENT fails to load, or perhaps the site fails to close the TCP connection properly, and now Firefox says "The connection was reset while the page was loading" and the page I was just successfully reading disappears.

          If you read and understood my comment this time, now you understand my complaint.

    • I don't see tftp as ever having been an important part of the internet over long-haul connections. Tftp would have been what it was intended to be and is, a very straightforward protocol that can be implemented with incredibly tiny footprint with little risk of getting it wrong. Notably:
      -TCP is *much* better at reliable communication without penalty. TFTP is intentionally dumb, send a block, ack a block, send a block, ack a block. Again, easy to write, horrible performance. In TCP we have adjusting win

  • by LaminatorX ( 410794 ) <sabotage&praecantator,com> on Friday March 13, 2009 @06:03PM (#27187789) Homepage

    Every time I have to sit through a bunch of crappy Flash or out of control javascript, I find myself wishing I could get a decent gopher feed.

  • Early 90's computers...486DX2? Pentium 90? That's enough to route much more traffic than any of the nodes at that time could even conceive of, and can do QoS to boot.

    Doesn't Gopher run on port 70, making it easy to prioritize over port 80 traffic? It would seem (although I could be completely wrong) that the biggest holdback wasn't hardware, just that QoS hadn't really been brought to fruition in time.

  • I doubt gopher would have met the needs of the internet as well as the web, and would have been sufficient. The combination of HTTP and HTML has been proven to be enormously successful. Gopher would have needed some major work to make it as flexible as HTML. Web would have probably have replaced Gopher in any case. The design of the web is more practical and better thought out than gopher.

  • Some of us don't like where the bloated, cpu and bandwidth wasting internet is heading. A world where gopher survived and flourished doesn't sound all that bad.

  • Had they been able to use data filtering technology to prioritize gopher traffic Topolski thinks the World Wide Web might not have survived.

    So in other words, Net Neutrality saved the web?

  • by Overzeetop ( 214511 ) on Friday March 13, 2009 @06:54PM (#27188411) Journal

    after seeing it on a secretary's desktop at NASA in the early 90s. My comment was very close to "Yeah, but I can already get all that with gopher; I don't think it will take off." Now, in my defense, just six months later I predicted that in a few years you would see panel trucks with web addresses instead of 800 numbers. The couple of people I told that to looked at my like _I_ was crazy. Damn, I wish I would have put my retirement savings behind that thought.

  • In 1993, I heard about this "world wide web" thing and tried it. I think it was "www", some really awful command line tool. gopher worked far better at the time and was much easier to use. Then Mosaic came out and changed the world. I think you could have done a multimedia gopher along the same lines though; it maybe wouldn't have been quite as flexible, but as far as bandwidth consumption goes, media is media...

    • by jrumney ( 197329 )
      I recall the same - except I'm pretty sure it was lynx that I tried first. At the time, gopher was still new for me, so the cool thing about it wasn't so much the content (there was still more of that available on ftp) as the fact that you could start at University of Minnesota and spend hours just following the links in the menus to other gopher sites - browsing. With lynx, you had to find the links buried in some text that you weren't really that interested in anyway, whereas with gopher they were always
  • Licensing (Score:5, Interesting)

    by sien ( 35268 ) on Friday March 13, 2009 @08:47PM (#27189375) Homepage

    It's surprising that no one here on slashdot has pointed out that a major difference between the html and gopher was that gopher services had to get a licence [wikipedia.org] from the University of Minnesota while http servers could be constructed without a licence.

    Free open software with free open standards is what got the web going.

    • Re: (Score:3, Informative)

      by tricorn ( 199664 )

      Only if you wanted to use their code. It was very easy to write a simple Gopher server. Not saying that NCSA making their http server code freely available didn't help in the adoption of http/html, it certainly was a strong factor, but the U of M gopher server wasn't the only one out there.

  • ...and have access to a pre-OS X MacOS, TurboGopher VR is a must-see. The screenshot at the bottom of this page (http://www.tidbits.com/iskm/iskm3html/pt4/ch24/ch24.html [tidbits.com]) doesn't do it justice.

    Suffice to say, it is probably the only Gopher client that will ever have a key mapped to a "jump" action that is interpreted literally.

  • by Lulu of the Lotus-Ea ( 3441 ) <mertz@gnosis.cx> on Friday March 13, 2009 @10:29PM (#27189873) Homepage

    The difference between HTTP and Gopher has NOTHING WHATSOEVER to do with ability to serve multimedia content, nor with bandwidth. HTTP, or really HTML, just allows more diverse linking patterns than Gopher's hierarchical format. But there's nothing non-graphical or content specific about gopher. I RAN graphical Gopher clients perfectly happily (well, including early Web browsers that supported that protocol.

If you have a procedure with 10 parameters, you probably missed some.

Working...