Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet Technology

Time To Take the Internet Seriously 175

santosh maharshi passes along an article on Edge by David Gelernter, the man who (according to the introduction) predicted the Web and first described cloud computing; he's also a Unabomber survivor. Gelernter makes 35 predictions and assertions, some brilliant, some dubious. "6. We know that the Internet creates 'information overload,' a problem with two parts: increasing number of information sources and increasing information flow per source. The first part is harder: it's more difficult to understand five people speaking simultaneously than one person talking fast — especially if you can tell the one person to stop temporarily, or go back and repeat. Integrating multiple information sources is crucial to solving information overload. Blogs and other anthology-sites integrate information from many sources. But we won't be able to solve the overload problem until each Internet user can choose for himself what sources to integrate, and can add to this mix the most important source of all: his own personal information — his email and other messages, reminders and documents of all sorts. To accomplish this, we merely need to turn the whole Cybersphere on its side, so that time instead of space is the main axis. ... 14. The structure called a cyberstream or lifestream is better suited to the Internet than a conventional website because it shows information-in-motion, a rushing flow of fresh information instead of a stagnant pool."
This discussion has been archived. No new comments can be posted.

Time To Take the Internet Seriously

Comments Filter:
  • Serious (Score:5, Funny)

    by shird ( 566377 ) on Sunday March 07, 2010 @10:41PM (#31397016) Homepage Journal

    As we all know, the Internet is serious business.
    http://drunkenachura.files.wordpress.com/2009/07/internet-serious-business.jpg [wordpress.com]

    • Re:Serious (Score:5, Funny)

      by Tynin ( 634655 ) on Sunday March 07, 2010 @10:51PM (#31397096)
      I think the issue is David Gelernter failed to predict how most of the Internet communities talk to each other [imageshack.us]. Not to mention it would require a massive restructuring of the Internet, but given the latest whispers of what ACTA will bring us [temp-e.net], I guess it is more likely than not.
      • Re: (Score:3, Insightful)

        Comment removed based on user account deletion
        • Re: (Score:2, Funny)

          >I've never heard of him before

          You have not been paying attention, and this reflects badly on you.

        • by epine ( 68316 )

          Considering this piece reads like the sleep-talking of a singularitarian

          He runs in those circles:
          Kurzweil Debates Gelernter at MIT [robots.net]
          I'm shocked how many smart people have a deep intuition that computation can't underlie consciousness when we have so many formal results that the limits of computation are inscrutable (complexity theory).

          Users Are Not Reactionary After All [edge.org]
          I thought I would find a soul-mate in Gelernter, since I believe strongly in aggregating *my own* data, but in truth I don't get much out of his ideas. This is what I wrote to myself when I first read that piece:

          Edge question 2010: made the absurd statement that 99.9% of the technocrats involved in creating the internet will be displaced when the system evolves to operate in a top-down mode. This is extremely insulting, because it implies the technocrats have created the system in the image of their personal limitations, and denies the possibility that we've chosen to work at this level because that's where the action is. If we'd started top down, the internet would have never made it off the ground.

          M

        • Re: (Score:3, Insightful)

          by Gerzel ( 240421 )

          Indeed. Like any good charlatan fortuneteller the man keeps to the vague and puts many things in the form of questions which he can claim to have predicted either way they turn out.

          Some of the "Predictions" are really just calls for what he wants in the computing world. One Internet interface? Cloud computing ruling all? He strikes me as the type who can't see the computer world beyond windows or purely business needs. He even sorta looks like Dilbert's pointy hair boss to go along with the spew.

      • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Monday March 08, 2010 @12:02AM (#31397490)

        Anyone can make a prediction. I'll make a prediction right now that one day we'll have a man on Mars.

        The problem is how ACCURATE is the prediction. And his predictions are pretty useless. They're filled with current buzzwords and have no falsifiable content. Take prediction #5:

        5. Consider Web search, for example. Modern search engines combine the functions of libraries and business directories on a global scale, in a flash: a lightning bolt of brilliant engineering. These search engines are indispensable -- just like word processors. But they solve an easy problem. It has always been harder to find the right person than the right fact. Human experience and expertise are the most valuable resources on the Internet -- if we could find them. Using a search engine to find (or be found by) the right person is a harder, more subtle problem than ordinary Internet search. Small pieces of the problem have been attacked; in the future we will solve this hard problem in general, instead of being satisfied with windfalls and the lowest-hanging fruit on the technology tree.

        WTF? I'm not going into whether a search engine is an "easy problem". Everything is easy once it has been done by someone else.

        But why does he believe that finding PEOPLE is an issue? This is the INTERNET. You can find published information ABOUT people. But PEOPLE are not abstracted and defined on the Internet.

        And yes, in the "future" this "problem" will be "solved". When, how, where and by whom is skipped. So this "prediction" cannot be falsified. Therefore, it can never be shown to be wrong.

        That article is crap.

        • Re: (Score:3, Interesting)

          by EdIII ( 1114411 ) *

          But why does he believe that finding PEOPLE is an issue? This is the INTERNET. You can find published information ABOUT people. But PEOPLE are not abstracted and defined on the Internet.

          I think you missed his point entirely, which is spot freaking on.

          What he is referring to, IMHO, is the Fucking Google Effect . Somebody enters some search criteria into Google, quite often some sort of error code, and Voila! the first page contains entirely correct answers.

          There is credibility given to answers in a way tha

          • Re: (Score:3, Interesting)

            by dzfoo ( 772245 )

            >> What he is referring to, IMHO, is the Fucking Google Effect .

            I believe that this is precisely the problem: that his "predictions" are so vaguely described that they can mean anything to anybody, and thus can never actually be falsified. Kind of like a garden-variety translation of Nostradamus' quatrains: somewhere, someone will twist their interpretation until it fits into some sort of reality.

            And that's not "predicting the future". To paraphrase Toy Story character Woody, that's just "guessin

        • Yes, exactly. People are not abstracted, not defined on the Internet, not searchable. There may be 3 experts in the world capable of answering your question, and not a single webpage even approaching it. It would be extremely difficult to find these people and ask them your question (having them willing to answer it is an entirely different matter.) And to filter out all the questions that are better suited for their less competent colleagues. This is a problem that needs to be solved and Internet may becom

          • And to filter out all the questions that are better suited for their less competent colleagues.

            More importantly, how to filter out the answers from those less competent people. I don't really want to fade through the IT equivalent of chiropractors claiming to be able to cure asthma [upenn.edu].

            There's simply too much incentive to try to fool such search engines about your abilities for this to be useful.

        • The other respondents hit the nail on the head: Gelertner's point is about finding and querying expertise rather than written information--the difference between reading a book by Donald Knuth and being able to ask him a question directly.

          However, that problem has a solution on the Internet already: communities of Interest. Dating back well before Web search engines, newsgroups allowed people to find each other according to subjects of interest and to share expertise. Web forums like Slashdot continue that

        • But why does he believe that finding PEOPLE is an issue? This is the INTERNET. You can find published information ABOUT people. But PEOPLE are not abstracted and defined on the Internet.

          Speak for yourself, human.

      • We form tribes of relatives, neighbors, classmates, co-workers, etc. Anthropologists/sociologists have discovered there is an optimal tribe size of around a hundred (plus-minus fifty) before it becomes unwieldy, bifurcates or dissolves. The internet allows us to construct social and commercial "tribes" from across the planet.
  • Seriously, (Score:4, Funny)

    by miracle69 ( 34841 ) on Sunday March 07, 2010 @10:44PM (#31397038)

    Where are we going to take it?

    And did Al Gore give us a curfew?

  • by account_deleted ( 4530225 ) on Sunday March 07, 2010 @10:53PM (#31397112)
    Comment removed based on user account deletion
  • Let's come up with something to replace HTTP/JavaScript/Flash/what-have-you. It's huge waste, but even worse, distortion.

    We have the technology. We can do better than this.

    x86 assembly, bogus sessions, they do not have to be fate.

    Right? Right?

    • by jo42 ( 227475 ) on Sunday March 07, 2010 @11:45PM (#31397416) Homepage

      replace HTTP/JavaScript/Flash/what-have-you

      Every time I do "web development", I feel like I'm duct taping popsicle sticks together to build a house and then throwing in a bit of mud to seal the holes. Even after 10+ years everything still feels like a really bad hack/kludge/bodge.

      • by AuMatar ( 183847 ) on Monday March 08, 2010 @02:50AM (#31398242)

        Because it is. You have a sessionless protocol trying to do sessions. Amusingly enough written on top of a connection based protocol (so you have a session built in- the TCP connection). You have a text markup language based on the idea of the client choosing how to display data being used to display pixel perfect displays. You have a language that they had so much faith in they decided to name it after another popular language in hopes people would confuse them. And that language has no built in method for transfering data to/from the server or doing RPCs, you have the whole AJAX hack thrown in on top to do that. There's nothing about the whole stack that's well designed for modern uses. But its universal, so we're stuck with it unless Mozilla and MS work together to push out something new.

        • You have a sessionless protocol trying to do sessions. Amusingly enough written on top of a connection based protocol (so you have a session built in- the TCP connection)

          I'd love to know what you propose would do better and still scale to tens of thousands of page requests per second, and can deal with malicious network nodes and nodes dropping off the network without notice. You do realize that TCP is also doing sessions on top of a sessionless protocol, right? Is TCP poorly designed?

          You have a text markup language based on the idea of the client choosing how to display data being used to display pixel perfect displays.

          If the web was designed to be pixel-perfect, browsers would be as messy as Win32, trying to maintain backward compatibility with all sorts of different displays. Either that, or everything

          • I should add:
            • CSS is a bit inflexible, and its attribute names are a bit arbitrary (CSS selectors are good, though). That would have been a valid complaint.
            • Making things look good (or work at all) in MSIE is a real pain. That's also a valid complaint.
            • Most properly-designed web apps are stateless, except for authentication and maybe a language preference. Everything else should be in your server-side data model, or in your request. If you're writing a web application that keeps a lot of session-speci
          • by macshit ( 157376 )

            Yeah, the name "JavaScript" was stupid. So what?

            The silly thing is that, however horrible the name javascript is, the "standard name" is worse... "ecmascript"... sounds like a skin disease!

            "Say, you'd better see a doctor about that ecmascript!"

          • by AuMatar ( 183847 ) on Monday March 08, 2010 @12:30PM (#31402420)

            I'd love to know what you propose would do better and still scale to tens of thousands of page requests per second, and can deal with malicious network nodes and nodes dropping off the network without notice. You do realize that TCP is also doing sessions on top of a sessionless protocol, right? Is TCP poorly designed?

            Doing sessions on top of sessionless on top of sessioned is poorly designed. That's the current situation- interactive apps written over HTTP on top of TCP. HTTP is a good file transfer protocol, but it doesn't fit the modern usage of many webpages, its being shoe horned in because everyone uses browsers and that's the only way they communicate. It's past time for a new protocol at the HTTP layer made for web applications that can co-exist alongside it.

            If the web was designed to be pixel-perfect, browsers would be as messy as Win32, trying to maintain backward compatibility with all sorts of different displays. Either that, or everything would be monochrome at low resolution.

            But that's what every damn web designer wants, and what they struggle with HTML, CSS, and Flash to achieve. From frame hell to the equivalent in CSS, they design it assuming that it should be pixel perfect. It's time to educate them or give them what they want, the current hacks they use to try and make it so are a huge waste of time and money.

            The only language I can think of that has arbitrary functions like RPC built-in is PHP. If you think PHP is the epitome of language design, then we have nothing more to discuss. Most good languages separate the language itself from the standard library.

            You don't know many languages then. A good language for the web would recognize that it's client-server, and provide for built in ability for automated data transfer and calling of functions on the server. Instead we have the steaming pile which is Javascript, a bad language to begin with, married to the utter hack that is AJAX.

            Sure, on the whole, it's not the best that we could do, but if you think nothing about it is well-designed, well, what would you propose? Flash?

            The web wasn't designed for applications. Start over. A new transfer protocol based on sessions. A new display format based on SVG or similar technology with access to all common widget types (menus, sliders, combo boxes, list boxes, other things that the current web can't do well or at all). Scrap js and use a well designed language, one that's tier aware. And make browsers able to use this format or the original http/html stuff, as was always intended- that's why URLs start with htttp://.

            It'd be a year or two to work it all out, during which we'd continue with what we have now. The end result would be a huge increase in productivity and ease of use, since we wouldn't have to wedge around broken protocols and throw in hideous hacks.

            The last thing anyone needs is Microsoft reinventing the web.

            As developers of the most used web browser do you really think they could be left out of anything? They shouldn't control the process, but they need to have a seat at the table.

        • by sorak ( 246725 )

          Thank you. You just described he crap-stack that is my livelihood, but the sad part is that you haven't even gotten to flash yet.

        • You have a text markup language based on the idea of the client choosing how to display data being used to display pixel perfect displays.

          This is the part of it that I really don't have a problem with. I think it's great that HTML is somewhat separated from layout and display, since it theoretically enables people to create semantic markup and then create a style somewhat separately. Though HTML and CSS can each be improved, I like the fact that there is that split. The semantic markup theoretically allows for the content to be parsed for different purposes, e.g. screen readers, news feeds, alternate displays.

          • by AuMatar ( 183847 )

            The idea of HTML was good. The problem is that modern web designers and companies don't want that- they want pixel perfect control over displays, and are using HTML, CSS, etc to achieve that. HTML works well for what it was designed for, but its not good at what it's currently being used for. It's being wedged into use where it really shouldn't be.

            • Well I think it's a bit more complicated than that. It's true that different people come to HTML looking for different things. Some people are happier looking at HTML in a text-based browser like Links, while some want very pretty custom-designed pages that are pixel-perfect and identical on all browsers. The split between content and presentation theoretically allows for both camps (and everyone in between) to walk away satisfied. Well, maybe not completely happy, but close enough to make it a good all

      • I agree. I even started working on a user interface protocol based on TCP (actually, any serial connection - RS-232 or USB would work just as well), but I didn't have time when moving to a house, and haven't started up again.

        If you've done event driven GUI programming, you probably noticed much of it is in the form of (set up windows and controls), ... etc, until the final (closes windows and so on). So why not define a standard set of messages so your application can be on a server, and your GUI is on y

    • it already exists (Score:5, Insightful)

      by Colin Smith ( 2679 ) on Monday March 08, 2010 @03:19AM (#31398334)

      It's called usenet.

      The web 2.0 version is RSS feed of a blog (woohoo). And the application is an RSS agregator.

      Taken to it's logical end point you get Lotus Notes.
       

  • by Anonymous Coward on Sunday March 07, 2010 @10:59PM (#31397148)

    My language parser borked on 'cybersphere.' The words 'cyber' and 'virtual' leave a terrible aftertaste making whatever came later deteriorate into gibberish.. oh wait, this whole thing is gibberish to begin with. gibberish that seems (not entirely sure) to be a justification for everyone to throw their data (and I mean ALL their data) into the public space for the sake of...I'm not entirely sure, but I'll assume it's in the interests of whatever social/political/economic institutions he's a member of.

    I know, how about letting the user decide the 'how' as well as the 'what' when it comes to interfacing with the technology at his disposal? I know, I know, that would be asking people to think for themselves for a few nanoseconds and we can't have that or else the terrorists win, the children lose, and 'freedom' dies. damn, what was I thinking? Gotta dumb everything down so even the most dull witted soccer mom can process it without the knees jerking upward..

    • by zappepcs ( 820751 ) on Sunday March 07, 2010 @11:26PM (#31397304) Journal

      Exactly, kind of, I think.

      The Internet is not a thing like the 'winter olympics' or recording industry. The Internet is the system of communications systems which allow the transfer of information (as well as aggregation, falsification, and overload of). It changes the source of information for those who regularly access it when compared to the time before the Internet.

      What needs to be discussed is not cyber this, or virtual that, but how users use information. Lets face it, for a large portion of the population the phrase 'use information' is rather optimistic. Aggregating information, presenting it in a way that is both intuitive and useful is something of a holy grail. We've seen many attempts to do things like this, and each of them has their fans and foes. What is being suggested is essentially that we all need to have one set of cultural values. Looks good on paper, but it makes a huge mess of things in real life.

      Then again, look at Microsoft Windows. How may people do you know that think this is how computers are supposed to work, and anything not like Windows is weird?

      A single cultural viewpoint is wrong.

    • Re: (Score:3, Funny)

      by Opportunist ( 166417 )

      Encountering a "virtual" is just telling you that you are going to read a lot of rubbish and yet end up having to do all the thinking yourself anyway.

      Ask any C++ programmer, he'll agree.

    • I got farther than you, thought maybe he has something to say, but then gave up when he said this:

      users of any computing system ought to have a simple, uniform operating system and interface. Users of the Internet still don't.

      Sure, keep building your dream world, I'll stay in the real world, thanks.

    • http://userscripts.org/scripts/show/62062 [userscripts.org] seems like just the thing for you, and trust me greasemonkey scripts work really well! Case in point: http://userscripts.org/scripts/show/5738 [userscripts.org]
    • by lennier ( 44736 )

      On the contrary, this guy has been working on the Lifestream concept for a while, and while I agree 'cybersphere' is a rather meh name for the concept, I think he's onto something very important. Lifestreams are at the heart of blogs, Facebook and Twitter; these infrastructures aren't really doing it very well, which is why we need a new one.

  • The Internet's future is not Web 2.0 or 200.0 but the post-Web, where time instead of space is the organizing principle — instead of many stained-glass windows, instead of information laid out in space, like vegetables at a market — the Net will be many streams of information flowing through time. The Cybersphere as a whole equals every stream in the Internet blended together: the whole world telling its own story. (But the world's own story is full of private information — and so, unfortunately, no human being is allowed to hear it.)

    The future of the Internet is information streams blending together? What the fudge does this even mean?

    Hey, if you like this guy, you will probably enjoy reading this [elsewhere.org] as well.

  • Time to start taking ourselves too seriously

    No moment in technology history has ever been more exciting or dangerous than now, when I started speaking.

  • Gerlenter has some really off the wall ideas (see for example this post by Prof. Jeffrey Shallit http://recursed.blogspot.com/2009/02/religion-makes-smart-people-stupid.html [blogspot.com]). But in this case, some of what Gerlenter has to say might make sense and he certainly has shown from his prior work that he's someone worth paying attention to when he is talking about computers. However, the labeling this as 35 predictions is clearly not a good descriptor of TFA. For example, 12 is not at all a prediction but simply
  • by DynaSoar ( 714234 ) on Sunday March 07, 2010 @11:13PM (#31397256) Journal

    I don't care if he predicted Nostradamus and first described self-sustaining fusion. The points and problems brought up are in large part already known and understood in other terms, with many of them dismissed by those who understand the problems in the terms commonly used.

    6. The internet does not create information overload. It doesn't create information, or anything for that matter. It is constructed and filled by people who either handle the information load well or do not (hence over-load). The number of sources and amount received from them is under the control of the receiver. This is only a problem if the person does not develop a suitable technique for handling the flow, or is prevented from using it. Simultaneity is not a way to handle a large flow except in unprocessed pass-though. Regardless of the technologies that might be employed for any of this, sucessful collection of new material requires serial reception with the majority of attention focused on the item is interest.

    Far more useful in developing the ability to absorb more information faster is the concept of 'media richness'. Plain text is just that, very plain, while human behavior is very rich (language plus nonverbals, etc.). Most of the net is low richness. It could be made more dense, but to be richer would then also have to be made cleaner, with less noise within the signal.

    14. Creating your own new ideas and presenting them as validated concepts by comparing them with existing concepts is a technique well used in fiction writing. In non-fiction people expect to be able to compare the old and new and see justification for why the latter is useful before they should be expected to see arguments as to why one is better. Nobody can agree with what they can't understand. You can't even say to understand it if you can't explain it, you can only say you know what you mean.

    I strongly recommend getting a job selling, installing and supporting a large installation so you can see just how much thought and work goes into making the internet happen. It has never just happened on its own.

    • The fundamental difference between your analysis and his writing is that you are thinking of technical concerns while he is thinking of people first.

      The internet does not create information overload.

      Not by itself it doesn't. What he has observed is the universal truth that humans in combination with the internet produce information overload. It allows us such easy access to information that was can (and do) become overloaded in the mass of it. It allows so many people to create information that independe

  • The article says "internet", but it really means "the HTTP based family of applications that use the internet". Sometimes a customer gets me by mistake when they need help because "their internet is down". I start to get mad because of self contradictory statements, but then I remember that they really mean, "my web browser stopped working". (You can tell I'm not really tech support because next I try to find out what browser they are using, and they are never able to tell me. Which means they are usi

  • So appearantly we have someone who predicts a WHOLE DAMN LOT of stuff (seriously, most people wouldn't even THINK of that much, let alone PREDICT it), and he predicted the internet. Ok. I'm fairly sure if I spend my life predicting stuff I am supposed to guess right from time to time. If you want to impress me, give me all his predictions and a percentage how many were true. More than 50% and I will start listening.

    And what does the Unabomber have to do with it at all? Is surviving an explosion now somethin

    • by butlerm ( 3112 )

      "predicted the internet"? That takes a lot of talent. The idea that there would be something like the Internet was obvious the day packet oriented networking was invented.

      Ted Nelson coined the term "hypertext" in 1965, when Gelernter was ten. The combination of the two (i.e. "the web") is certainly not beyond the capacity of someone having ordinary skill in the art. It is simply a matter of economics.

      • So that would make an attempt to IMPLEMENT his "prediction" taking place when he was FIVE YEARS OLD.

        Isn't it kind of hard to "predict" something that someone else has already spent the time and energy on to attempt an implementation?

        Oh, and

        The Cloud (or the Internet Operating System, IOS -- "Cloud 1.0") will take charge of your personal machines.

        You might want to check with Cisco first. They might have a problem with you using that TLA and name. It's rather close to what they've been marketing FOR YEARS.

        Now

        • Isn't it kind of hard to "predict" something that someone else has already spent the time and energy on to attempt an implementation?

          I would like to know what fucked-up definition of "prediction" you have.

          "I predict that we will have flying cars in the next five years" is in no way diminished by the huge amount of effort put forth by many people seeking to make flying cars a reality.

    • by rossdee ( 243626 ) on Monday March 08, 2010 @12:37AM (#31397656)

      So when did he predict 'the internet' ? Was this before or after Al Gore invented it?

      AFAIK Shoghi Effendi predicted the internet back in 1936:

      "A mechanism of world inter-communication will be devised, embracing the whole planet, freed from national hindrances and restrictions, and functioning with marvellous swiftness and perfect regularity."

  • 17. There is no clear way to blend two standard websites together, but it's obvious how to blend two streams. You simply shuffle them together like two decks of cards, maintaining time-order — putting the earlier document first. Blending is important because we must be able to add and subtract in the Cybersphere. We add streams together by blending them.

    ---

    This guy is half way to inventing my Feed Distiller [feeddistiller.com], except he didn't see the usefulness of similarity filtering to some source, to keep the st

  • And the worst part is how similar is becoming to the spanish inquisition
  • by Hurricane78 ( 562437 ) <deleted AT slashdot DOT org> on Monday March 08, 2010 @12:04AM (#31397508)

    It’s like religion, but without as much power. Kinda like a predecessor.

    The only revelation that ever stunned me, was the following:
    I was still a teenager, and I read in the German computer magazine PC Welt about Nostradamus and what of that “actually happened” in the computer area.
    And one prediction for the very close future was, that a new OS would come, to rule the world. Something big.
    Mind you that was long before Linux (created 1991-92) was even remotely mainstream. I constantly read computer magazines, and know that it was not mentioned once or known.
    They joked that maybe Nintendo would create a Yoshi OS. (Super Mario World, the first game to feature Yoshi, was released in 1990-91. Which gives you a feeling of when this was written.)

    Years later, when I heard more and more about Linux, and even IBM started to pick it up, I started to realize that this was that OS!
    Doesn’t mean anything, but somehow that was such a moment that really made me think. Like: Was he an Alien and/or time traveler from the future? ;)

    To this day I wish I could get that article back. I know it was in the summer as we were at the beach. But the oldest issues they have in their archive are from 2007. So if you got an old archive from maybe 1990-92, please contact me! :)

  • The idea that the internet is ever going to deliver massive quality ignores the simple fact: previous mediums were controlled by the elite. To hold a medium controlled by everyone to the same standards as mediums controlled by a select group is to ignore the very nature of the internet!

    The internet is LolCatz and Rickrolling and Facebook Pickle people talking shit on Nickelback.

    Acting like this fact imperils our ever present need for another Rousseau is elitist bullshit.

    Too long our humanity has been defin

    • by Jack9 ( 11421 )

      Quantity has a quality all of its own. - Joseph Stalin.

    • by bsDaemon ( 87307 )

      If you can't read 90% of Shakespeare, then you're not putting in enough effort, and that's not a failing on his part, its a failing on yours. Remember, back in his time the theatre was the popular entertainment medium and that the same people who today are trolling *chan and bitching about their homework were going to see his plays and mostly groking them, linguistically if not in subtext all the time.

      I'm no great fan of Foucault myself in particular, but structuralism as a means of literary criticism rea

      • "the rest of life is about being a more interesting person capable of enjoying a wider breadth of experience"

        Why does Lolcatz not make me a more interesting person?

        In all seriousness, the article's proposal is this: on a scale of quality papyrus > hand writing > typing > word processing > Photoshopping "i'z pays atenshun" onto a photo of a cat.

        I'm sorry, but the notion that anything written today is almost by default worse than anything written 100 to 500 years ago is elitist bullshit. He's say

  • by dcollins ( 135727 ) on Monday March 08, 2010 @12:14AM (#31397540) Homepage

    You know, like about 10 or 15 years ago I saw this TV presentation by a guy who swore up and down that filesystems should store & display documents solely by timestamp order of creation. (Is this the same guy?) "Time instead of space... cyberstream or lifestream... shows information-in-motion, a rushing flow of fresh information...," all that jazz.

    I routinely think back on that because it's one of the wrongest, most idiotic epic fails I ever remember seeing. I'm astonished to see it popping back up with a bunch of "web" buzzwords plastered on top.

    • by PaintyThePirate ( 682047 ) on Monday March 08, 2010 @01:20AM (#31397840) Homepage
      Interestingly, this is the approach that OLPC and now Sugar Labs have taken for file access in Sugar, using the Journal activity [laptop.org]. This is also the direction Gnome is heading in, with Zeitgeist [gnomejournal.org] and its GUIs.

      It's a little strange at first, and it certainly can't replace normal file browsers completely, but it ends up being pretty convenient in day to day use. Of course, these aren't filesystems, just layers atop them.
      • by dargaud ( 518470 )
        It's also the way I archive my mass of digital images: with the date in the filename as I cannot rely only on the filesystem timestamp. I also add some basic keywords in the filenames, something like 20091012_105445-SkiDescentEcrins.jpg. Then searching becomes a breeze, because you always remember more or less the date and adding a keywords results in you finding the right files.
    • by SuperKendall ( 25149 ) on Monday March 08, 2010 @01:47AM (#31397932)

      I'm not sure if this is the same guy, but I think it is. In the video I saw the concept was called a "lifestream" then as well.

      To me the idea also seems bad. I understand the motivation, he was trying to get people away from filesystems and into some more natural system for understanding how to find data. But temporal based is just not it. Humans can have a hard time ordering things absolutely in time, so to make access time based only obscures how to get to things, and also makes things that happened long in the past very hard to access - basically like storing all data in an array instead of a hashmap. People want to be able to get to things quickly and a time based interface does not really help much with that except for the most immediate things.

      • I have to say I wrote that response before I had read through all 35 points.

        Yes the lifestream idea is still there. Along with the kinds of concepts that made me question it before in terms or organizing my own data - that something I wanted to deal with later I would just "move into the future".

        But you know what? He has a good point that a great deal of the internet ended up using lifestreams anyway - blogs are all organized inherently along a timeline. And if you think about it meshing streams of data

        • by dzfoo ( 772245 )

          More proof that if the prediction is vague enough, it can be associated with any reality.

          >> But you know what? He has a good point that a great deal of the internet ended up using lifestreams anyway - blogs are all organized inherently along a timeline.

          That's not entirely accurate. Oh yes, blogs and other online contents are organized chronologically, but not much more than, say, newspaper stories, and classic movies: they are still classified by the inherent taxonomy of their class, e.g. by title o

          • The same with blogs. An individual blog is "organized inherently along a timeline," as you say; but all blogs at once are not.

            They are in just about any rss reader... furthermore you talk about classification, which is true of news sources but much less true of most blogs. Even a lot of developer blogs may wander far afield of development from time to time.

            Likewise with Twitter: you do not read the entirety of the Twitter content (from all feeds or whatever they are called) as a single chronologic

  • "...better suited to the Internet than a conventional website." What?
  • Taming the Natives (Score:3, Insightful)

    by TheVelvetFlamebait ( 986083 ) on Monday March 08, 2010 @01:01AM (#31397752) Journal

    Taking the internet seriously is what leads to all these "internet laws" that slashdot seems to rally against. In fact, the internet's existence as an international object that isn't technically, on the whole, legal in most jurisdictions, for one reason or another, is due in part to the internet not being taken seriously. Now, people are taking what they read online reasonably seriously; as seriously as any other medium. The internet is now no longer just for geeky adults, but also for children, and as such, a large portion of the population will look to have it censored or at least rated, just like any other medium (the logistics of such a task is another issue entirely).

    The days of the internet being a wild west of vocal freedom are in danger of coming to a close, for as much as living in a wild west can be exhilarating and can make you feel more free, there will always be people who want to develop it to make it as safe as the colonised areas.

  • by w0mprat ( 1317953 ) on Monday March 08, 2010 @01:08AM (#31397778)

    3. Here is a simpler puzzle, with an obvious solution. Wherever computers exist, nearly everyone who writes uses a word processor. The word processor is one of history's most successful inventions. Most people call it not just useful but indispensable. Granted that the word processor is indeed indispensable, what good has it done? We say we can't do without it; but if we had to give it up, what difference would it make? Have word processors improved the quality of modern writing? What has the indispensable word processor accomplished?

    Free speech, that's what. Not only free as in libre, but free as in gratis. It's possible to replicate ideas across the world at real-world cost far too small to meter.

    One of my ancestors wrote a book, the only copy of the manuscript was destroyed when the house was flooded by a nearby river. The publishers also lost the only other copy of the text, but the family considered they'd be unlikely to actually accept it and publish.

    So one can see the fundamental advantage of not being bound by a pencil or a typewriter. In the information age what we really have in excess is truly inexpensive duplication.

    It's ironic then that data can still go missing, although this is for other reasons rather than cost of making a backup, like intellectual property.

    The question the author poses is not quite the right one to ask. What has been ubounded by digital word processing is quantity. Quality is different, a subjective and arbitrary value.

    Looking at it another way, I consider readily ubiqutious free speech too cheap to meter as a pretty nicequality.

    Indeed the 'du-' in duplication implies you create a second identical copy which is what you'd have to do with a pen or typewriter. This word is no longer accurate for what is possible with the Internet.

  • Gelernter who. . ? (Score:5, Insightful)

    by Fantastic Lad ( 198284 ) on Monday March 08, 2010 @01:17AM (#31397828)

    So, it seems that David Gelerter was blown up by the Unabomber, survived and wrote a book [amazon.com] about the experience. In a cavalier attempt to "Take the Internet Seriously" I dredged up two reviews from Amazon's customer comments which show opposing valances of political opinion regarding the book's content. I thought it might help to explain the kind of filters Mr. Gelerter views the world through and thus help one decide whether his little treatise on the Internet is worth anything.

    Review Number One. . .

    "Drawing Life" is by David Gelernter, a computer science professor who survived one of Ted Kaczynski's mail bombs.

    The book is about a well educated, intelligent man who has descended into a fear of the future and a hatred of the society that nurtured him, who dreams of a glorious American past that never really existed, who has written a venomous yet pedestrian political tract that would never have been printed without the author's notoriety, and who has come to the conclusion that sometimes people must be deliberately killed to remake society.

    This book is also about the Unabomber.

    Gelernter has endured an awful lot, and for this one is prepared to grant him slack. If he's cranky, he's certainly earned the right to be this way.

    Yet, I've come away disappointed, not just with "Drawing Life," but with Gelernter himself. He is a profoundly bitter man who believes modern society has been ruined not just by the Unabomber but by the likes of unwed mothers, liberals, lawyers, feminists, intellectuals, working mothers, left-wing journalists, Hillary Clinton, and the usual gang of suspects straight from Rush Limbaugh's enemies list.

    Tiresome and unoriginal. Not worth reading.

    And David, enough with the kvetching already!

    Review Number Two. . .

    One of the most powerfully written and elegantly thought out books I have ever read. Should be mandatory reading for every American. I used to think only Vietnam veterans had this kind of sane view of the world after adversity. I was wrong. Buy it, read it, pass it along.

    Right. So Gelernter is passing judgment on the great social commons known as the Internet, is he?

    I'll pass, thanks.

    -FL

    • Comment removed based on user account deletion
  • This guy is smarter than you, and he might be right only 10% of the time. I've seen a few ideas of his not gain traction.

    He still has you beat.

    Have you read anything else he's written, or are you just snarking it up with your ignorance?

  • by vlm ( 69642 ) on Monday March 08, 2010 @07:42AM (#31399504)

    I actually read the article, it reads like one of those hack academics in 1995 trying to sound hip (and/or pompous) by writing long tedious screeds using technical words they don't understand, to discuss a culture they have no experience with. About 1/3 of the article is about how great the guy used to be and how important and relevant his every utterance is. However, I'm not buying it.

    I think its an elaborate hoax, like a modern "Sokal affair", and most of you fell for it.

    http://en.wikipedia.org/wiki/Sokal_affair [wikipedia.org]

    'information overload,' a problem with two parts: increasing number of information sources and increasing information flow per source.

    Yes, access to information without the mediation of the academics and priesthood, and control by multinational corporations is a big problem, for them. Not so much for everyone else. I think we'll survive despite their best FUD.

    The first part is harder: it's more difficult to understand five people speaking simultaneously than one person talking fast -- especially if you can tell the one person to stop temporarily, or go back and repeat. Integrating multiple information sources is crucial to solving information overload.

    Sorry teacher I couldn't read chapter 3 last night because chapters 4, 5, 6 ,7 all exist so I was too intimidated to read chapter 3. I can't read my slashdot firefox tab because I have other tabs open. WTF is this guy talking about?

    But we won't be able to solve the overload problem until each Internet user can choose for himself what sources to integrate,

    I strongly suggest each user operate their own mouse, as opposed to operating each others mices. My kids figured this out around K or first grade, although their previous failure to follow that rule was probably more sibling rivalry and/or comic relief rather than actual ignorance.

    and can add to this mix the most important source of all: his own personal information -- his email and other messages, reminders and documents of all sorts.

    Translation: Google docs, gmail, and google calendar is really cool. Facebook too. Thanks for letting us know, academic dude, without you guys we'd never have known!

    To accomplish this, we merely need to turn the whole Cybersphere on its side, so that time instead of space is the main axis

    Cool idea dude, like a log file, but on the web. I'm sure no one would ever think of putting a log file on a web. Actually the log file could be human generated prose and comments instead of the insights from my /var/log/syslog. Why, we could call it a web log. Or even a 'blog.

    14. The structure called a cyberstream or lifestream is better suited to the Internet than a conventional website because it shows information-in-motion, a rushing flow of fresh information instead of a stagnant pool.

    Stagnant pool... thats kuro5hin, right? information-in-motion, thats like the front page of slashdot.

    Come on Alan Sokal, admit it, you're the one behind this hoax, aren't you?

  • "640k (of usable ram) ought to be enough for anybody" - Bill Gates

    People try to make these kinds of far-reaching predictions without really thinking it through all the time. This is nothing new, though this guy has less balls than most in that his quotes aren't even concrete enough to truly be ridiculed in the future.

    Some, like this one:

    "1. No moment in technology history has ever been more exciting or dangerous than now. The Internet is like a new computer running a flashy, exciting demo. We have bee
  • The first part is harder: it's more difficult to understand five people speaking simultaneously than one person talking fast -- especially if you can tell the one person to stop temporarily, or go back and repeat.

    That's why the Internet is still mostly text. Typing is slower than talking, but reading is faster than listening, and reading doesn't suffer from this problem.

  • "With the Internet, the greatest disseminator of bad data and bad information the universe has ever known, it's become impossible to trust any news from any source at all, because it's all filtered through this crazy yenta gossip line. It's impossible to know anything."

    Soft-science academics have been complaining about the Crazy Yenta Gossip Line ever since it got big enough for them to notice.

    Doesn't stop them from being a hugely active part of it.

  • The Cloud will take care that your information is safely encrypted, distributed and secure.

    I've seen the inside of "The Cloud". It looks a lot like the "non-cloud" environment. The parts that are different have nothing to do with enhancing security. Fail.

  • In many ways the internet is like the Total Perspective Vortex [wikipedia.org] gone wrong. In many ways it give people to much information then our minds can safely handle but what happened was it also allowed us a soapbox to give our response to what we learn.

    So minorities can yell as loud as the majority, insane unverified half truths can get as much if not more attention then proven documents. Analysis of actions without correct context etc... It is too much for people to handle.

    There was an interesting study. People

  • Blogs and other anthology-sites integrate information from many sources.

    Oh, hyphens. Are there two words you can't improperly join?

    Here we have an adjective joined by hyphen with a noun and the compound improperly being used as a noun when it must be used as an adjective, casting the following word not as a verb but as a noun, leaving the sentence without a verb. And of course the plural is misapplied on the compound adjective and must shift to the 'nounified' verb, so you have:

    "Blogs and other anthology-site [adj.] integrates [n.] <missing verb> information from many so

  • David Gelernter was still a kid when Douglas Englebart and Ted Nelson were inventing all the hypertext ideas the Web was built from.

"You can have my Unix system when you pry it from my cold, dead fingers." -- Cal Keegan

Working...