Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Technology

Thirty Years in Computing 316

Jacob writes "Jacob Nielsen, usability guru, writes about the last 30 years of computing and his predictions of the next 30 years of computing. An interesting read. quote: 'Computer games in 2034 are likely to offer simulated worlds and interactive storytelling that's more engaging than linear presentations such as those in most movies today.'"
This discussion has been archived. No new comments can be posted.

Thirty Years in Computing

Comments Filter:
  • by etcremote ( 776426 ) on Friday May 28, 2004 @11:52AM (#9278319)
    I predict that in 30 years, what is and isn't a computer will be hard to distinguish.
    • ...if the network is the computer, or if the computer is the network, or what! Stupid Sun...
    • by sixteenraisins ( 67316 ) <tomorrowsconsonant.yahoo@com> on Friday May 28, 2004 @12:03PM (#9278464)
      We're already there. Cell phones, PDA's, and handheld game consoles (Nokia N-Gage, anyone?) are already blurring the line between what is and isn't a computer.

      What I think will be interesting to watch is how software also starts evolving from apps with a narrow focus (think along the lines of early 90's WordPerfect) to apps which try to do pretty much everything - perhaps a bad example, but MS Word already allows table and cell editing similar to Excel, graphics manipulation, and desktop publishing.
      • umm... blurring the line? They ARE computers. A computer is an electronic device that is one thing, and can do three. It must be programable. And it must be able to retrieve, store, and process data.
      • by mhesseltine ( 541806 ) on Friday May 28, 2004 @01:25PM (#9279364) Homepage Journal
        What I think will be interesting to watch is how software also starts evolving from apps with a narrow focus (think along the lines of early 90's WordPerfect) to apps which try to do pretty much everything - perhaps a bad example, but MS Word already allows table and cell editing similar to Excel, graphics manipulation, and desktop publishing.

        One word: EMACS

      • >MS Word already allows table and cell editing similar to Excel,
        >graphics manipulation, and desktop publishing.

        WP 5.1 did do a lot of that in the nineties already.
        WP 6.0 did all of it in 1994.
        In features WordPerfect was and is still way ahead of MS Word.

        You could do calculations, references and use variables in WP5.1 tables.

        WP was always a more serious DTP tool. WP 6.0 already supports folding signatures to do 2-up, 4-up, 16-up, booklet, separate font libraries, absolute page positioning styles, ke
    • by OECD ( 639690 ) on Friday May 28, 2004 @12:08PM (#9278522) Journal

      I predict that in 30 years, what is and isn't a computer will be hard to distinguish.

      Conversely, movies and other linear entertainment will be utterly recognizeable. There will always be a place for good stories, and it's very hard to 'write' a good story on the fly and interactively. It starts to look too much like the tangled yarn that is life.

    • by Daniel Dvorkin ( 106857 ) * on Friday May 28, 2004 @12:11PM (#9278558) Homepage Journal
      Guy I know once talked about looking at an old Sears & Roebuck catalog from, I think, just before WW1. In it there was a section for early power tools. They sold power saws, screwdrivers, etc. just like they do now. The difference was, though, to use any of these tools, you had to buy a separate motor. This was a bulky thing that you set on your workbench next to your project. It came with a variety of adapters which you could use via a chain drive or something along those lines to power your saw, screwdriver, etc.

      The analogy here is pretty plain, I think. I'm not sure that the idea of "the computer" as a separate machine will ever entirely go away, but certainly the computing power in everyday appliances (TV's, radios, hell, even toasters and refrigerators) is growing all the time. The standalone computer may eventually go the way of the standalone power tool motor.
      • by bigman2003 ( 671309 ) on Friday May 28, 2004 @12:46PM (#9278951) Homepage
        That's a good point. Because current WE need to GO TO the computer. Soon, the computer will just be where we need it.

        Tablet computers are an example of this. A small tablet, that is hooked wirelessly to your network can be used for e-mail, etc. Of course the tablet will get smaller and smaller, and soon not recognizable as a 'computer'. It will be similar to a piece of paper.

        Now, most people connect their MP3 type player to their computer, and download the music. Eventually, your MP3 player will once again, connect wirelessly, and just download everything- because storage won't be an issue. Of course it will be smaller, and barely noticable. But once again, you won't need to go to your computer.

        Currently you can buy things on-line on your computer. But wouldn't that be better from your TV? Just yesterday there was an article the next Xbox having more computer functionality. With HDTV quality screens, I would rather make my purchases from my couch, not sitting at my desk. Why go to the computer, when the rest of my house is more comfortable?

        Sitting in my 'office' at home isn't fun- it's not where I want to spend my time. I'd rather be out with everyone else. We've been tied to the keyboard long enough, and I think we'll start moving away.

        Yes- I really would like a web-enabled refrigerator...It would be nice to walk into the kitchen, and get my news/e-mail while standing there drinking out of the orange juice container.

        When display devices get advanced enough that they can simply be 'printed' then we can have them everywhere. This will be the biggest step forward.

        Your TV is actually a great display device- because it streams in a lot of different information. But it is too big, bulky, expensive and ugly to have everywhere. But when I can place a display in the wall of my bathroom, I can use it while I take a crap. It won't be the luxury device of a Texas oilman anymore- it will show up in everyday life.
        • by joggle ( 594025 ) on Friday May 28, 2004 @01:36PM (#9279473) Homepage Journal
          I don't know about all of that. While some things may become cheaper, many things will simply become better (not cheaper).

          For instance, those power tools in that old Sears catalog probably didn't cost more than modern power tools, possibly less since they were simpler and didn't each have their own motor and battery (even after adjusting for inflation). Laptops only cost about $1000 less than they did 15 years ago and have been pretty steady for the last 6 years or so.

          I predict that many technologies will start off relatively expensive and then stabilize after 5-10 years, just as many technologies before them did (TVs, microwaves, etc.).

    • by filmsmith ( 608221 ) on Friday May 28, 2004 @01:02PM (#9279121)
      But I predict that within 100 years computers will be twice as powerful, 10,000 times larger, and so expensive that only the five richest kings of Europe will own them.

      nngh-hey!

      fs
    • The most significant convergence will involve the computer and our best friend, the Television.

      Computers will be the center of the house. The "Media Center" computers are in the infant stages, these are the early attempts to bring the computer to the part of your house that currently receives the most attention.

      When this technology becomes seamless, television will become an interactive and more personalized experience.

      Do you like the outfit that Jennifer Lopez is wearing? "Click here to buy". "MMM
  • by erick99 ( 743982 ) * <homerun@gmail.com> on Friday May 28, 2004 @11:52AM (#9278320)
    I'm sorry but this is not how you look into the future:simple extrapolation of the present. Nielsen simply takes all of the features we look at today and scales them up (3PHz processor, exabyte hard drives, etc.). My God, whatever computers look like in 30 years will probably bear little semblance to what we use today.

    He and other futurists might do better to look at what we use computers for now and what we don't, but could, use them for in the future. They could also think way outside the box and think about how computers will physically change (will it still be everything in one box or will the hardware be as distributed as software can be) or how computers will integrate into everyday life.

    I guess I expected a bit more imagination. 30 years is an awfully long time in terms of technological development.

    Keep smiling!

    Erick

    • I think we have a better shot of getting an "out of the box" original thought from an episode of Star Trek then some of these guys. Though in thirty years it will be cute to hear "My wristwatch has more computing power then the fastest computer in the year 2004." Considering I remember my teacher saying that in 1995; comparing his wrist watch to the ENIAC :)
      • by TGK ( 262438 ) on Friday May 28, 2004 @12:30PM (#9278756) Homepage Journal
        Of all the futurist sci-fi authors out there, the best and (in my opinion) most realistic rendition of future societies is given in Peter F. Hamilton's Reality Disfunction series.

        Note: Think away the energy manipulating poltergeist possession thing to get my point.

        Human society divides along two lines, Adamists and Edenists. Adamists embrace nanotechnology and information technology. Edenists embrace biotechnology. While the division isn't that plausable, most of the tech described from the Adamist side of things is a real possibilty in the distant future. We're allready seeing the beginings of it.

        My predictions:
        1.) Augmented Reality will be the killer app that moves the personal computer from your desktop into the category of wallet, watch, and keys that you need to leave the house.

        2.) Increases in display technology and plumeting memory and processor costs continue to push more embded devices into the marketplace.

        3.) Computer interaction will edge out human interaction as the primary means of doing buisness. How this happens will depend on the particular industry. It has allready happened to the banking industry. Some of this will be online interaction, an appreciable portion of it will be based on biometrics and customer tracking. The privacy people will object to this, but will be overcome by the allmighty dollar.

        4.) The computer applications we use will continue to become more abstract and seperated from the data they handle. The reason this occurs is the cycle that drives hardware also drives sofware. Hardware sells because people want to run the latest software. Software sells because people who have the latest hardware want things to run that pushes their system to the limit. Programers thus write applications that allows a more sophisticated rendition of the same dataset. Not to use Microsoft as an example, but compare Excel 95 to Excel XP. What's the difference?

        5.) Longhorn will begin a trend in operating systems that SGI first demonstrated with the Onyx. The OS is the redheaded stepchild of the mainstream software market right now. It is untilitarian, focusing more on getting its job done and less on looking slick. Apple has tried to change this, SGI has tried to change this, Enlightenment has tried to change this. Microsoft will succeed.

        Most of these predictions are more like 10 years down the road instead of 30. What's really interesting are the social change that this kind of technological integration will bring about. What will happen as the governments of the world lag further and further behind the corporations as providers of the day to day services that people depend on?

        The next 30 years of computing promises more than just faster system and bigger drives, it promises radical changes in where computers are found, what computers do, and how human beings interact.

        Thirty years is a long time, and while I wouldn't put a bet in for me being able to get an 802.11 jack for my head in that time frame, it's only because I don't think the FDA would allow it by then.

    • by AKAImBatman ( 238306 ) <<moc.liamg> <ta> <namtabmiaka>> on Friday May 28, 2004 @12:00PM (#9278436) Homepage Journal
      Actually, the extrapolation procedure doesn't work too bad. It's just a matter of connecting the dots. I have a book from 20+ years ago about the future of video games. Some of the claims were:
      • Games could allow more than two players. Perhaps even enough to play a full game of soccer or football! (The picture showed a "dome" with controls in a ring around it.)
      • Games will be able to be played over great distances! (The picture showed a chess board with a wireless antenna on it.)
      • Games will be so much more realistic! (Shows a handheld game with a full scene of a motor bike jumping a dirt ramp.)


      None of these predictions were wrong per say. Rather, the author failed to connect the dots and follow the the most likely path of games. Why have an arcade machine with 15 control sets when you can simply hook machines together over long distances? Why have a chess board with an antenna when you can play the same thing on your super-realistic, Hi-Res, 3D screen?

      The future of computer technology has always been known. It's simply been a matter of developing the power to do it. The only failure of the visionaries was in their lack of understanding market conditions and forces. They thought of each technology in a vacuum and didn't put them together as actually happened.
      • I'm going to go out on a limb here and predict that in 30 years we won't have any computers at all. Instead, we'll have spice, and hot Bene Gesserit women. Yes, I am predicting the Butlerian Jihad.
      • Again i'd have to point to Kurzweil about this. People don't realise how quickly we've advanced since the computer was created. Every technology we create only fuels new techs with fuels new ones exponentially.
        In one century we've experienced multiple revolutions, industrial and technological. Considering we've been around over 10 times that long (civilised anyhow), i'd say it's amazing.

        I just think tech advancement happens on a much shorter scale as you'd have to change the measurement yearly almost.

        I
      • Actually, the extrapolation procedure doesn't work too bad. It's just a matter of connecting the dots. I have a book from 20+ years ago about the future of video games. Some of the claims were:

        * Games could allow more than two players. Perhaps even enough to play a full game of soccer or football! (The picture showed a "dome" with controls in a ring around it.)
        * Games will be able to be played over great distances! (The picture showed a chess board with a wireless antenna on it.)
        * Games will

    • and where does bio-tech technology fall in all of this. Foret about only hardware driven machines. How will they interact with our bodies in 30 years. Just a thought. 30 years is a long time.
    • I'm sorry but this is not how you look into the future:simple extrapolation of the present. Nielsen simply takes all of the features we look at today and scales them up (3PHz processor, exabyte hard drives, etc.). My God, whatever computers look like in 30 years will probably bear little semblance to what we use today.

      Just assuming that there will be just one processor. Probably the number of processors will be measured in KP (kilo-processors).

    • No kidding... The article is so lacking in imagination it is not even funny.

      I even doubt that he is right. The reason is because it will become impractical. Right now we have plenty of CPU power on the desktop. For example we can build cars (drag racers) with 2000HP, but is it practical for a mainstream car? Not with oil prices being what they are.

      As you point out computers will integrate into mainstream and the features that we pre-occupy ourselves with (RAM, CPU Speed, etc) will become irrelevant.
    • by freeze128 ( 544774 ) on Friday May 28, 2004 @12:47PM (#9278969)
      The computers of 2034 will be impressive indeed! They will have many functions that will be illegal for you to perform.
    • He and other futurists might do better to look at what we use computers for now and what we don't, but could, use them for in the future. They could also think way outside the box and think about how computers will physically change (will it still be everything in one box or will the hardware be as distributed as software can be) or how computers will integrate into everyday life.

      Yeah, I've heard "futurists" like him before. One goofball suggested that "in the future" we would store ALL our music on a dis

  • Gaming won't really change... heck, I'm still waiting for my flying car.
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Friday May 28, 2004 @11:52AM (#9278334)
    Comment removed based on user account deletion
    • Re:Um... (Score:4, Interesting)

      by Kenja ( 541830 ) on Friday May 28, 2004 @12:02PM (#9278454)
      "Uh, don't computer games now have simulated worlds and interactive storytelling? Morrowind anyone?"

      While I liked the game (and am currently playing the Bloodmoon expansion). Morrowind was about as interactive as a chose your own adventure book. Sure, you could do things that where outside of the script. But they had no effect.

      You could rob all the great houses blind while the guards watched, you could kill entire towns, you could reach the rank of guild master in any of several guilds. But nothing changed. No one reacted diferently to you regardless of what you did (unless you where wanted for murder or something, then you had to pay a small fine. And keep in mind that you realy could kill entire towns without getting a price put on your head). One would hope that in the future there will be Morrowind like games with real interaction rather then scripted events.

      PS: My favorite example of this problem in Morrowind is when I would walk into a house vault naked, turn invisible, unlock the vault doors, take anything not nailed down,stagger back out with more stuff then I'll ever be able to sell, and all the guards say is "we're watching you. Scum!". I bust out laughing every time that happens.

    • uh.... (Score:3, Insightful)

      by Thud457 ( 234763 )
      You overlooked the crucial part of the quote:

      "Computer games in 2034 are likely to offer simulated worlds and interactive storytelling that's more engaging than linear presentations such as those in most movies today "

      Even some of today's primitive games have most movies beat... (watching Hollywood eat it's young [imdb.com] at a prodigious [ign.com] rate, I sometimes think "Tetris" is more complex, multifaceted and emotional storytelling.)

    • Actually... (Score:3, Interesting)

      more engaging than linear presentations such as those in most movies today

      I disagree with this statement too, but for a different reason.

      I wouldn't underestimate the engaging nature of the narrative. Storytelling is as old as mankind and it's not likely to disappear just because we can suddenly take control of the story. In fact I would argue that if you could control the story, what's the point of readin/watching/taking part in it? The point of storytelling is to engage the reader and make him feel

  • by The Ultimate Fartkno ( 756456 ) on Friday May 28, 2004 @11:55AM (#9278366)


    DNF due in March.

  • Compu...what? (Score:4, Interesting)

    by Deflagro ( 187160 ) on Friday May 28, 2004 @11:56AM (#9278372)
    I agree with someone else's post. A computer won't be a box with a monitor, etc..
    It will prolly be like a PDA that has periphs you can plug in and just have everything virtual.
    I mean, 30 YEARS! Considering the exponential advance in technology, all we'd have to do is find a new battery model (nanotech i'm sure) and voila.

    I'm gonna be in my rocking chair playing final fantasy XX i'm sure.
    • Re:Compu...what? (Score:2, Interesting)

      by grumbel ( 592662 )
      Just consider the last 15-20 years, back then computer where very small, directly build into the keyboard, almost made no noise beside the drives, could be plugged into every TV and 'just worked', no need to maintain or reinstall the OS really, just insert another floppy and viola it worked.

      Today computers are in most cases big noisy grey boxes. People have to reinstall or maintain their OS, manually install security patches and every once in a while a nice internet worms does funny things with your machin
      • And that may be the bottleneck too. Considering designing an interface requires intelligence and a certain....foresight or empathy.
        We can make molecular computers and quantum cryptography, etc.. but when it comes to using it, we run the same things we're used to.
        We really need an increase in intelligence to advance and when we can teach our machines to think for us, then the bottleneck is gone and the exponentail becomes exponential only limited to resources... it's all very interesting.
      • Re:Compu...what? (Score:4, Insightful)

        by mr_mischief ( 456295 ) on Friday May 28, 2004 @02:49PM (#9280183) Journal
        They're bigger because we wanted user-upgradable parts. They're louder because they need to be reliable and not burn up in a couple of months -- it's one of the prices of getting faster. They're less secure because we're connecting them to one another to enable things we couldn't easily do 15-20 years ago.

        You can easily buy an SBC with an AMD Geode 1 GHz CPU and 128 megs of RAM, put your storage on CompactFlash with an IDE convertor, and have integrated Ethernet on it. With no fans needed and solid-state storage, it'd be quiet. With everything but the CF on one board, it'd be small. It would run most software people run on the stock desktops.

        VMS indeed does do versioned filesystems. It's not too long, I'm sure, before there's a Linux filesystem that implements it at the FS level if there's not already. Until then, there are versioning systems at the application level.

        There are all kinds of software we have now that we didn't 15-20 years ago. You're almost certainly reading /. on a web browser. SMTP/POP3 email software certainly wasn't the norm on desktops 20 years ago. We have much better animation now than then. We have realistic computer audio done mostly in software (this is enabled largely by the processor speeds and memory sizes, but the software to take advantage of it is fairly new). Instant messengers which work outside the LAN are certainly new within the last 15 years. The programming languages used to write other software have changed much over the last 15-20 years. Machine translation of natural languages was a dream 20 years ago, but now it's getting reasonably accurate. Software in just the last couple of years has taken big strides toward displaying everyone's languages together on screen in the proper character sets -- even with more than one alphabet in use at a time. Desktop operating systems have come from offering filesystem services and port access to one program at a time through the days of cooperative multitasking into the days of memory-protected preeemptive multitasking and even machine virtualization.

        Sure, the uses of the individual applications may not have changed much -- reading text, editing text, listening to sounds, playing games, todo lists, calendars, address books, etc. Tewnty years ago, though, could you open your address book, drag a CD-quality sound clip into it, and type an annotaion before clicking a button to send it to someone on another continent?

  • IN 30 years,,,, (Score:2, Interesting)

    by Jailbrekr ( 73837 )
    I think there will be a backlash against technology. We will hit a critical point in our social evolution where we say "enough!" How much of a backlash, I know not.

    At least I hope there is a backlash. Too much, too invasive, too quick.

    • At least I hope there is a backlash. Too much, too invasive, too quick.

      Kind of like the Butlerian Jihad [dunenovels.com]?

    • Too much, too invasive, too quick.

      Isn't that the famous last words of the Dodo bird?
    • Re:IN 30 years,,,, (Score:4, Insightful)

      by SoCalChris ( 573049 ) on Friday May 28, 2004 @12:30PM (#9278752) Journal
      I think technology will keep getting better, but we'll see it less. In fact, it is already happening.

      Take Tivo for instance. A few years ago, if you wanted to record something, you had to set up your VCR, program it, make sure there was a blank tape, etc... Now you just punch into your Tivo that you like certain kinds of shows, and they are recorded for you. In the future, devices like Tivo probably won't even need you to tell it what to record, it will know what you want to record based on what you watch most.

      Another example is cars. The new Mercedes recognize who is driving, and adjust the seats/mirrors/stereo to what that driver likes automatically. They also recognize if a seat is empty, and in an accident it won't deploy the airbags for empty seats. Some of the new cars don't even require a key to start any more. The owner carries a card with a RFID chip in their wallet that the car recognizes, and allows them to drive the car without having to use a key. Even 10 years ago, the things that are standard on a lot of new cars would have been unimaginable.

      I think things will keep getting far more technologically advanced, but we will see it less and less.
  • Boy! What a limited imagination that guy has. I'm expecting a holo deck by then!
    • you're deluding yourself... they'll never invent a holodeck, and if someone did, he'd be prevented from marketing it, because if everyone had a holodeck, no one would ever go outside of it, except to eat and defecate; this would lead to the downfall of mankind, as reproduction and social activities cease altogether.

  • by lawpoop ( 604919 ) on Friday May 28, 2004 @11:59AM (#9278415) Homepage Journal
    These people should do some reading on narrative theory.

    A story is a meaning applied to events after they have occured. A game is a game, like sports or a board game. You can only make a story out of it after events have been completed. A story has a status quo, an event that disrupts that status quo, and a hero who overcomes a challenge to create a new status quo. You can only joing narrative events to actual events after they have all taken place. If you have a wandering storyline, what's to say that this particular event is the shift to the 2nd or 3rd act? It's only after you have everything that you can make a complete story. And that's not to say that there's only one story. Any event might serve as any of the narrative events, depending on the story you're telling.

    • A story is a meaning applied to events after they have occurred.

      Then perhaps your definition (or the standard definition, whatever) is narrow, or perhaps they just misused the term, but I don't think that is the point.

      I think the point the author of the article was trying to make was that instead of having a "story", or some linear sequence of events happen to the player (the character), you will have a completely interactive world where your actions can change the world, and that changed world affec
      • Remember that a story is something you tell after the fact. It has a punchline, like a joke. Something that hits you. Good stories are planned out, and their telling is practiced.

        For good, unplanned stories to happen, I think that will only happen in MMORPGS with either great AI (unlikely), or a lot of freedom for avatars. And then, again, *a story will be a re-telling of events that have already happened* . Hey, did you see what happened in $_MMORPG yesterday? I finally got my castle fortifications set

    • A story is a meaning applied to events after they have occured. A game is a game, like sports or a board game. You can only make a story out of it after events have been completed.

      I will never understand why this old chestnut appears every time there's a discussion of interactive storytelling.

      By your definition, fiction is impossible. When the author sits down with a blank sheet of paper, he should be stuck, since there are no past events for him to relate.

      But of course, we know this isn't the case. E
      • But of course, we know this isn't the case. Even school children are capable of inventing fanciful, novel stories. The path to interactive storytelling is collaboration between the player and the computer to produce a narrative which is both interesting to the player and dramatically compelling. The narrative is a product of this process, it is not the process itself.

        The problem is that storytelling is hard. It's easy to create a story. It's tough to create a compelling and interesting story. And it seems
    • Exactly! (Score:3, Insightful)

      I never could figure out the point of the 'holonovel' in Star Trek. Why go to the trouble of taking part in the story of Wuthering Heights if you first have to read the story, learn your lines, and go through the motions of the character? I mean, supposing you're playing Cathy and you decide to marry Heathecliffe. Well then you all live happily ever after and the story is no longer Wuthering Heights.
  • Shared game content (Score:3, Interesting)

    by ArsonSmith ( 13997 ) on Friday May 28, 2004 @11:59AM (#9278425) Journal
    One thing the gameing industry needs is a shared content license similar to how open source is set up. If someone spends 6 months makeing a detailed land scape for level 14 of a game and it turns out the everyone blows through level 14 in just a couple of minutes is level 14 worth those 6 months?

    Not really, but if that level was "Open Source" sort of speak, it would then be able to be modified, with modifications going back to the original, and used in the next game. With several improvments over time that section would eventually become a great peice of colabirated art.
    • Games can already be modded, but modern single-player games tend to require an overall plan for the gameplay and story, and consistency in art and design. It would be difficult to maintain all of that while giving up direct control over the game's contents. I'm not saying it's impossible to make a game that way, just that you're not going to see the equivalent of an A-list commercial title without a similarly centralized development process.
    • One thing the gameing industry needs is a shared content license similar to how open source is set up. If someone spends 6 months makeing a detailed land scape for level 14 of a game and it turns out the everyone blows through level 14 in just a couple of minutes is level 14 worth those 6 months?

      Still designing levels in 2034? Can't the games make their own levels on the fly yet? Boy, NetHack still has 'em beat...

  • So.. (Score:3, Interesting)

    by thebra ( 707939 ) * on Friday May 28, 2004 @12:02PM (#9278451) Homepage Journal
    Certainly, our personal computer will remember anything we've ever seen or done online. A complete HDTV record of every waking hour of your life will consume 2 percent of your hard disk.

    Doesn't it already do this, its called history. I see that he is saying it will screen capture all of it, but why? This article doesn't really predict anything but just states the obvious. Yes, we will have faster processors and more hard drive space, bigger screens, higher resolutions, amazing predictions! But I want to know when my computer will talk to my car and refrigerator and let me know when I'm driving to the grocery store that my son (future son) just drank the last of the milk.
  • DUKE Nukem (Score:3, Funny)

    by tonywestonuk ( 261622 ) on Friday May 28, 2004 @12:02PM (#9278460)
    Computer games in 2034 are likely to offer simulated worlds and interactive storytelling that's more engaging than linear presentations such as those in most movies today.

    So, I guess this will be when Duke Nukem Forever is completed then....
  • In thirty years there will be fewer people making a living from programming computers, but there will be far more programmers. My prediction is that everyday folks will provide general statements to a deductive build system of sorts that will generate the software they need (within reason). Example - "hey, make me some code for figuring out the future payments on the mortgage we just downloaded"..."lets build a new level for this game with more dragons".

    Consistent with this prediction - the only major piece

    • You're probably right, but why load down the interface with technical terms? The user shouldn't even need to know what "code" is, nor should he have been required to manage the files and data needed to define the mortgage, he should be able to just tell the computer "figure out the future payments on this mortgage" and let the system deal with it.
    • This implies that either 1) language has no ambiguities, or 2) artificial intelligence is possible. Rewind 30 years ago, and see how far AI has come. Not very. But in 30 years we've learnt a lot on language, and it's very ambiguous. Which is why no one wants to program in Engligh, when they can use Perl or C or some other abstraction. Will AI be possible? Your assumption is that it will. I'm not so sure.
  • by lildogie ( 54998 ) on Friday May 28, 2004 @12:04PM (#9278482)
    Airplanes will crash, nuclear weapons will detonate, and NBC will air a hokey movie about it all.

    I'm just glad I live in a later timezone. Oh, wait....
  • This quote from the article struck a nerve for me: People who started using computers after the PC revolution have no idea about the miserable user experience that centralised computers imposed. Even the worst PC designs today feel positively liberating by comparison.

    I started using computers about the same time Neilsen did (only 28 years ago for me :). One of the trends that keeps rearing its ugly head is the return to centralized computers. Nowadays they call them "Application Service Providers", or

    • by mschuyler ( 197441 ) on Friday May 28, 2004 @12:25PM (#9278705) Homepage Journal
      Why do you cringe? I'm replacing 150 public computers with $300 thin clients coming off a terminal server (well, a cluster of them), just exactly what you are talking about. Right now, if I need to change anything, I have to visit 150 computers individually, even for the tiniest tweak of a config file. Plus I have to lock these things down tight because John Q. is either stupid and wrecks stuff unintentionally, or he's trying to show me how clever he is by sabotaging the machines and attempting to hack my system. So that means stuff like Centurion Guard, Fortres, keys, and all kinds of crap that wastes my time.

      With thin clients, I make the same change on the server and it's all done. It IS a return to the mainframe model, and it's one I'm extremely happy about because it will make my life so much simpler. Once I get these 150 done I'm going after 150 staff computers. Most people simply do not need real PCs, and half of them couldn't see a difference anyway. As long as they get a login screen and a desktop they couldn't care less if the files they create are stored on a server or locally, or whether they have a hard drive somewhere under their desks. Sure, there are a few folks who are going to need local storage for various reasons, so they can keep their PCs. But the vast majority simply don't need it. I'm also saving money. Even when you amortize the servers over the number of thin clients they can support, my capital cost is half what it would be for PCs.

      I surely would not advocate that approach for any of us, perish the thought. But in the real world in a production environment, which slashdot certainly is not, it's a viable solution.
  • "'Computer games in 2034 are likely to offer simulated worlds and interactive storytelling that's more engaging than linear presentations such as those in most movies today.'"

    Entertainment in 20 years will be more entertaining than entertainment today? go figure, never saw that coming.
  • Given the current state of privacy and such, not to mention plain old need and common sense, exactly who did he talk to who's asking for a feature that records every moment of your waking life?

    Dick Cheney?
    John Ashcroft?
    Donald Rumsfeld?
    The girl in "50 First Dates"?

    This is basically a 'flying cars' article.
  • I think we can agree that computers - or technology in general replaces workers.

    I remember Arnold B Scrivener - the story of a scrivener (hand copier) left useless by the invention of the typewritter.

    So robotics are edging out industrial line mechanics.

    I suggest that good software will soon be edging out intellecual translators - the people who speak in professional languages because the "rest of us don't understand" like Lawyers for example.

    even doctors - essentially translate a list of complaints into
  • The Turing Point... (Score:4, Interesting)

    by solarlux ( 610904 ) <noplasmaNO@SPAMyahoo.com> on Friday May 28, 2004 @12:07PM (#9278514)
    What I'd like to know is when... computers will have the same level of consciousness as we do.

    At that point, they will be empowered to invent and innovate creatively without the biological encumbrances we have. Imagine a human-like mind that can, while thinking, remember every fact with equal clarity. And imagine the scope of that knowledge base to include all discovered facts. Every theoretical mathematical conjecture could be instantly evaluated and computed (no more tedious sessions working with Mathematica). Sci-fi writer Vernor Vinge has stated that this point in history will be so revolutionary that we are entirely incapable of seeing what lies after it -- a horizon "singularity".
    • You mean "if"... (Score:4, Informative)

      by sean.peters ( 568334 ) on Friday May 28, 2004 @12:20PM (#9278659) Homepage
      Since it's by no means a sure thing that computers will EVER attain consciousness.

      I also have heartburn with the term "singularity" as applied to the growth in computer capability. "Singularity" is a mathematical term with a precise definition: it's a point on the curve representing some function at which the slope of the curve is infinite - think of the limit of f(x)=1/(x-1) as x approaches 1. But "Moore's Law" is an exponential function - its slope is finite everywhere on the curve.

      While I understand what people mean when they discuss a computer "singularity", it's really not a very accurate way to use the word.

      Sean
  • Circular Logic (Score:2, Interesting)

    by beatleadam ( 102396 )
    I knew that it could feel good to use computers, and I wanted to recapture that sense of empowerment and put humans back in control of the machines.

    While I applaud anyone who is willing to attempt to predict "30 years in computing" and like everyone, can not say he or she is wrong (after all, it has not happened) I have to say that this is a useless article from Mr. Nielsen. In the same couple of paragraphs that he is talking about his dislike of the mainframe and his pleaant experience with the desktop
    • Something non-empirical doesn't necessarily equate with useless. People's feelings are important to qualitative research and can sometimes get at things that quantitative/empirical studies cannot.
  • What's all this about screens, why the hell isn't he expecting projection onto the back of the retina. [bbc.co.uk]
  • Diamond Age (Score:3, Interesting)

    by Glog ( 303500 ) on Friday May 28, 2004 @12:11PM (#9278562)
    Nielsen's Law of Internet bandwidth? Puh-lease... Anyone quoting themselves in anything but a scientific paper sounds rather pompous and pretentious.

    Nielsen may be a fine usability expert but as a futurist and visionary he is lacking in the imagination department. I strongly recommend the Diamond Age by Neal Stephenson for an inspired read of what computing may be like many years from now.

  • Sounds like someone needs bigger iron or to turn off the SETI client.
  • by mst76 ( 629405 ) on Friday May 28, 2004 @12:12PM (#9278570)
    can hold every movie and sound track ever published.
  • by RoufTop ( 94425 ) on Friday May 28, 2004 @12:12PM (#9278573) Homepage
    People have been expecting these interactive movie worlds to tell us non-linear stories for at least a decade. There are several problems with this line of thinking: it's far more expensive to tell a non-linear story than a linear one, moviemakers are much better at telling stories than audiences, and people LIKE linear stories.

    Alternate endings to movies on DVD's and open-ended worlds in games like GTA are good examples of the kinds of things we'll be doing for a while. But a story told from a million angles? Forget it. Even with technology to create those worlds, you still need to think about, well, everything, and all the consequences of every action. It's not gonna happen.

    What we like about linear stories is their flow from conflict to resolution. And we see movies because the people that make them are good at what they do. The original storytellers around a fire could have sat there waiting for their "users" to interact with them ("storyteller, put the mail on the duffel bag" :-), but instead they were valued for their imagination and timing.

    rouftop
  • by dylan_- ( 1661 ) on Friday May 28, 2004 @12:13PM (#9278586) Homepage
    I think it's crazy trying to predict 30 years in the future unless as a sci-fi scenario.

    I mean, if you'd asked me in 1974 what things would be like in 2004 I simply couldn't have guessed what we'd have now. Actually, I'd probably just have replied "Goo! Gah gah gah! Whaaaah!" but that's besides the point...
  • In the future games will be totally portable and totally sidetalkin' [sidetalkin.com]!
  • by JoeBuck ( 7947 ) on Friday May 28, 2004 @12:15PM (#9278605) Homepage
    Dogbert: I can predict the future by assuming that money and male hormones are the driving forces for new technology. Therefore when virtual reality gets cheaper than dating, society is doomed.

    Woman (to Dogbert): Is Dilbert available?

    Dogbert: He's been in the holodeck since March.

  • 'Computer games in 2034 are likely to offer simulated worlds and interactive storytelling that's more engaging than linear presentations such as those in most movies today.'

    I could spew meaningless crap like that all day for a fiver.
  • From the article: If I keep up my exercise schedule, I stand a good chance of experiencing computers 30 years from now. According to Moore's Law, computer power doubles every 18 months, meaning that computers will be a million times more powerful by 2034. According to Nielsen's Law of Internet bandwidth, connectivity to the home grows by 50 percent per year; by 2034, we'll have 200,000 times more bandwidth. That same year, I'll own a computer that runs at 3PHz CPU speed, has a petabyte (a thousand terabytes
  • by Sunken Kursk ( 518450 ) on Friday May 28, 2004 @12:22PM (#9278679) Homepage
    Given the way it seems DRM and such have been going recently, I have a different view of where home computers will be in the future...

    In the past, Internet Terminals were heralded as the wave of the future. This was because of their convenience, ease of use, etc. I see them now as the wave of the future because they don't store content. They are simply a gateway into someone else's content. Once the RIAA and MPAA have finished their buyout of the legislative and legal system, new regulations will require that computers not store any information. That way the big guys don't have to worry about the little guy sharing music or downloading the latest episode of Law & Order - Pothole Repair Crew for free. To listen to music, plug in your credit-card and connect to their services. Only $5.99 for an hour's worth of music. Want to play the latest game? Only $2.99 to plug into the Doom 5 server and play.

    This can even extend to the workplace. Microsoft Office Services. For $15,000 per year, you can get a 10 connection license to allow your employees to work on presentations, software requirements, etc. Then for only $150,000 per year, two of your developers can connect to Microsoft Development Studio Services and work on that software you need written. Then for the low-low price of $200,000 per year, Microsoft will go ahead and host the software you wrote. Imagine, you don't have to worry about backups, and you'll never need to worry about the BSA pounding down your door.

    All that needs to happen is widespread acceptance and availability of broadband. This is sure to have happened in 30 years.

    Think this can't happen? I guess we'll have to wait and see.

  • Double transistors doesn't equate with double speed like Nielsen makes it to believe. This is a big error for someone of his stature. I respect the guy as far as usability is concerned but he talks out of his ass in this article.
  • by CycleMan ( 638982 ) on Friday May 28, 2004 @12:25PM (#9278697)
    Okay, petabytes and exabytes sound interesting from a "Wow - technology" perspective, but why do I care? Will they improve how we live our lives, increase the amount of face time we spend with each other, decrease hunger and poverty, elevate the human spirit or cure race relations?

    That amount of computer storage probably won't be enough to help men understand women. =)

    I'm growing in favour of technology being just a little more clunky and difficult so that people will move their heads away from the monitor once in a while - and not just to make new PC mods.

  • 1. Finally!!!! Computers will wreck a nice beach... er recognize speech.
    2. EUI. Emersive User Interface, perhaps something like Minority Report or The Matix. I mean manipulating virtual object in real space, not jacking in.
    3. Cyrrano Virus proof of concept hits on your girlfriend (or mom, in the case of hopeless nerds)
    4. Indian Tech Giant "Bollysoft" is investigated for anti-competative practices, cuts cost by farming out tech support to the up and coming Afganistan tech industry.
    5. Computers finally translate dolphin speech. Turns out it's mostly fart jokes and machismo pick up lines. And, they are very interested in our culture's "beer" and "ESPN"
    6. PentiumXI prosessor requires a 220 volt electric connection, liquid oxygen cooling. Intel investigates opening small wormholes between processors and surface of Jupiter moon Europa for joint processor cooling/planet heating terraforming project.
    7. That's right, virtual 3D holographic blue screen of death.
  • by kabocox ( 199019 ) on Friday May 28, 2004 @12:28PM (#9278734)
    I don't want to ever have to see another computer again. Period. I want a watch, eye and ear improvements. I want my eye improvement to be able to give me 20/20 vision. I want it to record everything that I've ever experienced and beable to display anything to the same level. The ear implants should be able to record both ears worth of audio in the full human hearing range and store it. It should be able to reproduce almost any sound that the human ear can perceive. The watch should be where everything is stored, the CPU where everything is processed, and is easily removable and replacable when we figure out how to make smaller, faster, and cheaper watches. Oh, the watch should tell time and GPS as well.

    The next big thing will be the touch interface.
  • "If you'd like Calculon to double-check his tedious paperwork, press 2." *BEEP*, (fry presses 1) you have selected option 2. No I didn't! I'm 99% sure you did.

    Ah....interactive movies.
  • I always thought the holodeck represented what they thought would be the ultimate in video game technology 400 years into the future. However there was a well known problem with it, which in later series' acquired the name 'holoadditcion'. Basically, it created completely immersive worlds that were completely real to all the senses, down to the finesse of actual replicated matter for some elements. It was something so powerful it made Evercrack addiction look like the equivalent of a jones for skittles v
  • Sounds like Holodecks to me. Can anyone tell me when they were invented (according to Star Trek)?

    Got to admit, for many things it's hard to think of a more perfect interface than a holodeck simulation.

  • I probably still will be using Dosbox, UAE, MESS and MAME to play some 80s games every now and then. :)
  • by theodp ( 442580 ) on Friday May 28, 2004 @12:45PM (#9278933)
    People who started using computers after the PC revolution have no idea about the miserable user experience that centralised computers imposed.

    Check out Plato [platopeople.com]. Pre-1975 bitmapped graphics, audio and photographic quality images, instant messaging, near zero latency multiplayer network gaming, distance learning, groupware, newsgroups, online newspapers, animated email, network delivery of music, client/server computing, touch screen interfaces, flat-panel displays, and multimedia that were delivered across a worldwide educational network with satellite and cable communications using CDC mainframes.
  • by hak1du ( 761835 ) on Friday May 28, 2004 @12:46PM (#9278944) Journal
    No matter how you author or present a story, people will still experience it in some linear order. Authors spend a lot of time worrying that the order a reader actually gets is interesting and makes sense; that's what a big part of good writing is all about. Linearity is something that is an added value for a story, not a restriction.

    Many games may well be "non-linear" (i.e., have many different paths), but that's not to make them more engaging, it's to make them more replayable. And there will also continue to be many highly linear games that present a single, well-designed storyline as part of the game, although hopefully authors will find ways of making the interaction with the storyline more natural than "you must find switch A and trigger it to continue".
  • My guess (Score:3, Insightful)

    by Pendersempai ( 625351 ) on Friday May 28, 2004 @12:51PM (#9279003)
    Processing and storage will be recentralized.

    Imagine: a couple hundred corporations around the united states each have dedicated facilities to process and/or store information. Other companies network these commodities to cohere the aspects of computing. These companies could specialize in redundancy/dependability, power, or affordability. You subscribe to one of these companies' services, and they give you a username and password. Now, you can use any compatible I/O device, log in, and you're at your (virtual) computer.

    These I/O devices could be anything from a current monitor/keyboard/mouse desk setup to a wireless touchscreen you carry around with you (assuming pervasive WiFi). Even if it's a palmtop, it'll have all the processing power and storage of your desktop setup. So a gameboy would be just as powerful as a desktop system, and a no-moving-parts $10 MP3 player could access your entire hard drive. The virtual computer recognizes which device you're using to access it, and adopts its interface accordingly.

    But the I/O devices could start posing as appliances: your kitchen telephone AND your cell phone are just computer terminals. Your coffee maker takes commands from the virtual computer: once you've set your alarm clock (another computer I/O device), your coffee maker knows when to start preparing a morning pot of coffee.

    I don't even care to speculate what this model would do to our legal battles over IP and DRM; I think 30 years is far enough in the future that the technology will remake the legality beyond recognition.

    The barriers to this model of computing are bandwidth and (to a lesser extent) wireless permittivity. Many of the gains could be recognized even with only wired technology -- it's just that the alarm clock, coffee maker, and mp3 player would have to jack in to a wall port somewhere.
  • in 30 years... (Score:4, Insightful)

    by hak1du ( 761835 ) on Friday May 28, 2004 @01:02PM (#9279123) Journal
    I don't need Nielsen to tell me that computers will be faster and displays will be bigger (although it is likely that Moore's law will have fallen by then).

    Nielssen seems to be saying that computers will be used largely the same way they are being used today, with some obvious tweaks. While computers have gotten faster, fundamentally, we have made little progress in how we interact with them over the last 30 years (Smalltalk and the Alto were being developed in the 1970s and contained most of the paradigms that the most advanced commercial desktops are using today), and Nielssen is basically saying that not much will change over the next 30 years either. That may excite him, since it allows him to continue to peddle his user interface incrementalism, but, frankly, I find it depressing.

    One thing is certain: in 30 years, we will still have self-appointed "gurus" that make a name and a business for themselves by repeating populist techno-babble and buzzwords, but without having any real insight or vision. That has nothing to do with computers, it is just human nature, and that won't change.

"The medium is the message." -- Marshall McLuhan

Working...