Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology

The Computer of 2010 241

nostriluu writes " With the assistance of award-winning firm frogdesign (the geniuses behind the look of the early Apple and many of today's supercomputers and workstations), Forbes ASAP has designed and built (virtually, of course) the computer of 2010."
This discussion has been archived. No new comments can be posted.

The Computer of 2010

Comments Filter:
  • If memory serves correctly. Didn't someone have an input device that connected to your fingers? Moving your fingers around in certain ways to communicate.
  • After all, who else makes a computer's looks its main selling point?

    Aside from the design of a pretty box to put the computer in, this article might have gotten 3rd place in a local Junior Highschool science fair.

    It's a set of extrapolative predictions that could have been put together by a layman in a couple of hours of seaching the internet. This falls short of what most of us here could probably just sit down and type out without doing any research at all.

    For example, there are no guesses about what specialized coprocessors will be the rage in 2010. Will 3D be the big thing, like now, or will acceleration for certain AI functions be the cool off-CPU gadget? Will we still think a big specialized FPU is a big deal, or will we just have a whole pile of small, parallel integer units?

    This is the interesting kind of question about future computers. We know they'll go faster, use less power, and store more data, and we can put them in any damn box we please - that kind of speculation is as bootless as it gets.

    ---
    Despite rumors to the contrary, I am not a turnip.
  • by xtal ( 49134 ) on Tuesday August 22, 2000 @05:53PM (#835039)

    I don't know why everyone thinks we want to talk to computers. I want to talk to my computer about as bleeping much as I want to talk to my television. I can't talk 100WPM but I can get close to that on a keyboard - and I don't know why you'd want to change that. Even thought recognition would be a pain in the ass. I can type almost without thinking about it - which might explain some of my posts, ha ha, but surely you must know what I mean; Thoughts flow easily to keyboards that might not to voice. Maybe that's conditioning, but writing down thoughts is something that goes back for all of recorded history and I think it's more than just me.

    Computers of the future will be optical. They'll run at 100's of Ghz. They'll have stupid huge hard drives. Hell, they might even think. But you won't be talking to them - because it's plain not efficient compared to other input techniques, like computers. Do you know how sore your vocal chords would be after dictating all day?

    Arrggh. That's my rant for the day.

  • I've no idea what the level of insight usually is at Forbes ASAP, but this article seems to be taking the idea of prediction back to the same era as the vehicle-tubed, silver-suited, plastic b-movie period of the 50s.

    While some of the "details" appear to be semi-plausible extensions of current technology-in-progress (there's some holographic storage in there, and it sounds like there's a bit of work being done on optical connections), most come across as partially fanciful, attention-grabbing fictions with a vague or shortsighted basing in reality, but with no real reason for being there apart from they're different to what we have now.

    For instance, a lack of keyboard is a ridiculous idea. Perhaps it might work for simple dictation processes, but that assumes that there will be some device/method that will be faster for navigation (I probably use my keyboard more than my mouse to get around screen) and for non-dictionary input.

    Other "advancements" are more in tune with the author's desire for the PC to become a fashion accessory, rather than a practical tool. "Digital Butler"? Come on... While there is certainly a (growing) market for this, the majority of sales will still (yes, even in the future...) be for the purposes of functionality. And for functionality, one needs... practicality!

    Further, while it may look good, it's also been designed to be very general purpose - plug it into this wall/that desk/an eyepiece. Surely the author could see that separate appliances (PDAs, desktop terminals, servers) is the way things are going, rather than having a single versatile unit acting as all things?

    Wildly inaccurate. I would hope.

  • And adding to that, this hallucination isn't exactly mindblowing: If I am stuck with a single terabyte of storage in 2010 that will be a true showstopper. Even simple extrapolation gets us 100 TB ten years from now. And there are more insufficiencies: Why is it that I have to plug my computer into the wall at home to make my house come to life? If there's anything obvious about the future then it's networks everywhere connecting computers everywhere. RAM doesn't match harddisk capacity. 256 GB of holographic RAM and only 1 TB of harddisk space?
  • Why wouldn't this thing be a wearable stereo-optic widget instead?

    Also, I don't want my whole house on one computer, I wan't lots of embedded devices that talk to each other using _very_ simple easy to secure protocols. That way viruses don't crank my thermostat up to ultra bake, close all windows, and flick the lights on and off until I have a dang seizure.
  • I don't know if I'd be inclined towards speech recognition even then. I -like- typing. (except when my wrists decide they're pissed about having been fractured years ago. Then I'd surrender half my organs for high quality speech recognition)
  • Yes, even his discription of the desk turning into a giant touch-sensitive computer screen is from the movie (Dilenger's desk in "real life" is like that) .. when you need to type something, the screen displays a simulated keyboard with touch sensitive buttons.

    I didn't think anyone remembered that movie .. it was one of my favorites as a kid.

  • i especially like the idea of the eff or 2600 walking to their desks and using biometrics to login to their computers... uhm, no. Talk about incredibly invasive, next they'll have those blood samplers like in Gattica.

    FYI, the original Lost in Space took place in 1998. Danger Will Robinson, Danger.
  • ...until someone develops a programming language specifically suited for voice recognition...
    already been done - it's called COBOL. while not specifically designed for dictation, it sure looks like it was.

    disclaimer: i haven't touched or even looked at cobol since they tried to teach it to me back in the dark ages, and a misspelling of the word environment caused the compiler to spit out two errors for each line of source code...
    things may have changed since then.
  • You could see it, of course.

    By 2010, most "computers" will be next to invisible as they will be a natural part of the objects in the home.

    The most computer like object to be seen will be a thin magazine sized color display with a touch sensitive surface. These will be dirt cheap, found everywhere and comunicate via IR or wireless IP. Somewhere in the home will be a box with disk storage and a Ip connection to the external world (via cable or phone.) CD, DVD etc players will be freestanding as now -- your TV or HIFi will access them as network devices.

    All will run Linux kernels :-)

  • I havent read this article yet. but my guess is, it'll be just like 2001: a space odyssey. None of that shit came true!
  • "Delete all games that I have not played in the last 5 months and then defragment my hard drive." and all other examples in the parent post aren't really about voice recognition. Sure, you can use voice recognition to get this sentence in the computer, but then it will be just that: a sentence, nothing more. What you really need to make this work is to have the computer understand what the meaning of those sentences is, and that, imho, is way more difficult. Then you can use these kind of commands. And whether you use voice recognition or not is trivial. I'd be just as happy to type such commands in a box somewhere.
  • by Anonymous Coward
    Oh yeah, and no one will catch a disease sharing a touch screen, right? Visited a public ATM recently?
  • Yeah, that's what I care about. The single most important thing about the computer of 2010 is that it looks like a flying saucer. Oh, sure, we'll toss in a reference to optoelectronics to show that we're hip to technological issues as well as artsy stuff.

    Duh.

    One form of the 2010 machine will be a tiny watch or pendant running Linux 4.8.16. But another will be a clunky tower, just like today, because the bigger the package, the more you can put inside. I doubt there'll be a place for their inconvenient, unstackable design.

    Technologically, maybe it won't even have a hard disk. Maybe it'll use optics, maybe something else. The only thing I care about is that it'll be big (storage-wise) and fast.

    As to the "swoopy" design, just check out all those 50's-era predictions of the future. Yeah, it'll look like a frisbee -- and no doubt I'll be wearing a silvery one-piece jumpsuit.

  • Mr. Victor Borge had this one nailed back in the '70s with his phonetic pronunciation [www.kor.dk]
  • Look, if you are going to claim you are trying to predict the future, have some freaking content! This article had nothing about how we would get from here to there, nothing about what advances would make this machine possible (other than vague platitudes) and nothing about how this would fit into the world it lived in. To the authors of this article: Go read Heinlein, or Niven, or Asimov, or Clarke, or any other decent futurist before operating your keyboard! (because I seriously doubt any professional journalist would want to dictate his stories to a speech recognition system.)
  • uh...Windows 2006 or maybe even 2007?
  • IMHO, it is going away. When all the devices you use can talk to each other over the network (where network = Bluetooth LAN, Internet, whatever), SneakerNet becomes unnecessary.

    I disagree. There are lots of good reasons why removable storage is going to be around:

    • Secure file storage. Have data you need to keep private? Put it on a removable and put it in a safe deposit box or something. Or maybe you have a machine that, for security, can't be connected to a network.
    • Too fat for the pipe. I've dealt with jobs that require the transfer of files totaling over 1GB to another office or service provider. Given that we don't seem to be getting close to the ability to transfer amounts of data this size quickly anytime soon, it's still faster to have someone stop by and pick up the discs/cartridges/whatever.
    • Is everybody going to be l33t in the future? Will everybody who has a computer in the future have the best/fastest/wireless connection? It's not a matter of technology but more one of price and affordability for not just the well-paid programmers with the custom boxen but also those who can only afford an eMachine or some such machine.
    I wouldn't mind being able to do away with removable storage but IMHO there are too many good reasons why we should keep it around
  • ...optoelectronics, another buzzword we encourage you to start using immediately.

    Optoelectronics? This is one buzzword which will never catch on. I know nothin about optics or electronics, I'm just a programmer, but it seems obvious that "Optronics" is the clear choice among buzzwords for this emerging field. I say shun the (soon to be depracated) term Optoelectronics; adopt the much more advanced term, Optronics!

  • by Tony Hammitt ( 73675 ) on Tuesday August 22, 2000 @08:12PM (#835057)
    1. You show me a way to enter C code with a voice system and _then_ I'll throw out my keyboard. I could just see it: "up, up, up, left brace..." Screw that.

    2. Processors don't have to spend 2/3 of their time waiting around for data. Real ones at least. I have a 533MHz alpha that does 980 MFLOPs, don't tell me it's waiting around most of the time.

    3. I doubt that anyone will want to use Lithium batteries in ten years because fuel cells will have been out for 8 years.

    4. If we have a quarter terabyte of main magnetic memory, what is the terabyte of optical disk for? It's the only moving part in the computer, what the hell do we need it for? Magnetic memory is static.

    5. What about the network connection? OC-192? Better? I'd personally vote for some type of ATM, especially if we're going to use it for all of our communications. QOS is important, I don't want to lose frames on my movie just because someone calls..

    6. They think that absolute security relies on thumbprints? Give me a break (or break-in). What we really need is to make sure that IPv8 is double-key encrypted at all levels.

    7. There's nothing that they describe that is going to take a Cray to process. What does the typical secretary need with a supercomputer? A voice activated webpad is about enough. Gamers are another story entirely. Immersive VR is going to take more than they've got scheduled anyway.

    In short, the forbes article is a fluff piece.
  • In my opinion, whether there is technical merit to this article or not, it sucks. And the reason is the way it's written. It blows in, drops hints of details like throwing candy at a crowd of children, then moves on to other things. It reads like a sales pitch, and a bad one at that.
  • Did I read correctly? I think it said that it had an optical hard drive and magnetic memory. Isn't that a huge step backwards? If my hard drive was as slow as my CD-ROM, and my RAM was as slow as my hard drive, I wouldn't be reading slashdot right now.
  • what would this input device register if the only finger being moved around in "this certain way" was the middle one?

    It could be a handy shortcut meaning find the root partition in /etc/fstab and run

    fsck /dev/whatever

  • in 10 years, we'll be on 70-140 ghz computers... (if my math skills are up to task :)
    -moose
  • I have a big pile of Popular Mechanics magazines from 1990 to 1996 in my room. I always get a kick out of looking at all those funky concept car shapes that never came to be. A couple of them could have been viable in today's market filled with new Beetles and PT Cruisers.

    FYI, Popular Mechanics had a feature article in 1994 on upcoming "information appliances" that would pervade our lives in just a couple years and allow us access to the "information superhighway" in just a couple years. These devices would come in the form of slimmed-down desktop computers (iPaq, iOpener), hand-held devices (Palm/Visor) and set-top boxes for the TV (TiVo, ReplayTV, WebTV).

  • Actually the only real reason to do away with the keyboard is for mobility. Frankly its a pain to haul around a standard sized keyboard for data entry. For desktop I think the keyboard replacement is a tough sell.

    The moble market however doesn't need voice recognition to get rid of the keyboard, Palm seems to have done a fairly good job of it with their entry system. Other companies could come up with a similar easy to use entry system, or an even better one. Indeed some alternates are already available for the Palm.

    One question is do you really want to use a lot of computing/hardware power for voice recognition, or would you rather use it for something else (assuming software designers can come up with an efficient use for that power).
  • Do you know how sore your vocal chords would be after dictating all day?

    Not to mention that fact that it would IRRITATE THE HELL out of everybody within hearing distance. I mean, COME ON! I would hate to work in a cube, surrounded by a couple dozen people, all talking to their computers (well, we talk to our computers here, but mostly to swear at them). The clicking of a keyboard is pretty easy to ignore, because it's not particularly interesting to listen to, just a bunch of clicks.

    Also, if I had to talk to post all my slashdot comments, I'd probably be fired because people would finally realize how little time I spend working. :P
  • I would think we could come up with something better then biometrics.
    A biometric password is like using the same password everywhere, you know what it is based on and I would think of all things that could be spoofed, it would be somewhat easier. I don't know about you, but everything I touch doesn't hold evidence of my root password.
    What we would really want is a system that can't be hijacked. A authentication system that proved it is me (the living, willing). A self destructive system when given the wrong password would be ok, however, you would probably be killed for using it.
    Maybe a system with a flesh embedded chip (that needs blood circulation), along with a relative security level password. If you are truly being hijacked you could esentially open up a honeypot that contains very little real data but doesn't seem barren.
    Perhaps this is too paranoid, but if so, you probably don't need biometrics either. Something like a fingerprint is just too likely to be damaged or non-repeatable, to be useful.

  • by Frymaster ( 171343 ) on Tuesday August 22, 2000 @05:59PM (#835066) Homepage Journal
    Face it: keyboards are still around after all these years because THEY WORK

    At least you didn't say they worked well. Hey, let's look at some input device "theory" shall we?

    1. You store information in your brain. It's chemical. It's analog.
    2. You want that information in your computer. It's electric. It's digital.
    3. Can it possibly be that the best way to bridge these two qualitative gaps is by wiggling physical limbs over hard plastic nubbins?
    4. Depressingly, the answer appears to be "yes"...
    5. So now it's down to a matter of appendages, nubbins and how you wiggle them (feel free to make porn jokes now)
    6. Alternate WAN (wiggling appendages over nubbins) techs have risen and fallen. The mouse is a popular WAN... but the guy who came up with the mouse idea (you know, whats-his-name who worked at SRI) also had this bizarre "chord playing" device for input as well... sorta like using an one-handed accordian.
    7. Text. We want text input because we're slaves to alphabetic, pseudo-phoenetic written languages.
    8. WAN techs must not only be efficient but be acceptable by people as well...
    9. So, we need a WAN. It must be text-oriented, efficient and have a high acceptance rate among people.
    10. You're answer to that is the keyboard. I work with a guy who turns blue under the eyes without his stylus.... the bottom line is:

    We have WANs now that do the job, but we have seen new WANS (mouse, stylus) come along and there is no reason to think that WAN evolution will stop just because we like our F-keys and Num Lock. In 1983 I would never have imagined a mouse. But it happened.

  • Can you imagine rooms full of cubicles with everyone reciting, "file menu save as see colon backslash my documents backslash annual report dot doc enter"

    I don't even work in cubicles, but I know I would keep my office door closed a lot more often if everyone in the hall was chanting nonsense to their computers all day.

    Bingo Foo

    ---

  • Yeah, and the "The Desktop as Desk Top" part... Remember the evil CEO's desk?
  • >This should make university computer labs interesting, especially for people writing code.

    Obviously it will be nearly impossible to write code without using a keyboard, but most computer users are not writing code: They're sending e-mails, writing papers and looking up information on the Web. With suitably advanced software (10 years is a long time, and in many areas we're already there), this can all be done vocally, but there will always be need for a keyboard.

    My point was more along the lines of "Can you imagine trying to think about anything, especially code, in a big room where everyone is busy talking to their computer?" i.e. the noise aspect. I could actually see writing code via voice, especially if you're using higher-level languages with less bizarre syntax--which would likely start being developed once voice recognition became mainstream.

    Heck, you could even write C code with voice, if you had a clever enough interpreter:

    "For I equals zero, array of I dot name not NULL, I plus plus, do printf percent dash twenty-five S space slash slash space percent seven D endstring comma array of I dot name comma lookup of array of I dot value, endblock."

    Okay, I take that back... I'd rather do C with a keyboard. (-:

  • How 'bout a portable with a molecular memory and a completely optical circuit, including the chips. Lumenon has the chips in question already, and you all know the quantum trip. That means that you would not only be able to communicate to your coffee machine and brother-in-law by annonymous E- mail at the speed of the dual death & life state of Schroedinger's Cat, but you could do it in the dark, typing on a transparent plastic lap-top while tripping on the play of light in the circuits.
  • <deep voice> Cars... I was promised flying cars. Where are my flying cars? </deep voice>
  • But above all, optoelectronic computing is faster than what's available today. How fast? In a decade, we believe, you will be able to buy at your local computer shop the equivalent of today's supercomputers.

    I hate to break their foward-looking uber-geek bubble, but isn't any one of the Apple G4 [apple.com] considered a "supercomputer" by today's standards? One gigaflop is the cutoff, right? The G4 met it, right? So we can buy "today's" supercomputer TODAY, right???

  • In the long-gone days (1980) of the 80286...

    An 80286 in 1980? *snort* Right, and I had an Apple ][ in 1970.
    --

  • The disk will be ... a spinning, transparent plastic platter with a writing laser on one side and reading laser on the other...

    I like the two-sided laser idea, and the application of holography (which might enable you to exploit the thickness of the disk to store a few layers of bits rather than just one). But will we still be cursed with moving mechanical parts (like rotating media)??

    chips that use silicon to switch but optics to communicate... Instantaneous on-chip optical communication

    It sounds like they plan to replace any sufficiently long signal paths with on-chip optical waveguides, requiring an LED at one end and a phototransistor at the other. Putting LEDs on the same die with transistors is problematic today, but presumably they can solve that problem with some new LED chemistry. Next they need to be able to build optical waveguides into a die, and insulate them from one another (so they need transparent and opaque materials that can be built up using photolithography). I dunno if such stuff exists, they seem pretty confident about it.

    One of the biggest advantages of photonic circuitry is an extremely low power requirement.

    This is supposed to be a consequence of packing the die with LEDs and phototransistors, rather than charging up the RC delays of long signal lines? Hmm, maybe. The LEDs might not need much light to throw a bit a few microns.

  • How many times have we heard that by the year 2000 we'd be driving space cars and have robot maids a la Jetsons? Come on....

    You mean you didn't get your robot maid at the New Year's Eve party like everyone else? Well, that explains it ... you did get the neon jump suit, though, right?

  • What's the point of putting harddisk and cpu together if you have lightning fast communications: right! there's no point. You might as well separate them.

    I think this article takes the pc today and wonders what would happen if all of the components in the PC were improved and (surprise!) you get a very fast version of the PC. What this article does not do is wonder how we would build computers if we could connect the parts more efficiently. The PC I had six years ago was more than adequate to operate the fridge, microwave, tv and light in my house. The only problem was that it couldn't communicate with those things out of the box. But what if the lightbulp was bluetooth enabled? It might someday become feasible to do so, what are we going to then? That's what's interesting. I don't think I'll ever dictate an email to my PC, typing is much faster than speaking. I don't care if my wordprocessor runs at 25 Mhz or at 25 Thz. I use my home PC for gaming, browsing and typing (in that order). Only the first type of use requires the kind of PC I have on my desk. This is not going to change. I'll probably be playing cooler games in 10 years but what else am I going to do with the PC outlined in the article?

  • I can't wait until 10 years from now, when we all look at this article and laugh our asses off. :)

    :wq!

  • This is _precisely_ what I was thinking of. So does Forbes predict that we'll be able to throw the 2010 PCs at people to de-rez them? Hope not, I'm not all that good at frisbee or jai-alai.

    ObSimpsons: Has anyone here seen Tron?
  • by blaine ( 16929 ) on Tuesday August 22, 2000 @05:34PM (#835079)
    I mean, really, why do people want to do away with keyboards?

    Keyboards are quick and efficient. This article says that you'll instead use a 3D interface, and simply touch with your hands what you want to do.

    Is it me, or does that sound rather slow and clunky? Do I really want to be waving my arms around just to open a damn program?

    Face it: keyboards are still around after all these years because THEY WORK. They might not look futuristic or uber-high tech, but THEY WORK.
  • But I predict that within 100 years computers will be twice as powerful, 10,000 times larger, and so expensive that only the five richest kings in Europe will own them.
  • I can't talk 100WPM but I can get close to that on a keyboard

    Actually, you probably talk well over 100 WPM. Time yourself: I just checked myself and reading fairly technical text ("Experiments in Physical Chemistry", Shoemaker et al.) at my normal speaking speed I got about 250 WPM. When excited I easily do 300 WPM, which my students sometimes hate.

    I used to debate in high school and college. I "spread" at about 700 WPM: I knew folks who could easily top 1000WPM. However, I suspect it's going to be many years before voice recognition gets to the point of understanding that.

  • I don't necessarily disagree with this, but of course C is a language designed for the current, keyboard-based, paradigm. A programming language designed for computers with voice-based input would presumably have an entirely different kind of structure.

    And that programming language will be written in C, hammered out on a keyboard.


    Sean

  • can just hear it now - people trying to have cybersex at cybercafe computers...

    Well, it's better than meatsex on the cybercafe computers ... I mean, those keyboards really distract my girlfriend and the guy at the next terminal always complains that he can't hear the sound on the MP3 he's playing when we get into it ...

  • We've already seen an article, here I think, that a bloke with IBM or someone was talking about non-dynamic RAM with low power requirements to run and NO power requirements to maintain state. Now I wouldn't be surprised if in fact it could stand a powered refresh every day or so to offset the effects of random magnetic fields - hell you could put a field detector on the MM and it could open a circuit for a powered refresh every time it thought it was required.

    You don't want a disk in a mobile device, for a simple reason - torque. A fast spinning disk is essentially a gyroscope. While it would keep the device stable for use, the cost is horrific forces on the axles of the disks. Go on, run around with a mobile computer containing and 10K rpm disk and see how long it takes to fail. I'd be surprised if it worked at the end of the day.

    So, clearly the computer of the future will contain two levels of RAM - something designed for performance with whatever power requirements that entails, and a larger bulk designed for stability over speed, which replaces the hard disk. Only it doesn't, because no matter how optimised it is for stability, it's still gonna spank a disk completely for response time and throughput. In 10 years, memory designed for stability without power (or with only a tiny backup battery rated at '12 months backup') is still gonna look quite nice compared to modern RAM for speed and such.

    So you'd have everything you would normally put into cache and main RAM in your fast RAM, and all the contents of your disk, as well as swapped-out memory pages, in your 'slow' RAM.

    To turn your computer off, all you do is flush the DRAM contents to stable RAM. How long's that gonna take? 0.1 secs, tops.

    To turn it on, all you do is a/ read from slow RAM enough to give the user their desktop, and start swapping the relevant parts of the OS into DRAM.

    That removes the power requirements of the disk, which are quite high, removes most of the power requirements of DRAM, as there isn't much (perhaps only 10-20 GB - shockingly low for 2010), and leaves your computer safe to cart round with you. All we need now is a decent display. And as for all you guys worrying about interface. Imagine you strap something around each elbow that detects the nerve impulses bound for your forearms where the muscles that control your fingers are located. Wireless of course, encrypted link to the computer. I'm sure you'll learn to 'want to type' without actually moving your fingers and control your computer that way.

    If you could put such a device on your head, or better still implant it, imagine the "macros" you could use. Your computer is under attack? You think of being protected, and it replaced inetd.conf with a more secure one and hups inetd. and shuts down some other programs. and enables network logging. and performs reverse lookup on all ip's currently talking to you. and traceroutes them. and keeps the results... and and and and and

    Given the utility of such an interface to users of realtime computer systems, no not fighter pilots at all, guv, honest, I'm sure we'll see it or something like it by 2010. Once they exist they'll get cheaper and cheaper (until they fry someone's head, then they'll get expensive for a while til that problem is fixed), and then you'll be able to go down to Sony or Toshiba or GE and request surgery to have a neural link implanted.


    Anyway, I've done the Cyberpunk thing enough.

    Cache-boy

  • Here's why the no keyboard fetish:

    They collect dust and hair and spilled beverages. They eventually break. They make noise when used. If you have a nice wooden desk they mark it. They require either cables, or if using radio or IR, batteries. They come in standard sizes, while hands don't. They contain useless keys. (case in point - the Windows keys) If you have to share them you get to pick up whatever's on the fingers of your co-workers - and in most cases you probably don't want to know what that is.

    There are probably lots mroe bad things about keyboards, but that's enough for now.
  • Hey, you know...the PCs of tommorrow will be very similiar to todays. Here's my prediction:

    Goddamn, I gotta play this game too...

    2010 State of the Industry
    1. Msfts new mouse goes "beyond optical"... it now tracks movement based on the earth's magnetic field. Naysayers point out that Sun produced a similar mouse back in '01, but you had to use a special planet with it...
    2. In a button-adding frenzy, logitech has released the 101-button mouse (wheel, lever, hand crank and ripcord included as well). Ad campaign: "it's a second keyboard... on wheels!"
    3. Logitech's new mouse prompts Wired Magazine to declare "The Keyboard is Dead"
    4. What the hell comes after "pita"? Now we gotta find out!
    5. Transmeta announces the ultimate in software emulation and completely eliminates all physical components in their new chip. Company officials say the zero mass of the chip will reduce shipping costs and inventory overhead... Torvalds admits in an interview that it's basically a Turing Machine with a box...
    6. Windows '09... It's got fins!
    7. Oni released.
    8. moodMac line released, a throwback to post-gen-X 70's nostalgia it changes colour depending on your mood. Features quad G9 processors with repoVec, a sub-processor that actually uses photoshop for you.
    9. Mac releases OS XVII.LXIV.rVII. considers upgrading to Arabic numerals
    10. You're still playing minesweeper?
    11. Compaq releases a "computer so advanced, it's smaller than a dime". Pundits say monitor size is a serious limitation. Ex-VP Lieberman new CEO of Compaq, changes ad campaign to "24x6 nonstop" to keep the sabbath...
    12. Seti@home finds alien life! He's doing 2 units a day on an AMD K21 TweetyBird.
    13. Bill Gates says "640Mb ought to be enough for anybody"

  • by sterno ( 16320 ) on Tuesday August 22, 2000 @06:00PM (#835091) Homepage
    Okay, Forbes comes up with the wonderful computer of the future. It's is a steril pristine, easy to use, consumer friendly, non-toxic happy computer. It has biometrics, optics, and of course no keyboard, because keyboards just aren't hip.

    Now, let's talk about the computer of the future I imagine. First of all it will be a half dissasembled box with various optical cables coming out of it and a little bit of dust gathering on the exposed parts. The processor is of course tweaked in some way as to make it 1.5-2 times as fast if occasionally unstable.

    The computer is hooked up via a wireless VPN to a bunch of my hacker friends all over the world where we share our thoughts, and our music in secrecy. Of course I've got a high bandwidth Internet connection. It's perfect for serving up movies, music, and games, but it's still not quite enough to handle some of the latest technologies (some things never change).

    I've got several of my older computers hooked up on the other end. Sure, they are slow and primitive, but it's fun! Needless to say these are all in a state of semi-disarray, with cables in a giant spaghetti mess on the floor.

    Sure, I've got one of those cool mega-displays that display everything in photographic quality in a screen the size of a desk, but I've got some throw backs. I've of course got a keyboard since those virtual keyboards are cludgy at best. I've got a scrolling LED display I found in a junk yard and managed to hook up to my box. If somebody tries to hack my box a bunch it displays a message on the LED to let me know what's happening.

    Now, that sounds like my dream computer of the future! Maybe it would be nice to have something portable to go with it, but I want a box I can hack and play with.

    ---

  • Did anyone else find the page navigation a little out of sync with the subject?

    Forbes is talking about the computer of 2010 to people whom they believe want or need a link at the bottom of the page to get back to the top. Considering that the Forbes readership is supposed to Have All The Money, I'm a bit worried by this...

    Forbes' next article: "The Scrollbar In 2010", with a sidebar on the marvelous research being done on keyboard shortcuts...

  • Silly article.. Maybe they should do something on cars of the future and how they'll fly and have 18 cup holders. Or how about the TV of the future with 20000 channels. Maybe the robot of the future that will clean your flying car and change channels for you. A space-age oven that you put a food pill in and out pops a complete meal.
  • The Desktop as Desk Top
    In 2010, a "desktop" will be a desk top...in other words, by plugging our computer into an office desk, its top becomes a gigantic computer screen--an interactive photonic display. You won't need a keyboard because files can be opened and closed simply by touching and dragging with your finger. And for those throwbacks who must have a keyboard, we've supplied that as well.

    A virtual keyboard can be momentarily created on the tabletop, only to disappear when no longer needed. Now you see it, now you don't.


    This has got to be the most idiotic thing that I have ever heard of in my entire friggin' life. Think about this: you sit down and plug your comp into your desk and you proceed to work for 8 hours bent of your desk.. I don't know about you people, but I have three monitors hooked up to my machine and at the end of the day..12-14 hour days at that.. I have a bastard of a crick in my neck. If I had to hunch over all day, not only would my neck hurt, but so would my back.. and as a added incentive, my woman could call me quasimodo from then on.

    No thanks.

    Rami
    --
  • by Veteran ( 203989 ) on Wednesday August 23, 2000 @03:40AM (#835110)
    If you had asked Forbes in 1990 what the computer of 2000 would look like they wouldn't have been very close. They might have gotten the processor speed and memory size correct - but that would have been about all.

    There would have been no way that they would have predicted the importance of the Internet - or something like Slashdot. In 1990 the communications capability of computers was only known and appreciated by a very few geeks; most people had local call modem access to bulletin boardsif they had anything. (Please don't post how you had access to the Internet in 1983 - that just proves YOU are a geek and nothing else. Who could an average person have used as an ISP in 1990?)

    In 1990 very few prognosticators would have predicted anything like a noticeable percentage of people running a Unix style operating system. Nor would they have predicted anything like Windows 2000 or an iMac.

    One of the most interesting things about this article is that they had almost nothing to say about the - external to your house - communication capability of the machine. I suspect this will be one of the most important aspects of that machine.

    One of the reasons that I bought OS/2 Warp 4 was the voice recognition capability built into the OS. I wound up using it very little. Not because it didn't work, it did. The reason I didn't use it much was that in order to activate it I had to say the word 'desktop'. For me at least 'desktop' is a VERY difficult word to pronounce properly. The 'k' sound at the end of one syllable followed by the 't' sound at the start of the next is just tough to say. When I thought about it I realized that I pronounced it 'destop' as do many of the people who say it in normal speech. The computer didn't know what a 'destop' was.

    'Desktop' is a minor stumbling block, but it is the sort of thing that keeps voice recognition from being utilized as much as it could be. One of the keys to a useful voice command computer is to use words in the command structure that people can pronounce.

    There is also a slight misconception in the article; the good thing about optical communication between computer subsections is not the speed of light vs the speed of electrical pulses - the good thing is that optical communications can switch on and off faster; you can obtain higher frequencies.

    The article also gets it a little wrong when it blames the electrical interconnect for causing delays in main memory fetching. The problem is that dram speeds have only grown about 10 times faster since the days of the Z80 while processor clock speeds are up by a factor of 250 or so. Unless there is a real breakthrough in memory speeds that trend will continue.

  • To me, it seems as thought the author of this article but absolutely no research into designing his "Computer of 2010." It was almost as though he knows nothing at all about computers and found some old tech magazines and science fiction articles and combined them into a semi-realistic computer.

    One of the things that bothered me the most the the appearance of the hardware itself. The author obviously thinks that the computer of 2010 is supposed to look like some kind of ugly disc that plugs into both the house and the desk. However, that is an entirely useless feature for the desktop of the future. If one truly wants his house to be computer operated, than it will be done with devices specifically designed to do so. Much like how cars have special onboard navigation computers to help the driver get around. Though it is possible to hook up a laptop, it certainly won't do the job as well as the onboard computer, nor will it be using the computer itself to its fullest extent. In other words, the technology has existed for over 20 years now, it's just that it is either too expensive or not interesting for the common Joe to go out and buy one.

    But lets pretend that he didn't say that stupid thing about the house. Lets move on to the stupid things he said about the desk. You will plug this little module into your desk? Why? What advantages does this offer. What if you want to use this computer on the road? What if you don't have access to a desk? Well then, this idea becomes really retarded. Wouldn't it be easier to just carry around a laptop and hook it into a dock? That's basically what the guy "invented" in his little made-up story.

    But let's move past this dock and look at how the thing actually works. You will have some kind of desk that is actually a computer. You will plug it in, wave your arms around and drag your fingers around it? If someone walks into your cubicle, you will look incredibly stupid. Why get rid of the mouse and keyboard when they are such great tools. Why have a magically disappearing/reappearing keyboard. Wouldn't it be a LOT cheaper to have a regular keyboard? The whole interface is retarded. And let's not forget about the cost of this fantasy computer. It costs a fortune to get a 15" LCD screen. I don't think the price is going to come down enough in the future for us to have desk-sized 3D touchscreen LCD. Even if it were, I wouldn't want it built into my desk. I'd want it the way they're dishing them out right now. A little stand but a big screen. A 30" LCD screen with 1600x1200+ resolutions would be much better than what this guy proposes.

    Then there's his idea of security. These ideas won't take 10 years to implement. They're perfectly available now. The only reason they haven't caught on yet is because its too much effort for something that can be handled just as easily as a 10-letter password. It seems as though this guy was told to write up a story about the "world's most expensive"/"fantasy" computer.

    Once we get past the terrible ideas for user interfaces, we get to terrible techinical rantings by the author about the hardware. It almost seems as though he was paid by different manufacturers to point out their names in his article as a form of advertisements. All the technologies he talks about have been known about for a long time and pretends that when IBM says 5-10 years, that means it will automatically be put into all computers in that time. But I think we all know that none of that is true.

    The author seems to have some misconceptions about the way hardware works, and what he says about RAM makes almost no sense at all. All in all, this article seems like a hack. I think its interesting that they put it on slashdot because it gives real geeks the opportunity to poke holes into it and give the rest of the community a place to think about what computers will REALLY be like in 10 years.
  • One fact that I've always found interesting is the incredible resistance that users have to changing the qwerty keyboard. I realize that there are differing opinions, but when I tried using a Dvorak layout in college, it improved my typing speed considerably. Of course, even though I was willing to relearn the keyboard layout, I quickly gave it up when I realized it would be too confusing to go back and forth, and I was going to have to use other computers than my own home PC, which would all have QWERTY.

    (Since then, I've gotten fast enough on the QWERTY that I think there might be some truth to the theory that QWERTY can be just as fast as DVORAK. But I guess that's like trying to figure out how many licks it takes to get to the tootsie-roll center of a tootsie-pop. "The world may never know.")

    Another example is the VITALY keyboard, which is a keyboard layout for palm pilots that is optimized for one-handed stylus hunt-and-peck speed. It's a great idea, and everyone I've heard who's tried it claims a huge increase in speed and accuracy. Despite this, competing products with a qwerty layout are selling extremely well (I think).

    Since users are so incredibly loyal to the old familiar QWERTY keyboard, I am pretty confident it will still be the primary input device in 2010.

  • This has got to be the biggest piece of horse hockey my brain has ever played with. The authors of this article have obviously not been paying attention to trends in computing.

    Most of the technologies that they mention are in the theoretical stage at this point, and as we all know, most theoretical technologies are press fluff. 5 years ago I remember hearing about "Ion Drives" that would be able to write a GB of data to a square inch by changing the electrical signal of individual molecules. It was an optical technology etc........ Where is it?? Still in my May '95 copy of Wired apparently.

    It is a well known fact that Academia has a cute tendency to announce technologies that will be available "in a few years" knowing full well that they will never materialize. Hell, we're still waiting for Rambus and Sapphire chips aren't we??

    Also, the computer market is moving more toward embedded computing and small "appliances" like wireless web-pads. Not the monolithic beastie presented here.

    And the idea that the "Biometric horah-doodah" will make my computer infinitely secure?? Yeah, when the Slashdot community has been lobotomized...

    And I can't see my employer shelling out for the future desk they write about either. The f***er won't even get me a separate phone line for Bhudda's sake.

    This might be the computer of 2525, better yet, the computer of 2050, but even then I doubt it. Most likely this is just the unfortunate side effect of an acid flashback.

    (Besides, I have this scary vision of everybody in my office talking in C code at once and me screaming across the room, "Shut up! You're screwing with my syntax!") But in ten years apparently programming will be something you do in plain English. (Ha, ha, ha.... They said that in 1980 about 1990...)

    At least a cow leaves behind something solid, powerful, and nutritious for geese. Forbes has simply contributed to landfill... But hey, mental mastrubation is almost as fun and doesn't leave your arm all tired...

    ~Hammy

    "The 486 processor is so powerful it is doubtful that it will ever be used in anything other than high end servers." -Byte Magazine, October 1991
  • The last thing I need is a computer that looks like a frisbee

    When I want to use it the dog will have it in the backyard waiting to play catch.

  • To build our new fast cache, we'll need to get rid of the inefficiencies of today's product, which requires the computer to constantly refresh it,... The inefficiencies in cache are so bad, in fact, that once you know the speed of your cache you can assume that its real-world performance will be about a third of that--the missing two-thirds being sacrificed to refresh cycles.
    Isn't cache SRAM (i.e. static RAM)? So it doesn't need its charge refreshed periodically, unlike DRAM (i.e. main memory)? From what I understand, SRAM is currently so bulky and expensive that it would be totally uneconomical to completely replace DRAM with it - but it SRAM does have a number of advantages - lower power consumption (no refreshes), and faster performance.

    Then again, maybe I'm smoking crack. Can someone back me up or correct me?

    While I'm on the subject...

    we'll hitch it directly to the CPU with a multiplexed optical bridge
    Wouldn't it be faster to incorporate the cache on-die, like with Celeron As?

    Holographic memory is three-dimensional by nature
    Uh, why's that? What makes 'holographic memory' any different from regular memory? (I don't think they're wrong, I just want more info).

  • After an extensive look at this computers specs, I am confident when I say, "Sign me up!". I mean look at that thing:

    Hard Disc
    *FAST* memory!
    CPU
    Power supply
    and to top it all off, MAIN RAM.

    Who would have thought, that by only 2010 we're going to be seeing computers, "[that], believe it or not, [are] about the size of a Frisbee". Time to throw this old .75x2.5x2.5 footer out the window (About the size of half the available space on any desktop in the world). Time to get me a PC that can *really* fly!

    The best part of all, they've incorporated 20th century "The Clapper" technology, for us stingy throwbacks who are scared of product ideas that are actually new!

    'Plug it into the wall with a magnetic clamp and watch as our home comes to life. In essence, the computer becomes the operating system for our house, and our house, in turn, knows our habits and responds to our needs. ("Brew coffee at 7, play Beethoven the moment the front door opens, and tell me when I'm low on milk.")'

    Someone weld a misty-mate on the side of one of these suckers, and I'll drop my other testicle!

  • > in 10 years, we'll be on 70-140 ghz computer

    I get 101 GHz, assuming 1.0 now and doubling every 18 months.

    BTW, there was a story elsewhere earlier today where Intel was bragging about Williamette running at 4.0 GHz in 2004. That's right on for the traditional version of "doubles in speed every two years", but industry has been doing better than that for 10 years (+/-) now. If 4G is the best they can offer, they'll be well down the road to bankruptcy, since AMD should be at 6G by then.

    --
  • Because it's small (about the size of a Frisbee) and because it has the power of today's supercomputer, the 2010 PC will become the repository of information covering every aspect of our daily life. Our computer, untethered and unfettered by wires and electrical outlets, becomes something of a key that unlocks the safety deposit box of our lives.
    "You will each receive an identity disc. Everything you do or learn will be imprinted on this disc. If you lose your disc, or fail to follow commands, you will be subject to immediate deresolution. That will be all." -- Command Program Sark
  • The rate and trend we're on, it's likely to be a monolithic box with a single fiber-bundle input. Chock full of DSPs and reprogrammable chips - like the Crusoe; except we don't get to go inside.

    It's mounted on a rack in the closet, and the cabling goes all throughout the house. Better yet, BlueTooth.

    Any component you want to add can be plugged in anywhere. A new flat-screen TV is your monitor, as is the PDA in your pocket. You speak into the air in any room, and you are obeyed. You buy a new refridgerator, and it's suddenly online. Where you put the keyboard - and there WILL be one - is a matter of decoration more than functionality.

    And it's completely transparent to all, except the technologists - which is as it should be. Just as I don't care to know the exact air/fuel mix in my engine, neither does my mechanic care about his chip-set or the temperature of his CPU.

    People who are not passionate about the tech find it too complex and too intrusive. They want a box they plug in, more easily then a stereo component or VCR. They just want it to work, seamlessly and without requiring them to RTFM.

    The computer of 2010 may be more like a CD changer than anything else. The computer of 2015 will be a freaking LAMP. Seriously... You can cram a whole lot of hardware into those things - all that empty space. Network the thing via power-lines, and to upgrade your processing power, you just buy another lamp, or TV, or Microwave... Or a slot mounted, monolithic box (the size of a VHS tape at most) that you plug-in to a rack in the closet.. But this is where I came in.

    The REAL jabber has the /. user id: 13196

  • Can you imagine trying to think about anything, especially code, in a big room where everyone is busy talking to their computer?

    Agreed, but that doesn't mean it's not useful in a home setting, for instance, or a private office. Naturally office etiquette would prohibit the regular use of such things in shared areas.

    And even if you are coding away at your latest application, I would think it's more efficient to pause for a moment and say, "Computer, when's my next appointment?" than it would be to move out of your development app into a calendar of some sort, and then go back. Even in a public area, this level of occasional voice control is probably acceptable. *shrug*..

    I totally agree, though, keyed input will still be primary for most industrious work, but simple tasks in a more intimate setting would be so much more efficient if they could be done effectively by voice. Just think, you could browse the web, update your calendar, compose a few e-mails while cooking yourself dinner, or cleaning house. After a long day at the office, that level of ease-of-use would be spectacular.

    I'm a big advocate of "behind-the-scenes" computing, where the PC is hidden and unintrusive (and today's paradigm largely unneeded).
  • by plugging our computer into an office desk, its top becomes a gigantic computer screen--an interactive photonic display. You won't need a keyboard because files can be opened and closed simply by touching and dragging with your finger
    Sounds like Dilinger's computer in the cult-movie "Tron".
    Is SF the first inspiration source for engineers?
    (No answer needed)
    Anyway, something really scares me: They still have this need for mono-processor machines with one harddisc, etc.
    I think it would be cooler to just design Lego-like components, each of which would be a tiny computer that could interact with one another like in the good old times of Atari ATW [computingmuseum.com].
    So, instead of paying a huge amount of money to change computer every 6 months (however quick they are you know people will still pay to upgrade them, a friend pertinently compared computers to cars : you want them to work properly but to amaze your neighbours) why wouldn't we pay a few bucks for some more GIPS to fit ? With wireless communication, this would then be tomorrow's computer and I bet my vision is far realistic than ASAP's nice-looking box.
    --
  • The premise of that article is entirely ridiculous. About 2/3rds of the "technologies of the future" have a date of availibility listed as "2010 (With luck!)"

    Hell, I may as well say "With luck, we'll all have robot maids and hovercars like the Jetsons in 2010," because it's about as solid a prediction as the writer of that articles's.

    Assuming all of these technologies are released exactly on-schedule, they will be prohibitively expensive (Ex: Any new processor for about 2 months) and poorly implemented in the software-side (Ex: USB in Windows 95).

    Not to mention the 'appearance' of the computer of the future. Apparently, in 10 years, our computers will be comprised primarily of colorful rectangles and circles. Neat.
  • First of all, I hate dirty optics. This includes any layer of glass that I have to read through. So why they heck do I want to read small text on a sheet of glass with fingerprints all over it??

    The thing about voice interaction, or any other form of poorly defined interaction is ambiguity. Try to build ANY context-free language that understands any plain english and you'll quickly realize that it's not context-free. Even if we were to somehow create an incredibly smart interpreter on the computer-end, typing '1' means exactly that, but speaking "one" could be a different story.

    Besides, typing ';' is so much damn faster than saying "semi-colon". I'd hate to dictate C code.

  • You show me a way to enter C code with a voice system and _then_ I'll throw out my keyboard. I could just see it: "up, up, up, left brace..." Screw that.

    I don't necessarily disagree with this, but of course C is a language designed for the current, keyboard-based, paradigm. A programming language designed for computers with voice-based input would presumably have an entirely different kind of structure.

    I have no idea what that structure might be, but it's interesting to think about, yes?

    In short, the forbes article is a fluff piece.

    Most Forbes articles are.

    -

  • by achurch ( 201270 ) on Tuesday August 22, 2000 @06:26PM (#835158) Homepage

    Let's take a little look at this proposed computer of 2010:

    SECURITY

    The PC will be protected from theft, thanks to an advanced biometric scanner that can recognize your fingerprint.

    Now all they need is biometric scanners on screwdrivers too.

    INTERFACE

    You'll communicate with the PC primarily with your voice...

    This should make university computer labs interesting, especially for people writing code. And how about when your friend Bob pops into your office to say hello:

    ... therefore propose that in order to cut the cost of this project by 35%, all managers oh hi, Bob, what's up? Oh, not much, the usual. Find any new porn sites lately? Yeah, check out www.example.com. Cool, thanks. Anyway, all managers should...

    The Desktop as Desk Top

    In 2010, a "desktop" will be a desk top ... You won't need a keyboard because files can be opened and closed simply by touching and dragging with your finger.

    Be careful when drumming your fingers.

    Your Home

    The PC of 2010 plugs into your home so your house becomes a smart operating system.

    "Open the refrigerator door, HAL." "I'm sorry, Dave, I'm afraid I can't do that."

  • If memory serves correctly. Didn't someone have an input device that connected to your fingers? Moving your fingers around in certain ways to communicate.

    Out of curiosity, what would this input device register if the only finger being moved around in "this certain way" was the middle one?

    =================================
  • Will we stillbe faced with the current decision - go with the flow and deal with crap (windows), or pioneer into better worlds but live without almost all the useful applications wee want (linux,bsd)??

    Operating systems have an incredible half-life. NT is already a decade old or so if I am not mistaken. We will definitely still be kicking around Win2k, linux, and OSX in 2010.

  • I don't know what I'd do without a keyboard. How the heck do you dictate Perl?

    As for the Windows keys, I'm using a keyboard that's old enough not to have them. If they really do break as often as you say, maybe you're just mistreating it. Only keyboards I have that don't work are cheap ones I bought just to steal parts from.

  • And picking up whats on your co-workers fingers: are you a hypochondriac or what?

    Actually, computer keyboards have for a while been the largest vector of cross-patient contamination at hospitals. People disinfect the toilets and occasionally remember to scrub the doorknobs, but people rarely think to try to swab down a keyboard, in part because doing so would be difficult with today's keyboards and their many pits and crevices. A doctor who examines his patients while wearing latex gloves often forgets to remove those same gloves before looking up some record or info on his computer, and those who don't wear gloves often forget to wash their hands first, though they religiously scrub after the whole examination.
  • "The PC will be protected from theft, thanks to an advanced biometric scanner that can recognize your fingerprint. "
    A fingerprint scanner means that you would have to "log in" to your computer before using it and "log out" to protect it from misuse. Unfortunately this was not mentioned in the article. With a voice activated computer wouldn't it just recognise your voice? It seems more likely that the computer of 2010 will scan your retina with an invisible laser. The computer will recognise you if you just glance at it from across the room.

    "A virtual keyboard can be momentarily created on the tabletop, only to disappear when no longer needed. Now you see it, now you don't."
    Personally the thought of typing on a flat solid surface is very uncomfortable. Although a voice activated computer sounds pleasant, some people will always need a REAL keyboard. Programmers and accountants come to mind.

    "With such capacity, you'll be able to store every ounce of information about your life. But beware. If your computer is stolen or destroyed, you might actually start wondering who you are."
    Will the computer of 2010 lack a backup device? Will it not be subscribed to a backup service? With the increases in bandwidth and encryption technology I would be surprised if the concept of the PC will still exist as it does today. In the near future I see the PC being replace by the Terminal. Your data and applications will be stored at your ISP. Your ISP will encrypt and backup your data to numerous locations around the globe. All communications between your terminal and your ISP will be encrypted. In fact I would be surprised if the "computer" didn't become as every day as the telephone. You can pick up a phone anywhere in the world, dial in to your office, and check your voicemail. I can see the computer of 2010 being just a terminal with a slot for your smartcard. Insert your card and the terminal connects to your ISP - all encryption going through the card. If your smartcard is stolen it can be invalidated with a phone call to your ISP. It is also possible with advances in cybernetics that an individual will have an encryption device implanted in the body. The device will communicate with the terminal with radio signals.

    The future holds many advances in technology and personal computing. In that future I cannot see anyone carrying their PC home from work and plugging it into their desk. Nowadays people carry pagers, cellphones, and PDAs - let's not add another thing to the list.

    -Mike_L

  • You're absolutely correct. Except you forgot something. Try adding punctuation, capitals, curly braces, quotes, formatting - telling the cursor where to go - and you'll find your speed goes through the floor and aggrivation goes through the roof. I had a Newton for awhile and this was the main problem - the handwriting recognition was actually really good - the problem was putting the text in the right place on the screen, and that time outweighed any benefit the handwriting bought you.

    Don't believe me? Try programming with ViaVoice. Or doing more than simple dictation, like adding bold or *Removing* bold to a sentence. You'll go insane.

  • by RovingSlug ( 26517 ) on Tuesday August 22, 2000 @10:13PM (#835185)
    I can't talk 100WPM

    Ummm... yes you can. Easily. When was the last time you said, "Mississippi one, Mississippi two, Mississippi three"? Saying that at a moderate pace, each vocalized numeral occurs about once a second. So together, that's two words a second, or 120 WPS. A Mississippi is no small word!

    With that brief anaylsis, for non-technical text, I'd say it's very reasonable to speak at a steady 200-250 WPS. Type that on your keyboard, then we can talk! :p

    Of course, until someone develops a programming language specifically suited for voice recognition, I won't be coding via voice any time soon.

    BTW, that last sentence was 22 words, if you can read it in 6 seconds, that's 220 WPM. Try it. Now say it at half the speed (whoah! that doesn't feel very fast at all) and that's 110 WPM, still above your asserted top typing speed.

    - Cory
  • by TheDullBlade ( 28998 ) on Tuesday August 22, 2000 @06:36PM (#835189)
    The master control is made up of no fewer than 50 redundant processors, each smaller than a millimeter cube and much faster than a modern computer, though they vary somewhat in their contruction, scattered about my reinforced skull. There is no way for all of them to be damaged while I remain alive.

    The workhorse of the system responsible for archiving and heavy processing, a solid-state homogenous massively parallel connection machine about the size of my fist, is tucked away in my chest cavity between my oxygen-storage organ and my 2nd and 3rd backup hearts (cylinders about the size of a thumb).

    The interface can take any form, since the system has full access to all nerves leading into my brain, and has plenty of power to simulate a believable environment; it can be superimposed atop real-world data or it can be fully immersive. The failsafes are carefully trained (in a process taking months of daily feedback training) direct neural connections to the master control, which can be used to cut off any problematic computer functions and reconnect my mind to my body; a spare nervous system is tatooed across my skin so there is no single point of failure. I'd have to be almost fully decapitated to lose control of my body (not likely to occur accidentally, thanks to carbon microfiber reinforcements). I'd certainly be dead before my connections were disabled, unless a very careful surgery was undertaken.

    Connections to other computers can be made by many different forms of electromagnetic transmission, or by tiny electrical currents through contact on any of hundreds of points on my body.

    Power is available through the main storage battery of my body (a distributed system with surprising capacity), but essential functions, such as the master control, can be supported by generation of minute amounts of power from the glucose and oxygen in my blood.

    ---
    Despite rumors to the contrary, I am not a turnip.
  • The disk will be holographic and will somewhat resemble a CD-ROM or DVD. That is, it will be a spinning, transparent plastic platter...

    About the only thing I'm confident to say about handhelds 10 years from now, is that they won't have discs. Discs are cheap, but solid-state memory is almost cheap enough NOW for a portable computer. In 10 years, spinning discs will seem as antiquated as the spinning tape reels that adorn movie-computers.

    I think that in general, the line between "live" and "archived" storage is going to be blurred more and more, in all computers. I don't expect that portables will distinguish between memory and mass storage at all.

  • Hardware will be so cheap and powerful, that we will forgo the dull, mundane materials now used in favor of "optoelectronic substrate". It will be moldeable into various shapes, strong enough to be used as a substitute for drywall or maybe even steel, and it will have any "active properties" you might desire.

    Need a computer? No problem. All you have to do is tap the desk and say "keyboard" (remember, everything is made of this material) and a keyboard and active display will appear on your desk. Any data you enter will be stored permanently on a secure central server that you can access from any item that is made of substrate.

    Lightbulbs and sockets? The dark ages! Just tap the wall, and say "light, medium intensity" and the wall will emenate a soft glow.

    Children will be born and grow up in entire cities where the walls, sidewalks, and cars are made of substrate. Just tap your foot on the sidewalk and say "hopskotch!" and a hopskotch board appears.

    You'll take your daughter to visit the country. She'll tap a rock and nothing will happen. She'll say "how boring".

  • by matman ( 71405 ) on Tuesday August 22, 2000 @07:06PM (#835200)
    I can just hear it now - people trying to have cybersex at cybercafe computers...
  • The article is tech-lite, well they say that current RAM uses magnetism and this hasn't been true since the days of core.

    Overall, I feel that the article is total marketing B.S. ignoring things such as usability and the limits to Moore's law.

    Hand input and 3d is already there, they use it for molecular modelling. Designers will love this feature too but I agree with the comment about the lack of keyboards.

    I do have my doubts that we will still be using Li-ion batteries in ten years. There are other technologies that should be working well by then that offer better energy density and a higher internal resistance (Li-Ion looses charge quickly).

    I'm inclined to agree with your commentator about notebook memory going non-rotating. Spinning a disk of any technology costs power.

    From my own point of view, I see three market slots, there will still be a PDA. It is smaller but very personal and a lot more powerful than anything we have now but it will be combined with a mobile telephone and a GPS. For input, we have voice or pen and for output a small flat-panel or a mini HUD on a pair of glasses. Emphasis on low-power and portability.

    I still see the notebook. There is always a place for a larger and more detailed display and lots more memory which cn fit in a briefcase. Expect it to look like an A4 pad with keyboard/pen and a 3D mouse. Viewing maybe either through the builtin flat-panel display or a min-HUD.

    Actually HUDs are quite interesting technology, if it can follow your pupil. You only need detail where the eye is looking, the rest of the picture can be shown at lower detail.

    Expect enough wireless peripherals to be floating around that we are worrying about the prevailing EM field around your body.

    In the end, the workstation will remain because there are people who don't want to compromise on speed or expansion capability.

    All those guys that Forbes used, was to design a game console.
  • No one has any idea what *anything's* going to be like in 10 years, let alone something changing as much as a computer. I read through this and it sounds like some guy's just making up things. Sure, there's technical background to what they say, but it's about 90% speculation.

    How many times have we heard that by the year 2000 we'd be driving space cars and have robot maids a la Jetsons? Come on....
  • by blaine ( 16929 ) on Tuesday August 22, 2000 @05:43PM (#835217)
    First up, I have RARELY had keyboards break. I mean, GOOD keyboards are practically indestructible. I have an IBM keyboard right now that I've actually spilled an entire glass of water on, which I took apart and dried , and it still works fine.

    If you have a wooden desk, get something to put your keyboard on. Try rubber feet, or just put down some sort of pad between the desk and your keyboard. You could say the same thing about a monitor, or the computer itself.

    Cables... who cares? So its a cable. Most appliances use them. They aren't a big deal.

    There are MANY different types of keyboards. You can probably find one that fits your hands.

    As for useless keys... well, don't buy a keyboard with those keys. They DO exist. Or, make those keys useful by binding them to something you find useful. Nobody said you had to keep the default keybindings.

    And picking up whats on your co-workers fingers: are you a hypochondriac or what? Ever use public transportation? Christ, ever walk down the street ? Ever sit down in a chair in a restaurant? Take a chill pill.
  • IMHO these so called artists should stick with what they seem to be good at. Which, in most cases, is to produce piles of junk and collect money for it. I think this whole evolution is getting way out of hand and I sure hope that they won't get any influence in the design of computers. A computer is supposed to be functional; not a form of decoration.

    We've seen art "evolve" from making paintings and statues to throwing paint on the canvas and calling it a painting up to dumping piles of metal junk on the grass and calling it art (to bad about corrosion) right up to stringing up dead horses in a tree (another exclusive form of "art"). I really don't think we need that crap in the technology sector. While I admire these "artists" for getting paid for doing IMHO absolutely nothing for it I don't feel good about letting them decide how the machine would operate and feel. This is a totally different sector; people need to work with these devices, which takes another form of expertise. Judging by most of the modern "art" I really doubt that most artists are capable of having some consideration for the public.

  • I thought that the computer of 2010 would have as many as 40,000 vacuum tubes, weigh only two tons, and cost only a few hundred thousand dollars. Plus fit in only one medium sized room.

  • They did not mention removable storage.. I would be inerested to see where that is going.

    IMHO, it is going away. When all the devices you use can talk to each other over the network (where network = Bluetooth LAN, Internet, whatever), SneakerNet becomes unnecessary.

  • I'd really like to see a huge advance in non-volatile storage, particularly something that is not fragile, very very small, and has much faster data transfer rates than hard disks. Admittedly, market trends make this unlikely by 2010.
  • Granted, power generation techniques haven't changed much in the last 40 years or so... The method by which we turn the turbines has seen innovation but the fact remains we still are making big magnetic feilds to induce current in wire coils. This doesnt bug me. It is easy and efficient.

    What bugs me is that battery and/or portable power technology has not yet reached anywhere close to a pinnicle in terms of storage or efficiency. They inted to put a Lithium battery in this beast and have it run for two weeks? Great. Waste a bunch of engineering time so you can BEND A CIRCA 1999 BATTERY INTO A DONUT SHAPE?

    What about zinc/air batteries? What about fuel cells? Recharging, please! If this thing requires so little power what about solar? I haven't used a tiny, cheap four funciton calculator that needed batteries in about 6 or 7 years!

    These people are just making noise with buzzwords to sell their design services. There is little vision (as far as technology goes) apparent in their work.

    ~GoRK
  • Like you, I can type at ~100WPM. But it took time to learn how to do that. Years. Wonder how fast I could be dictating words if I had trained that as hard as I did typing. I'm also pretty sure that sore vocal chords would be a reminiscent of old days once you mastered it, just as sore hands from typing when I was still learning to do that...

    Why does everybody demand that voice recognition require little or no training whatsoever? I've been able to move my fingers for years before I could type, and I'll happily accept that it'll be a considerable time before I can dictate with any form of precision and speed. Once we can do that (assuming that the current crop of speech recognition software catches up), *THEN* let's discuss if typing or dictating is faster than the other.

  • OK, it's 2000 and I don't even have a robot dog that can vacuum red plush carpet, or even a manned visit to Mars. So I expect the computer of 2010 will run Windows Naught-Seven, and the revolutionary feature of computers in that year will be leopard-skin cases.
  • Do you actually use a Palm regularly? I do, and some of my co-workers are true artists in terms of grafitti style writing. But they still write VERY slowly compared to their keyboard typing speed. It's just a slow way to enter data...

    Don't get me wrong - I really like using my Palm. But not for hardcore typing. Good for taking notes during meetings. Going over my calendar in the commute train. Catching up with the news that got hotsync'ed while I was busy typing to earn a living.

  • by DrWiggy ( 143807 ) on Tuesday August 22, 2000 @11:22PM (#835258)
    For some reason people are obsessed about guessing our future, and determining how our lives will be affected and what technologies will be in use. The simple truth of the matter though, is we are really, really bad at doing it.

    In the 1950's and 1960's a lot of predictions (many of which you have probably seen yourselves) were made as to how we would live in the year 2000. Now, I don't know about you guys, but I'm not wearing silver jump boots and driving my hover car to work yet, and I still have to cook all by myself. It was foolish to try and predict 40 or 50 years into the future, and we've learnt our lesson. So now, a design team want to get some publicity and attempt to predict where our computers are going to be in 10 years time. There are some flaws though.

    1. Why do I want a computer shaped like a frisbee in the first place? Too big to carry around, too small to make it look like it deserves a space in my office.

    2. All of these technologies are still on drawing boards and we won't see prototypes until 2010 at the earliest. In addition better and more useful technologies are likely to emerge between now and 2010 meaning some of these components may never get the R&D required to develop them.

    3. Desks as screens is the most stupid thing I've ever heard in my life. After a weeks use, the cramp and pain in my neck would probably become unbearable. It's also an "illegal" position for visual display equipment to be placed in under the Health and Safety at Work Act in the UK. But hey, that nasty dude with the supercomputer in Tron had one, right?

    4. More powerful computers doesn't mean smarter computers. The article implies that because this machines will have equivalent power to something like a Cray J90 or somesuch that it will therefore make our environments "smart". So, does that mean they are predicting advancements in artificial intelligence as well? Funny that, because for the last 2 decades people have been saying that in 10 years time we'll have smart machines.

    In short, I think that this is possibly the worst article I have ever seen concerning the future of computers over the next ten years. Seeing as it's completely publicity-generating pie-in-the-sky and not clearly thought out by anybody who understands these issues, why am I even bothering writing this reply? Because I've got nothing better to do? In that case, I think I'll go and design my house of the future (*yawn*).....
  • I think that the future of input devices will not be an all or none soluation, but rather a combination of input devices and methods.

    As it is, there are many things I want to tell my computer to do, for example.

    Save this file then shutdown for the night.

    Saying that one sentence is much faster then waiting for the file to save, then checking to make sure that the OS actualy shuts down OK (granted, more reliable OS's would solve half of this problem.)

    On the other hand, there is no way in the world that you are going to seperate me from my Rhino3d command bar, its a lovly thing. Abbreviations are much easier to type then say, and often times written words form patterns that spoken words do not. Thus the reason that I can type my name two or three times faster then anything else, my fingers have become acostum to typing it. On the other hand I still stutter while saying my name (ironic, eh?)

    I do like the idea of a virutal desktop though, there are many times in life that I just wanted to be able to play around with somthing in real 3d. In addition, it is easier to tell a computer to do somthing like:

    Undo that second to last spell check

    Then it is search through a document for the word that you accidently corrected (spell checkers don't understand most industry buzz words, go figure!)

    Of course there is one major problem with voice reconization technology right now, namly

    THE DAMN THING CAN'T UNDERSTAND A WORD THAT I SAY.

    Sure sure, I've read the manuals "may need 3 or more hours of training in order to work as advertised"

    I've also spent half an hour trying to get the trainer to understand a single word that I am saying.

    The reason?

    Simple.

    I have spent so much time reading, that most of the words that I know the definition of, I have never actualy pronounced!

    Honestly, how often does the word defiled or the phrase uber mage come up in daily conversation?

    The fact is that the written english languages deviates strongly from the spoken english language, heck, ask almost any english lit teacher if you don't believe me.

    Many words that people use in their daily spoken language that they have perfected the pronounciation of, they have never actualy written down, and many words that they write down, they may have never pronounced in their entire life (I know I have gone 5 or more years without actualy speaking a word but using it almost weekly in my writting.)

    The basic fact is that it is not natural to dictate a document. Most authers who dictate documents (do any writters still actualy do that anymore?) have somebody else go through and eliminate the standard mannerisms of speech.

    (this also explains why those medieval fantasy novels sound so emmensly lame when read out loud).

    Yet there is hope for integratiing voice reconization into the modern and everyday computer. Many long winded or multiple step commands would fit very well into a voice reconiziation programmed enviorment. Granted, all the Linux nuts out there have script files to do most of the work for them if the job exeeds 5 or 6 long winded tasks, but for other commands, such as the given example of save then shutdown, this would be the perfect system.

    Imagine somebody being able to give the following command:

    Delete all games that I have not played in the last 5 months and then defragment my hard drive.

    The best part about the above command is that it is not technoloicaly impossable. Heck, a file system which keeps track of "last date accessed on" and that allows for catagorizing of directories (games, apps, pr0n) is all that is needed.

    Whats more, it would save me the 5 or so hours it takes for me to sort through my games directories.

    It has even another bonus, the average user could understand it! Granted, I'm not a big fan of the "average user" (*cough* newbie *cough*) but it doesn't exactly take a rocket scientist to understand the idea of deleting everything that hasn't been used in at least 5 months. Just follow up by (attempting to) explaining defragmenting a HD (you have to do it, shut-up, end of story,-- my perfered method) and you have a very easy to use and often needed command that is actualy easier to use through voice reconization then it is to do by keyboard or mouse
    .

    Examples of other commands that are easier to use through voice recon then traditional commands:

    Save this file as (document name)final.txt then delete all other previus backup copies of this file.

    Goto visited websites and rotate all passwords

    remove all non-program essential image files

    DELETE THAT DAMN DIRECTORY ALL READY (for those people are are stuck with windoze and uninstall shield, you know what I am talking about!)

    Find some of all items in colum C and place it in the next avaible row of colum C

    Run virus program, clean up temp files, then run defrag

    Partional off the newly installed 20gig HD into two 10gig segments labled D and E

    Copy all user created text files to the CD-RW that was just inserted

    Delete all MP3's that have only been listened to once

    Collect all non-program attached midi files and move them to usr/music/midi

    (yes yes, some people here still listen to midi files, get over it!)

    Anyways, just my, err, uh

    2 and a half bits I guess, heh.

  • by Master of Kode Fu ( 63421 ) on Tuesday August 22, 2000 @07:35PM (#835262) Homepage
    One way get a sense of what computers will be like at a certain point in the future is to look an equal distance into the past. This approach isn't accurate and can't account for factors such as unexpected technological leaps or setbacks, but it's still a pretty good starting point.

    I recently purchased an used book called Computer Systems in Business: An Introduction recently. Published in 1986, it features computer systems that seem pretty quaint almost two decades later.

    One clever thing the book does is discuss the concept of P, a relative-cost measure for dicussing how much computers cost in light of the opposing forces of inflation and Moore's Law which make costs difficult to forecast in actual currency. P is equal to the average cost or value of a standard microcomputer system, which at the time consisted of...

    • CPU
    • Main memory (256K)
    • Keyboard
    • Monitor (they specified monochrome)
    • Two diskette drives
    • Dot matrix printer
    The book does mention some things that seem pretty quaint these days: main memory for some larger micros exceeded 2 megabytes in 1985, two diskette drives would give you a total on-line storage capacity of a megabyte, and my favourite line: "A Winchester is sometimes called a hard disk."

    The books goes on to discuss minicomputers (6 megs memory with 200 ns access time, 8K cache and 1.8 gig of hard disk -- er, Winchester -- space. Only when the book gets to mainframes do the machine specs seem somewhere in line with a machine you could probably buy at your local appliance shop...

    A diagram of a typical 1985 mainframe system is shown in figure 7-17. Main memory size is 64 megabytes, and access time is 60 nanoseconds. The system also has a small (64K) cache memory. In the typical mainframe system, the access time for cache memory (35 nsec) is 1.7 times more rapid than that for main memory.
    The diagram, which I can't include in this posting, shows 35 disk drives of 800 megabytes each (28 gigs total). In P units, the cost of such a beast was 1,043, or about $3.6 million dollars.

    If we assume that the rate of tech progress over the next 15 years is roughly the same as the past 15, we have a starting point for visualizing what the average personal computer of 2015 could be like -- simply look at machines that cost about a thousand times more than the Celeron 600 at Circuit City.

    Then, in the best /. traditon, imagine a Beowulf cluster of them!

  • Choose the future. Choose frisbee-shaped ergonomics and shopping for hardware in Vogue and Cosmopolitan magazine. Choose restrictive software and operating systems that obfuscate what lies underneath. Choose shouting impotently at a multimedia interface that knows what you want far better than you do. Choose software that spends your money for you, wants equal rights and can only be made to work by flattering its (bad) poetry. Choose electronic rebellion from your fridge, your toilet and your television. Choose news, censored by your intelligent agent to suit your delicate sensibilities, and by the FBI to suit their political agenda. Choose slashdot future. Choose life... but why would I want to do a thing like that?
  • The lojban language is working on being usable for a spoken programming language, from what I've heard. In lojban, that would be:

    .i go to ge to ga .abu zmadu by. gi cy. zmadu by. toi gi dy. dunli li no toi gi to ko dunse'a .abu to by. pi'i li mupinomuvo fe'i pai toi .ije ko dunse'a lypy'abu .abu toi

    Or something like that. It might even be shorter if I was better at lojban.

    I said that rather slowly and distinctly in 18 seconds.
    --
    No more e-mail address game - see my user info. Time for revenge.
  • "You'll communicate with the PC primarily with your voice... "

    This should make university computer labs interesting, especially for people writing code.


    Obviously it will be nearly impossible to write code without using a keyboard, but most computer users are not writing code: They're sending e-mails, writing papers and looking up information on the Web. With suitably advanced software (10 years is a long time, and in many areas we're already there), this can all be done vocally, but there will always be need for a keyboard.

    Be careful when drumming your fingers.

    With a full desk top of space to work with, I imagine I'd be keeping my critical triggers away from the areas I'd rest my hands. :) With that much space there's room to have some of it empty when you're not working with it.
  • How come every new concept of the future has to be aerodynamic? I mean, please, it's not like my machine is going to be going faster because it has a slender design. How the hell am I supposed to balance things on that thing if it has all those curves? Give me a damn computer that is a square and won't tip over.

    Although now that I think about it, a drink holder in the case would be pretty damn cool.
  • most) that you plug-in to a rack in the closet.. But this is where I came in. I believe the correct terminology is to "come out" of the closet ;) Rich

For God's sake, stop researching for a while and begin to think!

Working...