The Computer of 2010 241
nostriluu writes "
With the assistance of award-winning firm frogdesign (the geniuses behind the look of the early Apple and many of today's supercomputers and workstations), Forbes ASAP has designed and built (virtually, of course) the computer of 2010."
Re:whats up with the no keyboard fetish?-Leather (Score:1)
Clearly, the form-over-function /Apple/ of 2010 (Score:2)
Aside from the design of a pretty box to put the computer in, this article might have gotten 3rd place in a local Junior Highschool science fair.
It's a set of extrapolative predictions that could have been put together by a layman in a couple of hours of seaching the internet. This falls short of what most of us here could probably just sit down and type out without doing any research at all.
For example, there are no guesses about what specialized coprocessors will be the rage in 2010. Will 3D be the big thing, like now, or will acceleration for certain AI functions be the cool off-CPU gadget? Will we still think a big specialized FPU is a big deal, or will we just have a whole pile of small, parallel integer units?
This is the interesting kind of question about future computers. We know they'll go faster, use less power, and store more data, and we can put them in any damn box we please - that kind of speculation is as bootless as it gets.
---
Despite rumors to the contrary, I am not a turnip.
Voice recognition stinks (Score:4)
I don't know why everyone thinks we want to talk to computers. I want to talk to my computer about as bleeping much as I want to talk to my television. I can't talk 100WPM but I can get close to that on a keyboard - and I don't know why you'd want to change that. Even thought recognition would be a pain in the ass. I can type almost without thinking about it - which might explain some of my posts, ha ha, but surely you must know what I mean; Thoughts flow easily to keyboards that might not to voice. Maybe that's conditioning, but writing down thoughts is something that goes back for all of recorded history and I think it's more than just me.
Computers of the future will be optical. They'll run at 100's of Ghz. They'll have stupid huge hard drives. Hell, they might even think. But you won't be talking to them - because it's plain not efficient compared to other input techniques, like computers. Do you know how sore your vocal chords would be after dictating all day?
Arrggh. That's my rant for the day.
Pure (50s) sci-fi (Score:2)
While some of the "details" appear to be semi-plausible extensions of current technology-in-progress (there's some holographic storage in there, and it sounds like there's a bit of work being done on optical connections), most come across as partially fanciful, attention-grabbing fictions with a vague or shortsighted basing in reality, but with no real reason for being there apart from they're different to what we have now.
For instance, a lack of keyboard is a ridiculous idea. Perhaps it might work for simple dictation processes, but that assumes that there will be some device/method that will be faster for navigation (I probably use my keyboard more than my mouse to get around screen) and for non-dictionary input.
Other "advancements" are more in tune with the author's desire for the PC to become a fashion accessory, rather than a practical tool. "Digital Butler"? Come on... While there is certainly a (growing) market for this, the majority of sales will still (yes, even in the future...) be for the purposes of functionality. And for functionality, one needs... practicality!
Further, while it may look good, it's also been designed to be very general purpose - plug it into this wall/that desk/an eyepiece. Surely the author could see that separate appliances (PDAs, desktop terminals, servers) is the way things are going, rather than having a single versatile unit acting as all things?
Wildly inaccurate. I would hope.
Re:Flying computers! (Score:1)
SCREEN??????? One computer??? (Score:1)
Also, I don't want my whole house on one computer, I wan't lots of embedded devices that talk to each other using _very_ simple easy to secure protocols. That way viruses don't crank my thermostat up to ultra bake, close all windows, and flick the lights on and off until I have a dang seizure.
Re:Dictation requires training!!! (Score:1)
Re:Frisbee; personal information ... shades of TRO (Score:1)
I didn't think anyone remembered that movie .. it was one of my favorites as a kid.
Re:All this is crap (Score:1)
FYI, the original Lost in Space took place in 1998. Danger Will Robinson, Danger.
Re:Voice recognition stinks (Score:1)
disclaimer: i haven't touched or even looked at cobol since they tried to teach it to me back in the dark ages, and a misspelling of the word environment caused the compiler to spit out two errors for each line of source code...
things may have changed since then.
first error on first page (Score:1)
By 2010, most "computers" will be next to invisible as they will be a natural part of the objects in the home.
The most computer like object to be seen will be a thin magazine sized color display with a touch sensitive surface. These will be dirt cheap, found everywhere and comunicate via IR or wireless IP. Somewhere in the home will be a box with disk storage and a Ip connection to the external world (via cable or phone.) CD, DVD etc players will be freestanding as now -- your TV or HIFi will access them as network devices.
All will run Linux kernels :-)
2001 (Score:1)
Re:Integration of Keyboard, Mouse, and Voice Recon (Score:2)
Re:whats up with the no keyboard fetish? (Score:1)
Tee Hee (Score:1)
Duh.
One form of the 2010 machine will be a tiny watch or pendant running Linux 4.8.16. But another will be a clunky tower, just like today, because the bigger the package, the more you can put inside. I doubt there'll be a place for their inconvenient, unstackable design.
Technologically, maybe it won't even have a hard disk. Maybe it'll use optics, maybe something else. The only thing I care about is that it'll be big (storage-wise) and fast.
As to the "swoopy" design, just check out all those 50's-era predictions of the future. Yeah, it'll look like a frisbee -- and no doubt I'll be wearing a silvery one-piece jumpsuit.
How do you dictate Perl (Score:2)
Another content free article (Score:2)
And the operating system in 2010 will be... (Score:1)
Re:Very Cool (Score:1)
IMHO, it is going away. When all the devices you use can talk to each other over the network (where network = Bluetooth LAN, Internet, whatever), SneakerNet becomes unnecessary.
I disagree. There are lots of good reasons why removable storage is going to be around:
optoelectronics? (Score:1)
Optoelectronics? This is one buzzword which will never catch on. I know nothin about optics or electronics, I'm just a programmer, but it seems obvious that "Optronics" is the clear choice among buzzwords for this emerging field. I say shun the (soon to be depracated) term Optoelectronics; adopt the much more advanced term, Optronics!
Bah! (Score:3)
2. Processors don't have to spend 2/3 of their time waiting around for data. Real ones at least. I have a 533MHz alpha that does 980 MFLOPs, don't tell me it's waiting around most of the time.
3. I doubt that anyone will want to use Lithium batteries in ten years because fuel cells will have been out for 8 years.
4. If we have a quarter terabyte of main magnetic memory, what is the terabyte of optical disk for? It's the only moving part in the computer, what the hell do we need it for? Magnetic memory is static.
5. What about the network connection? OC-192? Better? I'd personally vote for some type of ATM, especially if we're going to use it for all of our communications. QOS is important, I don't want to lose frames on my movie just because someone calls..
6. They think that absolute security relies on thumbprints? Give me a break (or break-in). What we really need is to make sure that IPv8 is double-key encrypted at all levels.
7. There's nothing that they describe that is going to take a Cray to process. What does the typical secretary need with a supercomputer? A voice activated webpad is about enough. Gamers are another story entirely. Immersive VR is going to take more than they've got scheduled anyway.
In short, the forbes article is a fluff piece.
Re:All this is crap (Score:1)
This seems slow! (Score:1)
Re:whats up with the no keyboard fetish?-Leather (Score:1)
It could be a handy shortcut meaning find the root partition in /etc/fstab and run
fsck /dev/whatever
According to moore's law, (Score:1)
-moose
Re:whats up with the no keyboard fetish? (Score:1)
FYI, Popular Mechanics had a feature article in 1994 on upcoming "information appliances" that would pervade our lives in just a couple years and allow us access to the "information superhighway" in just a couple years. These devices would come in the form of slimmed-down desktop computers (iPaq, iOpener), hand-held devices (Palm/Visor) and set-top boxes for the TV (TiVo, ReplayTV, WebTV).
Re:whats up with the no keyboard fetish? (Score:2)
The moble market however doesn't need voice recognition to get rid of the keyboard, Palm seems to have done a fairly good job of it with their entry system. Other companies could come up with a similar easy to use entry system, or an even better one. Indeed some alternates are already available for the Palm.
One question is do you really want to use a lot of computing/hardware power for voice recognition, or would you rather use it for something else (assuming software designers can come up with an efficient use for that power).
Re:Voice recognition stinks (Score:1)
Not to mention that fact that it would IRRITATE THE HELL out of everybody within hearing distance. I mean, COME ON! I would hate to work in a cube, surrounded by a couple dozen people, all talking to their computers (well, we talk to our computers here, but mostly to swear at them). The clicking of a keyboard is pretty easy to ignore, because it's not particularly interesting to listen to, just a bunch of clicks.
Also, if I had to talk to post all my slashdot comments, I'd probably be fired because people would finally realize how little time I spend working.
Why the fascination with biometrics? (Score:1)
I would think we could come up with something better then biometrics.
A biometric password is like using the same password everywhere, you know what it is based on and I would think of all things that could be spoofed, it would be somewhat easier. I don't know about you, but everything I touch doesn't hold evidence of my root password.
What we would really want is a system that can't be hijacked. A authentication system that proved it is me (the living, willing). A self destructive system when given the wrong password would be ok, however, you would probably be killed for using it.
Maybe a system with a flesh embedded chip (that needs blood circulation), along with a relative security level password. If you are truly being hijacked you could esentially open up a honeypot that contains very little real data but doesn't seem barren.
Perhaps this is too paranoid, but if so, you probably don't need biometrics either. Something like a fingerprint is just too likely to be damaged or non-repeatable, to be useful.
Re:whats up with the no keyboard fetish? (Score:5)
At least you didn't say they worked well. Hey, let's look at some input device "theory" shall we?
1. You store information in your brain. It's chemical. It's analog.
2. You want that information in your computer. It's electric. It's digital.
3. Can it possibly be that the best way to bridge these two qualitative gaps is by wiggling physical limbs over hard plastic nubbins?
4. Depressingly, the answer appears to be "yes"...
5. So now it's down to a matter of appendages, nubbins and how you wiggle them (feel free to make porn jokes now)
6. Alternate WAN (wiggling appendages over nubbins) techs have risen and fallen. The mouse is a popular WAN... but the guy who came up with the mouse idea (you know, whats-his-name who worked at SRI) also had this bizarre "chord playing" device for input as well... sorta like using an one-handed accordian.
7. Text. We want text input because we're slaves to alphabetic, pseudo-phoenetic written languages.
8. WAN techs must not only be efficient but be acceptable by people as well...
9. So, we need a WAN. It must be text-oriented, efficient and have a high acceptance rate among people.
10. You're answer to that is the keyboard. I work with a guy who turns blue under the eyes without his stylus.... the bottom line is:
We have WANs now that do the job, but we have seen new WANS (mouse, stylus) come along and there is no reason to think that WAN evolution will stop just because we like our F-keys and Num Lock. In 1983 I would never have imagined a mouse. But it happened.
Cubicle Hell (Score:1)
I don't even work in cubicles, but I know I would keep my office door closed a lot more often if everyone in the hall was chanting nonsense to their computers all day.
Bingo Foo
---
Re:Just like TRON (Score:1)
Re:Just a moment, here... (Score:1)
>This should make university computer labs interesting, especially for people writing code.
Obviously it will be nearly impossible to write code without using a keyboard, but most computer users are not writing code: They're sending e-mails, writing papers and looking up information on the Web. With suitably advanced software (10 years is a long time, and in many areas we're already there), this can all be done vocally, but there will always be need for a keyboard.
My point was more along the lines of "Can you imagine trying to think about anything, especially code, in a big room where everyone is busy talking to their computer?" i.e. the noise aspect. I could actually see writing code via voice, especially if you're using higher-level languages with less bizarre syntax--which would likely start being developed once voice recognition became mainstream.
Heck, you could even write C code with voice, if you had a clever enough interpreter:
Okay, I take that back... I'd rather do C with a keyboard. (-:
Sooner than that.... (Score:1)
more thoughts on the future (Score:1)
"Today's supercomputers" (Score:2)
I hate to break their foward-looking uber-geek bubble, but isn't any one of the Apple G4 [apple.com] considered a "supercomputer" by today's standards? One gigaflop is the cutoff, right? The G4 met it, right? So we can buy "today's" supercomputer TODAY, right???
Take the technical accuracy with a pinch of salt (Score:1)
In the long-gone days (1980) of the 80286...
An 80286 in 1980? *snort* Right, and I had an Apple ][ in 1970.
--
More _rotating_ media?? (Score:2)
I like the two-sided laser idea, and the application of holography (which might enable you to exploit the thickness of the disk to store a few layers of bits rather than just one). But will we still be cursed with moving mechanical parts (like rotating media)??
chips that use silicon to switch but optics to communicate... Instantaneous on-chip optical communication
It sounds like they plan to replace any sufficiently long signal paths with on-chip optical waveguides, requiring an LED at one end and a phototransistor at the other. Putting LEDs on the same die with transistors is problematic today, but presumably they can solve that problem with some new LED chemistry. Next they need to be able to build optical waveguides into a die, and insulate them from one another (so they need transparent and opaque materials that can be built up using photolithography). I dunno if such stuff exists, they seem pretty confident about it.
One of the biggest advantages of photonic circuitry is an extremely low power requirement.
This is supposed to be a consequence of packing the die with LEDs and phototransistors, rather than charging up the RC delays of long signal lines? Hmm, maybe. The LEDs might not need much light to throw a bit a few microns.
Re:All this is carp (Score:1)
You mean you didn't get your robot maid at the New Year's Eve party like everyone else? Well, that explains it
Re:Close: not! (Score:2)
I think this article takes the pc today and wonders what would happen if all of the components in the PC were improved and (surprise!) you get a very fast version of the PC. What this article does not do is wonder how we would build computers if we could connect the parts more efficiently. The PC I had six years ago was more than adequate to operate the fridge, microwave, tv and light in my house. The only problem was that it couldn't communicate with those things out of the box. But what if the lightbulp was bluetooth enabled? It might someday become feasible to do so, what are we going to then? That's what's interesting. I don't think I'll ever dictate an email to my PC, typing is much faster than speaking. I don't care if my wordprocessor runs at 25 Mhz or at 25 Thz. I use my home PC for gaming, browsing and typing (in that order). Only the first type of use requires the kind of PC I have on my desk. This is not going to change. I'll probably be playing cooler games in 10 years but what else am I going to do with the PC outlined in the article?
Bwahaha. (Score:1)
Re:Frisbee; personal information ... shades of TRO (Score:1)
ObSimpsons: Has anyone here seen Tron?
whats up with the no keyboard fetish? (Score:5)
Keyboards are quick and efficient. This article says that you'll instead use a 3D interface, and simply touch with your hands what you want to do.
Is it me, or does that sound rather slow and clunky? Do I really want to be waving my arms around just to open a damn program?
Face it: keyboards are still around after all these years because THEY WORK. They might not look futuristic or uber-high tech, but THEY WORK.
Re:All this is crap (Score:1)
You underestimate speaking speed (Score:1)
I can't talk 100WPM but I can get close to that on a keyboard
Actually, you probably talk well over 100 WPM. Time yourself: I just checked myself and reading fairly technical text ("Experiments in Physical Chemistry", Shoemaker et al.) at my normal speaking speed I got about 250 WPM. When excited I easily do 300 WPM, which my students sometimes hate.
I used to debate in high school and college. I "spread" at about 700 WPM: I knew folks who could easily top 1000WPM. However, I suspect it's going to be many years before voice recognition gets to the point of understanding that.
Re:Bah! (Score:1)
And that programming language will be written in C, hammered out on a keyboard.
Sean
Shaken, not /.ed (Score:2)
Well, it's better than meatsex on the cybercafe computers
Re:Moving Parts? Definitely no disk (Score:1)
We've already seen an article, here I think, that a bloke with IBM or someone was talking about non-dynamic RAM with low power requirements to run and NO power requirements to maintain state. Now I wouldn't be surprised if in fact it could stand a powered refresh every day or so to offset the effects of random magnetic fields - hell you could put a field detector on the MM and it could open a circuit for a powered refresh every time it thought it was required.
You don't want a disk in a mobile device, for a simple reason - torque. A fast spinning disk is essentially a gyroscope. While it would keep the device stable for use, the cost is horrific forces on the axles of the disks. Go on, run around with a mobile computer containing and 10K rpm disk and see how long it takes to fail. I'd be surprised if it worked at the end of the day.
So, clearly the computer of the future will contain two levels of RAM - something designed for performance with whatever power requirements that entails, and a larger bulk designed for stability over speed, which replaces the hard disk. Only it doesn't, because no matter how optimised it is for stability, it's still gonna spank a disk completely for response time and throughput. In 10 years, memory designed for stability without power (or with only a tiny backup battery rated at '12 months backup') is still gonna look quite nice compared to modern RAM for speed and such.
So you'd have everything you would normally put into cache and main RAM in your fast RAM, and all the contents of your disk, as well as swapped-out memory pages, in your 'slow' RAM.
To turn your computer off, all you do is flush the DRAM contents to stable RAM. How long's that gonna take? 0.1 secs, tops.
To turn it on, all you do is a/ read from slow RAM enough to give the user their desktop, and start swapping the relevant parts of the OS into DRAM.
That removes the power requirements of the disk, which are quite high, removes most of the power requirements of DRAM, as there isn't much (perhaps only 10-20 GB - shockingly low for 2010), and leaves your computer safe to cart round with you. All we need now is a decent display. And as for all you guys worrying about interface. Imagine you strap something around each elbow that detects the nerve impulses bound for your forearms where the muscles that control your fingers are located. Wireless of course, encrypted link to the computer. I'm sure you'll learn to 'want to type' without actually moving your fingers and control your computer that way.
If you could put such a device on your head, or better still implant it, imagine the "macros" you could use. Your computer is under attack? You think of being protected, and it replaced inetd.conf with a more secure one and hups inetd. and shuts down some other programs. and enables network logging. and performs reverse lookup on all ip's currently talking to you. and traceroutes them. and keeps the results... and and and and and
Given the utility of such an interface to users of realtime computer systems, no not fighter pilots at all, guv, honest, I'm sure we'll see it or something like it by 2010. Once they exist they'll get cheaper and cheaper (until they fry someone's head, then they'll get expensive for a while til that problem is fixed), and then you'll be able to go down to Sony or Toshiba or GE and request surgery to have a neural link implanted.
Anyway, I've done the Cyberpunk thing enough.
Cache-boy
Re:whats up with the no keyboard fetish? (Score:2)
They collect dust and hair and spilled beverages. They eventually break. They make noise when used. If you have a nice wooden desk they mark it. They require either cables, or if using radio or IR, batteries. They come in standard sizes, while hands don't. They contain useless keys. (case in point - the Windows keys) If you have to share them you get to pick up whatever's on the fingers of your co-workers - and in most cases you probably don't want to know what that is.
There are probably lots mroe bad things about keyboards, but that's enough for now.
Re:Way off.... (Score:2)
Goddamn, I gotta play this game too...
2010 State of the Industry
1. Msfts new mouse goes "beyond optical"... it now tracks movement based on the earth's magnetic field. Naysayers point out that Sun produced a similar mouse back in '01, but you had to use a special planet with it...
2. In a button-adding frenzy, logitech has released the 101-button mouse (wheel, lever, hand crank and ripcord included as well). Ad campaign: "it's a second keyboard... on wheels!"
3. Logitech's new mouse prompts Wired Magazine to declare "The Keyboard is Dead"
4. What the hell comes after "pita"? Now we gotta find out!
5. Transmeta announces the ultimate in software emulation and completely eliminates all physical components in their new chip. Company officials say the zero mass of the chip will reduce shipping costs and inventory overhead... Torvalds admits in an interview that it's basically a Turing Machine with a box...
6. Windows '09... It's got fins!
7. Oni released.
8. moodMac line released, a throwback to post-gen-X 70's nostalgia it changes colour depending on your mood. Features quad G9 processors with repoVec, a sub-processor that actually uses photoshop for you.
9. Mac releases OS XVII.LXIV.rVII. considers upgrading to Arabic numerals
10. You're still playing minesweeper?
11. Compaq releases a "computer so advanced, it's smaller than a dime". Pundits say monitor size is a serious limitation. Ex-VP Lieberman new CEO of Compaq, changes ad campaign to "24x6 nonstop" to keep the sabbath...
12. Seti@home finds alien life! He's doing 2 units a day on an AMD K21 TweetyBird.
13. Bill Gates says "640Mb ought to be enough for anybody"
My Computer of 2010 (Score:4)
Now, let's talk about the computer of the future I imagine. First of all it will be a half dissasembled box with various optical cables coming out of it and a little bit of dust gathering on the exposed parts. The processor is of course tweaked in some way as to make it 1.5-2 times as fast if occasionally unstable.
The computer is hooked up via a wireless VPN to a bunch of my hacker friends all over the world where we share our thoughts, and our music in secrecy. Of course I've got a high bandwidth Internet connection. It's perfect for serving up movies, music, and games, but it's still not quite enough to handle some of the latest technologies (some things never change).
I've got several of my older computers hooked up on the other end. Sure, they are slow and primitive, but it's fun! Needless to say these are all in a state of semi-disarray, with cables in a giant spaghetti mess on the floor.
Sure, I've got one of those cool mega-displays that display everything in photographic quality in a screen the size of a desk, but I've got some throw backs. I've of course got a keyboard since those virtual keyboards are cludgy at best. I've got a scrolling LED display I found in a junk yard and managed to hook up to my box. If somebody tries to hack my box a bunch it displays a message on the LED to let me know what's happening.
Now, that sounds like my dream computer of the future! Maybe it would be nice to have something portable to go with it, but I want a box I can hack and play with.
---
"Back Up" (Score:2)
Forbes is talking about the computer of 2010 to people whom they believe want or need a link at the bottom of the page to get back to the top. Considering that the Forbes readership is supposed to Have All The Money, I'm a bit worried by this...
Forbes' next article: "The Scrollbar In 2010", with a sidebar on the marvelous research being done on keyboard shortcuts...
Next artocle for Forbes (Score:2)
Desk display?? (Score:2)
This has got to be the most idiotic thing that I have ever heard of in my entire friggin' life. Think about this: you sit down and plug your comp into your desk and you proceed to work for 8 hours bent of your desk.. I don't know about you people, but I have three monitors hooked up to my machine and at the end of the day..12-14 hour days at that.. I have a bastard of a crick in my neck. If I had to hunch over all day, not only would my neck hurt, but so would my back.. and as a added incentive, my woman could call me quasimodo from then on.
No thanks.
Rami
--
Future Designs (Score:3)
There would have been no way that they would have predicted the importance of the Internet - or something like Slashdot. In 1990 the communications capability of computers was only known and appreciated by a very few geeks; most people had local call modem access to bulletin boardsif they had anything. (Please don't post how you had access to the Internet in 1983 - that just proves YOU are a geek and nothing else. Who could an average person have used as an ISP in 1990?)
In 1990 very few prognosticators would have predicted anything like a noticeable percentage of people running a Unix style operating system. Nor would they have predicted anything like Windows 2000 or an iMac.
One of the most interesting things about this article is that they had almost nothing to say about the - external to your house - communication capability of the machine. I suspect this will be one of the most important aspects of that machine.
One of the reasons that I bought OS/2 Warp 4 was the voice recognition capability built into the OS. I wound up using it very little. Not because it didn't work, it did. The reason I didn't use it much was that in order to activate it I had to say the word 'desktop'. For me at least 'desktop' is a VERY difficult word to pronounce properly. The 'k' sound at the end of one syllable followed by the 't' sound at the start of the next is just tough to say. When I thought about it I realized that I pronounced it 'destop' as do many of the people who say it in normal speech. The computer didn't know what a 'destop' was.
'Desktop' is a minor stumbling block, but it is the sort of thing that keeps voice recognition from being utilized as much as it could be. One of the keys to a useful voice command computer is to use words in the command structure that people can pronounce.
There is also a slight misconception in the article; the good thing about optical communication between computer subsections is not the speed of light vs the speed of electrical pulses - the good thing is that optical communications can switch on and off faster; you can obtain higher frequencies.
The article also gets it a little wrong when it blames the electrical interconnect for causing delays in main memory fetching. The problem is that dram speeds have only grown about 10 times faster since the days of the Z80 while processor clock speeds are up by a factor of 250 or so. Unless there is a real breakthrough in memory speeds that trend will continue.
Summary of Tech Magazines and Science Fiction (Score:2)
One of the things that bothered me the most the the appearance of the hardware itself. The author obviously thinks that the computer of 2010 is supposed to look like some kind of ugly disc that plugs into both the house and the desk. However, that is an entirely useless feature for the desktop of the future. If one truly wants his house to be computer operated, than it will be done with devices specifically designed to do so. Much like how cars have special onboard navigation computers to help the driver get around. Though it is possible to hook up a laptop, it certainly won't do the job as well as the onboard computer, nor will it be using the computer itself to its fullest extent. In other words, the technology has existed for over 20 years now, it's just that it is either too expensive or not interesting for the common Joe to go out and buy one.
But lets pretend that he didn't say that stupid thing about the house. Lets move on to the stupid things he said about the desk. You will plug this little module into your desk? Why? What advantages does this offer. What if you want to use this computer on the road? What if you don't have access to a desk? Well then, this idea becomes really retarded. Wouldn't it be easier to just carry around a laptop and hook it into a dock? That's basically what the guy "invented" in his little made-up story.
But let's move past this dock and look at how the thing actually works. You will have some kind of desk that is actually a computer. You will plug it in, wave your arms around and drag your fingers around it? If someone walks into your cubicle, you will look incredibly stupid. Why get rid of the mouse and keyboard when they are such great tools. Why have a magically disappearing/reappearing keyboard. Wouldn't it be a LOT cheaper to have a regular keyboard? The whole interface is retarded. And let's not forget about the cost of this fantasy computer. It costs a fortune to get a 15" LCD screen. I don't think the price is going to come down enough in the future for us to have desk-sized 3D touchscreen LCD. Even if it were, I wouldn't want it built into my desk. I'd want it the way they're dishing them out right now. A little stand but a big screen. A 30" LCD screen with 1600x1200+ resolutions would be much better than what this guy proposes.
Then there's his idea of security. These ideas won't take 10 years to implement. They're perfectly available now. The only reason they haven't caught on yet is because its too much effort for something that can be handled just as easily as a 10-letter password. It seems as though this guy was told to write up a story about the "world's most expensive"/"fantasy" computer.
Once we get past the terrible ideas for user interfaces, we get to terrible techinical rantings by the author about the hardware. It almost seems as though he was paid by different manufacturers to point out their names in his article as a form of advertisements. All the technologies he talks about have been known about for a long time and pretends that when IBM says 5-10 years, that means it will automatically be put into all computers in that time. But I think we all know that none of that is true.
The author seems to have some misconceptions about the way hardware works, and what he says about RAM makes almost no sense at all. All in all, this article seems like a hack. I think its interesting that they put it on slashdot because it gives real geeks the opportunity to poke holes into it and give the rest of the community a place to think about what computers will REALLY be like in 10 years.
User resistance to change (Score:2)
(Since then, I've gotten fast enough on the QWERTY that I think there might be some truth to the theory that QWERTY can be just as fast as DVORAK. But I guess that's like trying to figure out how many licks it takes to get to the tootsie-roll center of a tootsie-pop. "The world may never know.")
Another example is the VITALY keyboard, which is a keyboard layout for palm pilots that is optimized for one-handed stylus hunt-and-peck speed. It's a great idea, and everyone I've heard who's tried it claims a huge increase in speed and accuracy. Despite this, competing products with a qwerty layout are selling extremely well (I think).
Since users are so incredibly loyal to the old familiar QWERTY keyboard, I am pretty confident it will still be the primary input device in 2010.
My B.S. Meter is off the dial.... (Score:2)
Most of the technologies that they mention are in the theoretical stage at this point, and as we all know, most theoretical technologies are press fluff. 5 years ago I remember hearing about "Ion Drives" that would be able to write a GB of data to a square inch by changing the electrical signal of individual molecules. It was an optical technology etc........ Where is it?? Still in my May '95 copy of Wired apparently.
It is a well known fact that Academia has a cute tendency to announce technologies that will be available "in a few years" knowing full well that they will never materialize. Hell, we're still waiting for Rambus and Sapphire chips aren't we??
Also, the computer market is moving more toward embedded computing and small "appliances" like wireless web-pads. Not the monolithic beastie presented here.
And the idea that the "Biometric horah-doodah" will make my computer infinitely secure?? Yeah, when the Slashdot community has been lobotomized...
And I can't see my employer shelling out for the future desk they write about either. The f***er won't even get me a separate phone line for Bhudda's sake.
This might be the computer of 2525, better yet, the computer of 2050, but even then I doubt it. Most likely this is just the unfortunate side effect of an acid flashback.
(Besides, I have this scary vision of everybody in my office talking in C code at once and me screaming across the room, "Shut up! You're screwing with my syntax!") But in ten years apparently programming will be something you do in plain English. (Ha, ha, ha.... They said that in 1980 about 1990...)
At least a cow leaves behind something solid, powerful, and nutritious for geese. Forbes has simply contributed to landfill... But hey, mental mastrubation is almost as fun and doesn't leave your arm all tired...
~Hammy
"The 486 processor is so powerful it is doubtful that it will ever be used in anything other than high end servers." -Byte Magazine, October 1991
Would I need this? (Score:2)
The last thing I need is a computer that looks like a frisbee
When I want to use it the dog will have it in the backyard waiting to play catch.
cache & refreshes (Score:2)
Then again, maybe I'm smoking crack. Can someone back me up or correct me?
While I'm on the subject...
Wouldn't it be faster to incorporate the cache on-die, like with Celeron As? Uh, why's that? What makes 'holographic memory' any different from regular memory? (I don't think they're wrong, I just want more info).Wow, the computer of 2010! (Score:2)
Hard Disc
*FAST* memory!
CPU
Power supply
and to top it all off, MAIN RAM.
Who would have thought, that by only 2010 we're going to be seeing computers, "[that], believe it or not, [are] about the size of a Frisbee". Time to throw this old
The best part of all, they've incorporated 20th century "The Clapper" technology, for us stingy throwbacks who are scared of product ideas that are actually new!
'Plug it into the wall with a magnetic clamp and watch as our home comes to life. In essence, the computer becomes the operating system for our house, and our house, in turn, knows our habits and responds to our needs. ("Brew coffee at 7, play Beethoven the moment the front door opens, and tell me when I'm low on milk.")'
Someone weld a misty-mate on the side of one of these suckers, and I'll drop my other testicle!
Re:According to moore's law, (Score:2)
I get 101 GHz, assuming 1.0 now and doubling every 18 months.
BTW, there was a story elsewhere earlier today where Intel was bragging about Williamette running at 4.0 GHz in 2004. That's right on for the traditional version of "doubles in speed every two years", but industry has been doing better than that for 10 years (+/-) now. If 4G is the best they can offer, they'll be well down the road to bankruptcy, since AMD should be at 6G by then.
--
Frisbee; personal information ... shades of TRON? (Score:2)
Actually (Score:2)
It's mounted on a rack in the closet, and the cabling goes all throughout the house. Better yet, BlueTooth.
Any component you want to add can be plugged in anywhere. A new flat-screen TV is your monitor, as is the PDA in your pocket. You speak into the air in any room, and you are obeyed. You buy a new refridgerator, and it's suddenly online. Where you put the keyboard - and there WILL be one - is a matter of decoration more than functionality.
And it's completely transparent to all, except the technologists - which is as it should be. Just as I don't care to know the exact air/fuel mix in my engine, neither does my mechanic care about his chip-set or the temperature of his CPU.
People who are not passionate about the tech find it too complex and too intrusive. They want a box they plug in, more easily then a stereo component or VCR. They just want it to work, seamlessly and without requiring them to RTFM.
The computer of 2010 may be more like a CD changer than anything else. The computer of 2015 will be a freaking LAMP. Seriously... You can cram a whole lot of hardware into those things - all that empty space. Network the thing via power-lines, and to upgrade your processing power, you just buy another lamp, or TV, or Microwave... Or a slot mounted, monolithic box (the size of a VHS tape at most) that you plug-in to a rack in the closet.. But this is where I came in.
The REAL jabber has the /. user id: 13196
Re:Just a moment, here... (Score:2)
Agreed, but that doesn't mean it's not useful in a home setting, for instance, or a private office. Naturally office etiquette would prohibit the regular use of such things in shared areas.
And even if you are coding away at your latest application, I would think it's more efficient to pause for a moment and say, "Computer, when's my next appointment?" than it would be to move out of your development app into a calendar of some sort, and then go back. Even in a public area, this level of occasional voice control is probably acceptable. *shrug*..
I totally agree, though, keyed input will still be primary for most industrious work, but simple tasks in a more intimate setting would be so much more efficient if they could be done effectively by voice. Just think, you could browse the web, update your calendar, compose a few e-mails while cooking yourself dinner, or cleaning house. After a long day at the office, that level of ease-of-use would be spectacular.
I'm a big advocate of "behind-the-scenes" computing, where the PC is hidden and unintrusive (and today's paradigm largely unneeded).
Tron, etc. (Score:2)
Sounds like Dilinger's computer in the cult-movie "Tron".
Is SF the first inspiration source for engineers?
(No answer needed)
Anyway, something really scares me: They still have this need for mono-processor machines with one harddisc, etc.
I think it would be cooler to just design Lego-like components, each of which would be a tiny computer that could interact with one another like in the good old times of Atari ATW [computingmuseum.com].
So, instead of paying a huge amount of money to change computer every 6 months (however quick they are you know people will still pay to upgrade them, a friend pertinently compared computers to cars : you want them to work properly but to amaze your neighbours) why wouldn't we pay a few bucks for some more GIPS to fit ? With wireless communication, this would then be tomorrow's computer and I bet my vision is far realistic than ASAP's nice-looking box.
--
What a ridiculous article. (Score:2)
Hell, I may as well say "With luck, we'll all have robot maids and hovercars like the Jetsons in 2010," because it's about as solid a prediction as the writer of that articles's.
Assuming all of these technologies are released exactly on-schedule, they will be prohibitively expensive (Ex: Any new processor for about 2 months) and poorly implemented in the software-side (Ex: USB in Windows 95).
Not to mention the 'appearance' of the computer of the future. Apparently, in 10 years, our computers will be comprised primarily of colorful rectangles and circles. Neat.
Re:Keyboards.. Good cheap and effective (Score:2)
The thing about voice interaction, or any other form of poorly defined interaction is ambiguity. Try to build ANY context-free language that understands any plain english and you'll quickly realize that it's not context-free. Even if we were to somehow create an incredibly smart interpreter on the computer-end, typing '1' means exactly that, but speaking "one" could be a different story.
Besides, typing ';' is so much damn faster than saying "semi-colon". I'd hate to dictate C code.
Re:Bah! (Score:2)
I don't necessarily disagree with this, but of course C is a language designed for the current, keyboard-based, paradigm. A programming language designed for computers with voice-based input would presumably have an entirely different kind of structure.
I have no idea what that structure might be, but it's interesting to think about, yes?
Most Forbes articles are.
-
Just a moment, here... (Score:4)
Let's take a little look at this proposed computer of 2010:
SECURITY
The PC will be protected from theft, thanks to an advanced biometric scanner that can recognize your fingerprint.
Now all they need is biometric scanners on screwdrivers too.
INTERFACE
You'll communicate with the PC primarily with your voice...
This should make university computer labs interesting, especially for people writing code. And how about when your friend Bob pops into your office to say hello:
The Desktop as Desk Top
In 2010, a "desktop" will be a desk top ... You won't need a keyboard because files can be opened and closed simply by touching and dragging with your finger.
Be careful when drumming your fingers.
Your Home
The PC of 2010 plugs into your home so your house becomes a smart operating system.
"Open the refrigerator door, HAL." "I'm sorry, Dave, I'm afraid I can't do that."
Re:whats up with the no keyboard fetish?-Leather (Score:2)
Out of curiosity, what would this input device register if the only finger being moved around in "this certain way" was the middle one?
=================================
Will our software still suck in 2010?? (Score:2)
Operating systems have an incredible half-life. NT is already a decade old or so if I am not mistaken. We will definitely still be kicking around Win2k, linux, and OSX in 2010.
Re:whats up with the no keyboard fetish? (Score:2)
As for the Windows keys, I'm using a keyboard that's old enough not to have them. If they really do break as often as you say, maybe you're just mistreating it. Only keyboards I have that don't work are cheap ones I bought just to steal parts from.
Re:whats up with the no keyboard fetish? (Score:2)
Actually, computer keyboards have for a while been the largest vector of cross-patient contamination at hospitals. People disinfect the toilets and occasionally remember to scrub the doorknobs, but people rarely think to try to swab down a keyboard, in part because doing so would be difficult with today's keyboards and their many pits and crevices. A doctor who examines his patients while wearing latex gloves often forgets to remove those same gloves before looking up some record or info on his computer, and those who don't wear gloves often forget to wash their hands first, though they religiously scrub after the whole examination.
I really doubt it (Score:2)
The future holds many advances in technology and personal computing. In that future I cannot see anyone carrying their PC home from work and plugging it into their desk. Nowadays people carry pagers, cellphones, and PDAs - let's not add another thing to the list.
-Mike_L
All you people who think you can talk 100WPM... (Score:2)
You're absolutely correct. Except you forgot something. Try adding punctuation, capitals, curly braces, quotes, formatting - telling the cursor where to go - and you'll find your speed goes through the floor and aggrivation goes through the roof. I had a Newton for awhile and this was the main problem - the handwriting recognition was actually really good - the problem was putting the text in the right place on the screen, and that time outweighed any benefit the handwriting bought you.
Don't believe me? Try programming with ViaVoice. Or doing more than simple dictation, like adding bold or *Removing* bold to a sentence. You'll go insane.
Re:Voice recognition stinks (Score:3)
Ummm... yes you can. Easily. When was the last time you said, "Mississippi one, Mississippi two, Mississippi three"? Saying that at a moderate pace, each vocalized numeral occurs about once a second. So together, that's two words a second, or 120 WPS. A Mississippi is no small word!
With that brief anaylsis, for non-technical text, I'd say it's very reasonable to speak at a steady 200-250 WPS. Type that on your keyboard, then we can talk!
Of course, until someone develops a programming language specifically suited for voice recognition, I won't be coding via voice any time soon.
BTW, that last sentence was 22 words, if you can read it in 6 seconds, that's 220 WPM. Try it. Now say it at half the speed (whoah! that doesn't feel very fast at all) and that's 110 WPM, still above your asserted top typing speed.
- Cory
My Personal Computer of 2010 (Score:3)
The workhorse of the system responsible for archiving and heavy processing, a solid-state homogenous massively parallel connection machine about the size of my fist, is tucked away in my chest cavity between my oxygen-storage organ and my 2nd and 3rd backup hearts (cylinders about the size of a thumb).
The interface can take any form, since the system has full access to all nerves leading into my brain, and has plenty of power to simulate a believable environment; it can be superimposed atop real-world data or it can be fully immersive. The failsafes are carefully trained (in a process taking months of daily feedback training) direct neural connections to the master control, which can be used to cut off any problematic computer functions and reconnect my mind to my body; a spare nervous system is tatooed across my skin so there is no single point of failure. I'd have to be almost fully decapitated to lose control of my body (not likely to occur accidentally, thanks to carbon microfiber reinforcements). I'd certainly be dead before my connections were disabled, unless a very careful surgery was undertaken.
Connections to other computers can be made by many different forms of electromagnetic transmission, or by tiny electrical currents through contact on any of hundreds of points on my body.
Power is available through the main storage battery of my body (a distributed system with surprising capacity), but essential functions, such as the master control, can be supported by generation of minute amounts of power from the glucose and oxygen in my blood.
---
Despite rumors to the contrary, I am not a turnip.
Moving Parts? (Score:2)
About the only thing I'm confident to say about handhelds 10 years from now, is that they won't have discs. Discs are cheap, but solid-state memory is almost cheap enough NOW for a portable computer. In 10 years, spinning discs will seem as antiquated as the spinning tape reels that adorn movie-computers.
I think that in general, the line between "live" and "archived" storage is going to be blurred more and more, in all computers. I don't expect that portables will distinguish between memory and mass storage at all.
Carried To The Logical End. (Score:2)
Hardware will be so cheap and powerful, that we will forgo the dull, mundane materials now used in favor of "optoelectronic substrate". It will be moldeable into various shapes, strong enough to be used as a substitute for drywall or maybe even steel, and it will have any "active properties" you might desire.
Need a computer? No problem. All you have to do is tap the desk and say "keyboard" (remember, everything is made of this material) and a keyboard and active display will appear on your desk. Any data you enter will be stored permanently on a secure central server that you can access from any item that is made of substrate.
Lightbulbs and sockets? The dark ages! Just tap the wall, and say "light, medium intensity" and the wall will emenate a soft glow.
Children will be born and grow up in entire cities where the walls, sidewalks, and cars are made of substrate. Just tap your foot on the sidewalk and say "hopskotch!" and a hopskotch board appears.
You'll take your daughter to visit the country. She'll tap a rock and nothing will happen. She'll say "how boring".
Re:whats up with the no keyboard fetish? (Score:3)
Marketing hype.... (Score:2)
Overall, I feel that the article is total marketing B.S. ignoring things such as usability and the limits to Moore's law.
Hand input and 3d is already there, they use it for molecular modelling. Designers will love this feature too but I agree with the comment about the lack of keyboards.
I do have my doubts that we will still be using Li-ion batteries in ten years. There are other technologies that should be working well by then that offer better energy density and a higher internal resistance (Li-Ion looses charge quickly).
I'm inclined to agree with your commentator about notebook memory going non-rotating. Spinning a disk of any technology costs power.
From my own point of view, I see three market slots, there will still be a PDA. It is smaller but very personal and a lot more powerful than anything we have now but it will be combined with a mobile telephone and a GPS. For input, we have voice or pen and for output a small flat-panel or a mini HUD on a pair of glasses. Emphasis on low-power and portability.
I still see the notebook. There is always a place for a larger and more detailed display and lots more memory which cn fit in a briefcase. Expect it to look like an A4 pad with keyboard/pen and a 3D mouse. Viewing maybe either through the builtin flat-panel display or a min-HUD.
Actually HUDs are quite interesting technology, if it can follow your pupil. You only need detail where the eye is looking, the rest of the picture can be shown at lower detail.
Expect enough wireless peripherals to be floating around that we are worrying about the prevailing EM field around your body.
In the end, the workstation will remain because there are people who don't want to compromise on speed or expansion capability.
All those guys that Forbes used, was to design a game console.
All this is crap (Score:2)
How many times have we heard that by the year 2000 we'd be driving space cars and have robot maids a la Jetsons? Come on....
Re:whats up with the no keyboard fetish? (Score:3)
If you have a wooden desk, get something to put your keyboard on. Try rubber feet, or just put down some sort of pad between the desk and your keyboard. You could say the same thing about a monitor, or the computer itself.
Cables... who cares? So its a cable. Most appliances use them. They aren't a big deal.
There are MANY different types of keyboards. You can probably find one that fits your hands.
As for useless keys... well, don't buy a keyboard with those keys. They DO exist. Or, make those keys useful by binding them to something you find useful. Nobody said you had to keep the default keybindings.
And picking up whats on your co-workers fingers: are you a hypochondriac or what? Ever use public transportation? Christ, ever walk down the street ? Ever sit down in a chair in a restaurant? Take a chill pill.
A computer is NOT a form of (modern) art (Score:2)
We've seen art "evolve" from making paintings and statues to throwing paint on the canvas and calling it a painting up to dumping piles of metal junk on the grass and calling it art (to bad about corrosion) right up to stringing up dead horses in a tree (another exclusive form of "art"). I really don't think we need that crap in the technology sector. While I admire these "artists" for getting paid for doing IMHO absolutely nothing for it I don't feel good about letting them decide how the machine would operate and feel. This is a totally different sector; people need to work with these devices, which takes another form of expertise. Judging by most of the modern "art" I really doubt that most artists are capable of having some consideration for the public.
Re:All this is crap (Score:2)
Re:Very Cool (Score:2)
IMHO, it is going away. When all the devices you use can talk to each other over the network (where network = Bluetooth LAN, Internet, whatever), SneakerNet becomes unnecessary.
When will hard drives be replaced? (Score:2)
Lithium Battery? WTF? (Score:2)
What bugs me is that battery and/or portable power technology has not yet reached anywhere close to a pinnicle in terms of storage or efficiency. They inted to put a Lithium battery in this beast and have it run for two weeks? Great. Waste a bunch of engineering time so you can BEND A CIRCA 1999 BATTERY INTO A DONUT SHAPE?
What about zinc/air batteries? What about fuel cells? Recharging, please! If this thing requires so little power what about solar? I haven't used a tiny, cheap four funciton calculator that needed batteries in about 6 or 7 years!
These people are just making noise with buzzwords to sell their design services. There is little vision (as far as technology goes) apparent in their work.
~GoRK
Dictation requires training!!! (Score:2)
Like you, I can type at ~100WPM. But it took time to learn how to do that. Years. Wonder how fast I could be dictating words if I had trained that as hard as I did typing. I'm also pretty sure that sore vocal chords would be a reminiscent of old days once you mastered it, just as sore hands from typing when I was still learning to do that...
Why does everybody demand that voice recognition require little or no training whatsoever? I've been able to move my fingers for years before I could type, and I'll happily accept that it'll be a considerable time before I can dictate with any form of precision and speed. Once we can do that (assuming that the current crop of speech recognition software catches up), *THEN* let's discuss if typing or dictating is faster than the other.
the cynics speak... (Score:2)
Palm entry is NOT efficient (Score:2)
Do you actually use a Palm regularly? I do, and some of my co-workers are true artists in terms of grafitti style writing. But they still write VERY slowly compared to their keyboard typing speed. It's just a slow way to enter data...
Don't get me wrong - I really like using my Palm. But not for hardcore typing. Good for taking notes during meetings. Going over my calendar in the commute train. Catching up with the news that got hotsync'ed while I was busy typing to earn a living.
I wouldn't take this too seriously (Score:3)
In the 1950's and 1960's a lot of predictions (many of which you have probably seen yourselves) were made as to how we would live in the year 2000. Now, I don't know about you guys, but I'm not wearing silver jump boots and driving my hover car to work yet, and I still have to cook all by myself. It was foolish to try and predict 40 or 50 years into the future, and we've learnt our lesson. So now, a design team want to get some publicity and attempt to predict where our computers are going to be in 10 years time. There are some flaws though.
1. Why do I want a computer shaped like a frisbee in the first place? Too big to carry around, too small to make it look like it deserves a space in my office.
2. All of these technologies are still on drawing boards and we won't see prototypes until 2010 at the earliest. In addition better and more useful technologies are likely to emerge between now and 2010 meaning some of these components may never get the R&D required to develop them.
3. Desks as screens is the most stupid thing I've ever heard in my life. After a weeks use, the cramp and pain in my neck would probably become unbearable. It's also an "illegal" position for visual display equipment to be placed in under the Health and Safety at Work Act in the UK. But hey, that nasty dude with the supercomputer in Tron had one, right?
4. More powerful computers doesn't mean smarter computers. The article implies that because this machines will have equivalent power to something like a Cray J90 or somesuch that it will therefore make our environments "smart". So, does that mean they are predicting advancements in artificial intelligence as well? Funny that, because for the last 2 decades people have been saying that in 10 years time we'll have smart machines.
In short, I think that this is possibly the worst article I have ever seen concerning the future of computers over the next ten years. Seeing as it's completely publicity-generating pie-in-the-sky and not clearly thought out by anybody who understands these issues, why am I even bothering writing this reply? Because I've got nothing better to do? In that case, I think I'll go and design my house of the future (*yawn*).....
Integration of Keyboard, Mouse, and Voice Recon (Score:3)
As it is, there are many things I want to tell my computer to do, for example.
Save this file then shutdown for the night.
Saying that one sentence is much faster then waiting for the file to save, then checking to make sure that the OS actualy shuts down OK (granted, more reliable OS's would solve half of this problem.)
On the other hand, there is no way in the world that you are going to seperate me from my Rhino3d command bar, its a lovly thing. Abbreviations are much easier to type then say, and often times written words form patterns that spoken words do not. Thus the reason that I can type my name two or three times faster then anything else, my fingers have become acostum to typing it. On the other hand I still stutter while saying my name (ironic, eh?)
I do like the idea of a virutal desktop though, there are many times in life that I just wanted to be able to play around with somthing in real 3d. In addition, it is easier to tell a computer to do somthing like:
Undo that second to last spell check
Then it is search through a document for the word that you accidently corrected (spell checkers don't understand most industry buzz words, go figure!)
Of course there is one major problem with voice reconization technology right now, namly
THE DAMN THING CAN'T UNDERSTAND A WORD THAT I SAY.
Sure sure, I've read the manuals "may need 3 or more hours of training in order to work as advertised"
I've also spent half an hour trying to get the trainer to understand a single word that I am saying.
The reason?
Simple.
I have spent so much time reading, that most of the words that I know the definition of, I have never actualy pronounced!
Honestly, how often does the word defiled or the phrase uber mage come up in daily conversation?
The fact is that the written english languages deviates strongly from the spoken english language, heck, ask almost any english lit teacher if you don't believe me.
Many words that people use in their daily spoken language that they have perfected the pronounciation of, they have never actualy written down, and many words that they write down, they may have never pronounced in their entire life (I know I have gone 5 or more years without actualy speaking a word but using it almost weekly in my writting.)
The basic fact is that it is not natural to dictate a document. Most authers who dictate documents (do any writters still actualy do that anymore?) have somebody else go through and eliminate the standard mannerisms of speech.
(this also explains why those medieval fantasy novels sound so emmensly lame when read out loud).
Yet there is hope for integratiing voice reconization into the modern and everyday computer. Many long winded or multiple step commands would fit very well into a voice reconiziation programmed enviorment. Granted, all the Linux nuts out there have script files to do most of the work for them if the job exeeds 5 or 6 long winded tasks, but for other commands, such as the given example of save then shutdown, this would be the perfect system.
Imagine somebody being able to give the following command:
Delete all games that I have not played in the last 5 months and then defragment my hard drive.
The best part about the above command is that it is not technoloicaly impossable. Heck, a file system which keeps track of "last date accessed on" and that allows for catagorizing of directories (games, apps, pr0n) is all that is needed.
Whats more, it would save me the 5 or so hours it takes for me to sort through my games directories.
It has even another bonus, the average user could understand it! Granted, I'm not a big fan of the "average user" (*cough* newbie *cough*) but it doesn't exactly take a rocket scientist to understand the idea of deleting everything that hasn't been used in at least 5 months. Just follow up by (attempting to) explaining defragmenting a HD (you have to do it, shut-up, end of story,-- my perfered method) and you have a very easy to use and often needed command that is actualy easier to use through voice reconization then it is to do by keyboard or mouse
.
Examples of other commands that are easier to use through voice recon then traditional commands:
Save this file as (document name)final.txt then delete all other previus backup copies of this file.
Goto visited websites and rotate all passwords
remove all non-program essential image files
DELETE THAT DAMN DIRECTORY ALL READY (for those people are are stuck with windoze and uninstall shield, you know what I am talking about!)
Find some of all items in colum C and place it in the next avaible row of colum C
Run virus program, clean up temp files, then run defrag
Partional off the newly installed 20gig HD into two 10gig segments labled D and E
Copy all user created text files to the CD-RW that was just inserted
Delete all MP3's that have only been listened to once
Collect all non-program attached midi files and move them to usr/music/midi
(yes yes, some people here still listen to midi files, get over it!)
Anyways, just my, err, uh
2 and a half bits I guess, heh.
Historical perspective (Score:3)
I recently purchased an used book called Computer Systems in Business: An Introduction recently. Published in 1986, it features computer systems that seem pretty quaint almost two decades later.
One clever thing the book does is discuss the concept of P, a relative-cost measure for dicussing how much computers cost in light of the opposing forces of inflation and Moore's Law which make costs difficult to forecast in actual currency. P is equal to the average cost or value of a standard microcomputer system, which at the time consisted of...
The books goes on to discuss minicomputers (6 megs memory with 200 ns access time, 8K cache and 1.8 gig of hard disk -- er, Winchester -- space. Only when the book gets to mainframes do the machine specs seem somewhere in line with a machine you could probably buy at your local appliance shop...
The diagram, which I can't include in this posting, shows 35 disk drives of 800 megabytes each (28 gigs total). In P units, the cost of such a beast was 1,043, or about $3.6 million dollars.If we assume that the rate of tech progress over the next 15 years is roughly the same as the past 15, we have a starting point for visualizing what the average personal computer of 2015 could be like -- simply look at machines that cost about a thousand times more than the Celeron 600 at Circuit City.
Then, in the best /. traditon, imagine a Beowulf cluster of them!
Life choices... (Score:2)
Re:whats up with the no keyboard fetish? (Score:2)
.i go to ge to ga
Or something like that. It might even be shorter if I was better at lojban.
I said that rather slowly and distinctly in 18 seconds.
--
No more e-mail address game - see my user info. Time for revenge.
Re:Just a moment, here... (Score:2)
This should make university computer labs interesting, especially for people writing code.
Obviously it will be nearly impossible to write code without using a keyboard, but most computer users are not writing code: They're sending e-mails, writing papers and looking up information on the Web. With suitably advanced software (10 years is a long time, and in many areas we're already there), this can all be done vocally, but there will always be need for a keyboard.
Be careful when drumming your fingers.
With a full desk top of space to work with, I imagine I'd be keeping my critical triggers away from the areas I'd rest my hands.
Aerodynamic?? (Score:2)
Although now that I think about it, a drink holder in the case would be pretty damn cool.
Re:Actually (Score:2)