Operating Systems of the Future 436
An anonymous reader writes: "'Imagine computers in a group providing disk storage for their users, transparently swapping files and optimizing their collective performance, all with no central administration.' Computerworld is predicting that over the next 10 years, operating systems will become highly distributed and 'self-healing,' and they'll collaborate with applications, making application programmers' jobs easier."
A vision of OS future : tiny reliable components. (Score:5, Interesting)
It's definitely a good approach, although ErOS is still quite experimental yet.
A nice conspiracy theoretic rant (Score:2, Interesting)
Your digital "rights" managed TrustedPCs will connect to a giant virtual disk array via the network, where what you store will be subject to government and corporate monitoring and removal.
Think this is nuts? Where are the 200GB drives? Why is Intuit pushing us to store tax and financial information on their site? Why does Microsoft want to give us an authentication token that's good for retrieving our information "anywhere, anytime."
Why would anyone (other than a legitimate large corporation) have a need for local storage, once the Internet storage product is fast and cheap? I can only imagine one use for local storage--copyright infringement.
Scary (Score:2, Interesting)
Now if it was open source, distributed OS with self healing I might be ok, I guess I just object to giving that much control to a large coorporation whos main concern is profits and not my privacy.
Hmmm... (Score:5, Interesting)
I predict that there will never be a revolutionary new operating system until we break free of the chains imposed by Posix compliance. Until then, we're stuck with files that have to be streams of bytes, ugo-style permissions, non-wandering processes, incompatable RPC calls, &c.
And the real pain is there have been OS'es that have had simple & elegant solutions to problems that are hard under unix (Aegis, Multics, VMS, TOPS, ...) that were pushed aside by the steamroller that is Unix.
But to be fair, many of the forgotten O/S's are now forgotten because they weren't as general purpose as Unix. Unix is the great compromise. But it's hard to strive for the best when you've already accepted compromise.
The #1 Rule of Network Security (Score:4, Interesting)
Whoever thought up this pipe dream apparently doesn't understand the Zeroth Law of Network Security: If you want information to be secure, DON'T PUT IT ON THE FUCKING NETWORK!
Seriously! As if most business OSes don't default to the least-secure settings already! Why would you want to run important apps on a system where the default is to share anything and everything with any computer in listening distance?
If the last 18 years are any indication... (Score:3, Interesting)
I'll believe the distributed file-storage myth when I see it. To me, it sounds as if it would hog bandwidth, just like gnutella does. I don't see any change coming in the way I store files on my computer. It's fast, effecient, and hasn't needed a change.
SysAdmins need not quit their day-jobs. As long as Microsoft is providing this technology, you can be sure that it will run into snags and security vulnerabilities. Increased complexity = increased vulnerability.
...and that's all I've got to say about that
Re:Futurists are stupid (Score:2, Interesting)
That's just in general. Apply it to the technology sector, and it becomes even more true. About the best you can do is say "wouldn't it be cool if...?" But basically these guys just take an interesting research paper (out of the thousands out there) and act like that's what's actually going to happen.
But I'm better than them! I really can predict the future! I predict that in 10 years, there'll be a bunch of people predicting what will happen 10 years from then, and nearly all of them will end up being wrong. That's right, you heard it here first.
poorly researched article (Score:2, Interesting)
IBM believes that we are at just such a threshold right now in computing. The millions of businesses, billions of humans that compose them, and trillions of devices that they will depend upon all require the services of the I/T industry to keep them running. And it's not just a matter of numbers. It's the complexity of these systems and the way they work together that is creating a shortage of skilled I/T workers to manage all of the systems. It's a problem that's not going away, but will grow exponentially, just as our dependence on technology has.
From my understanding, autonomic computing and other projects like are going for something much bigger than "lets make our OS smarter." I seriously doubt this is targeted at the consumer, since there are too many privacy issues. The real benefit of "self healing" is in the corporate environment where up time is critical. Autonomic's goal as I read it is about making systems work together seamlessly to improve reliability and scalability. Say a server has some hardware problem or a switch is dying. Things like these could cause real financial losses, so having smart systems that reconfigure/heal itself could reduce the cost of hardware and software failures. How many times have admins had to get up at 3 am to fix the webserver because some log ran amuck and ate up all the HD space. Having a standard system for handling these problems would help make systems more reliable.
Too many reporters are getting way too lazy.
Interfaces still haven't changed (Score:1, Interesting)
VMS (Score:2, Interesting)
Predictable Predictions (Score:3, Interesting)
My strong belief is that the best "predictions" occur when you find something in use today - only too expensive for the home user - and "predict" it will be ubiquitous within a few years. So here are my completely predictable predictions.
Notice how all of my predictions sort-of exist already. This is what makes predictions so easy.
Where Are the Flying Cars? (Score:2, Interesting)
We were shown slides of how the OS would link multiple machines and faults could be automatically tolerated and hardware hot-swapped for repairs. Plasma panels would provide fully bitmapped presentations. A new language (PLAS) would make bugs a thing of the past. We thought it was pretty cool.
THEN, we were told that this is EXACTLY THE SAME SHOW (slides and all... except for PLAS) as was presented for the System/360... and THAT WAS EXACTLY THE SAME show as presented for the 7090... and THAT WAS EXACTLY THE SAME SHOW... Dumb as we were, we did realize that we hadn't done crap and that all the plans had come to naught.
So... now that it's 2002, where're the flying cars I was promised would be here by 2000!?!?
Expectation is Key to Reliability (Score:3, Interesting)
First off, we should learn a lesson from biology. The bee, for example, has about a million interconnected neurons. Yet the bee's highly sophisticated behavior is extremely robust and efficient. How does nature do it? The answer has to do with parallelism and expectations.
1. Parallel processing insures that signals are not delayed, i.e., their relative arrival times are guaranteed to be consistent.
2. Expectations are assumptions that neurons make about the relative order of signal arrival times.
We can emulate the robustness of nature by first realizing that computing is really a genus of a species known as signal processing. We can obtain very high reliability by emulating the parallelism of nature and enforcing a program's expectations about the temporal order of messages: no signal/message should arrive before its time. The use of stringent timing constraints will ensure that interactions between multiple tiny modules remains consistently robust. Enforcement should be fully automated and an integral part of the OS.
Of course, this is only part of it. The other constraints (e.g., the use of plug-compatible links, strong typing, etc...) are known already. No message should be sent between objects unless first establishing that plugs are connected to compatible sockets, i.e., that they must be of the same type.
The most problematic aspect of computing, IMO, is that it is currently based on the algorithm. Problem is that algorithms wreak havoc in process timing and the end result is unreliability. The algorithm should not be the basis of computing. To ensure reliability, computing should be based on signal processing. Algorithms should only be part of application design, not process design. Just one man's opinion.