Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Technology

Operating Systems of the Future 436

An anonymous reader writes: "'Imagine computers in a group providing disk storage for their users, transparently swapping files and optimizing their collective performance, all with no central administration.' Computerworld is predicting that over the next 10 years, operating systems will become highly distributed and 'self-healing,' and they'll collaborate with applications, making application programmers' jobs easier."
This discussion has been archived. No new comments can be posted.

Operating Systems of the Future

Comments Filter:
  • by chrysalis ( 50680 ) on Monday February 11, 2002 @02:35PM (#2988115) Homepage
    IMHO, future operating systems will tend to something like the ErOS operating system [eros-os.org] . This OS is based on multiple tiny extremely reliable components, within a strong capability model to provide a high level of security.
    It's definitely a good approach, although ErOS is still quite experimental yet.


  • by base3 ( 539820 ) on Monday February 11, 2002 @02:44PM (#2988209)
    Beware this "distributed storage" push. As the intellectual "property" "industries" gain more and more control of the world's governments, storage will be in the hands of a few large companies, and not under the control of individual users.

    Your digital "rights" managed TrustedPCs will connect to a giant virtual disk array via the network, where what you store will be subject to government and corporate monitoring and removal.

    Think this is nuts? Where are the 200GB drives? Why is Intuit pushing us to store tax and financial information on their site? Why does Microsoft want to give us an authentication token that's good for retrieving our information "anywhere, anytime."

    Why would anyone (other than a legitimate large corporation) have a need for local storage, once the Internet storage product is fast and cheap? I can only imagine one use for local storage--copyright infringement.

  • Scary (Score:2, Interesting)

    by amaprotu ( 527512 ) on Monday February 11, 2002 @02:46PM (#2988238) Homepage
    'Self Healing' scares me. I'm not entirely sure why, but I want to be in control of my computer. I'm afraid that with 'self healing' my computer can install things I don't want installed, uninstall things I do want and send all my information to Big Brother.

    Now if it was open source, distributed OS with self healing I might be ok, I guess I just object to giving that much control to a large coorporation whos main concern is profits and not my privacy.
  • Hmmm... (Score:5, Interesting)

    by dghcasp ( 459766 ) on Monday February 11, 2002 @02:48PM (#2988253)
    Oh, you mean something like Plan 9 [fywss.com] from Bell Labs?

    I predict that there will never be a revolutionary new operating system until we break free of the chains imposed by Posix compliance. Until then, we're stuck with files that have to be streams of bytes, ugo-style permissions, non-wandering processes, incompatable RPC calls, &c.

    And the real pain is there have been OS'es that have had simple & elegant solutions to problems that are hard under unix (Aegis, Multics, VMS, TOPS, ...) that were pushed aside by the steamroller that is Unix.

    But to be fair, many of the forgotten O/S's are now forgotten because they weren't as general purpose as Unix. Unix is the great compromise. But it's hard to strive for the best when you've already accepted compromise.

  • by Guppy06 ( 410832 ) on Monday February 11, 2002 @03:14PM (#2988457)
    "Imagine computers in a group providing disk storage for their users, transparently swapping files and optimizing their collective performance, all with no central administration."

    Whoever thought up this pipe dream apparently doesn't understand the Zeroth Law of Network Security: If you want information to be secure, DON'T PUT IT ON THE FUCKING NETWORK!

    Seriously! As if most business OSes don't default to the least-secure settings already! Why would you want to run important apps on a system where the default is to share anything and everything with any computer in listening distance?
  • by Whatsthiswhatsthis ( 466781 ) on Monday February 11, 2002 @03:19PM (#2988499)
    ...there won't be much drastic change from now till the next 18 years. For evidence of this, look at the Apple Lisa. The Lisa had windows, icons, a menubar, a WYSIWYG interface, and a mouse. Today's computers are little more than a glorified Lisa interface, whether they are running Mac OS X or Windows XP (I know because I run both.) Like the Lisa, todays computers still crash and still corrupt themselves. I doubt that this could be easilly changed in the next five, ten, or even fifteen years.

    I'll believe the distributed file-storage myth when I see it. To me, it sounds as if it would hog bandwidth, just like gnutella does. I don't see any change coming in the way I store files on my computer. It's fast, effecient, and hasn't needed a change.

    SysAdmins need not quit their day-jobs. As long as Microsoft is providing this technology, you can be sure that it will run into snags and security vulnerabilities. Increased complexity = increased vulnerability.

    ...and that's all I've got to say about that
  • by Chris Burke ( 6130 ) on Monday February 11, 2002 @03:37PM (#2988620) Homepage
    No one can predict what will happen in 10 years. Anyone who claims that this "is" what will happen is selling something. Anyone who says "maybe" is admitting that they are engaging in pointless masturbation. In 10 years, events run far outside of anyone's ability to predict cause-and-effect.

    That's just in general. Apply it to the technology sector, and it becomes even more true. About the best you can do is say "wouldn't it be cool if...?" But basically these guys just take an interesting research paper (out of the thousands out there) and act like that's what's actually going to happen.

    But I'm better than them! I really can predict the future! I predict that in 10 years, there'll be a bunch of people predicting what will happen 10 years from then, and nearly all of them will end up being wrong. That's right, you heard it here first.

  • by f00zbll ( 526151 ) on Monday February 11, 2002 @03:48PM (#2988668)
    The article is poorly researched. IBM's autonomic computing != Farsite. IBM's autonomic computing [ibm.com] is a very ambitious project. Here's the opening paragraph from the autonomic site:

    IBM believes that we are at just such a threshold right now in computing. The millions of businesses, billions of humans that compose them, and trillions of devices that they will depend upon all require the services of the I/T industry to keep them running. And it's not just a matter of numbers. It's the complexity of these systems and the way they work together that is creating a shortage of skilled I/T workers to manage all of the systems. It's a problem that's not going away, but will grow exponentially, just as our dependence on technology has.

    From my understanding, autonomic computing and other projects like are going for something much bigger than "lets make our OS smarter." I seriously doubt this is targeted at the consumer, since there are too many privacy issues. The real benefit of "self healing" is in the corporate environment where up time is critical. Autonomic's goal as I read it is about making systems work together seamlessly to improve reliability and scalability. Say a server has some hardware problem or a switch is dying. Things like these could cause real financial losses, so having smart systems that reconfigure/heal itself could reduce the cost of hardware and software failures. How many times have admins had to get up at 3 am to fix the webserver because some log ran amuck and ate up all the HD space. Having a standard system for handling these problems would help make systems more reliable.

    Too many reporters are getting way too lazy.

  • by Drazi100 ( 458128 ) on Monday February 11, 2002 @04:14PM (#2988874)
    not only that, but the mouse and click interface has been around since 83, though if you count the star workstation then that was the 70's too
  • VMS (Score:2, Interesting)

    by D.Throttle ( 432930 ) on Monday February 11, 2002 @04:40PM (#2989026)
    VMS has been doing all of those things for years. Now can anyone tell me where it is right now?
  • by nathanh ( 1214 ) on Monday February 11, 2002 @04:41PM (#2989029) Homepage

    My strong belief is that the best "predictions" occur when you find something in use today - only too expensive for the home user - and "predict" it will be ubiquitous within a few years. So here are my completely predictable predictions.

    1. Stereo equipment will start to offer Ethernet ports and "integration with your home computer". Initially this will be limited to song selections via Windows-only software.
    2. Affordable SANs will become popular. Initially this will occur within school/university labs but the gear will spread into "tech homes" as well.
    3. The word processor will become "that thing you get for free with your computer" thanks to efforts from Sun and OpenOffice, similar to what currently occurs with web browsers and media players.
    4. People will get sick of managing hundreds of incompatible devices; stereo, computer, MP3 player, discman, mobile phone, PDA, etc. Vendors will form large alliances to offer an integrated system.

    Notice how all of my predictions sort-of exist already. This is what makes predictions so easy.

  • by WeBMartians ( 558189 ) on Monday February 11, 2002 @06:13PM (#2990046)
    The Geezer remembers a presentation at IBM when it decided to make FS ("Future System)... about 1980...

    We were shown slides of how the OS would link multiple machines and faults could be automatically tolerated and hardware hot-swapped for repairs. Plasma panels would provide fully bitmapped presentations. A new language (PLAS) would make bugs a thing of the past. We thought it was pretty cool.

    THEN, we were told that this is EXACTLY THE SAME SHOW (slides and all... except for PLAS) as was presented for the System/360... and THAT WAS EXACTLY THE SAME show as presented for the 7090... and THAT WAS EXACTLY THE SAME SHOW... Dumb as we were, we did realize that we hadn't done crap and that all the plans had come to naught.

    So... now that it's 2002, where're the flying cars I was promised would be here by 2000!?!?
  • by Louis Savain ( 65843 ) on Monday February 11, 2002 @08:57PM (#2991346) Homepage
    Sure, you've got to start with reliable components, but you have to combine them in just the right way, too.

    First off, we should learn a lesson from biology. The bee, for example, has about a million interconnected neurons. Yet the bee's highly sophisticated behavior is extremely robust and efficient. How does nature do it? The answer has to do with parallelism and expectations.

    1. Parallel processing insures that signals are not delayed, i.e., their relative arrival times are guaranteed to be consistent.

    2. Expectations are assumptions that neurons make about the relative order of signal arrival times.

    We can emulate the robustness of nature by first realizing that computing is really a genus of a species known as signal processing. We can obtain very high reliability by emulating the parallelism of nature and enforcing a program's expectations about the temporal order of messages: no signal/message should arrive before its time. The use of stringent timing constraints will ensure that interactions between multiple tiny modules remains consistently robust. Enforcement should be fully automated and an integral part of the OS.

    Of course, this is only part of it. The other constraints (e.g., the use of plug-compatible links, strong typing, etc...) are known already. No message should be sent between objects unless first establishing that plugs are connected to compatible sockets, i.e., that they must be of the same type.

    The most problematic aspect of computing, IMO, is that it is currently based on the algorithm. Problem is that algorithms wreak havoc in process timing and the end result is unreliability. The algorithm should not be the basis of computing. To ensure reliability, computing should be based on signal processing. Algorithms should only be part of application design, not process design. Just one man's opinion.

Arithmetic is being able to count up to twenty without taking off your shoes. -- Mickey Mouse

Working...