Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Operating Systems Software Windows OS X Linux

Where Are Operating Systems Headed? 278

An anonymous reader writes "Dr. Dobb's Michael Swaine breaks down the question of where operating systems are headed. Among his teasers: Is Vista the last version of desktop Windows? (Counterintuitively, he says no.); Did Linux miss its window on the desktop? (Maybe.) And, most interestingly, are OSes at this point no longer necessary? He calls out the Symbian smartphone OS as something to keep an eye on, and reassures us that Hollywood-style OSes are not in our short-term future. Where do you weigh in on the future of operating systems? In ten years will we all be running applications via the internet?"
This discussion has been archived. No new comments can be posted.

Where Are Operating Systems Headed?

Comments Filter:
  • What's the point? (Score:5, Interesting)

    by itsmilesdavis ( 985700 ) on Friday February 09, 2007 @12:52PM (#17949808)
    Everybody is talking about running applications through the internet. Why would we, as consumers, want to do this? The RIAA and MPAA are attempting to limit our ability to make backups of things we purchase. Now, software appears to be heading in the same direction. If we start streaming applications, then we could easily get into a pay-as-you-use function, or some other horrid distribution system. Frankly, I would not want to be charged every time I open a text document, or an IM window, or an internet browser. And I don't like the idea of paying a subscription fee either. I think forcing people to stream applications through the internet will only push more people into using Linux, so that everything is right there on the machine.
  • by tchuladdiass ( 174342 ) on Friday February 09, 2007 @12:53PM (#17949828) Homepage
    The definition of an operating system I like to use is:
    An OS is a collection of code that is used by software to manage access to system hardware via a well defined API, along with a collection of standardized utilities that provide for user access and management of system hardware and data structures and data streams associated with that hardware.

    So, under this definition, the kernel is a peice of the OS, disk access utilities are part of the OS, but applets such as a mini word processor and paint program are mearly bundled utilities.
  • by Peter Trepan ( 572016 ) on Friday February 09, 2007 @12:53PM (#17949836)

    I'm worried that we're going to keep building on top of the macrokernels we already have, without cleaning up and simplifying things as we go. I'm worried that the future will be as presented in Vernor Vinge's A Deepness in the Sky, where everyone runs an operating system too large, un-modular, and spaghetti-like for anyone to understand, much less debug. Hurry with The Hurd, RMS!

  • by 192939495969798999 ( 58312 ) <info AT devinmoore DOT com> on Friday February 09, 2007 @12:55PM (#17949886) Homepage Journal
    The vast, vast majority of internet-goers are already running a lot of stuff on the internet, like email, various activex controls, etc. which aren't technically traditionally installed apps, even if they're not entirely internet-based either. The transition phase is over, and now that more and more internet-based apps are coming out, it will just be a more diverse environment -- not just a "pc only" or "internet only" world.
  • Consumer devices (Score:5, Interesting)

    by pubjames ( 468013 ) on Friday February 09, 2007 @12:56PM (#17949900)

    Two words: Consumer devices.

    I think Steve Jobs has seen the future, and realised that the PC won't be so important, the action is all going to move to various types of devices aimed at consumers. So, he started with music players, is moving into portable video/gaming and now of course telephones, and has made the first steps towards TV. Television is the biggie of course, and I believe Jobs is being deliberately low key about his intentions there - with the low key announcement of the Apple TV box, for instance.

    Here's a prediction, in the next few years Steve Jobs is going to make a presentation where he says something like "First we revolutionised the personal computer, then the music player and the telephone. Now we're going to revolutionise television..."

  • by AKAImBatman ( 238306 ) * <akaimbatman@gmaYEATSil.com minus poet> on Friday February 09, 2007 @12:59PM (#17949942) Homepage Journal
    What is OpenGL? ODBC? SDL? XLib? They aren't part of the Operating System, and yet they're not programs. What are they?

    Programmers think of them individually as APIs. Collectively, however, they add up to the platform the software targets. As long as that platform is available, the software is portable.
  • BeOS (Score:1, Interesting)

    by Anonymous Coward on Friday February 09, 2007 @01:01PM (#17949972)
    BeOS was/is the future. If only we could get the source code.
  • Why? (Score:3, Interesting)

    by khasim ( 1285 ) <brandioch.conner@gmail.com> on Friday February 09, 2007 @01:01PM (#17949978)
    Why include things like "a mini word processor"? That gets into too much interpretation of what "mini" is.

    I prefer to define an OS as the code that controls the local hardware.

    If the OS allows some other app to control the local hardware then that OS has a "vulnerability" and is not "secure". There are lots of examples of that in history.

    Apps run on the OS. And app can be something such as Java which can run apps itself. But Java should never be touching the local hardware.
  • by wsanders ( 114993 ) on Friday February 09, 2007 @01:01PM (#17949982) Homepage
    I'm inspired by Ray Kurzweil's keynote at RSA Conference 2007.. http://singularity.com/ [singularity.com]

    If you're a M$ hater, just wait until "sap and impurify your precious bodily fluids" is a system requirement.

    Among other nanotechnological breakthroughs, Kurzweil says it will be possible to inject robotic blood cells that will enable you to "sit at the bottom of a swimming pool for 4 hours."

    OK, for now I'll settle for Fedora Core 42 and nano-robots that will let me drink as much red wine as I want without getting a headache.
  • by davidwr ( 791652 ) on Friday February 09, 2007 @01:07PM (#17950052) Homepage Journal
    For a computer to be useful, you need hardware, applications, and input and output. That's it, nothing more.

    Everything in between is there as a convenience.

    Whether it's convenience library routines like math libraries, a hardware-abstraction or -virtualization layer, or things that let more than one application coexist and even communicate, or whatever, OSes and other "in between" parts of a computer are there to make the application more useful, easier to write and maintain, or both.

    We will always have these in-between layers. Whether the "in between" layers of the 22nd century are anything like today's OSes only time will tell.

    Personally, I think 10 years from now you will see just about every application running in an isolated environment, possibly a VM of sorts. In particular, applications which access machines or applications that are not "trusted" will be run isolated from other applications on the system. They will be able to save files to a scratchpad area and send events to certain other applications such as a printing subsystem, but that's about it. Applications will communicate with other applications on the same PC in much the same way distributed applications, such as a web application, communicate today.

    By 2017, I also see most applications using virtually no local storage except security credentials and cached data. All "real data" will be stored on "the big server in the sky" or "the big server run by the IT department." The exceptions will be applications demanding extreme privacy, such as diaries and non-networked dayplanners, applications demanding offline use, such as cellphone notepads, and "convenience applications" like calculators and non-networked games.

    By the time our Kindergarteners reach High School, the distinction between wristwatch, cellphone/PDA, and laptop/desktop/home-entertainment-center will be one of scale and purpose, not architecture or raw capability.
  • by Morgaine ( 4316 ) on Friday February 09, 2007 @01:10PM (#17950098)
    Until devices and other hardware components have enough built-in intelligence to communicate with each other and with user programs, and until their built-in intelligence is presented to applications through a standardized communications interface, there will always be a role for operating systems.

    And the reason is simply that this is the primary role of an O/S: to glue together many rather dumb components (some virtual, some non-local), and to provide a standard abstraction for them, so that applications can be programmed with a degree of sanity. Everything that O/Ss do can be considered in those terms.

    Host operating systems will disappear when they are no longer needed. And *that* will happen only when/if their key functions have migrated into the hardware, so it's a defensible argument to say that actually they will never really disappear, but transform.
  • Re:Consumer devices (Score:3, Interesting)

    by pubjames ( 468013 ) on Friday February 09, 2007 @01:10PM (#17950104)
    Hate to kill your Apple dream but Microsoft will do mainstream IPTV with the xbox360 way sooner,

    Yes, and they did phones sooner, and I believe they had PDAs that play music before the iPod.

    The difference is that Jobs has a very clear idea of what consumers want. My old mum isn't going to buy an XBox360 to watch TV on it. Nor am I for that matter.
     
  • by johnhennessy ( 94737 ) on Friday February 09, 2007 @01:15PM (#17950218)

    While the author correctly identifies a huge potential market for smartphones in the coming years, maybe his assumptions about Symbian are a little naive.

    These smartphones are becoming popular because they are becoming more and more like a standard PC every day. The only exception being the user interface (if anyone has an idea how to fix this, give me a call ! I promise to share in the huge profits ! ).

    This is facilitated by the increasing processor power that these phones have available to them. Symbian was designed for small memory, low performance processors which incredibly strict power consumption requirements and limited connectivity running in a highly controlled environment (i.e. software environment).

    The cost of developing drivers for Symbian (with all its quirks) is enormous. At the moment, the semiconductor companies are getting hit with the cost of this development. This will not last forever, they will always strive for the cheapest possible solution - and this helps explain Linux large penetration in this market.

    The company that holds the best cards in this field is Apple. They have waited until mobile devices have become powerful enough to run (only slightly modified) standard PC kernels (XNU). This is going to save them a fortune in the years to come. Microsoft has missed this boat - they are trying to split their OS into as many different branches/versions/flavours as possible, while neglecting the requirement to try and maintain a common "brand" across all devices.

  • by davidwr ( 791652 ) on Friday February 09, 2007 @01:23PM (#17950338) Homepage Journal
    The point of Internet applications, or equally, Intranet applications, is "run anywhere" convenience.

    My ISP offers webmail. If I use it instead of POP, I can read my mail anywhere, anytime. In exchange, I lose the privacy that comes with keeping my data local. I also lose the ability to read my mail when the ISP has a hiccup.

    Google offers maps. In most cases Google Maps is a lot more convenient than firing up my local street-maps program. It's also "run anywhere."

    On the other hand, I don't think I'd want my doctor to put my medical records on any online database unless it was very secure and run by trustworthy people and didn't allow unencrypted connections.
  • Some cleanup happens (Score:3, Interesting)

    by davidwr ( 791652 ) on Friday February 09, 2007 @01:29PM (#17950440) Homepage Journal
    There was a time when people tried to cram an http server into Linux.

    It may still be there but it's not used outside special-purpose environment.

    Likewise, until recently people tried to cram almost every filesystem and pseudo-filesystem under the sun into the Linux kernel. With the advent of FUSE, future pseudo-filesystems and even real ones will be in userland. Sure they won't perform as well but at least they won't kill the kernel when they bug out.
  • Re:Consumer devices (Score:3, Interesting)

    by ObiWanKenblowme ( 718510 ) on Friday February 09, 2007 @02:36PM (#17951578)
    No, Microsoft knows how to cram a bunch of features into a product and tout that as an improvement. More features does not necessarily equate to a better product. FM radio? Really, is that a feature that the portable music player market has been demanding? Furthermore, I'd expect the Zune's ability to share music (but only for 3 days) to frustrate and alienate more average users than excite them. Please don't make the same mistake that a lot of other /.ers do and assume that because you want it, the rest of the target market must want it.
  • by Dunbal ( 464142 ) on Friday February 09, 2007 @02:49PM (#17951778)
    robotic blood cells that will enable you to "sit at the bottom of a swimming pool for 4 hours."


          I wonder what those "blood cells" are going to do with all that HCO3-? Lung physiology doesn't simply consist in oxygenating the blood. The lung also has to get rid of that excess CO2 (dissolved in the blood as HC03-), otherwise the blood pH will decrease very quickly leading to respiratory acidosis and death. I'm not sure how you can breathe out without breathing in, though...

          Oh, and we won't forget about how those lungs you're not using will collapse as the gasses in them get absorbed over those 4 hours. It's called atelectasis. Then think about how this prevents the lung from cleaning itself, and how many bacteria will have reproduced in those static lungs in 4 hours. The person will be looking forward to a severe bilateral pneumonia within a day or so. Whoever proposed this is obviously NOT a physician - or needs to review physiology rather urgently. It's a terrible idea.
  • by misleb ( 129952 ) on Friday February 09, 2007 @02:54PM (#17951852)

    My ISP offers webmail. If I use it instead of POP, I can read my mail anywhere, anytime. In exchange, I lose the privacy that comes with keeping my data local. I also lose the ability to read my mail when the ISP has a hiccup.


    Try a service that has IMAP. Or have POP leave a copy on the server (though not as good as IMAP). One big problem with relying on webmail is that you can't easily integrate multiple accounts into one interface. Most webmail services are designed to access that service only (although some have the option, but then you're back to IMAP/POP).

    It is a big pain to have to browse to different sites to get your mail. I have Apple Mail configured on several computers, all accessing the same IMAP accounts. Webmail is a nice *option* to have available in case I'm somewhere that doesn't have Apple Mail, but I'd hate to use webmail all the time. Gmail is "OK," but I still prefer local applications... particularly Apple Mail.

    Google offers maps. In most cases Google Maps is a lot more convenient than firing up my local street-maps program. It's also "run anywhere."


    For quick queries, yeah, the online stuff is good enough. But if you want to do anything more advanced like complex trip planning, firing up a local app usually best.

    Personally, I dont' see local apps disappearing for quite some time. I think online apps tend to work IN ADDITION TO local apps. Not as a replacement.

    -matthew
  • by Doctor_Jest ( 688315 ) * on Friday February 09, 2007 @03:00PM (#17951942)
    We had that once before... and the personal computer shattered that model.... Ever since then, companies, governments, and people who crave control have been trying to push us back to that model.

    I'll use a pad and paper before I'll go back to dinosaur computing.... I don't have to be connected to the internet to use my computer/game console/phone... But it does add convenience... I don't know if I want to trade autonomy and 100% control of my computing devices for the ubiquity of "scaling purpose". That's just me....

    Of course, I'm not the target market for such changes... and you hit the nail on the head regarding who will be most likely to embrace this shift in technology... Ah crap... I'm such a dinosaur... :)

  • Michael Swaine. Dr. Dobbs Journal. Yeah sure. Geez, what a pretentious twat you are.

    I think your own post shows far more pretentiousness than mine does. I have the highest respect for Mr. Swaine, and the work he has done in the field of computer journalism. But that doesn't mean that everyone always expresses themselves clearly, or even have a solid enough concept of what they wish to communicate on paper. (As a fellow author - no, I'm not talking about blogging - I can identify.)

    Fundamentally, I'm not disagreeing with Mr. Swaine. Only expounding on what he's attempting to say, and (hopefully) removing his confusion. If and when he reads this (which is a very likely possibility), I hope he thinks, "Yes! That is exactly what I was thinking!" :)
  • by mlgm ( 61962 ) on Friday February 09, 2007 @03:57PM (#17952778) Homepage
    I believe (or at least I hope) that the future of operating systems does not lie in fancier user interfaces, but in making the computer more responsive.

    Do you know that today's computers are really fast? I mean, those GHz processors are incredibly fast, it is unbelievable what they are able to do in a second. But you might not know it from just using a computer.

    In my daily work I often receive very slow responses from both Windows and Linux machines. I often have to wait seconds for things that should (and could) be instant. I mean after the screen saver on my desktop machine locks the screen, the next user request invariably will be to unlock it. The OS should know that. And it should sit there waiting for any sign that its master wants to work again and then it should instantly present the password dialog.

    Or what about those apps where I have to look for seconds at animated splash screens saying that they load this or that module or plugin. Why can't the OS provide means for loading pre-initialized applications (some folks might remember the undump utility).

    There are possible performance improvements all over the place, which could be achieved by using techniques like caching or using database technology or being able to hint to the operating system which ressources might be needed next. Together with maybe a little more RAM this could create a really reactive user experience.

    I often wonder how you can spend so much money for creating software and come up with such bad and slow design :-).

  • by ratboy666 ( 104074 ) <fred_weigel@[ ]mail.com ['hot' in gap]> on Friday February 09, 2007 @04:21PM (#17953116) Journal
    And you have it.

    BEGIN RANT

    Isolated processes, running on hardware or VMs, or as processes under an OS. Using network semantics to communicate. A simple model -- forget about threads and the attendant semantic issues. The model is already supported, and even "Windows" can participate (although that locks us into the SOCKETS API). On top of that we can have RPC, shared storage, time and identification services, etc.

    Works wonders, and it has brought us to where we are today. The model can continue growing. Except that it really isn't the "preferred" model for Windows development. Indeed, the preferred Windows methodology is to use MSVC, and bind the application code into the GUI. Windows doesn't even ship with an X server!

    For HPC, we need somewhat different models -- the latencies imposed by typical network stacks do not permit the performance levels needed by the computation "parcels". But, this is (generally) dealt with by source language extensions, that hide the interconnect issues.

    Is this the future? Maybe, (or not, I am horrible at this game). But it is the present. My home computer is a network. Storage is centralized into a RAID-5 server, serving out NFS directories, including HOME directories, and Operating Environment pieces. Using automount, of course, to give a consistent internal view of the filesystem. Stations use NIS for login, and automount maps, etc. giving consistent login and home directories and tools. NTP keeps the time the same on the different parts. IMAP provides consistent mail services. DHCP handles the mundane assignment of IP address space assignment, and informing the parts of where such things as the local NTP resources are. It doesn't matter whether a part is running on a machine, or under a Virtual Machine (I deploy VMware server). A CVS server handles projects, and an SQL (MySQL) server handles database storage as needed (for media tracking for MythTV, mostly, although there are other databases).

    It Just Works. The Network Is The Computer. Two ideas, melded together. Of course, Windows is an ugly stepchild in my environment (It works, but needs tweaking, and there is an almost ungodly amount of bending in the infrastructure for support). MAC OS X? I don't know. Nobody has ever tried an Apple laptop in my home office, so I can't comment. (but, initial feelings -- NIS support may or may not work, NFS probably does, X probably does, automounter seems to be almost a foreign idea to most MAC users I talk with -- take that with a grain of salt). Solaris? An easy fit -- I use it. Linux? A no-brainer, HP/UX and AIX? Easy. (though I don't use them).

    I even extend the network with fixed-function devices (DSM-320 DLink media receiver). It uses the "UPNP standard". Now, I am not sure that standard was actually needed, but I do support it.

    All brought to us by the "simple" POSIX API and semantics, and SOCKETS.

    A new direction of OS design? Its a bit a marketing show. Its easy to add glitz and shizzle to upper UI elements, but the OS is generally considered the resource controlling layer.

    I don't want to make it sound like I think that layer is static. I think static is a good thing for basing current and future developement, but extensions are certainly welcome. The biggest changes, in my opinion, are the support for "zero-copy" operations. These can require either a great deal of care in setting up the exact circumstances under which such an optimization can be utilized, or a new API, opening the feature up to broader use. Fast select semantics, possibly through a new API are another such area.

    Then we have the implementation of that layer -- the big news being virtualization and complete isolation.

    Everything else I have seen is, to be kind, a marketing driven "OS feature" that really shouldn't be discussed as an OS feature. This includes "3D desktops", the whole idea of a desktop, included applications, and even "what applications are supported".

    The last point is important. If the semantics of the OS conf
  • by eno2001 ( 527078 ) on Friday February 09, 2007 @04:22PM (#17953134) Homepage Journal
    I agree. But the average person doesn't make these distinctions. As far as they're concerned Microsoft Office or even Microsoft Word is the operating system. In terms of technical discussions, it would be nice if people would stick 100% to the technical viewpoint. But, even technical folks get distracted and slip into using "OS" to mean a complete kernel + subsystem + desktop environment + applications. Sadly I don't think most people will ever get the distinctions.

    I have yet to read the article so my comments should be taken in that light. One of the things I disagree with when I hear about people saying that desktop apps will be replaced with web based apps is that this doesn't apply across the board. Who, in their right mind would do video editing, audio production, 3D modeling and rendering via a web app? Those are legitimate uses for a computer that will never find a home on the web unless we all have guaranteed 1 terabit per second links to the server on which the application is hosted as well as dedicated resources on that server. In short, it ain't gonna happen. Some may argue that these aren't mainstream uses for computers, but you'd be wrong. With Microsoft and Apple packing video and audio editing tools into their OSes, you'd be VERY wrong to say that.

    The other thing that people who make such claims seem to assume is that everyone uses the computer just like they do. They assume that all people want is word processing, e-mail and the web. Maybe a little streaming media and music downloads and that's it. Again, they're totally wrong. The computer is such a flexible tool, it would be a shame to put the albatross of the perception of what an average user does with a machine on it from the factory. OSes are here to stay. They might eventually get a different designation, but they will still be OSes at the core from today's mainstream definition (OS kernel + apps to make the kernel useful to a user).

    Another reason I would assume the writer is all wet, is that I've seen the future of OSes. In fact I'm living with the future of OSes at home and work. What people think of as an OS, will in the future be completely disassociated from hardware. It will be ephemeral. It will be able to jump from one set of resources to another without the user even perceiving the switch, with all processes in tact. This is what I'm currently doing with the Xen virtualization system. Your VM is only associated with it's storage and subnet. The CPU and RAM that it's being executed on are irrelevant. The VM can be made to jump from one physical host to another without missing a beat and more importantly without your users ever knowing.

    Combine that feature with a system where CPU and RAM resources can be partitioned and allocated to or away from these system hopping VMs as well as a robust thin client approach and you see the beauty of this approach. No longer are you forced to give a user a ton of wasted RAM and CPU just to do typical desktop stuff. Now you give them only what they need and allow the system to increase or decrease the resources within parameters and on demand. So the typical user may get 128 Megs for basic use, but if they suddenly need more RAM, the system is configured to allocate their VM up to a max of 512 Megs. Meanwhile another resource intensive user might be off that day. The typical 512 Megs they are allocated is not being used. The lower limit for their VM is 64 Megs at idle. That gives you enough RAM to reallocate to the first user. The system takes care of this for you. The same with CPU time. All on the fly.

    That's where OSes are going. Users will still interact with a desktop of a sort, but what happens in the background is going to be hugely different. Look up the info for Intel's HVM and AMD's SVM support (hardware assisted virtualization). They didn't bet the farm on that for nothing. And Longhorn from Microsoft is slated to have a hypervisor to take advantage of HVM and SVM, so Microsoft can't be ignoring this either. In fact they've shown a lot of interest in Xen. So I think the article is likely completely wrong. But, I'll be able to say for sure after I read it. Off to read it now...
  • by patternhunter ( 681871 ) on Friday February 09, 2007 @10:34PM (#17958520)
    My Thinkpad died after apparently one too many Linux distro installations (I was adding a new one about every other day for a while). As a long-time Mac user in a Windows world, adding another OS to the mix made sharing files an often frustrating experience. With much more emphasis on compatible formats, moving from Mac to Windows to Linux and back again is relatively painless these days. And yet, maybe the real promise of Web2.0 is to make the OS irrelevant.

    Cory Doctorow describes himself as "someone who lives in his browser." I would put myself in that category as well and have been messing around with the idea of creating my own application service provider ever since I first heard that term back in the 90s. I love the idea of using any cpu as a terminal on the net where all my data and applications are stored.

    Here are a few of the apps that are making this more possible all the time:

            * gmail - now with nearly 3GB of storage, my current storage of over 1000 messages is only using 12% of capacity. At the rate that the service continues to upgrade capacity, I may never come even close to tapping out this service. Of course, Google may be running algorithms on all of us that will soon create a Minority Report world where we are bombarded with highly customized ad-sense commercials everytime our rfid-embedded brains pass a location-aware plasma screen.
            * google calendar - with nice integration with gmail and the ical standard, this is a shareable and syncable web calendar that seems to get the job done for now and is sure to improve over time.
            * del.icio.us - still the best social bookmarking / tagging service for my money (as in none since it's free)
            * thinkfree online - this is a seriously cool product that I just started playing with over the past couple weeks. Despite the slower start time, this nifty little web app kicks Writely's ass by allowing you to create, share and store (up to 1GB for now) MS Office compatable docs, spreadsheets and presentations all using a relatively intuitive interface that duplicates the look and feel of ThinkFree's destop product (which is very similar to its Office counterpart). It even has wiki-type versioning history and allows you to post to a remote blog too.
            * openomy is one of a bunch or new data storage services on the web these days. Openomy is written in Rails gives you a nice interface and 1GM of free storage.
            * bloglines is still my favorite web-based RSS reader. It is incredibly easy to use and is one of the first things that I open when I am traveling or just have a quick minute to check in with what is going on in the world (or at least the world that I am interested in)
            * So this sound great for common productivity tools but web-based apps will never replace apps like iTunes to play the music you have, right? Actually, Pandora, BlogMusik and similar apps to come might be even better to help you explore music you don't have (and both are free, at least for now)
            * E-Messenger and KoolIM are a cool web-based instant messengers that allows you to IM with AIM, MSN, and Yahoo (including Yahoo Beta) without dowloading any client software.

    With web-based applications and data storage that enable us to work and play beyond the desktop, could the "OS wars," and maybe the OS itself, soon be a thing of the past?
  • by master_p ( 608214 ) on Saturday February 10, 2007 @07:46AM (#17961638)
    The programming language C and the user/kernel mode will not survive for much longer.

    First of all, the C model has been proven to cause more problems than benefits. The C model is defined as the model where native code is executed directly by the hardware, absolute barriers exist between programs, the kernel routines live in a different universe than the programs etc.

    There are great problems with this model:

    1) co-operation between programs proves very difficult both for the O/S designer and the programmer. Very specialized mechanisms are required for programs to communicate: pipes, sockets, shared memory, etc. Those things work nicely, no doubt about that. But to code an API on top of them is not straightforward and it takes time.

    2) viewing a process as a giant array of bytes resulted in billions of dollars of damage in buffer overflow exploits, null & wild pointers, etc.

    My prediction is that at some point in time, someone will come out with an O/S that is not based on C, but on a more advanced programming language, like Java, Smalltalk, Erlang or Haskell. And those O/Ses will prove that APIs are more important than O/Ses, and that modules are better than processes.

All seems condemned in the long run to approximate a state akin to Gaussian noise. -- James Martin

Working...