Forgot your password?
typodupeerror
Windows Microsoft Operating Systems Software Technology

Fresh Air For Windows? 645

Posted by timothy
from the reinvention dept.
jmcbain writes "The NY Times has an opinion piece on how the next Windows could be designed (even through Microsoft has already laid plans for Windows 7). The author suggests 'A monolithic operating system like Windows perpetuates an obsolete design. We don't need to load up our machines with bloated layers we won't use.' He also brings up the example of Apple breaking ties with its legacy OS when OS X was built. Can Windows move forward with a completely new, fast, and secure OS and still keep legacy application support?"
This discussion has been archived. No new comments can be posted.

Fresh Air For Windows?

Comments Filter:
  • by Anonymous Coward on Sunday June 29, 2008 @08:12PM (#23994563)

    Can Windows move forward with a completely new, fast, and secure OS and still keep legacy application support?

    Based on past performance: No.

    This has been another edition of Short Answers to Stupid Quesitons.

    • by Bill, Shooter of Bul (629286) on Sunday June 29, 2008 @08:34PM (#23994765) Journal
      The wise answer is "maybe". There are only two companies that have done something similar. Apple, tried doing it from scratch and basically killed itself in the process, had to adapt already written NeXT. Even that took forever and sucked for a couple of years before they got everything right. Microsoft did something similar with windows NT: a ground up modern rewrite that was mostly compatible with the existing windows, but there was a lot of time that passed between win NT 3.50 and win xp. So if they started right now from scratch, maybe in ten years they could have something that would be decent.
      • Re:Short answer: no (Score:5, Interesting)

        by siddesu (698447) on Sunday June 29, 2008 @09:14PM (#23995133)

        Yeah, indeed ;)

        Or they could, like, ditch all their work done so far, fork wine and make the new OS run on top of linux+wine, possibly off a sqlite-based WinFS ;)

        Then just port their platform libraries onto that, redo their visual tools as eclipse plugins -- and presto, you have best of both worlds.

        And fast ;)

        • by lilmunkysguy (740848) on Sunday June 29, 2008 @09:32PM (#23995279)
          I'm not sure how you were modded flamebite. I like your ideas.
        • by Tangent128 (1112197) on Sunday June 29, 2008 @10:49PM (#23995909)
          I don't know about the "Eclipse plugin" part, but I do suspect Microsoft has a contingency plan of that sort.

          Move to a nix-y kernel, release a full .NET port; maybe fork wine, or just use some more dog-foody compatability layer.

          I suspect they'd introduce/keep their own API, though. I wouldn't expect X Windows to be bundled with (let's say) "Windows X"; they likely would use the transition to more strongly push Windows Forms over the older system, though.

          And of course, don't expect their addons to be Open Source, even if they do adopt the Linux kernel.

          In short, see OS/X.
          • by hairyfeet (841228) <bassbeast1968@ g m a i l.com> on Monday June 30, 2008 @04:13AM (#23997579) Journal
            Personally I wouldn't be surprised if Win7 turns out to be another dud their next OS turns out similar to Xandros(In fact I wouldn't be surprised if they bought Xandros and used it for the base) in that you would have a proprietary GUI on top of a Linux base. Just add the Win APIs from Win98 and WinXP in a parallels style compatibility layer and voila! A new stable Windows a lot faster than it would be if they did a total rewrite.


            The fact that all we are hearing on sites like Microsoft watch is how much isn't going to be changed from Vista doesn't fill me with much hope for Win7. Maybe if they stripped down the CPU hogging DRM and made it more like XP there would be hope,but I personally get the sinking feeling that it is going to be an even more piggy,more flashy,and more newbie centric Vista SP2. From what little I have seen and heard it looks to be another turkey. It is almost like them saying after the disaster that was WinME,"Hey I know! We'll just keep putting more junk on top of WinME until they like it!". Maybe with the head of the Office division in charge things will change. But I get the feeling that marketing is in charge and they want more and more DRM so they can try to become "The Apple of home entertainment". But as always this is my 02c,YMMV

        • by Anpheus (908711) on Sunday June 29, 2008 @11:05PM (#23996037)

          Eclipse? Fast?

      • by postbigbang (761081) on Sunday June 29, 2008 @09:52PM (#23995437)

        Windows NT was a re-write of OS/2 when Microsoft divorced IBM (or vice versa, depending on whom you believe). It started a new code branch, one that ran in 32-bit only (advanced at the time) and inter-version compatibility was often iffy at best-- NOT mostly compatible.

        These two code branches merged at Windows 2000.

        I smell a rat behind the entire thing. Windows 7 might be a hypervisor with plug-ins for whatever. I think Microsoft is floating trial ballons to see what might be marketable after the enormous and embarrassing mistakes found in Vista. It's an actual, along with a PR nightmare for them and justifiably so. Were I a stockholder, I'd have their heads.

        Don't mistake for a moment that Microsoft is still seeking solutions to the enormous problems they have in stagnation. Vista was supposed to be a monumental endeavor, and it's a monumental disaster for them. Now that BIll's gone, who knows what's going to happen.

        • Re:Short answer: no (Score:5, Informative)

          by Dog-Cow (21281) on Sunday June 29, 2008 @10:11PM (#23995597)

          If by "These two code branches" you are referring to NT and Windows 9x, you are off by a release. They merged with XP, not 2000.

          • Sorry, but (Score:5, Informative)

            by postbigbang (761081) on Sunday June 29, 2008 @10:23PM (#23995709)

            No. The code bases were to merge at Windows 2000 Professional. Windows 95/98/ME were based on DOS. Win2K was the merge point at server and 'desktop'. XP came after Win2K, sealing the fate. At Vista, support for 8/16-bit code using DOS functionality essentially died. Try Duke Nukem II if you're unsure.

          • Re:Short answer: no (Score:5, Informative)

            by Dun Malg (230075) on Monday June 30, 2008 @01:24AM (#23996861) Homepage

            They merged with XP, not 2000.

            No, XP was only a point release of 2000 (i.e. XP = WinNT 5.1, 2000 = WinNT 5.0). Win2K was the merge point. Anyone who was using NT before that remembers the pain of getting DOS/Win3.1 things to run properly under NT 4 (or 3.51!)

      • by bcrowell (177657) on Sunday June 29, 2008 @09:58PM (#23995497) Homepage

        The wise answer is "maybe". There are only two companies that have done something similar. Apple, tried doing it from scratch and basically killed itself in the process, had to adapt already written NeXT. Even that took forever and sucked for a couple of years before they got everything right. Microsoft did some thing similar with windows NT: a ground up modern rewrite that was mostly compatible with the existing windows

        It may be dangerous to reason by historical analogy, because the hardware situation is qualitatively different now. CPUs are no longer showing the kind of Moore's-law growth in power that they used to. Meanwhile ram and hard disks are ridiculously cheap. For the typical user who just uses a computer for websurfing, email, and word-processing, it's kind of silly to spend any significant amount of money on a new system. They already have more ram and disk space than they need, and the CPU isn't going to be that much faster. We're seeing perfectly reasonable desktop hardware now for $200, and it won't be long until you can get that same hardware for $50.

        If I was one of the people at the helm of Microsoft, I'd be really worried about this, because when the hardware is $50, there's not going to be much room left for profit on the OS. Most retailers have been reluctant to sell cheap hardware, because their own margins on it are thin, but it's just a matter of time until that changes. Fry's sold $200 Great-Quality-brand machines for years, and WalMart is now selling the gPC online for $200. Once people realize that they can get a computer for $100, or $50, the dam is going to have to break, and retailers are no longer going to be able to sell machines at prices of $500 or $1000. It's going to be like the transition from the radio as a big wooden box to the transistor radio that you could carry with you to the beach, and throw in a dumpster if it got sand and water in it.

        In this new landscape, there's very little reason for MS to exist. One of the few reasons left for them to exist is that people have money invested in software, and they don't want to have to buy new software. The insane success of the eeePC -- and even at much higher prices than they originally thought they could get --- shows how vulnerable MS is. There are a lot of users out there who just use their computers for word-processing, email, and websurfing. Maybe first they buy a $50 Linux box for their kid to use to write her high school papers. That works out okay, and pretty soon the kid is like, "Mom, are you crazy? You're talking about spending $400 for a new computer? Just buy one like mine."

        • by peragrin (659227) on Sunday June 29, 2008 @10:07PM (#23995573)

          It is already beginning. I submit the EEpc, OLPC, and the sudden burst of real computers with real OS''s being shipped for under $400 right now Windows is holding back more development than anything else, especially with the intel atom processor. sorry you can't get a $100 OS onto a $400 device.

          why do you think msft is still selling XP for only low powered devices that Vista couldn't run on if it went on a diet. Why do you think MSFT is intentionally trying to limit the specs of such devices when they are already as powerful as any computer of 6 years ago?

        • by smallfries (601545) on Sunday June 29, 2008 @10:35PM (#23995797) Homepage

          Erm, when did CPUs stop showing exponential growth in performance? Was that a memo that nobody sent to Intel [intel.com]?

          Although clockspeeds are stuck because it is no longer economical to raise them, performance and transistor density are still scaling at the same rate. If anything we are in a period of performance increases that is slightly above trend, because now that the horrific NetBurst ISA has been killed off the Core2 replacement is rather lovely. Clock-for-clock it runs twice as fast as the old ISA because of shorter pipeline stages that have reduced instruction latency, and so far Intel have doubled the number of cores every 18 months. Given that they are ready to scale up to new fabs that can handle 2B transistors I would assume that they can continue to do so for the near future.

          It would be a seismic shift for the industry if processor performance flatlined but I don't see that happening for a long time. What we are seeing with the introduction of the Eee Pc et al is actually a trend that has been going on for decades. Roughly every ten years a new form factor is introduced at the bottom of the market, with the same performance, but with the price halving each time.

          So although your analysis of what changes are happening is way off, your final paragraph is quite accurate about what it means. The amount of performance that people actually require for most day-to-day tasks was exceeded when processors passed the Ghz mark. Now we are seeing cheaper and cheaper devices that deliver that (roughly) constant power. The effect on Microsoft is likely to be as you predict.

          • by ucblockhead (63650) on Monday June 30, 2008 @02:07AM (#23997033) Homepage Journal

            Though you are generally correct, it is important to note that doubling the number of cores doesn't improve performance as much as doubling the clock speed (or improving the number of cycles the average instructions take) because of troubles running serialized software software in parallel. Doubling is great, quadrupling is pretty good, but eventually you just don't get much bang. (Or, at least you won't without serious software improvement.) It's a serious problem the industry will soon face.

            But as you say, it is questionable if most people actually need more performance.

        • by Swampash (1131503) on Monday June 30, 2008 @12:36AM (#23996599)

          One of the few reasons left for them to exist is that people have money invested in software, and they don't want to have to buy new software

          I'd restate it as "One of the few reasons left for them to exist is that people have money invested in data locked up in proprietary Microsoft filetypes". I don't care that I have lots of .xls files on my hard disk - I care that I have tax returns and invoices on my hard disk. If Excel ever goes away, so does easy access to my data.

    • Re:Short answer: no (Score:5, Interesting)

      by liquidpele (663430) on Sunday June 29, 2008 @08:38PM (#23994807) Journal
      It could be done... in a sense. If they used their new virtualization technology (which actually isn't half bad, the beta even lets you take multiple snapshots, unlike vmware server), they could theoretically build in a "compatibility" model that could be enabled/disabled but could run older windows applications even if they new OS is radically different in how it handles such things.
      • Re:Short answer: no (Score:5, Interesting)

        by frdmfghtr (603968) on Sunday June 29, 2008 @08:59PM (#23995015)

        It could be done... in a sense. If they used their new virtualization technology (which actually isn't half bad, the beta even lets you take multiple snapshots, unlike vmware server), they could theoretically build in a "compatibility" model that could be enabled/disabled but could run older windows applications even if they new OS is radically different in how it handles such things.

        Sort of like what Apple did with OS 9/OS X?

        If so, the trouble with that might be that the legacy OS (Win XP or Vista) is so large that the legacy OS portion would double the size of the installation. If I recall correctly, the OS 9 support in OS X only added 400 MB to the installation, as OS 9 itself wasn't that large. What was really nice about it was that it could easily be removed if you didn't need the legacy support.

        (I may be wrong in my size estimates or misunderstand the OS 9 legacy support, as I moved from Windows XP to OS X when Tiger was released and have little experience with OS 9.)

        • by Zaiff Urgulbunger (591514) on Sunday June 29, 2008 @09:32PM (#23995287)
          The installation size probably isn't an issue given that the target customer, corporates who have invested heavily in Win2K/XP, will be largely using high end hardware (as opposed to the "new" low-end hardware a-la Asus EEE).

          Memory requirements might matter; but since we're looking at release two years from now, then 2GB is a reasonable requirement. If they base the "compatibility" code on XP rather than Vista, then it might be viable.

          The biggest problem I see is what to tell people right now. Saying, "oh yeah, the next version of Windows will be completely different" is not likely to go down well, and is unlikely to encourage anyone to "upgrade" to Vista prior to Windows 7. But saying "Windows 7 will be based on Vista" isn't particularly inspiring either!

          The marketing solution will likely be to not really give any concrete answers for as long as possible whilst telling people Windows 7 will build on their existing investment. If they don't do this, people might start looking elsewhere!!
      • Re:Short answer: no (Score:5, Informative)

        by bignetbuy (1105123) <r0ck.operamail@com> on Sunday June 29, 2008 @09:00PM (#23995031) Journal

        Commercial versions of VMware allow multiple snapshots. The version you refer to is the freeware version.

      • by v1 (525388) on Sunday June 29, 2008 @09:17PM (#23995169) Homepage Journal

        Apple's policy is to provide approximately 100% transparent support ONE version back. They did an incredible job with classic (supporting OS 9 in OS X) and an even better job in the transition with rosetta. (supporting ppc on intel)

        While it was fairly obvious you were running an OS 9 app in classic, almost no one notices a rosetta app running on an intel. Now notice, intels do NOT support classic. That's their "one hop" rule at work. And you can bet their next big one will drop support for powerpc.

        So this can be done, but it's hard to get right. But when you get it right, nobody notices. And that's a good thing.

        This is a bit like Windows. The problem they've had is that there's a lot more transition from dos to 95 to 98 to 2000 to xp to vista. None of those was entirely pleasant, and none of them were very transparent. Only half of them provided major new features, but all of them clung to numerous existing problems. So in the same timeframe, Apple has made just two massive leaps, with less "transition shock" in their two bumps that windows has seen in their five. The interim transitions (os 8 to os 9, 10.1 all the way to 10.5 really) were almost completely transparent.

        They've got a lesson to learn here. XP probably would have been a good time to do a "major bump" such as mac did with 9 to X, but they dropped the ball. They chose to break less, but to fix less as a consequence. Eventually they have to bite the bullet and fix as many of the underlying design problems as they possibly can in one fell swoop. It's going to break stuff. Maybe a lot of stuff. But if they could provide something like Apple did with classic support for OS 9, it wouldn't be so bad. Apple proved that it's not necessary to just totally break all your old software if you can provide decent emulated support for your previous OS inside the new one, invisibly.

        Sadly I don't see this happening with Windows anytime soon. Microsoft has never had a knack for making those internal transparent emulators like classic and rosetta. Unless they can get something like this together, it's either going to continue to be a wreck, or it's going to be a disastrous pill to swallow. Continuing to try to make these "baby step" fixes is going to drive the world crazy.

        • Re:Short answer: no (Score:5, Interesting)

          by Chrononium (925164) on Sunday June 29, 2008 @10:31PM (#23995769)

          I'm definitely a big fan of Apple stuff and am likely more tolerant of the small bugs that come out from Cupertino, but I think many people here are missing the big picture: Windows is all about compatibility. That's why a business might spend millions of dollars developing apps on Windows, because they can milk that cow for a long time afterwards. Vista is a significant enough break from Windows XP that many businesses don't want to switch because it means a potentially lower bottom line. Windows has incredible software inertia, while the Mac really doesn't. Comparing Mac OS and Windows is, well, comparing apples to oranges.

          Basically, if your bottom line depends upon a very slowly moving software architecture, then Vista is probably a bad thing. Making big changes, on the other hand, makes things potentially easier for Microsoft as there is less legacy and code can be refactored given years of experience.

        • by gnuman99 (746007) on Sunday June 29, 2008 @10:33PM (#23995781)

          Microsoft is many, many times more developer friendly than Apple. For example,

          http://trolltech.com/company/newsroom/announcements/press.2007-06-19.6756913411/?searchterm=codebase [trolltech.com]

          At least the windows API has been stable for a LONG time. You can get code that was running on Windows NT to continue to run, mostly. Or at least have a reasonable way of porting it. Stuff doesn't suddenly disappear in Windows.

          This is good news for developers. For some reason, users think that Apple was is better. I guess people only care about the latest-greatest app instead of having an inhouse or custom made application working for a decade or so, then Apple may look better.

          • by Slur (61510) on Monday June 30, 2008 @01:06AM (#23996757) Homepage Journal

            I'm not sure what you're trying to point out with your link. Existing Qt 32-bit applications continue to work, and either Apple will provide 64-bit Carbon in an update or they'll fix the HIView dependent Qt libraries when they transition away from Carbon. This issue has a negligible impact, and has nothing to do with "stuff suddenly disappearing" as you imply.

            I've been developing, publishing, supporting, and updating my Mac shareware program for 12 years - since Mac OS 7.5. Originally written to the Mac OS classic toolbox, I adapted it to CarbonLib in 1999 with some effort, to get ready for Mac OS 9, and I ported it to Carbon OS X in 2001, making it much better in the process. And I'll be porting it to Cocoa later this year, and taking it an entirely new level through the use of the latest Mac OS X APIs for compositing and animation.

            All along the way Apple has been great, and always getting better, especially since they released XCode. The tools are free, very usable, and every bit of API documentation is right there in XCode. And now they've released Cocoa 2, which is just a clear and wonderful programming API.

            Apple may have made a lot of changes over the last 12 years, but the changes have been constant improvements, and have had minimal impact on legacy applications. I am grateful for the quality of the work they do to save me time and make the work easier. And as a guy who started programming as a young hobbyist, I'm especially happy to see Apple giving away their development tools for free. It means kids can stumble into programming just like I did way back in 1977.

  • Wine? (Score:3, Interesting)

    by karearea (234997) on Sunday June 29, 2008 @08:13PM (#23994573)

    They could throw some time and effort (and $$?) into the support of WINE to allow the use of legacy Windows applications in an 'archaic OS'

  • by jeffmeden (135043) on Sunday June 29, 2008 @08:14PM (#23994577) Homepage Journal
    Remember Vista? Supporting legacy apps is already something MS has no interest in, apparently.
    • by Anonymous Coward on Sunday June 29, 2008 @08:36PM (#23994781)

      Any software that was created in the past few years which vista 'broke' were most likely poorly designed or were associated with managing or doing the functions expected of the OS itself (with a few exceptions.)

      Vista really isn't that 'buggy.' It is top heavy and uses way too much resources if you are only using it for limited things, but as a general purpose OS it really isn't that bad. I would still prefer Windows XP on new computers simply because I can get away with more power with a smaller investment in hardware, but I'm not necessarily 'against' Vista.

      • by dreamchaser (49529) on Sunday June 29, 2008 @09:06PM (#23995075) Homepage Journal

        Bashing Vista has become like pouring hot grits on Natalie Portman around here. It's just a meme anymore. It was funny for awhile but now it's just old.

        Vista really isn't all that bad. I still have XP machines (and Linux, and OS X, and Solaris, and OS/2 even) but I don't mind my Vista machine at all. I also run a lot of old apps on it just fine.

        • by Toll_Free (1295136) on Sunday June 29, 2008 @10:22PM (#23995701)

          Having run Vista32 on this laptop when new, and just recently moved to Vista X64, I agree.

          I turned most of the "eye candy" off on 32 bit, but 64 doesn't seem to get bogged down nearly as bad with the eye candy turned on. NOTHING else was changed, only the OS.

          Anywho, yes, Vista is fine. Pisses me off that I can't run Win16 apps on Win64 (like, install C&C, for instance), but oh well.

          I think I'll try 64 bit linux next.. Never tried a 64 bit rev... Any suggestions? I've always run Slackware since my first install, but it's not always the most "hardware friendly". It's a HP DV2000 based laptop, x64 1 gig ram.

          --Toll_Free

        • by 19061969 (939279) on Sunday June 29, 2008 @10:51PM (#23995941)

          If only I had mod points today... ;-)

          Seriously, I'm not a fan of MS by any standards but I have Vista installed on my desktop box and it annoys me less than XP on my laptop does. It's not a bad OS really and good enough for me not to scrub it and install Linux instead. Years ago, I couldn't stand Windows and always hosed the HD so I could put Mandrake or Debian on, but now I find Vista to easily be good enough.

  • by I Want to be Anonymo (1312257) on Sunday June 29, 2008 @08:14PM (#23994579)

    but I still wouldn't buy it.

  • by Anonymous Coward on Sunday June 29, 2008 @08:14PM (#23994581)

    Now that Bill Gates is retired from Microsoft, the editors should get with the times and lose that dated, painfully unfunny logo they use for Microsoft.

    Most people probably wouldn't get the Borg reference to begin with, and now Bill Gates era at MS is officially in the past.

    Only MS gets this ridiculous logo..now its finally the time they get rid of it.

  • oh come on (Score:3, Insightful)

    by Brian Gordon (987471) on Sunday June 29, 2008 @08:14PM (#23994583)
    I don't have any problem bashing Windows, but being modular is exactly the change from XP to Vista and what Server 08 does even better. Which is it going to be, that Vista should go monolithic for performance or that Vista should go modular for ease of design?
  • Why Not for Linux? (Score:3, Insightful)

    by Doc Ruby (173196) on Sunday June 29, 2008 @08:14PM (#23994587) Homepage Journal

    Why bother pretending that Microsoft will do anything with Windows that's interesting at all, when it's clearly spending its time and money making "more of the same", and its design constraints are clearly defined by its corporate interests.

    How about just making a version of Linux like that? If more work also makes Wine work a lot more reliably for most Windows apps, the whole thing could do a lot better than Microsoft at making "Windows" users happier.

    • by Darkness404 (1287218) on Sunday June 29, 2008 @08:27PM (#23994709)
      Because most Linux users want apps made for Linux, not Windows and then emulating the Windows API on top of Windows. WINE is great and has uses but basing a distro around it really isn't a great idea as WINE changes so quickly. Also, most Linux distros that are popular don't even try to act like Windows (Ubuntu, Mint, Debian, Fedora, etc) and the ones that do act like Windows usually fade into obscurity, (Linux XP, etc).
      • by Doc Ruby (173196) on Sunday June 29, 2008 @10:04PM (#23995545) Homepage Journal

        No, most people just want apps that do what they need to do. They don't care whether it's "Linux" or "Windows" or "both" or "neither". They don't even want an app, just to do what they need to do. Something that just runs Windows apps, because those do what people think they need to do, and does it without the crap that is Windows, but rather a simpler new paradigm, would be welcomed. Some of the extra Linux apps would probably be welcomed too, especially if they could be used side by side their familiar Windows apps. And they won't care whether it's running on top of "Linux", or "Winedows" or whatever, so long as it runs. Since Linux is a good basis to roll out a new PC OS on top of, especially with its existing developer and other community, which keeps any Linux-based OS compatible with most HW, it's a good means to that end. At an adequate degree of complexity, Wine doesn't "change", it just remains stable and the apps "just work". Which is a long way away still, but we're talking about a way to give people the "next generation" of PC environments. Without waiting for "Windows 8", or probably "Windows 9", or probably "Windows Never".

        That's the point of new PC paradigms. Not to "do Windows" better, or to "do Linux" at all, but to make people's computers "do my job" better.

  • by Drinking Bleach (975757) on Sunday June 29, 2008 @08:14PM (#23994591)

    Actually it stands for Windows NT 7.0. Here's a quick run-down:
    NT 3.1
    NT 3.5
    NT 3.51
    NT 4.0
    NT 5.0 (aka Windows 2000)
    NT 5.1 (aka Windows XP)
    NT 5.2 (aka Windows 2003)
    NT 6.0 (aka Windows Vista/2008)

  • No (Score:5, Insightful)

    by mangu (126918) on Sunday June 29, 2008 @08:15PM (#23994597)

    Can Windows move forward with a completely new, fast, and secure OS and still keep legacy application support?


    As someone who started developing applications for Windows in 1991 and stopped around 1999, I doubt it. Better let legacy applications (and the whole x86 mess too, BTW) fade away, they have gone far beyond their useful life.

  • Apple could do that because they were much smaller than Microsoft, and had a small but relatively loyal customer base, and their rewrite did pay off, as people are generally very happy with OS X and don't care about the incompatibility with OS 9 and older anymore.

    Microsoft has a huge userbase with much less loyalty, and generally a huge existing investment in software.

    We don't need a MS Windows rewrite, we've already got Ubuntu, because that's essentially what the article author wants: an operating system that Just Works[tm], even at the expense of compatibility. That's a pretty good description of any popular Linux distribution.

    • by rbanffy (584143) on Sunday June 29, 2008 @08:58PM (#23994999) Homepage Journal

      I would add that Apple did not do a full rewrite but, instead, adopted a stable, mature and very sophisticated OS from NeXT. Apart from that, OSX is very different from the classic MacOS and deeply incompatible. Any compatibility had to be bolted on its top.

      Microsoft has nothing like it and will not buy an OS outside.

      Or they could just grab any flavor of BSD, close it, build a Win32 susbsystem on top of it and sell it as Windows 8. They already did that with a TCP/IP stack.

  • Why is this news? (Score:4, Insightful)

    by Ecuador (740021) on Sunday June 29, 2008 @08:17PM (#23994621) Homepage

    Oh, yeah, this is slashdot.
    Microsoft already said they will build on Vista instead of going the microkernel way, and we have discussed that fact to death.
    Windows 7 will not be "Fresh Air", to the delight of /.ers everywhere. I mean, imagine if MS actually delivered a wonderful, light OS! That would certainly be the end of /. as we know it!

    • by nine-times (778537) <nine.times@gmail.com> on Sunday June 29, 2008 @10:49PM (#23995915) Homepage

      Is there really anything wrong with the Windows kernel? I mean, if Microsoft improved the shell, cleared out some of the cruft, and implemented standard file formats, protocols, etc. Wouldn't it at least be relatively decent?

      Lots of what people complain about are GUI problems, bundled applications, copy protection, and a failure to support standards. Not to downplay those complaints, but those aren't really an issue of the technical capabilities of the kernel itself.

  • WinCE. Pity about the name, though.

    • by rbanffy (584143) on Sunday June 29, 2008 @09:01PM (#23995041) Homepage Journal

      Yeah... right... 16 processes, right?

      CE is Windows done from the ground up, but it's not particularly elegant. And I _did_ write software for it. The 2002 model of the Brazilian electronic voting ballot runs Windows CE.

      Writing for it is every bit as ugly as it is for desktop Windows.

    • by fwarren (579763) on Sunday June 29, 2008 @11:57PM (#23996361) Homepage

      WinCE done right?

      I have written some software on the WinCE platform. It is NOT windows done right. Lets start with the evolution of the platform. It was designed for displays like 600x300. Full menus and dialogs. The OS has no concept of a "current directory". Every file has to be specified from the root of the drive every time. They figured the devices would have a touch display so no need for a mouse. So the standard Windows mouse API was ripped out. Essentially the only thing left was is a click or double click either left or right and where on screen it happened at.

      They then "re-imaged" it to compete with Palm. So now it is redesigned to work on a device that is 240x320. The menu is at the top of the device. The pop up keyboard soft-input-device (sip) pops up from the bottom. There are issues with a window getting in the background not being able to be brought to the foreground.

      Now we "re-image" again for the smart phone. With an even smaller display. Microsoft decides that a mouse is needed again. So they create a brand new API for dealing with a mouse, instead of using the win32 api

      If you think the win95-98 api vs the Win NT code base api wars were a problem. Now kick it up a notch. Take your pick, drawing graphics, initializing windows, dealing with the SIP. What ever fun I had dealing with the Win32 API was ground out of me when I started working on WinCE

      You want proof? Why did Microsoft extend the life of Windows XP for 3 more years for UMPC style devices to compete with Linux? Because WinCE in any incarnation is not up to the job. Microsoft is not even trying to pretend anyone will want it on a UMPC style device.

  • by mashuren (886791) <dukeofthebump@gmai[ ]om ['l.c' in gap]> on Sunday June 29, 2008 @08:19PM (#23994637) Homepage
    "Can Windows move forward with a completely new, fast, and secure OS and still keep legacy application support?"

    Well, considering the fact that Vista's all but killed the chance of running any software made before the year 2000, I'd have to say "no".

    It's pretty bad when old Windows software is much more likely to run under Wine than with the latest version of Windows.
  • by catwh0re (540371) on Sunday June 29, 2008 @08:20PM (#23994645)
    Keeping 'legacy' support has always been a nice excuse for not significantly upgrading the OS (or spring cleaning). Having tried to run many older programs under the promised legacy support (including the options to emulate previous versions of windows.) I can say that I've had small successes in keeping old software running on Windows.

    To me it's always been an excuse to keep windows bloated, and not actually any effort to keep old software functional.

    • Keeping 'legacy' support has always been a nice excuse for not significantly upgrading the OS (or spring cleaning).

      Are we not in the time where everyone and their brother is using virtual machines? It would seem that MS should relegate legacy support to virtual machines instead. They have the source code so they could "easily" create a VM (or some very transparent layer that makes it look like its running natively) for each version they've ever sold.

      Then they can do whatever they want and just keep the VM layers up-to-date.

      I surely can't be the first to think of this...

  • by LoTonah (57437) on Sunday June 29, 2008 @08:23PM (#23994667)

    Windows NT had an emulation layer that handled 16-bit apps. OS X had Rosetta and the Classic environments. And Microsoft now owns Virtual PC.

    They have the technology to make Windows a clean OS with emulation errors for doing whatever legacy OS you want. They just seem too lazy to do it.

  • Fluff piece (Score:5, Insightful)

    by ejdmoo (193585) on Sunday June 29, 2008 @08:23PM (#23994671)

    He really doesn't know anything about the internals of the Windows kernel or the Mach kernel, he's just assuming that since the NT kernel is "monolithic" and the Mach kernel is a "microkernel" then the latter must be better, and the reason it's better is it is "smaller."

    If you want to know where the real problems with Windows lie, they're in the API and the shell, not the kernel. The NT kernel is perfectly fine. See this Ars write-up by someone knowlegeable:
    http://arstechnica.com/articles/culture/what-microsoft-could-learn-from-apple.ars [arstechnica.com]

    I'd like to point out that Microsoft employs one of the original authors of the Mach kernel, Rick Rashid. He runs Microsoft Research. Look it up.

    • Re:Fluff piece (Score:5, Insightful)

      by Tumbleweed (3706) on Sunday June 29, 2008 @08:34PM (#23994775)

      I'd like to point out that Microsoft employs one of the original authors of the Mach kernel, Rick Rashid. He runs Microsoft Research. Look it up.

      Being put in MS 'Research' is the kiss of death if you want to make something that MS will ship. They seem to hire those brilliant people and give them massive funding only to keep them happy and prevent them from working for a competitor who might want to actually SHIP something brilliant they would come up with. Rather like IBM, only substitute incompetence in place of amorality as motivation.

    • Re:Fluff piece (Score:5, Insightful)

      by bcrowell (177657) on Sunday June 29, 2008 @08:55PM (#23994969) Homepage

      The Ars Technica piece is interesting, but I'm pretty skeptical about this whole idea of making radical changes in Windows and breaking backward-compatibility.

      One thing you have to keep in mind is that there's a huge downside for the user when you break backward-compatibility. Apple actually did an amazing job of maintaining backward-compatibility when they made the switch from 68000 to powerpc, but when they brought out MacOS X, the backward compatibility was lousy. You could still run classic apps on X, but they typically worked very poorly -- some features wouldn't work, apps would crash, and it took a really long time to start up the classic environment. Essentially Apple expected you to buy all new applications. Then Apple kept on bringing out frequent point-upgrades to MacOS X, and every single one cost a significant amount of money. My wife bought one of the early lamp-shaped iMacs, and we stayed on the upgrade treadmill for a while, but it really got old spending money every six months or so for a new version of the OS, so at this point we're still running an old version of MacOS on that (expensive) machine. Now we basically can't run any new software, because it only works on newer versions of MacOS X.

      It's also worth looking at it from MS's point of view. They're a monopoly, and their interest is in keeping users sucking at the tit. Maintaining backward compatibility has worked very well for them. One of the main things keeping Windows users from jumping ship for another OS is that they know their apps will continue to work. It's actually kind of amazing. I tech at a community college, and some of my colleagues are still using an old DOS shareware planetarium app. It still runs on Windows XP.

  • by ThorGod (456163) on Sunday June 29, 2008 @08:24PM (#23994685) Journal

    Just switch to Mac and get parallels :P

    Yeah, I know, not very funny. But does every comment have to be great?

  • Die Monkey Boy (Score:5, Interesting)

    by MCSEBear (907831) on Sunday June 29, 2008 @08:25PM (#23994693)
    There was a time when a much leaner Microsoft highly respected and rewarded employees who could write good code. These were the people who rose to positions of responsibility. Today, Microsoft is run by Sales and Marketing and coders are viewed as an expense. Until this situation reverses itself, don't expect any improvement in the product they create. They are too stupid to realize their product is the code. Ballmer being from sales only reinforces this problem. Perhaps he should be moved to a chair throwing division that does the monkey boy dance, and someone who can both create great code themselves and manage coders should be brought in as CEO.
  • Meh. (Score:4, Insightful)

    by zx-15 (926808) on Sunday June 29, 2008 @08:55PM (#23994975)

    I think the author of the article doesn't realize the difference between the legacy code and kernel architecture. Kernel architecture of windows is fine - its a hybrid kernel, which in general similar to Linux, you're not able to run in HPC on it, but hey, it is better than DOS! It's the legacy code that creates so much bloat, and swapping out the kernel won't change anything if the same mountain of code still runs.

    Of course Microsoft could create virtualization layer, but then Linux has Qemu, Xen and Wine, and OS X has Parallels and Wine, and of course there is VMware, so if Microsoft would ever support legacy code through virtualization, alternative implementation of it would be release pretty quickly, and everybody here knows how Microsoft likes competition.

    My guess there will be dying for the next 10-15 agonizing years, dragging any progress in the industry with them.

  • by nighty5 (615965) on Sunday June 29, 2008 @09:36PM (#23995303)

    What a whole lot of trolling effort.

    Windows isn't a monolithic design. Its a hybrid kernel, and with every release of Windows Microsoft has seperated out user space even further, including dll-hell to further improve the paradigm.

    One of the main guys behind Windows NT was David Cutler, a renowed software engineer and designer for VMS. Go and Google him, I can't be bothered to look up the URL.

    That should at least give you a clue as to the seriousness of the product and what they set out to achieve: the copy bits of the system that mattered most to Microsoft.

  • Microsoft's Problem (Score:4, Interesting)

    by Baldrson (78598) * on Sunday June 29, 2008 @10:55PM (#23995967) Homepage Journal
    If I were in Ray Ozzie's shoes I would apply something like the The Hutter Prize for Lossless Compression of Human Knowledge [hutter1.net] to the entirety of MS's software services suite. This, of course, requires making a rigorous spec for testing purposes.

    Make the engine, upon which the winning succinct byte code runs, a new W3C standard browser programming language (or at least virtual machine) and reduce the Microsoft OS CD to those components required to create a web-delivered application platform using the winning engine. Such an engine would, of course, have some features that dynamically encached expansions, memoizations, tablings and/or materialized views similar to the Hotspot optimization technology that originated with the Self programming language (and was later adopted by Sun's Java Virtual Machine). Hence it would make sense to have the OS CD contain a partially pre-expanded hence time-optimized code base.

    Then, for delivery of software services to pre-existing platforms, create a legacy port of the services code to pre-existing W3C standards like XForms implemented in a downloadable ECMAScript Client/SOA library in a manner similar to the way TIBET(tm) [technicalpursuit.com] does. The idea is to go "Live", ie: web-delivered, with a fundamentally new W3C base (whatever engine won the prize) but support legacy W3C environments for migration.

    Again, this prize-oriented strategy would, of course, require a rigorous specification of the software services so the testing could be largely automated.

    This approach addresses Microsoft's 2 biggest problems deriving from the same fundamental reality: Everyone has needed their OS to interoperate with the bulk of the information industry.

    The first problem is ethical and really goes beyond the scope of my professional opinions to my public opinions about the support of property rights [geocities.com]. Suffice to say, I have no trouble with someone who goes after a natural monopoly position and succeeds. I have a problem with someone who then refuses to use that position of success to fix the bug in the society that made them inordinately rich and their technology inordinately influential.

    The second problem is technical, which is what my argument here is really all about.

    Basically Microsoft's code bloat problem derives from its monopoly position. This may seem like a truism since all of the software "profession" suffers from code bloat, but only Microsoft can take this to monopolistic proportions -- proportions that make Ma Bell's monopolistic complexities of yore look Spartan.

    So Microsoft has this problem and it has many programmers (contributing to the code-bloat problem). It also has mountains of cash.

    So how can Microsoft bust its own monopoly position turning its many programmers and mountains of cash into succinct code?

    Monetary Incentives for the Programmers, ala the Hutter Prize:

    S = size of uncompressed code-base
    P = size of program outputting the uncompressed code-base
    R = S/P (the compression ratio).

    Award monies in a manner similar to the M-Prize [mprize.org]:

    Previous record ratio: R0
    New record ratio: R1=R0+X

    Fund contains: $Z at the time of the new record
    Winner receives: $Z * (X/(R0+X))

    It may turn out that due the incomputability of Kolmogorov complexity, the growth of reward may need ultimatelyto go exponential but the principle remains true.

    What happens very rapidly is the programmers first apply their skills to maximally refactoring. What falls out is a series of legacy API layers written atop a tight core.

    They'd have to spend more money on code testing to verify the compressed code-bases of the competing teams actually worked to spec but the results should be quite gratifying.

The Force is what holds everything together. It has its dark side, and it has its light side. It's sort of like cosmic duct tape.

Working...