Gartner Analysts Warn That Windows Is Collapsing 868
spacefiddle writes "Computerworld has an article about a presentation from Gartner analysts in Las Vegas claiming that Windows is 'collapsing', and that Microsoft 'must make radical changes to the operating system or risk becoming a has-been.' Michael Silver and Neil MacDonald provided an analysis of what went wrong with Vista, and what they feel Microsoft can and must do to correct its problems. Larry Dignan of ZDNet has his own take, and while he agrees, he suggests that the downfall of Windows will be slow and drawn-out. As an interesting tangent to this, there's also a story from a few days prior about Ubuntu replacing Windows for a school's library kiosks, getting good performance out of older hardware. '[Network administrator Daniel] Stefyn said he was "pleasantly surprised" to discover that the Kubuntu desktops ran some applications faster with Linux than when they ran on Windows. An additional benefit of Windows' departure from student library terminals saw the students cease 'hacking the setup to install and play games or trash the operating system.'"
Important lines from TFA (Score:5, Interesting)
You mean the almost-constant nag screens?
or do not see Vista as being better enough than Windows XP...
Making them smarter than the lying marketroids selling it...
to make incurring the cost and pain of migration worthwhile.
Translation: People are smarter than they think, and an OS that takes twice the hardware to be twice as slow AND even more incompatible with previous software isn't worth my money.
Of course, they still get sales - from the same idiots at my work who want to be upgraded from Office 2003 to Office 2007 because it's a bigger number, and then complain that they are confused by Office 2007 and want the tech support guys to "fix" it.
Re:Here we go again, eh? (Score:5, Interesting)
Man, when did this happen?
You are right about one thing... the morons still equate "windows" with "computer". But thanks to the 'tubes, TV, and Apple's marketing, that _is_ changing.
Death knell? Windows will not die with a bang, but with a whimper... but what do I know... I'm posting on Gartner, er Slashdot.
Part of technology life cycle (Score:5, Interesting)
When a technology service becomes ubiquitous and homogenous and - importantly - ceases being innovative, it runs the risk of becoming a candidate for conversion into a public utility. To stave this off, either ongoing innovation is required or the illusion of innovation and change is required. Microsoft has done a bit of both with Windows. But it's a thin veneer. As a result, poopulist efforts to 'socialize' this technology into a public utility are surging; hence, Ubuntu et al.
Re:Here we go again, eh? (Score:5, Interesting)
Unfortunately for MS, virtually the entire world's population now has Windows experience. It was not a great experience.
Some are cretins, and could not interface with a 4x2, but enjoy blaming windows
Some are experienced IT people who have seen Linux/Unix and know how it could be.
Most are now in a position to ask the professionals "Is this as good as it gets?" and being told - no, there IS another way.
Some are migrating to Vista, and realising that if it can get worse, sure as hell it could get better somehow. They know who to ask for advice, and its not the guy in PC world.
Re:Here we go again, eh? (Score:5, Interesting)
Same how the Roman empire was invincible, really. And the British empire. And let's not even get started on the American empire, which is crumbling before our very eyes.
Where is IBM? Where is Word Perfect? Both ruled supreme in their days, but those days are long gone. And just like IBM, Microsoft will still be around - but not as the powerhouse it once was. It will just be another big player instead.
One day soon the stockholders will ask why Microsoft is sinking so much money into XBox 360 or any of those other loss-making projects that Microsoft enjoys so much. And once they pull the plug on such projects, they will start to wonder if profits wouldn't be higher if Office were in a separate company, not fettered to any particular operating system.
Windows will survive that, as will Microsoft. But it will gradually become a niche product, one of many choices available for the operating system. Hardware will be controlled more and more through hypervisors. Applications will more and more be in virtualized environments of their own (beit virtual machines like Java or
And one day, someone will ask "what operating system are you running that on?", and despite being a card-carrying geek with a 4-digit slashdot ID, you will be forced to admit "Uhm, I'm not actually sure." Because it won't matter anymore.
legacy code (Score:5, Interesting)
Over time more competitors showed up in the marketplace, and as the economy shifted IBM stopped tossing money in our laps. Our engineers (of which I was one) spent most of their time trying to figure out how to shoehorn new features and entire new parallel products on top of the existing legacy codebase. The inevitable result was that we struggled while our competitors came out with newer, more modern & more powerful software. I eventually left that company to go to a startup where 7 others from this company had already gone to. That company was acquired a couple years later, and the application pretty much no longer exists.
If the engineers, who had requested the ability to create a new product from the ground up, had been listened to, then perhaps that company would still be around and competitive. It was mainly because of the business decisions to retain backward compatibility, like MS has done with Windows, that they eventually disappeared. As long as MS maintains their own demand for backward compatibility they'll be waging a slow & prolonged war that they have no chance of winning.
Re:Here we go again, eh? (Score:5, Interesting)
Think about it--every self-respecting business decided to hold off on Vista until at least after SP1. Well, SP1 has only just arrived, but before those businesses even have a chance to think about migrating, M$ is talking about releasing a completely new OS. It's speculation, sure, but it looks like Redmond believes it too, if they're willing to make a move like this...
Re:Here we go again, eh? (Score:4, Interesting)
"Give me a job, I'll support your company." (Score:2, Interesting)
Re:Collapsing? (Score:2, Interesting)
The biggest problem stays Balmer (Score:4, Interesting)
Re:Really? (Score:1, Interesting)
They shouldn't be, yet here this tripe is as a featured Slashdot story, as it will surely top the other meme sites.
And as you mentioned, it's just complete and utter bunk. The idea that OSX was just copied over to the iPhone is absurd. "OSX" on the iPhone is to OSX on the desktop as Windows CE is on PDAs and embedded devices (which Microsoft has been doing for at least 8 years or so) to the desktop -- yeah, there's some cross branding, shared libraries (from a source-code perspective -- C is cross-platform, even in the Windows world), API similarities, but underneath it all it isn't the same, and both are best-purposed for their respective targets, which is a much better decision than any run anywhere, lowest-common-denominator approach.
Of course I knew Gartner's opinion was nonsense when they went down the ridiculous-yet-truthy-through-repeated-assertion "monolithic" line of argument (which they likely picked up on Slashdot, it should be mentioned). Vista is a failure not because of any sort of code maintenance problem, but rather that Microsoft aimed far too high with Vista, taking far too many risks for a big, big change.
Like many such highly speculative (the whole WinFS initiative), large-scale projects, it failed spectacularly, and the result was a backtrack and then a polishing of XP to pretend it was something new. The failure of WinFS alone, which was to be a major foundation of a lot of the features of the new OS, was a massive failure for the project.
Re:legacy code (Score:3, Interesting)
No arguments there.
achieving backward compatibility != keeping legacy code
That depends entirely on the software. Ours was a high level client/server programming language. It was an English-like language, along the lines of BASIC. Since there were no statement separators (like semicolons in C, java, etc) it meant the language parser (built via YACC) had to be extended significantly. YACC is, by default, a look-ahead 1 parser. Thanks to our language not using statement separators the grammar was eventually extended to the point (thanks to the addition of new features) where it eventually had to look ahead 7 tokens. Trying to improve on that while maintaining backward compatibility would have required maintaining all that legacy code in the modified YACC parser, etc.
The application also saved itself by basically dumping the entire contents of the running applications memory to a disk image. So to load/run an application you just read the entire image into memory and started executing it. In order to maintain binary compatibility with earlier versions of the product you had to maintain all the features that existed in earlier versions of the product since any binary image that got loaded could make use of those older features. Again, it effectively required a reliance on legacy code. If the legacy code was modified/replaced then it would have required customers to likely modify their code and recompile, which is exactly what the decision makers wanted to avoid.
Re:Important lines from TFA (Score:3, Interesting)
The UAE "nag" screens are not, in principle, any different from Ubuntu's sudo pop-ups. They're more ubiquitous because of the Windows software ecosystem's DOS pedigree. DOS was not an OS, it was more like a library of system access routines. Any process could access any resource on the system and do as it pleased. Windows software tends to be designed around that assumption. Too many things ought to take administrative privileges, perhaps. Under the circumstances, where the policy was being overlaid on a large body of existing software, perhaps a more coarsely grained privilege escalation procedure would have been better, but it would be impossible to avoid excessive prompting altogether.
Windows Defender is another thing that is -- not fundamentally unreasonable in conception. Vista's policy on configuration and program files is not exactly foreign to Unix users: it thinks they should go in different places. The problem is that Vista treats every piece of non-MS software as presumptively spyware, and thwarts the user by silently sandboxing his attempts to use non-MS tools. If you use a MS tool, Vista does the right thing -- it pops a UAE "nag" dialog.
I can't speak to things like DRM problems -- I haven't had any nor am I like to have. But in many ways Vista is more Unix-like than XP. Early on in my evaluation of Vista, I had the audio system crash, with the usual cryptic error message. But the rest of the system was unaffected. I didn't say to myself, "The audio system crashed, Vista is a piece of shit." I was impressed. This is how it's supposed to work. Programmers are fallible, and one part of a system shouldn't trust another more than it has to.
The basic mark against Vista is that it was never finished to production release quality. It has prodigious memory requirements, even with the eye candy turned off. It's performance on average is acceptable, but people don't live in "on average", they live in the moment. The performance is a little inconsistent, which is much worse than the average performance being a little slow. Attempts to "fix" long standing problems sort of work, but they often have unintended consequences. Some things it tried to do were reasonable in conception but need to be taken back to the drawing boards and redesigned.
All in all, this is something you'd tolerate from a middling beta release, something between initial beta and a serious release candidate. I'm using a beta of Ubuntu hardy now, and it's tolerable, but actually a bit less polished than Vista. But we should expect it to be a LOT less polished at this point in time, because Vista is already in SP1.
I think the inability of MS to get Vista to production quality probably shows it is probably just too complex. It feels like a product shoved out the door when the clock ran out.
Re:Important lines from TFA (Score:5, Interesting)
Slackware 12.0 boots up in 47 s and once you login, KDE grinds the HD for about 30 s more. Now, the response times I'm getting are better than my 7 year old desktop
An OS shouldn't limit your hardware performance. This, more than the nagging, is what turned me of of Vista.
Re:Really? (Score:2, Interesting)
Unless "high end" has changed sometime recently, Vista runs quite well on machines that are decidedly less than "high end". My laptop cost less than $1000 and it runs Vista well (ultimate edition with all the bells and whistles (including glass)).
I just looked - today, you can go to the Dell web site and buy a $500 desktop computer with 2G of RAM, a dual core CPU, radeon 2400, and Vista Home Premium that will run all the cool features (again, including glass). To me a $500 computer is not a "high end" computer.
The REAL reason (Score:4, Interesting)
Re:Really? (Score:3, Interesting)
Re:Here we go again, eh? (Score:3, Interesting)
Going from 96% to 92% is a tiny loss, but it's more significant to look at the "other", which has gone from 4% to 8%. It has doubled, and as the trend continues, "other" is getting economically interesting to support. As that happens, Microsoft's "safe" monopoly markets come under competition, and they have to start diverting development dollars back there, again.
As for the whole Vista/XP thing, it shows up as a stumble, which is bad for PR and the "invincible" image. It leaves the crack open for "other" to make significant gains. (When you're in the single digits, single digit gains are big news.) It also puts them in an awkward revenue position. Sure both Vista and XP are revenue, but Vista is more revenue, and they'd have to be very careful about raising the price of XP, because that could be seen as making it "better" than Vista, the flagship product.
As others have said, I don't expect to see Microsoft die off, just become "another software company." By the same token, I expect the process to be very painful for Microsoft, just like it was for IBM.
Re:Really? (Score:5, Interesting)
I don't know whether OSX on the desktop and OSX on an iPhone are the same, because I don't like Apple and have never written anything for either. However, I've written lots of software for BSD, including on embedded devices, and lots of software for Linux, including on phones; and I can verify that BSD on embedded devices is just the same as BSD on the desktop, and that Linux on phones is the same - the codebase with the same libraries and many of the same applications - on phones as it is on the desktop. So there's nothing 'absurd' about the idea that MacOS on an iPhone could be just the same as MacOS on a desktop.
And, again, having written software for it: Windows CE is not - not even remotely - the same as either Windows95/98/Me or Windows NT/XP/Vista. It's completely different.
Vista's failure is down to poor engineering and poor management. Vista could have been brought out on time with all its features as promised by half a dozen of the companies out there - but not by Microsoft.
Re:Hacking the setup (Score:3, Interesting)
Re:Here's what you guys need to do... (Score:5, Interesting)
That's bad. Really really bad. It's bad because they won't be able to afford to develop their way out of their problems if the cashflow into the OS division becomes a serious drag on the bottom line. The current Windows system is so large that it requires armies of programmers to develop it's many little pieces, and any sort of "global project" is simply impossible -- as Vista demonstrated.
The situation is extremely similar to Apple in the mid-90s with the Copland project (go read the wiki article). As the project grew it got to the point where they needed an infinite number of people to develop it (see "Mythical Man Month"). Combined with rapidly dwindling sales, and thus revenue, they couldn't even afford a finite number of developers, and the entire project imploded.
As Copeland demonstrated, the solution is to start over with a new plan. Let's not forget that Apple has switched platforms _four_times_ (68k -> PPC -> OS X -> Intel). If they can do it, so can MS. But if MS is going to do it, they are going to have to pull the trigger, and every release of the existing code base makes that decision harder and harder.
Working against MS is the fact that they are *not* near death. Apple's brush with extinction meant there was very few people to piss off when the inevitable happened and the old systems were semi-abandoned into the "penalty box" (Blue Box). MS has hundreds of millions of users, it's going to make their life extremely difficult. VMs may indeed work, given recent advances, and if they can isolate applications in different VMs then they might make the system more secure as a free offshoot.
Maury
Re:Gartner analysts? (Score:4, Interesting)
And really, there are a lot of people who don't have a clue, who need "analysts" to help them form opinions: they're called "customers" or in some circles "clients".
Re:Really? (Score:3, Interesting)
I'm not disputing that your purchase will run Vista fine, just the idea that people who buy "disposable" computers are idiots.
Your 2.8ghz machine would go for chump change on craigslist today...and RAM upgrades would cost next to nothing (thanks, China!).
Those of us who buy "disposable" machines don't make any investment in technology. We buy cheapass machines that run the technology of the day very, very well. In the long run, since are not invested, we can afford frequent upgrades. In that timeframe, we were spending $1000 for computers that could handle XP (released in 2001). Today we might drop $1000 for a computer that can handle Vista...but are more likely to spend $300 on a box that can run XP. Exactly how much did your computer cost you in 2000?
Sadly, Sam Vimes' "Boots Theory" of economics does not hold true for computers.
Re:Really? (Score:3, Interesting)
Re:legacy code (Score:3, Interesting)
Apple basically transitioned an entire ecosystem with barely a hitch. It's a shame that Microsoft did not take the oppotunity to do something similar when passing from XP to Vista.
Re:Really? (Score:3, Interesting)
There are three overlapping issues here:
1) Endness of a machine purchased new today.
2) Endness of a machine purchased sometime before Vista was released (if they weren't targeting at least some of these, why bother releasing an upgrade version right away?)
3) Microsoft slapping "Vista-ready" or whatever on machines not capable of running Vista in a full-featured way.
The truth is, Vista runs fine on some machines in category 2, as long as it's got enough RAM and a graphics card capable of running whatever the glitzy 3D graphics stuff is called. However, with such a broad category (machines that were purchased before Vista came out), it's hard to judge how high- or low-end the machine is, and is thus largely pointless to try to discuss that.
Vista also runs fine on low-end machines in category 2, given the caveats (I disagree that 2GB of RAM is low-end today--there are plenty of machines configured by default with only 1GB, and a smattering few configured with 512MB.)
Category 3 has to do with machines which were capable of running Vista, but not capable of running the glitzy 3D. The question is whether or not such a machine is "Vista ready." I'm not really interested in arguing on this point, but it bears mentioning since it ties in with endness.
Re:Really? (Score:5, Interesting)
8 years ago was still P3 time. The original P4 wasn't released until late 2000.
Re:Really? (Score:2, Interesting)
- I perhaps should have used the word "modern" (computer) instead of the word "high end".
- While Vista was being developed (and for many years), it was likely developed on what was then considered "high end" computers.
Re:Really? (Score:3, Interesting)
The biggest difference for my buddy has been the lack of cruftyness. Whatever foul being infected his old P4 machine caused Windows to start in like 10 minutes or so. I'm positive that it was infected with SOMETHING, but I couldn't fix it without reinstalling. So far (almost a year), his laptop is mostly cruft-free.
I have no idea if this is to the credit of Vista, his use of Firefox with adblocker, or to my advice that he not install EVERYTHING that he comes across. Every once in a while he calls me with a "Vista won't let me do this!" and it so far has been GOOD that Vista wouldn't let him
For the record, he still prefers XP and he absolutely HATES Office 2007. I kind of like the idea behind the interface but confess to not having used it much at all.
Re:Hacking the setup (Score:3, Interesting)
Change it to explorer.exe. Type in 'telnet://' hit enter and you were in.
Those were the days.
Re:Really? (Score:5, Interesting)
These three computers now run beautifully and I thoroughly enjoy noticing that after upgrades sometimes things run faster not slower.
One thing that bothers me, both as a consumer and as someone who tries to be environmentally conscious, is that the continual trend towards more bloat in Windows results in the premature obsolescence of perfectly good hardware. I can foresee getting a total of 8-10 years of good use out of these computers (even more if I do things like reuse them as NAS devices or routers). I save money, do a bit to reduce waste in landfills, and don't have to deal with the frustration of working with an operating system that prevents me from fully utilizing the potential of hardware I bought.
Frankly, I'm seeing less and less valid reasons for the continued use of Windows other than 'it works' or 'that's what I'm familiar with.' And even those arguments are becoming less and less valid themselves.
Re:Really? (Score:3, Interesting)
The "bloat" in Vista isn't the kernel, it's all the stuff that goes on top like the GUI.
Then why is it that when Vista came out, my 3-year old laptop ran it like a 3-legged dog, even after I upgraded the RAM to 2 GB? It wouldn't even run the Aero GUI (even though the same laptop runs compiz in Ubuntu without any problems). It was obviously not the GUI getting in the way, since I was forced to revert back to the W2K style GUI. The machine does great with Ubuntu and WinXP, though. Besides, if a GUI can be the cause of slow file copies, I'd say there's something seriously wrong with the OS.
And as a side note, I just barely replaced my home machine, which was an 8.5 year old P3 running XP... It was a great PC when I got it, but it wasn't doing so hot by the end of its Windows life. I think I'll turn it into a mythbox, though, it will handle that just fine.
Re:Hacking the setup (Score:2, Interesting)
figures microsoft changed it they now call it steady state. if you want this: http://technet.microsoft.com/en-us/magazine/cc160970.aspx [microsoft.com] It did work with our public computers. The decisions was made to kill all the public computers since we do have wireless here and most people that come here do have their own laptops.
Re:Really? (Score:2, Interesting)
Like the Gartner guy in TFA, I believe it has more to do with Microsoft's engineering and product management practices. I think there's also a philosophical difference. The Mac OS X people are all of the Unix tradition. Windows, on the other hand, mixes the just-make-it-work Windows approach with the spirit of VMS (via Cutler and NT).
The Unix philosophy seems to have scaled better. The Windows mobile offering is a totally different beast than Vista, but both OS X and Linux can be made to fit on everything from phones to handhelds to high-end systems. Linux and OS X (as well as some of the other Unix children) also have been releasing more frequently and (IMHO) improving more than Windows has.
I hope Microsoft pulls it together, but the philosophical differences may be too deeply baked into the culture and into the code base to turn it around quickly.
Re:students will hack *anything* (Score:4, Interesting)
Not easy. First, they use Idesk for their desktop (on Windowmaker), so all you can open is Firefox. I used the local browser code execution trick to get a shell, and took the home directory back for myself, but had no root. I eventually had to look up an old, old, old overflow in ping, compile it on another box (since there's no local compiler), and copy it to the terminal, and then I had a root shell. Total time: 5 hours. That's roughly 60 times what it took for me to break an XP kiosk.
The moral is either "don't admit to fucking with kiosks online," or "Ubuntu is, despite its friendliness, surprisingly more secure than Windows."
That's the Whole point (Score:3, Interesting)
Re:Here's what you guys need to do... (Score:3, Interesting)
Re:Hacking the setup (Score:3, Interesting)
Re:Here we go again, eh? (Score:3, Interesting)
Re:Really? (Score:4, Interesting)
Such installs, when automated, tend to take, in my experience, around ten minutes off a disk image in a VM, compared to an hour and a half for installation (not counting the time wasted when you don't know it's asking you a question because you're off being productive elsewhere), plus the hours and hours of installing drivers for networking and video, rebooting, updating Windows Update, rebooting, running Windows Update, rebooting, running Windows Update again, rebooting, and so on.
You can trim a Windows Vista installation (between 2GB-4GB, according to TPB) down to around 600M, trimming out all the crap that I personally couldn't afford to lose. The result was so absurd that I just wiped it out without bothering to test it.
So, if Windows Vista is really just 'XP with prettyness and UAC' why is it an extra 450M? It's not drivers (I wiped out everything that Vista comes with). It's not useful apps or productivity tools (everything Windows comes with, I replace). So where's it all going?
I know there are a lot of under-the-hood changes, but certainly for the loss of performance, ballooning of requirements, complexity and frustration, certainly it can't be justified... can it?
Re:Really? (Score:3, Interesting)
When you strip everything 99% of users want out of XP, it goes from ~600M to ~150M. When you do the same to Vista, it goes from ~2-4GB to ~600M.
I'd imagine a lot of that is Vista's purported 'multiple DLL versions', which keeps every version of each DLL, so that apps that need a specific one will get it... but still, that seems absurd to me, and it doesn't explain the bloat in system requirements.
Let's face it, whatever Microsoft did to cock up Vista so badly, it was enough that even if they wanted to they couldn't build a good OS off it. Just like the Pentium M was a fantastic chip based not off the then-current P4 but the previously-retired P3, so too would Microsoft have to build their next-generation OS off XP at best, unless they refused to acknowledge (internally) their abhorrent failures.
Re:Really? (Score:4, Interesting)
DirectX 9.0c is 218MB.
Does that explain why you cannot have it on a embedded device?
OpenGL is tiny for reference. The core of it is 0.7MB on my computer.
And locking the iPhone in that manner isnt difficult.
But its impossible with Windows. Need to be Admin to install stuff.
How did 'ping' get you root ? (Score:3, Interesting)
Seriously - I'm curious. I'm not a hacker, but do understand things a bit. I get how you compiled a vulnerable version of ping, and copied it to your now-available $USER shell. I assume this would mean the ping executable is at most UID/GID User:User, rwx 777.
How do get from there to root? A local buffer overflow in a non-privileged ping executable allows you to get access to privileged memory ranges not controlled by ping, but rather by some privileged process, and you use that access to that privileged memory area to get to root?
If that is somewhat correct, it seems like the memory manager is to blame, not a bad ping programmer. Why should ANY non-privileged application be allowed to do that by the MMU? If not a buggy ping, then what's to keep you from downloading a purposely-written overflow app from a website and breaking out with a that?
Is that what NX fixes? But wouldn't some non-kernel privileged memory still need to be marked executable for root and setuid apps? Does NX thus have some policy mechanism for what program and/or memory range is and isn't vulnerable to overflows?
I understand the 50,000 foot view of SELinux and AppArmor - do they operate in this domain, or more at the file-and-kernel-ABI access permission level (rather than in this memory-range level)?
Thanks for the info....
ReactOS (Score:2, Interesting)
Re:Microsoft forgot their customer (Score:3, Interesting)
Now why an earth I would want to watch BluRay movies from a tiny little VGA connected monitor when I have standalone player and 32" television for that?