Windows Is Dead – Long Live Midori? 695
parvenu74 writes "A story from Infoworld is suggesting that the days of Windows are numbered and that Microsoft is preparing a web-based operating system code-named Midori as a successor. Midori is reported to be an offshoot of Microsoft Research's Singularity OS, an all-managed code microkernel OS which leverages a technology called software isolated processes (SIPs) to overcome the traditional inter-thread communications issues of microkernel OSes."
Prediction (Score:5, Interesting)
Re:Thin Client? (Score:3, Interesting)
Re:This is great news! (Score:2, Interesting)
Re:Huh? (Score:3, Interesting)
Defense against Linux boxes? (Score:5, Interesting)
Re:Prediction (Score:2, Interesting)
Proprietary Javascript (Score:3, Interesting)
This probably means that M$ is going to add a bunch of proprietaries to Javascript through IE and start adding language features to make a proprietary platform. Even so more, probably access to the win32 api via javascript. Even more so, probably JITed c#, wait.. wasn't java supposed to do this?
Re:Prediction (Score:5, Interesting)
How does one have a web-based operating system anyway? If you're running your OS inside a web browser, what is the web browser running on? Is it just turtles all the way down?
What's old is new again... (Score:4, Interesting)
This is almost exactly the same thing, in spirit at least, as Inferno (http://www.vitanuova.com/inferno/), which started in 1995 and has been under continuous development since. Managed kernel, runs on real hardware, uses software isolation between managed threads... oh, and has code flying, for real, right now. :)
Re:Thin Client? (Score:5, Interesting)
It seems that every ten years, someone re-invents the thin client.
First it was dumb terminals connected to a mainframe, then to a serial port box so one can connect to a UNIX box.
Then came XStations which used various (direct, indirect, broadcast) forms of XDMCP to find a host to download microcode and run apps from.
Then, it was JavaStations where people talked about fast broadband access to stuff on the ISP's server, and not to worry about all their private documents being stored offsite.
This just seems like more of the same, perhaps an offshoot of cloud computing. It will work for a couple niches here and there, but as a whole, Net based operating systems will fail, as people want to keep their stuff private on their own systems.
Same disadvantages apply. Security of stored files for example -- I trust my external TrueCrypt encrypted drive that uses both a long passphrase and a set of keyfiles a lot more to securely store my Word documents than I do some random ISP's computer.
Re:Prediction (Score:5, Interesting)
If the replacement rate for a desktop computer is 3 years, and everyone buys for $250 and Windows for $130 - that's less than $400 over 3 years... or just over $10 monthly.
If I had a website that offered full MS Office functionality and compatibility for $10/month... wanna bet I'd have some takers? They'd need 366 million customers to equal their current revenue using this model.
Worldwide, PC sales are supposed to grow to over 250 million/year by 2010, so while their target would be ambitious - it is feasible if they could rope roughly half of new PC buyers into this new model.
Trivia ... (Score:5, Interesting)
"Midori" is Japanese for "green". It is also a common female first name.
I don't know how either would apply to an OS, unless it has some connection to this [wikipedia.org].
Re:Prediction (Score:5, Interesting)
If I had a website that offered full MS Office functionality and compatibility for $10/month...
I concur, there would be probably be tremendous interest. I just wonder if it being a Microsoft branded product wouldn't be a detriment to it's success as opposed to it being judged purely on the merits of what it offers. But allow me to play the devil's advocate for a moment and suggest for gamers this might not be such a bad thing. (Potentially) Less OS on the hard disk could mean lower resource utilization and I'm sure a few enterprising users would find further ways to enhance performance maybe something a kin to tuning current Window's services so as to prevent unnecessary network access?
Re:Prediction (Score:5, Interesting)
Wanna check your email? That'll be $1. Wanna post to ./?? That'll be $2.
[after searching Clippy pops up]
I'm sorry, I was unable to process your credit card number on file. To see all of the search results, please enter a valid credit card number.
Some confusion about Singularity / Midori (Score:5, Interesting)
Brian Madden is either talking about something else, or he's confused by references to hypervisors elsewhere. Midori will run under Hypervisors... but as one possible deployment of the OS, not as an essential part of the system. Singularity is more like ".NET" taken to the next level, with the entire OS running without hardware memory protection (let alone hypervisors), so it can run anywhere... even as a module inside another application... without any specific hardware support.
Re:Problems (Score:1, Interesting)
The ideal situation is to have the CPU execute the managed bytecodes nativly essentially handling all the isolation in hardware.
With 64-bit addressing there is a huge huge opportunity for hardware support of packed data structures or hardware based compression.
With native execution and some form of data packing there may actually be the possibility of fast execution, reasonable memory use and a more forgiving managed environment.
Now if only we can do something about exception handling essentially goto statements and allowing assured execution of functions can't fail (for example _everything_ shouldn't have to allocate memory) unless the cpu/northbridge melts it would interest me.
I think there is a basic opportunity for new directions...
More intelligence in the CPU - offload higher level instructions
Generic hardware interfaces - DMA, out of order/scatter gather, queues, interrupt handling..etc. Hardware includes interface specification in its firmware...think bluetooth like hardware profiles.
But the devils in the details and there are reason zillions of hours of time have been spent on todays computers. If you can get rid of all legacy software and hardware concerns... there is a lot of opportunity..given how cheap hardware is and how flexable source code is or can be made to be via an automated process a totally new direction might both be affordable and worth trying.
Re:Prediction (Score:4, Interesting)
Of course, why not, especially if the cost is high, share it between users? Especially if it will support multiple desktops, won't every household maintain one OS for multiple users?
Interesting part is not that it's web-based... (Score:3, Interesting)
Wouldn't really work outside of America (Score:3, Interesting)
I mention America specifically as a generic example that everyone understands for one reason. "Unlimited Internet Bandwidth". This type of a model (even if it is a model where MOST of the OS is on current hardware but then randomly checks the internet for it's main "modular" pieces, vs having it all on the Hard Drive as we current do) cannot work well because other countries actually have to pay for speicifc amounts of bandwidth.
And even now, I've read random articles talking about ISPs (in america) which are considering moving to the "Pay for Bandwidth Tiers" models. WTF is the point of getting an OS that eats up all of your bandwidth just to stay turned on and be running a screen saver? It would need to randomly connect out and update things after all...
Some might argue that this is already being done, and that "caching" would solve the problem ... except that caching would negate the whole purpose of an online-OS (it needs to always have the latest thing to work well). Currently windows ALREADY connects out and randomly checks things and uses bandwidth, but it's NOT downloading entire modules as something like that would require.
Sorry, but if I lived somewhere with Pay-As-You-Go internet (I'm considering moving to Australia) I sure as hell wouldn't pay more money to an ISP on a monthly basis just so that I can use the "latest and greatest" windows.
Re:Prediction (Score:2, Interesting)
The problem arises when I start being forced to do it. For example, when the machines start using trusted computing to expel a free OS and a free office suite.
Re:Thin Client? (Score:1, Interesting)
I think you're just trying to be confontational.
Yes Windows XP still has its problems, but it is quite stable if you're not stupid about it.
I keep my work machine up for weeks at a time, hibernating about twice a week during that and I have never had problems. Some things start going a little weird at times, like I recently silently lost my sound device after an up times of 35 days.
I have 3 home machines, one is a windows xp media centre machine hosting 1.5TB of data, movies and music that I share and stream out to the other 2. The "server" stays up for weeks at a time with nary a problem. Heck the mythTV box (standard install and nothing else) that hooks into my TV has memory issues if I leave it up for longer than about 8 hours. Upon normal boot it can play a high-def (720p) movie streamed across my home network just fine. After about 9 hours, it has trouble playing a 320x240 video clip without stuttering. (It has 2GB memory, E6550 core 2 duo). I suppose its entirely possible that I need to tweak some configuraton, but I didn't have to do that for the windows machines.
The other machine (a small windows xp media centre laptop I move around the house) stays on or in and out of hibernation for weeks at a time. It always comes back and chugs along with no problems. I often fall asleep and wake up to it playing along just fine; sometimes it stays playing for a day or two at a time with no problems (the stereo sleep timer is on so I usually forgot to turn the computer off, until I notice it is on).
At work, I administer about 400 windows xp machines. I can go through our help logs over the past 3 and count the number of blue screens on one hand. The number of times windows has screwed up by itself, the user hasn't done something stupid or installed some crap that screwed it up, is about 43 for the last 3 years. And over half of those coincidently happened whenever we got a power bump through the building. Notebooks on docking stations always have trouble with power bumps.
Yes its anecdotal, but Windows has progressed nicely enough that it handles itself well enough.
Of course whether you count viruses and potential security problems directly against Microsoft or not will change that.
Vaporware (Score:3, Interesting)
Anyone remember Cairo? ;-)
Re:No longer associated with BSOD? (Score:1, Interesting)
I can say that I've encountered a BSOD in XP but it must have been less than a dozen times spread across 5 years and over 80 computers.
Agreed and in my case it can almost always be invariably traced back to either:
1) bad network drivers, particularly wifi
2) bad video drivers
3) faulty ram
4) faulty hard drive
I've had linux kernel panics about as often, and for generally the same reasons.
And when Windows crashes for those reasons people still blame Windows not the drivers/hardware.
Re:Prediction (Score:4, Interesting)
Also, in terms of practicality I have to say that I wouldn't know what to do with a few hundred thousand dollars myself. What am I going to do, stuff it under my bed? I feel like there is a purpose in having institutions that make it their business to do with my money what I can't really do myself. For me it's not out of paranoia that I don't store files with Google, but that I don't see the point. I don't NEED Google to store my files, I've been doing it for years myself.
Re:Prediction (Score:5, Interesting)
Well it all depends on how you use it. Back when I was married to The Bitch we had one master computer running linux that we both used. Sharing time on it was a bitch because I used it for work, and she used it for play. To solve this issue I rounded up a old '486, a 20 MB HD, and a 15" display. Piece of crap. I installed a very slimmed down linux, just enough to boot and connect the X server to central host.
She had her play computer and I had a work computer and everything was fine.
Actually there is was a interesting turn on that set up. After we separated her and some of her cult buddies broke in to my house and stole that X terminal I made her. I found out through a friend that they did that because they didn't me reading the email she left on "it" or having access to her icq logs. I found it very amusing that she had stole the wrong computer.
And if you wondering. Yes, I did look through the icq logs and email. I did show them to the judge and use them in court. I found out her nuttiness was more nutty then I ever imagined. I found out she had been abusing my son and what she had planed. So if your going to bitch about her privacy or some such BS, save it.
Re:Prediction (Score:3, Interesting)
.
The banker is not a blabbermouth.
He isn't looking over my shoulder whenever I dictate a letter.
He isn't reading our internal reports and planning documents - and - no matter how richly deserved - he isn't feeding the minutes of our daily conference calls to Scott Adams and The Simpsons.
Re:Prediction (Score:3, Interesting)
Re:Not Web Based (Score:4, Interesting)
Midori is being designed in such a way that components of the OS communicate with each other in a location independent manner. API calls to a local machine are no different than API calls to a remote machine.
This strikes me as being similar to a design goal shared by Plan 9 [wikipedia.org], and its spiritual descendant Inferno [wikipedia.org], both of which were based around the 9P [wikipedia.org] protocol.
Re:Prediction (Score:3, Interesting)
What? Do you work for Citrix?
You describe thin client architecture, or hosted computing. I totally agree, this is known, established technology. But from your comment you don't seem to understand these in any amount of detail.
Both Citrix and RDP clients, and moreover X, can transmit primitives to a client when dealing with a hosted application. However, generally "serving up nothing more than a picture" is slower than issuing higher level commands as far as performance is concerned. In simple terms, telling the client "draw dialog box asking 'Do you want to continue | Yes | No'" is less bandwidth than a bitmap of the dialog box.
X, which I am most familiar with, takes this concept very far. You can run an application off a remote computer (X client), which will still use your *local* video card/computer (running the X server) ((yes, client/server notation is kind of reversed in X)) and run 3-d apps over a relatively low bandwidth connection with fluidity. Imagine trying to do that with a compressed bitmap being pushed over the pipe -- it would be a bad slideshow.
RDP does transmit high level instructions, and can be relatively fast. It even has switches for "bitmap caching" of common things, and can compress data too. In my experience deploying the two, RDP is far from bloated and universally faster when compared to ICA, although ICA definitely offers a greater degree of control and customization in the server environment. We do use ICA exclusively, but I think it is primarily for business and historical reasons.
From your last sentence:
>'probably run the whole thing on VM -- most large companies are doing that now for there employees'
I assume you are not mixing up running a virtual machine on a server to host a Windows Citrix Server / running a Virtual Windows session for a remote client and running a Windows Server to host a Citrix application. The first is a Virtual Machine, the second is just a hosted application. You didn't really separate the two concepts.
My corporation runs Windows Server and has Citrix hosted applications for our main five business apps. I detest our setup for controlling access (which was handed to me and I am in the process of changing) which used a Juniper VPN host to allow clients access to specific servers, but that is a separate story.