The State of the Internet Operating System 74
macslocum writes "Tim O'Reilly: 'I've been talking for years about "the internet operating system," but I realized I've never written an extended post to define what I think it is, where it is going, and the choices we face. This is that missing post. Here you will see the underlying beliefs about the future that are guiding my publishing program as well as the rationale behind conferences I organize.'"
Meh (Score:3, Informative)
Re: (Score:1, Funny)
Is there a daily prize for obviousness?
No, there isn't.
Re: (Score:2)
yeah! neither does article embiggen our knowledge on the subject!
Oh pipe down and eat your rootmarm!!
He's chanelling Stallman is why it sounds familar (Score:3, Interesting)
Re: (Score:3, Informative)
You've got to be kidding that I'm channelling Stallman. He's finally waking up to an issue that I put in front of him all the way back in 1999. At the time, he said "It didn't matter." See for yourself, in the transcript of our interchange at the 1999 Wizards of OS conference in Berlin. They are a fair way through the PDF of the transcript, so read on down: http://tim.oreilly.com/archives/mikro_discussion.pdf [oreilly.com]
At the time I was talking about "infoware" rather than "Web 2.0" but the concepts I was working
Re: (Score:2)
The guy also seems to have a problem differentiating between an operating system and a network infrastructure.
Re: (Score:2)
The internet has an operating system just as much as a colony of ants has a hive mind. They don't, but they sure act like they do.
Metaphors. Learn them. Use them. Love them. But don't anthropomorphise them. They hate it when you do that.
Re: (Score:2)
How crumulent is it? We know it's not exactly cromulent, but you've left it's actual cromulence vague.
Dumb terminals and smart people don't mix (Score:5, Interesting)
This whole "Internet OS" thing reminds me of the periodic resurgences of the dumb terminal/thin client idea that goes back to the mainframe days. It seems like every ten years or so, everyone is talking about thin clients in every office, with the OS and apps running on some offsite server somewhere (now with the added twist of multiple servers over the internet). Ostensibly this is seen as a good way to save IT money and overhead. But in every actual deployment I've seen, it only causes hassles, additional expense, and headaches.
Back in the 90's we tried this at my old university. We networked all our computers and put all our apps on a central server. Even though this was all done on a local network (much more reliable in those days than the internet), it was still a complete disaster. Every time there was a glitch in the network; every student, professor, and staff member at the university lost the ability to do anything on their computer--they couldn't so much as type a Word document. Now, with little network downtime, you would think this wouldn't be so much of a problem--but when you're talking about thousands of people who live and die by the written word, and who are often working on class deadlines, you can imagine that even 30 minutes of downtime was a nightmare. I was skeptical of this system from the get-go, but got overruled by some "visionaries" who had bought into the whole thin client argument with a religious fervor. Of course, long story short, we ended up scrapping the system after a year and going back to the old system (with a significant cost to the state and university for our folly).
Re: (Score:3, Insightful)
Actually, it's all just one big cycle. When I first broke into the IT world, PCs were a bit of a novelty in most businesses. Then, the PC explosion caused things to move towards a "client-side" setup, with faster desktops, laptops and not as much horse power required on the server side. Then, in an effort to save money, tied in with servers/CPUs/memory becoming cheaper, and security concerns, companies started (or have started) to slowly pull things back from the client side and put more emphasis on the
Re: (Score:2)
I don't think we are going to go back anymore to the old days of client-side apps. There's a big difference today, the growing ubiquity of network access. Decades ago we didn't have internet (or it was crappy, slow and too inexpensive), every once in a while a new computer generation focused in client-side software because networks didn't really matter that much. With the ubiquity of internet I don't think we we'll see that again. We are starting to see MB/s of internet bandwith, it won't be too long until
Re:Client-side / Server Side (Score:2)
I like a hybrid approach.
Our Enterprise accounting system is on the server, but office apps are local. Daily workflow seems to produce a lot of "debris", which conveniently forms little digital compost heaps on people's local machines. (With a little nudging) if there's a document that's usefully finalized, post that version to the server folder.
MS Office Basic is "essentially almost-free" for OEM hardware purchases, so why put Word and Excel on a server?
Re: (Score:2)
Does the internet go down a lot? I haven't noticed.
Re: (Score:1, Funny)
"Does the internet go down a lot? I haven't noticed."
Then you are watching the wrong videos. They almost always start with going down.
Re: (Score:2)
I get phonecalls from my mother all the time telling me that the internet has gone down once again. Her home phone, which she uses when she calls me about this, is VoIP.
Re: (Score:2)
"Actually, it's all just one big cycle."
Not since the internet, the problem with thin clients is it starts to create single point of failure. The great thing about the net is redundancy even if that comes at a cost it gives you extreme amounts of flexibility.
The great thing about the net is redundancy, is a site down? find it in cache's of the net.
Re: (Score:1, Funny)
Dumb peeple and dumb terminalz dont mix either.
Re: (Score:2)
It's also dumb. Even if you bought a low-end Intel Atom machine, why would you want to waste that CPU letting it be a dumb terminal? Put that CPU to work by enabling it to do tasks independently even if the network connection fails.
Re: (Score:2, Funny)
Re: (Score:2)
Or something a little more useful like Folding@home.
Re:Dumb terminals and smart people don't mix (Score:4, Insightful)
It's also dumb. Even if you bought a low-end Intel Atom machine, why would you want to waste that CPU letting it be a dumb terminal? Put that CPU to work by enabling it to do tasks independently even if the network connection fails.
I weep for OpenMOSIX. I was hoping that the project would continue and ere long we'd be motivated to buy all one architecture in our house simply because all the machines would form a cluster almost without our involvement and just accelerate each others' tasks. A terminal cluster where the terminals also make the entire system faster is kind of an ideal dream.
Re: (Score:2)
Re: (Score:2)
You're not suggesting... A world wide beowulf cluster?!
That would be nice too, but there are many issues to be worked out first. Let Amazon &c work them out before we start building intentional cloud botnets. This would only provide you a single system image cluster in your house, and because Unix works on a process model, MOSIX works on process relocation. But when combined with LTSP and a bunch of machines of the same architecture (you could treat anything from Pentium on up, in x86 land, as i586 for example) then it would eliminate the need for local sto
Re:Dumb terminals and smart people don't mix (Score:5, Interesting)
I weep for OpenMOSIX. I was hoping that the project would continue and ere long we'd be motivated to buy all one architecture in our house simply because all the machines would form a cluster almost without our involvement and just accelerate each others' tasks. A terminal cluster where the terminals also make the entire system faster is kind of an ideal dream.
What happened to OpenMOSIX, anyway? I used it very successfully to turn groups of workstations into build servers; they all ran OpenMOSIX, and then make -j8 on any of the workstations would farm out the build to all the workstations. And it all Just Worked, and there was bugger all maintenance involved, etc. I was really looking forward to it getting mainlined into the kernel and then it just all kind of vanished.
There's no indication of what happened on the mailing list --- it just stops. There's a new project called LinuxPMI [linuxpmi.org] that claims to be a continuation but there's no mailing list traffic...
Re:Dumb terminals and smart people don't mix (Score:4, Informative)
According to Wikipedia, "On July 15, 2007, Bar announced that the openMOSIX project would reach its end of life on March 1, 2008, due to the decreasing need for SSI clustering as low-cost multi-core processors increase in availability."
Re: (Score:2)
Yeah, in fact I built just that for my school many years ago. 10 computers (PIII's), set up as an openmosix terminal cluster. It worked really well. If all terminals were in use people had the power of one PIII just like normal, and if fewer people used it, then there would be more power for everyone. This was far more efficient, especially as the computers would be on anyway, and scaled really well, as we didn't need to invest in really beefy servers to host all the apps on. It really was cost effective, a
Re: (Score:2)
This is assuming a perfect system, the server has to upgrade appropriately and have proper data, power and network backups to prevent the same issues but how often does slashdot go down these days
Re: (Score:2)
Well coming from an era when we had dumb terminals, I have no desire to go back to that. I like being able to use my computer even when it's not connected to the net. Like last night, I was watching videos without a connection. I couldn't do that with one of those so-called "cloud" computers, because neither the movie nor the player software would be on my machine.
And if you're really concerned about backing-up your data, there are services you can use NOW to upload your HDD to the net, so if your house
Re: (Score:2)
You can already connect your computer to the cell phone network for internet, get this updated for bandwidth and reliability and there is no reason a computer cannot always be connected to the internet.
Additionally having the OS on the internet instead of your device allows you to be working on a document on you desktop, move it over to your iphone to continue work on
Re: (Score:2)
I'm sure there are other examples that would work too, ex GPS maps or grocery lists...
Re: (Score:3, Insightful)
Actually, that's a good way to phrase it. That is, it may be true that slashdot itself is almost always up and running. But from my viewpoint, out here on an internet "leaf" node, slashdot quite often seems to be "down". It's fairly common that when I do a refresh, it can take a minute or more to complete. Sometimes when the "Done" appears at the bottom left of the window, the window is mostly blank, and it takes another refresh to get the summaries bac
Re: (Score:3, Interesting)
On the other hand an internet OS will use a lot of that bandwidth, likely leading to increased lag even as bandwidth increases (see hardware requirements of Win 95 vs Win 7...).
Unfortunately the only sure w
Re: (Score:2)
Unfortunately the only sure way to know how well it will work or not is to try it and see what happens.
Probably, and of course the open nature of the Internet means that people are free to experiment with a network OS. Actually, I've done that myself. Some 25 years ago, I demoed a "distributed POSIX" library that allowed me to do things like type "make" on one machine, and watch as it spun off subprocesses that compiled source on N other machines, linked them with libraries on other machines, and installe
Re: (Score:2)
Re: (Score:1)
running things locally would work great if >90% these days didn't need files from some sort of network drive/server/export/etc, requiring network access anyways. Lots of commercial software won't run if it can't get a license from the network, Outlook is just about worthless without a network connection. So really you need that connection anyways. Why do you seem to think that the loss of network access would need to imeditly kill any thing you were doing at the time? wait for the network to come back up
Re: (Score:2)
We may be at the point where things are stable enough (How often do you loose your gmail? Yes it went down for me the other day but its the first time in at least a couple years). The risks are much higher than the gains but they can be overcom
Re: (Score:2)
So your implementation didn't handle faults well, therefore we should throw out the idea?
There are certainly criticism to be made for the centralized model, but your anecdote isn't one of them. If the product you bought and/or stuff you built wasn't fault tolerant then you bought and/or built the wrong solution.
Re: (Score:2)
Re: (Score:2)
Yeah, I feel like there are a few problems with the vision of running a terminal/mainframe model, first and most obvious being, as you said, it introduces a central point of failure for everyone. If the server goes down, everyone on that server is suddenly unable to work. People will counter by saying, "well you just distribute it across a bunch of servers so there's no more single point of failure." It's harder than it sounds. If you distribute across servers, how do you manage that distribution? What
Re: (Score:2)
I was skeptical of this system from the get-go, but got overruled by some "visionaries" who had bought into the whole thin client argument with a religious fervor.
Or alternately, those "visionaries" were expecting to profit personally from the thin client manufacturer.
What we really want is the best of both worlds (Score:2)
There's no reason why we can't have both - data backed up/synchronized to the "cloud", and applications that can continue to run on locally cached data when the network is unavailable for whatever reason. There are still some cases where this is problematic - e.g. my iPhone Google Maps application really doesn't work in the hinterlands, as the phone won't have the maps locally stored - but this is really just a problem of caches not being big enough or smart enough to do what we need. The problem will be pa
Re: (Score:2, Insightful)
Every time there was a glitch in the network; every student, professor, and staff member at the university lost the ability to do anything on their computer--they couldn't so much as type a Word document.
Meh. That's true for my workplace despite our thick clients. Network folders, Internet connection, Active Directory... If anything goes down the office just sort of grinds to a halt.
Re: (Score:2)
Back in the 90's we tried this at my old university. We networked all our computers and put all our apps on a central server.
That's the point of local storage in HTML 5. Applications that make good use of it can run without a network connection, or when the server suffers a 30-minute "glitch."
Re: (Score:1)
Re: (Score:1, Interesting)
I think that's where old-school software download sites shine again. They are basically app stores for free/shareware apps; and they've been around for decades.
With the advent of Google-level search engines, they became a lot less relevant. Now that Google & co are spammed to death, they regain part of their old glory.
It's not all black and white though. App-stores suffer from fraudulent entries that try to game the system, too. I've followed the reports of various Apple App Store developers for a while
P or NP (Score:4, Insightful)
It seems the hardest and most time-consuming problem with Internet operating systems is figuring out how to work offline.
And the easiest solution, which seems to escape almost everybody, is "don't work online in the first place".
Re:P or NP (Score:4, Interesting)
The converse is not true. Of course you can retain the capabilities of an offline environment even after you add a wire to it, but those capabilities do not generalize to managing the resources on the other end of the wire.
The easiest solution to implement is a pencil and a piece of paper. Oh, you want capabilities too? Well, that's different.
Re: (Score:3, Interesting)
And the easiest solution, which seems to escape almost everybody, is "don't work offline in the first place".
FTFY. Having my data available on any online computer or device that I happen to be at *increases* its availability to me, even in the presence of occasional outages. There's down-sides, such as privacy, but availability isn't one of them: it's a net positive.
Internet as a living entity (Score:3, Interesting)
It have its strengths too, is maturing (hopely), have a good defense system so the sickness spread around don't infect everything, and it evolves fast (even if limited by laws, patents, trolls, etc), getting more personal and localized.
With a bit of luck people, institutions and governments starts to worry about its health, the ecosystem that it is and start working on preserving it as much as the planet we live.
Plan 9 Anyone? (Score:1, Funny)
It does sound like everything Plan 9 was trying to solve and did solve to a certain extent.
The trouble is plan 9 was too early for its time and it still is.
There is a larger problem too. Ownership. It is clear who owns and responsible for
individual machines. But who owns the mystical "between the machines space".
Google? Government? United Nations? Can't pick which is worse.
Re: (Score:2)
Me of course!
Your botnet provider. ^^
Internet OS (Score:1)
ps for non networking types IOS is Ciscos OS
Re: (Score:1, Funny)
Sounds more like he's summarizing the most popular services of the World Wide Web today, and calling all that the Information Operating System..
He missed porn.
Mobile code, redundant data (Score:2)
I think a better version of the future is to secure the PC using sandboxing and capabilities to limit the side effects of applications. This then allows you to download and run apps on your PC, without the need to trust them. You could then have redundant copies of your stuff spread across your various devices. Your stuff includes photos, videos, documents, and the code to manipulate them.
The focus on services is a result of the distortions caused by the lack of a good security model on the PC. Once that ge
Re: (Score:1)
Agreed. The idea of cloud computing is a power play to make users feel more secure given the inherent problems of (primarily) Microsoft Windows usage on the Internet.
The pitch is: "We'll do everything for you in the cloud and then it won't matter what you are running on your internet access device."
The problem with that model is that everything gets controlled by someone else. But the majority of non-technical customers do not understand how much they are giving away with that service model. They feel safer