The Uncertain Promise of Utility Computing 456
icke writes "A quick overview of where the Economist thinks we are with the The Next Big Thing, also known as Stuff that doesn't work yet. Quoting: 'It is increasingly painful to watch Carly Fiorina, the boss of Hewlett-Packard (HP), as she tries to explain to yet another conference audience what her new grand vision of "adaptive" information technology is about. It has something to do with "Darwinian reference architectures", she suggests, and also with "modularising" and "integrating", as well as with lots of "enabling" and "processes". IBM, HP's arch rival, is trying even harder, with a marketing splurge for what it calls "on-demand computing". Microsoft's Bill Gates talks of "seamless computing". Other vendors prefer "ubiquitous", "autonomous" or "utility" computing. Forrester Research, a consultancy, likes "organic". Gartner, a rival, opts for "real-time". Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.'"
Carly's explainations (Score:5, Insightful)
If you can't explain what you do in a way a 10 year old can understand, your business will probably fail.
Re:Carly's explainations (Score:3, Insightful)
Carly - "...[let's party like it's 1999]..." (Score:2)
Re:Carly's explainations (Score:5, Funny)
HP is a weird place (Score:5, Insightful)
My job was to disassemble brand-new packaged printers for rebuilding as prototypes for new models and loading the base unit CPU boards with Unix code for their prototype firmware.
I worked in a locked warehouse room with an outdoor loading ramp and about a million dollars worth of packaged printers stacked to the ceiling.
(They'd given me a marijuana unine test so they knew that they could trust me, but of course, no benefits not even morning coffee). My boss and my self were the only people who had keys to this locked storage workroom.
I put a picture of Claudia Schiffer in a evening gown on my PC desktop as wallpaper to keep from going insane in this sealed environment.
After about three weeks, I was fired for 'creating an environment conducive to sexual harassment' for this picture of Claudia Schiffer in a evening gown.
I can't recommend anyone seriously considering working at Hewlett-Packard. Sooner or later their bizarre culture is going to wipe you out regardless of how well you work or try to avoid their weird company politics.
I'm sure that Carly's only made a bad situation worse.
Thank you,
Re:HP is a weird place (Score:3, Interesting)
Thank you for taking the time to reply. I posted the photo on a PC that was in a room that was locked to all people except for me and my boss. It was not an accessable workplace. My boss, who was 15 years younger than me, had me tossed out of the company without review or comment.
It was only a
Re:Carly's explainations (Score:5, Interesting)
Re:Carly's explainations (Score:5, Funny)
Wal-Mart: We sell everything everywhere, for cheap.
Banks: We give money to people, and they give us more money back later.
McDonalds: We make fast food that kids like and parents put up with.
In-N-Out: We make fast food that everyone likes.
Dell: We make cheap computers.
Microsoft: We make software, and whatever else we want.
SCO: We sue people.
Re:Carly's explainations (Score:4, Funny)
Microsoft and Slashdot dislike (Score:3, Insightful)
That's not actually what Carly Says (Score:4, Funny)
We automate stuff.
Well, yes, that's what all I.T. departments do. HP doesn't even do it particularly well. That's why they need to say that they enable adaptive cross-platform solutions for process-centric business aplications. I used to facilitate reliable time-sensitive information distribution services because it wasn't that impressive to just have a paper route.
I'll tell you what's "painful"... (Score:3, Insightful)
Newsflash #1: Carly doesn't actually RUN anything. She's the CEO of a 150,000 person company. Asking her to explain in detail any computing architecture is like asking Arnold Schwarzenegger to explain California's budget. Yeah, it's painful. She's also not the person to look to for a good explination.
Newsflash #2: You won't *really* get it until it happens. Do you remember the first time you heard about the web? I was a VAX/VMS program
utility computing (Score:3, Funny)
Profane, not profound. (Score:3, Insightful)
Re:Profane, not profound. (Score:2, Insightful)
Re:Profane, not profound. (Score:2, Informative)
It really is a cool idea, and on
What's funny is... (Score:3, Insightful)
Until Microsoft, IBM, et al. spoke up and decided what DTDs and protocols they were going to use, it wouldn't help at all. In fact, you could just drop the XML and call it any old binary protocol, as long as everyone agreed on what
Re:Profane, not profound. (Score:5, Insightful)
They're just spinning off commodity computing as if it's the latest, greatest product offering, rather than the natural evolution of technology. Commoditization of technology has been the downfall of just about every past for-profit technology fad. What these companies and groups are doing is trying to pretend that they created the trend, for some reason. In the end, the result is still the same.
Re:Profane, not profound. (Score:3, Interesting)
This exactly describes what Oracle's doing with their "Grid" computing. They want you to shaft Sun, HP, etc., by buying super-cheap white box computers, and putting Linux on them. What they never seem to mention is that their SOFTWARE doesn't get any damned cheaper, even if the hardware is free, relatively speaking.
Hmmmm, let's see how this works. If I buy two 4 CPU Sun Fire 480 systems at $35k each, plus a couple of smallish NetApp Filers at
Computers will be everywhere (Score:5, Insightful)
Re:Computers will be everywhere (Score:5, Funny)
What will they talk about?
Re:Computers will be everywhere (Score:2)
Re:Computers will be everywhere (Score:2)
Oh, you know...stuff, and things. And stuff.
Re:Computers will be everywhere (Score:2)
the best way to enslave humanity.
Re:Computers will be everywhere (Score:4, Funny)
Re:Computers will be everywhere (Score:5, Funny)
If it's like computers nowadays, it will be porn.
Huge winner = information management at O/S level (Score:3, Interesting)
But humans want to manage information, not an O/S. The first operating system that manages information instead of binary files will be the basis for a huge winner.
Re:Computers will be everywhere (Score:2)
Candidate#2 - The Open Source World
Candidate#3 - Microsoft
Candidate#4 - Government
IBM has the giant gorilla approach with massive marketing, a broad product range, and ties to open source. They have the people and the power to get it done.
The OSS world would have professionals getting it done (those tired of waiting on others). A programmer/hacker with an IQ of 200 is going to get it done for his company and release into the wild. We have the man power BAR NONE.
Microsoft will just pour i
Re:Computers will be everywhere (Score:3, Informative)
Companies will not own their hardware, but rent it. If they suddenly need 3 times as much CPU, then they get it immediately, and only pay for what they use.
This is different than the current situation where a company must always keep enough hardware around to handle peak loads, which is almost never. And then, if they guessed wrong, they are still screwed.
It's really that simple, but hard to implement. IBM plans
Re:Computers will be everywhere (Score:4, Insightful)
This is different than the current situation where a company must always keep enough hardware around to handle peak loads, which is almost never. And then, if they guessed wrong, they are still screwed.
The problem with that scheme is that most business problems are more dependent on I/O bandwidth than on CPU crunching. Today, you can mail order a gigaflop of CPU horsepower for less than $100. Compute horsepower is not an issue.
The problem is that if you try to ship your computing problems to some other location, you've got to get the data from your site to theirs, so you still need I/O bandwidth at your site. What's worse, now you need a high-capacity WAN link to move it to these arbitrary locations.
You may also have massive databases of background data that need to be referenced to solve your problems. How do you handle this? Send terabytes of data offsite so that a third party can run their Opteron against it for a few minutes? Or do you install a massive Internet pipe so that they can mount your database remotely? Either choice costs more than buying your own Opteron.
Re:Computers will be everywhere (Score:2, Informative)
Same deal. You suddenly need 43 database servers instead of just two, how will you cope? IBM's strategy is about EVERYTHING in a computer system, not just CPU. The hardware doesn't even live on the customer site.
Here's how the customer sees it:
Customer: Holy shit! We've got to process 43 times our normal data volume for the next 36 hours, starting right now! better call IBM.
Customer: Hello, IBM, we're to be handling 43 times the transactions that we're norma
Re:Computers will be everywhere (Score:3, Insightful)
Need I go on?
Re:Computers will be everywhere (Score:4, Funny)
Re: (Score:2)
Re:Computers will be everywhere (Score:3, Insightful)
But I still assert that, for the most part, they haven't a clue how to leverage that into a product or service
It isn't a product or service, it's an architecture it's a way of connecting machines together in a way that they can off load some of their processing, or seemlessly access processes (for example a print house to print 5,000 customer notification letters).
No one pays much for those ideas. They pay for tangible products that come in a box.
Look at dial up networ
Discrete projects (Score:3, Insightful)
Absolutely. But I don't see large scale distributed computing or "utility computing" working in the public domain for more than a few conceptually cohesive projects (think SETI and Folding@home for publicly available projects). On the other hand individual companies could certainly take advantage of this concept for internal projects while harnessing the computing power that many of them already have in abundance. The problem is bringing all of this computing power (desktop systems) together easily and without hassle. Software like Pooch [daugerresearch.com] and Xgrid [apple.com] are decidedly the way to go here allowing companies to harness space CPU cycles for anything from rendering to bioinformatics to modeling airflow or turbulence. For instance, how many computers are at organizations like Lockheed Martin? Or Genentech? Or at most Universities?
Re:Discrete projects (Score:3, Insightful)
So, I'm basically agreeing with you that utility computing is applicable to only a small subset of interesting problems. A useful subs
"Organic," Grab your shovels (Score:3, Insightful)
Carly Fiorina doesn't give a shit about anything (Score:3, Informative)
She knows nothing about technology, and rather little about business. She only knows how to drain money. Don't expect to see HP change the face of computing with her in the captain's chair.
It's simple (Score:4, Interesting)
Re:It's simple (Score:3, Informative)
As for what that means for us *normal* people, maybe it means we can opt to make an extra penny or two running an IBM branded screensaver tha
Clearly, something monumental must be excreted... (Score:2)
no, just more crapspeak from the usual pack of rabid weasles jockeying for the best position from which to loot the citizenry.
Well. It's Marketing-Speak. (Score:3, Insightful)
The real problem is that if the masters of saying nothing by saying a lot, like the Economist is, don't understand what these IT heavyweights are saying, there must not really be much behind the terms...
Good Word for it (Score:4, Funny)
Bullshit Computing
or maybe PADOS "Pump And Dump Our Stock" Computing
The unexplainable e-business on demand (Score:5, Informative)
The instructor couldn't explain it, so she brought in a marketing exec, who could only define it in terms of itself. "E-business on demand is about computing, on demand, for e-business." Sprinkle in a healthy dose of meaningless adjectives, and you get the picture.
I'll tell you, it's pervasive. Since then, I've not found one person who can give a cohesive definition at this company. And yet, it's supposed to be my driving force and ultimate goal.
yay.
Re:The unexplainable e-business on demand (Score:2)
It seems to me the really "big things" were driven by new abilities suddenly being given to large numbers of people. Witness the PC (computing power for the masses), computers games (amazing fun new technology available to everytone), and the web (point-and-click internet for the masses). Heck, even the porn industry - in my day it took work to get your pron, sonny. I don't think it's possible to decide what the "next big thing" will be and then go out and create it, but I'm reasonably certain that some
I know what your problem is: (Score:2, Funny)
Re:The unexplainable e-business on demand (Score:4, Informative)
On Demand from IBM (Score:3, Interesting)
Re:On Demand from IBM (Score:5, Interesting)
Look at it from IBM's perspective. You can have 8 extra processors on-site for each client for those few times when they need the extra CPU, or you can have massive datacenters all over the world with a pool of extra CPU's to draw from. The latter will lead to unprecedented economies of scale as you can reassign computrons dynamically between clients to whomever needs them most, while still maintaining a comfortable cushion. Those economies of scale likely mean both lower prices for the customers as well as increased profit for IBM, because it drastically increases the efficiency of their services.
I would be surprised if IBM was *not* working on a way to make applications portable across architectures also, and the push towards Linux on everything would seem to support this endeavour, irrespective of all the other reasons.
Imagine buying systems capabilities instead of machines. Let's say you need gobs of CPU but not so much I/O bandwidth. Your jobs are allocated to a Power-based compute node. Let's say you need gobs of I/O bandwidth but not so much CPU. Your jobs are allocated to a zSeries machine. Now things get *really* interesting when your job first needs lots of I/O, then lots of CPU, then settles down for a bit. Your job could get reallocated across the grid based on its needs at any given moment.
The technical end of making transfers of processes and datasets seamless is where the difficulty lays, and all of the 800lb gorillas are chomping at the bit to get it working first. The first one to do it right stands to make a fortune.
Dan
Re:On Demand from IBM (Score:2)
Biz Lingo (Score:2)
Not a new idea, but a good one (Score:5, Informative)
At a high level, it's a pretty simple idea, and very powerful.
At the detailed level, there are some amazingly hard problems to solve. Like, for example, how does software get split into parts that can be separated with minimal communications overhead, or how do you decide when a task would run more efficiently spread across a bunch of CPU's, or how do you keep running smoothly when a network outage causes 10% of your CPU's to drop off of the grid.
I suspect that the reason that all of the big companies are pitching this is that:
1) CPU's and operating systems have been commoditized by Intel/AMD/etc. and Linux, and they want to have a reason for you to buy bigger/better/more expensive systems.
2) Once one of them announced it, they all have to have a "response".
That being said, I think that what they're doing is going to be of real value to high-end customers. If you're running a farm of 5,000 servers, you really need the software to be self-healing, etc.
Re:Thinking Machines (Score:2)
Re:Not a new idea, but a good one (Score:2)
It is a very interesting model. Why? Well because under it, computing resources are like electricity. You pay for what you use and the kit they sell you is like the grid with all kinds of excess ca
The Big Thing (Score:4, Insightful)
Absolutely. It's called saturation and we're closing in on it. So the marketing drones are in red alert to find something different to sell before the old business runs out.
Note the keyword "different". Also note that to marketing it means something entirely... uh, different, then to you and me.
It's a bit like C++ and C - there is a new paradigm, a new approach, and some real technical differences. A lot of books get written, some people become famous, some rich, a few both. In the end, though, 90% of what you're actually writing doesn't change. It's still "i++;" and "exit 1
something tells me (Score:2)
it'll look something like this
___
I
o-o
This is great news for software developers... (Score:5, Funny)
Re:This is great news for software developers... (Score:2, Insightful)
So by the use of the term 'Darwinian', would that mean that HP have now sacked anyone capable of developing a long term plan, and they are now blindly altering and testing things they already have to see if they are in some way better than they used to be, without any real understanding of what they're doing?
What is old is new again... (Score:2, Insightful)
Everything else is marketing gobbletygook.
Gordian Knot (Score:4, Insightful)
But if you have no idea what it is how can you claim it to be profound? Remember the Segway?
Perhaps the simpler explaination is that they are making lame-brained babble about how there are lots of computers now, there are going to be even more and they need to be easier to use? They then pick some high falutin sounding words that kind of describe some aspect of that as they see it.
Just maybe?!
Really, anything short on details and full of buzzwords probably isn't a big deal - or anything at all. Yes, there are current trends in the way computers are used that is changing. There usually are. There IS a push that people want SERVICES, not computers. They want INFORMATION, not machines. People don't want to worry about running servers and infrastuctures and they also don't want to have to deal with a lot of computery stuff to do things in their daily life like listen to music, communicate, etc.
Nothing new here.
Profound indeed. (Score:2)
For those that don't know it, the ASP model has generally proven to be a failure and this "new" concept seem like just another rehashing of the ASP model. But, this time they are going after CPU cycles rather than just applications.
Hand in hand with offshoring (Score:5, Insightful)
Re:Hand in hand with offshoring (Score:4, Insightful)
IBM is currently offshoring 100 lawyers to do this and Indian's are being trained in US accounting. In the future large service organizations like H&R Block will have tons of Indians or Chinese trained in the US laws and practices, you will interface with an American account manager who hands you the reports and answers basic questions. Meanwhile, your data will be input by Americans working for around $12 an hour, the data will be shuttled off to the Indian or Chinese service centers and the product will come back to be given to you by the account manager.
The efficiency gains that these large business' are getting from utilizing this model internally will be scaled and productized to appeal to small business, which will be considered a growth market. Local CPAs and a lot of basic work that local lawyers do will be aquired more cheaply by small business using these large service organizations. Some of the large service orgs will partner with local service providers to gain access to thier clients, the same way Intuit markets it's services to CPAs.
The problem for the average middle class US professional is that there are not really any jobs outside of Health Services (nurses, docters) that don't fall into this model. The problem for the country is that we can't just be people who take care of the old and sick and sell stuff. This country has to produce something and there has to be oppurtunity for the middle class and those who are trying to seek entry into the middle class. Democracy and Capitalism don't function without a strong wealth owning middle class.
Does anyone see a solution to this problem? I haven't found one. I've been looking too. Any new industries that we'll be able to move to?
Only one way to get it. (Score:3, Insightful)
Something profound is happening and it is hard to explain. Computing on demand, I like it. Still, it's hard for people to really get it.
Terms don't work very well. I've told them about apt-get, dselect and aptitude, but they get lost.
Showing them the tools in action is impressive, but they still don't get it. I've demonstrated apt-get and dselect and it's generally impressive. Changing out Exim for Sendmail flawlessly and remotely without reboot was kind of cool and impressed several people at work once. A demonstration of dselect came tantalizingly close to clueing in my brother in law. At first he was unimpressed with all the software he saw listed because he is used to music sharing programs. His eyes nearly poped out of his head when I told him that all of that software was free and inteneded to be by the authors. He still lacked an apreciation for the quality of the software and has yet to get it. Aptitude, while it may be easier to use and put a pretty face on the process will have about the same result.
No, I'm afraid that the only way people will understand that there is a vast collection of software ready to fill any and all of their computing needs is for them to use it. Free software, to me, is the ultimate computing on demand, co-operative utility type computing. The abiltiy to demand only comes from control and control only comes from freedom.
The candy available from the NOT net and all the other followers of Netscape's browser and remote desktop computing are nothing in the face of free computing.
darwin (Score:2)
I may be wrong, but weren't darwin's theories used by the "upper class" as an excuse to why they are better than the "lower class". Something to the effect of "we have evolved and you have not, so we deserve all these riches and you deserve nothing." I wish i still had my history notes. In anycase, vieled references to Darwin such as this "Darwinian reference architectures" has since left me skeptical about the persons motives.
Agents, Agents Everywhere (Score:2, Insightful)
Nothing monumental yet... (Score:5, Insightful)
Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.
There's nothing monumental that's really floated to the surface yet. I work in grid computing, which itself is an amazing buzzword that everyone wants to say and no one understands (hell, I am not really sure what the purpose of what I do is).
Everyone's grasping for straws right now, b/c when some research project actually does become useful, they want to be in front of the wave so they can ride it all the way. This is everyone throwing out made up words in the hopes that people will like some (or at least one) of them. Around here, our made up phrase that I love is that we are being called "the cornerstone of cyberinfrastructure." It's even been used so much that they've shortened cyberinfrastructure to "CI" in big rambling memos about our future and direction. It's sort of depressing, though, when you realize that none of this actually means anything yet. Maybe it will one day, but that's not quite here yet.
Reason for Simultaneous Discovery (Score:4, Insightful)
Yup. It's called "Bandwagon."
Lack of innovation (Score:2, Insightful)
What is going on is the marketroids are harvesting what they've sown. They are a little short on ideas at the moment. They have nothing to better to do so they copy one another. Having witnessed the overindulgence in irrational exuberance and the trade of talent for third-world coding monkeys (aka offshoring), the creative peo
Where do I apply for a job like that? (Score:5, Insightful)
JIT (Just In Time) Computing (Score:2, Insightful)
Don't worry if you miss this current trend, there will be new names for working faster next year.
You too? (Score:2)
Ever since then I've been pulling things out of MI-ASS.
People seem to love it...
I think I know what it is... (Score:2)
The marketing departments of these multi-national corporations all simultaneously decided to invent a new, improved, generic computing paradigm/infrastructure/idea/program/something else. And each of these companies are fighting to define what it is the marketing department is marketing.
Ahh, to work in a large corporation!
It's called fear (Score:2)
Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.
Yeah. They all bet the farm on maintaining exponetial growth forever, without asking anyone who knew math if it was possible. Now they are tap dancing.
The sound you hear is just a high speed mixture of marketspeak, fear, and tap shoes.
-- MarkusQ
I know why (Score:5, Funny)
Re:I know why (Score:4, Funny)
But remember: ubiquitous autonomy of enabling processes in a modularizing environment, synergetically adapting and integrating to provide on-demand seamless real-time organic utiliy computing is extremely vital in the face of Darwinian reference architectures.
Behind the glass (Score:3, Insightful)
Well-designed hosting environments can make this happen. Portable API's such as those available in Unix/Linux and in Java help make it happen, and help make the apps relocatable. Truly transparent network filesystems like NFS allow for application and server load balancing. Transparent graphics systems like X11 help make the apps truly independent of the display they're viewed on -- applications moving to the Web is a big piece, too.
This was the original vision of "network computing" and it's still a good idea -- it's still being worked on and there are places where it's being deployed. The reason why the original McNealy/Ellison vision of network computing failed is because they required everyone to move exclusively to pure Java applications. In reality, most environments can't make that big of a move that quickly.
So what we're seeing is a gradual shift of applications off the desktop and back into the data center. For the time being, most users are still using a fat PC to access them, but IT organizations will wake up one morning and suddenly realize that everything has moved behind the glass and they really are in a utility computing environment. If they've done it right, they will then be able to move applications and storage resources around the data center without an impact on the users. This is the promise of utility computing and it's a good idea.
And for organizations that don't want the expense of running their own data center, they can enlist the services of a hosting company [xand.com] that specializes in this type of thing -- IT keeps control of its applications, while someone else keeps the air conditioners, UPS's, and routers running.
AI (Score:3, Insightful)
But that's just what all of these sound like! "Darwinian reference architectures" sounds like a system that learns using a genetic algorithm. "autonomous" and "organic" are even more descriptive. But everybody is just dancing around the real issue so they don't scare off anybody.
Grid Computing != Ubiquitous Computing (Score:3, Informative)
I'm doing this already. (Score:3, Insightful)
I have an architecture in place which can scale pretty much linearly from 10 concurrent users to 1000 concurrent users and probably beyond just by adding boxes, completely transparently and with spectacularly little administrative effort.
Stop thinking of computers as individual machines, they are really just little blocks in the whole, treat them as such.
Oh and we haven't spent a penny on setting up the system, making use of older kit, so it's cheap, scalable, easy to manage, highly available, fast etc etc. Am I going to tell you how to do it? Am I buggery, you'll be able to buy such a system from a web site near you soon. Any administrators with a bit of imagination, a few years of experience and penchant for infrastructures.org could come up with a similar system fairly easily but thankfully they are few and far between.
Bleh! (Score:3, Insightful)
This is akin to the RIAA realizing that everyone else has moved to subscription services except them. That pay once, play forever model wasn't sitting well with them hence DRM and all it's associated ills were born.
The thing that will keep this kind of computing a pipe dream for now is bandwidth. I think you'd be hard pressed to find a company or a regular "Joe User" willing to usurp the power that they have in a machines that they own outright just so they cn pay a monthly bill. It's the same reason why so many idiots are happy with a big powerful PC on their desktops even if all they ever do is browse the web and do some word processing. Once the bandwidth is such that you have super fast connections to centralized processing, then this kind of thing might take off.
I'm doing it at home in a fashion with a terminal server and some wireless X terms. Instead of having to have fully outfitted PCs in every room, I just have one big honkin' nasty box that does everything I need it to. It's a file server, web server, mail server, DNS, application server, print server, streaming music server, IM server, etc... I put all the money and resources into this box and then everything else is just a glorified GUI dumb terminal. So far, no problems between my wife and I when we simultaneously run normal processes (or even some of my heavier ones). But I've got the bandwidth here at home. 100Mb wired to every machine except the wireless terminals. Until we get at least that kind of speed dedicated to every node on the net, this stuff isn't really going to happen.
I remember the come on for this crap that we got where I work when it was Compaq preaching this stuff. My boss and I looked at it and laughed. Sounds like another NT... Not Today.
The Network Is the Computer (Score:3, Interesting)
If the network is everywhere, easy and economic - outsourcing storage is perfect. Outsourcing CPU takes much more work (as we've pointed out). What will become profitable is not what can you do with a computer, but though the internet it is now, what can you know through a computer. It will become VERY profitable to make masses of information meaningful (information discovery). Take for example google. Or how about big brother... er, I mean Tom Ridge's TSA/Homeland Stupidity initative to link your grades, credit score and medical records together to determine if you're a terrorist. Yes, it's aweful, but this is what the powers that be want. And it's what you want judging from the popularity of google.
They are expecting networking to get better and better to make this happen so that information and its software is more interesting. The problem is politics. The FCC and the varrious state public utility commissions are all bribed by big telecom, and have NO interest in doing something innovative that might help its citizens and break the business monopoly of build it, sit on your rear, make money. You want to know who will? China. No infrastructure in place, slave labor, easy government bribery... it's the perfect business growth environment.
Ya, the network will make it happen, but the pessimist in me says it's not going to be here.
Also known as "Timesharing" (Score:3, Informative)
time-sharing: 1. Computing The automatic sharing of processor time so that a computer can serve several users or devices concurrently, rapidly switching between them so that each user has the impression of continuous exclusive use.
(Oxford English Dictionary)
More 'grid computing' nonsense (Score:5, Insightful)
Irving Wladawsky-Berger, an in-house guru at IBM, pictures an ambulance delivering an unconscious patient to a random hospital. The doctors go online and get the patient's data (medical history, drug allergies, etc), which happens to be stored on the computer of a clinic on the other side of the world. They upload their scans of the patient on to the network and crunch the data with the processing power of thousands of remote computers-not just the little machine which is all that the hospital itself can nowadays afford.
This "guru"'s story is so unrealistic that it's downright dishonest. First, how is the patient identified among the millions of medical records in this miraculous database? The patient must be carrying some kind of identity card, so why not embed his/her medical records in the card instead of putting them online where they are exposed to hackers? (Of course it's still possible for someone to steal a smartcard, but at least it requires a separate attack on each patient rather than a single attack on the entire database.)
Second, how do the doctors authenticate themselves, or is everyone allowed to browse and update the medical records? These are doctors at a "random hospital", so in order to help this patient they must have access to the medical records of everyone in the country. Every doctor has access to every patient's records - great, what happens when one doctor's smartcard goes missing? The entire database is compromised. Again, the only sensible option is to keep each patient's data on a separate smartcard (with an offline backup in case the card is lost). The 'grid' is not the solution here.
Finally, we have the touching story of The Little Computer That Could - the hospital's computer is too slow to crunch the data on its own so it makes use of idle cycles donated by other computers. This completely misses the point of utility computing, which is to make it possible to buy and sell computing resources. If grid computing ever becomes widespread, all those idle CPU cycles will become a commodity and you will have to pay for them. Perhaps some philanthropic souls will donate cycles to the hospital for free, but they're just as likely to donate a real computer - the idea that the 'grid' solves the problem of equipment shortages is absurd.
Next Big Thing (Score:3, Insightful)
Old, bad idea (Score:3, Insightful)
CPUs are cheap.
Once upon a time, computers were really expensive. Control Data Corporation proposed designs in the 1960s with one supercomputer (of about 5 MIPS power) per metropolitan area. They went on to build time-sharing data centers, and for a decade or so, it was a viable business. Back then, when a CPU cost over a million dollars, time-sharing made economic sense. It hasn't been that way for a long time. A very long time.
It's notable that there's little enthusiasm for "grid computing" from the businesses best positioned to provide it - hosting services. They have the right infrastructure in place. If they wanted to sell number-crunching power during off-peak periods, they could. But nobody wants that service.
The ASP business is a disaster. The biggest player, Corio, has had its stock price decline from 25 to 3 over the last three years. Their revenue is declining, and they're losing money. Many smaller ASPs have gone bankrupt, often leaving their customers in desperate straits. There are risks to outsourcing key business functions.
The real trend in business computing is "buy once, run forever". That's what "utility computing" is really about. How often do you replace your power transformer? The real push for Linux comes from businesses that hate Microsoft's "buy once, pay forever" plan, "Software Assurance".
It all hinges on quality (Score:4, Insightful)
I mean, today, I buy any other piece of consumer electronics, I plug it in, and I use it. It breaks, I throw it out.
With a PC, I have this thing that needs to be maintained, occasionally turned on and off, needs to be asked permission to be turned off, becomes useless when its OS gets EOL'd, has software from dozens of companies on it, and still has stone-age level means of really assessing/changing how it's configured. It's a big load on a consumer's patience and requires much more skill to really safely wield than all but a few geeks possess. (asside: I think this is one reason MS will be surprised at how fast Linux catches on, because the extra ease of use of MS is eclipsed by the 'you can fix anything, there are no dead ends' attribute of Linux) Plus, more and more our PCs hold valuable content (your baby photos, your music library).
So...eventually if someone instead offers a cheap, indestructable maintenance-free terminal and left the ugly issues of data storage, backup, application upgrades, virus definitions, and more to be handled for you remotely somewhere, and if it was done cleanly over a super fast connection, I think this idea will take off because consumers will value convenience over the flexibility and pain of essentially being a 1-man IT department for your own house.
Computing on Demand: CODpiece? (Score:3, Insightful)
In the old days (and today for some really big shops), everyone generated their own electricity - they had to. Either that, or they bought it from local collectives. As you can imagine, that was relatively expensive and way inefficient. If you needed a few thousand kw more than your generators could produce, well, you'd have to buy new generators.
Well heck, why not use some kw from your neighbor? Well you can, but the interconnect cost is high, as are the risks. What happens if you overload your neighbor's generator? Both of you are hosed. For your neighbor, the incremental benefit for selling you their excess electricity is far outweighed by the downside of total loss of all electrical. Doh!
Back then it might have been called "electricity on demand." As much electricity, when you needed it, on a metered basis. Hey, you don't have to worry about your electricity needs anymore. And by leveraging electricity generation across a region, the total price is magnitudes less than what you would pay. A no-brainer, and something with benefits so great that the local governments gave monopolies to local power companies so they'd build out their infrastructures.
Fast-forward to now, and COD is a major problem. No sane computer vendor wants to become a commodity like electricity, except...IBM. Only IBM has the scope to survive computing commoditization, because it believes its boxes are what's going to be at the end of that data cable snaking into your (or someone else's) business.
Face it, nobody except geeks really cares how stuff happens on computers, just that it happens quickly, reliably, and as expected, three things that most IT departments are mostly incapable of doing. Why not let IBM do it?
Right now there are a bunch of things to work out, like management, uptime, performance, and getting internal apps on hosted systems, stuff like that. It's the annoying management and administration stuff that's bogging everything down. But this is more than outsourcing, this is outsourcing to the next level.
Think about it. Why does every business need their own accounting program? They don't, not really. How about for payroll? HR? Inventory? Email? They don't. They might like to think they do, but realistically speaking if accounting software adheres to GAAP they'll live with it. If they can customize reports, they'll be fine. Same with everything else.
It would have millions (or billions) of dollars if the world was like this. Why have 5000 instances of peoplesoft running all over the US, when they basically do the same thing in the same way, with minimal customization? etc etc.
That's the promise of CoD - getting rid of your IT department completely. IT is generally the worst-performing, least responsive part of any business. Let it be handled by pros, instead of the yokels you've got. And you'll save money to boot.
Re:Carly Fiorina (Score:2, Insightful)
Re:Carly Fiorina (Score:2, Funny)
Re:Carly Fiorina (Score:4, Insightful)
As an AC below me suggested, it is precisely this behavior which might see her head roll from a guillotine someday.
* oh, did I say "little"? I meant $150,000,000, or about $25,000 for every employee she put out of a job in order for HP to "remain competitive" (her words, not mine).
Re:Carly Fiorina (Score:3, Insightful)
Layoffs suck. I've been there, I've been unemployed in this crappy mar
Re:Carly Fiorina (Score:3, Informative)
http://www.theinquirer.ne
Re:Carly Fiorina (Score:3, Interesting)
However, HP/Compaq had numerous reports of pretty awful problems combining things. The press, at least, represented it as one of the messier major mergers in recent history.
gender equality (Score:3, Insightful)
Obviouslly, you believe the lies spread by political hacks that get paid by perpetuating the myth that a bias agains women still exists in corporate America.
1. Harassment laws and corporat policies favor women over men
2. Diveristy training always includes training on how we should all be sensitive to women but never ever has any training on how everyone should be sensitive to men.
Re:Carly Fiorina (Score:5, Insightful)
I think giving women an equal chance is great, but if they are going to do all the same bone headed, greedy crap that men do why bother?
Re:Carly Fiorina (Score:4, Insightful)
Re:Carly Fiorina (Score:3, Interesting)
Re:Carly Fiorina (Score:2)
In btw, these are not a fake. They is real and I think she deserves a lot of credit for puling this one off.