Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Businesses Technology

The Uncertain Promise of Utility Computing 456

icke writes "A quick overview of where the Economist thinks we are with the The Next Big Thing, also known as Stuff that doesn't work yet. Quoting: 'It is increasingly painful to watch Carly Fiorina, the boss of Hewlett-Packard (HP), as she tries to explain to yet another conference audience what her new grand vision of "adaptive" information technology is about. It has something to do with "Darwinian reference architectures", she suggests, and also with "modularising" and "integrating", as well as with lots of "enabling" and "processes". IBM, HP's arch rival, is trying even harder, with a marketing splurge for what it calls "on-demand computing". Microsoft's Bill Gates talks of "seamless computing". Other vendors prefer "ubiquitous", "autonomous" or "utility" computing. Forrester Research, a consultancy, likes "organic". Gartner, a rival, opts for "real-time". Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.'"
This discussion has been archived. No new comments can be posted.

The Uncertain Promise of Utility Computing

Comments Filter:
  • by feed_those_kitties ( 606289 ) on Friday January 16, 2004 @11:33AM (#7998057)
    Sound a LOT like the way Enron tried to "explain" how their business worked.

    If you can't explain what you do in a way a 10 year old can understand, your business will probably fail.

    • It's kind of wierd for the press to actually start asking hard questions. Think tanks like Gartner et al live and die by techno-hype. The latest thing going around in CIO-land is Utility Computing, so we'll see what comes of that.
      • The press (and a lot of investors) got burnt hard, becuause the never seemed to question all that talk a couple of years ago. Just another "buzz word" CEO, looking to recapture some of the glory that made her "great".
      • by Walt Dismal ( 534799 ) on Friday January 16, 2004 @12:25PM (#7998651)
        Um,Carly's having a hard time explaining what it is because she really means "Utility outsourcing' but she doesn't know how to translate that from Hindu to English. But that's okay, it's not her audience's god-given right to understand her.
      • by Simonetta ( 207550 ) on Friday January 16, 2004 @12:56PM (#7999088)
        In 1994 I got a temp job (temp in the sense that they weren't hiring on less than the PhD level to avoid paying benefits, but permanent full-time in every other respect) at HP-Vancouver Washington.
        My job was to disassemble brand-new packaged printers for rebuilding as prototypes for new models and loading the base unit CPU boards with Unix code for their prototype firmware.
        I worked in a locked warehouse room with an outdoor loading ramp and about a million dollars worth of packaged printers stacked to the ceiling.
        (They'd given me a marijuana unine test so they knew that they could trust me, but of course, no benefits not even morning coffee). My boss and my self were the only people who had keys to this locked storage workroom.
        I put a picture of Claudia Schiffer in a evening gown on my PC desktop as wallpaper to keep from going insane in this sealed environment.
        After about three weeks, I was fired for 'creating an environment conducive to sexual harassment' for this picture of Claudia Schiffer in a evening gown.
        I can't recommend anyone seriously considering working at Hewlett-Packard. Sooner or later their bizarre culture is going to wipe you out regardless of how well you work or try to avoid their weird company politics.
        I'm sure that Carly's only made a bad situation worse.

        Thank you,
    • by jeffy124 ( 453342 ) on Friday January 16, 2004 @12:06PM (#7998439) Homepage Journal
      to me, they sound like all the different Microsoft execs (Ballmer, Gates, etc) trying to answer the question "What is .NET?" I know there was a business2.com article that sampled some responses, but I cant seem to find it at the moment. IIRC, one quote was along the lines "So much of our stuff has a '.NET' label attached to it, even we dont know what it is at times."
    • by TALlama ( 462873 ) on Friday January 16, 2004 @12:08PM (#7998455) Homepage
      Examples:

      Wal-Mart: We sell everything everywhere, for cheap.
      Banks: We give money to people, and they give us more money back later.
      McDonalds: We make fast food that kids like and parents put up with.
      In-N-Out: We make fast food that everyone likes.
      Dell: We make cheap computers.
      Microsoft: We make software, and whatever else we want.
      SCO: We sue people.
      • by NanoGator ( 522640 ) on Friday January 16, 2004 @01:45PM (#7999702) Homepage Journal
        Slashdot: We promote Linux and Mozilla, and we bash Microsoft in every way we can, even if it involves writing award winning fiction.
    • Watching Slashdot readers spew on topics they know nothing about.

      Newsflash #1: Carly doesn't actually RUN anything. She's the CEO of a 150,000 person company. Asking her to explain in detail any computing architecture is like asking Arnold Schwarzenegger to explain California's budget. Yeah, it's painful. She's also not the person to look to for a good explination.

      Newsflash #2: You won't *really* get it until it happens. Do you remember the first time you heard about the web? I was a VAX/VMS program
  • by Anonymous Coward on Friday January 16, 2004 @11:33AM (#7998058)
    damn if the lights would just stop flickering
  • by robslimo ( 587196 ) on Friday January 16, 2004 @11:35AM (#7998080) Homepage Journal
    If they can't describe it in real world, understandable terms, it's either pseudo-marketing babble or some ethereal, vapor-concept whom the perveyors of can't quite wrap their minds around themselves. In either case, they need to put up or shut up. I'm grow weary of it.

    • I am totally in agreement. All ideas that don't come out fully-formed and ready to be manufactured (or at worst, ready for detailed engineering) should be kept to one's self.
    • It's just XML based data transfers with a bunch of new names, the nebulous part comes from all the comapnies selling everything, IBM and MS this is mostly you two, under this new brand name. Remember when everything from MS was .NET, "new from Microsoft Socks.NET (they have our logo on them). Now IBM is selling everything as onDemand, and HP is selling adaptive everything, regardless of what it used to be called or how tortured the path back to XML data transfers would be.
      It really is a cool idea, and on
      • aside from XLST, there isn't anything really valuable about XML anyway. It looks like HTML, so maybe that's comforting. Of course it only helps humans to read it, but never mind that you need a DTD anyway to make sense of it, so it's not really convienient for humans OR computers.
        Until Microsoft, IBM, et al. spoke up and decided what DTDs and protocols they were going to use, it wouldn't help at all. In fact, you could just drop the XML and call it any old binary protocol, as long as everyone agreed on what
    • by BooRadley ( 3956 ) on Friday January 16, 2004 @12:15PM (#7998520)
      It's not all technobabble. They're trying to avoid using the word "commodity."



      They're just spinning off commodity computing as if it's the latest, greatest product offering, rather than the natural evolution of technology. Commoditization of technology has been the downfall of just about every past for-profit technology fad. What these companies and groups are doing is trying to pretend that they created the trend, for some reason. In the end, the result is still the same.

      • I wish I had mod points today, to mod this post up even more.

        This exactly describes what Oracle's doing with their "Grid" computing. They want you to shaft Sun, HP, etc., by buying super-cheap white box computers, and putting Linux on them. What they never seem to mention is that their SOFTWARE doesn't get any damned cheaper, even if the hardware is free, relatively speaking.

        Hmmmm, let's see how this works. If I buy two 4 CPU Sun Fire 480 systems at $35k each, plus a couple of smallish NetApp Filers at
  • by glenrm ( 640773 ) on Friday January 16, 2004 @11:35AM (#7998081) Homepage Journal
    Computers will be everywhere and the will all talk to each other all of the time. That is all they are talking about, however what makes them nervous is whoever makes this work seemlessly first will be a huge winner.
    • by transient ( 232842 ) on Friday January 16, 2004 @11:42AM (#7998163)
      they will all talk to each other all of the time

      What will they talk about?

    • The winner will be the one that provides information management at O/S level. Right now, O/Ss can do a lot of technical stuff, but they can't manage information. External apps are needed for that.

      But humans want to manage information, not an O/S. The first operating system that manages information instead of binary files will be the basis for a huge winner.
    • Candidate#1 - IBM
      Candidate#2 - The Open Source World
      Candidate#3 - Microsoft
      Candidate#4 - Government

      IBM has the giant gorilla approach with massive marketing, a broad product range, and ties to open source. They have the people and the power to get it done.

      The OSS world would have professionals getting it done (those tired of waiting on others). A programmer/hacker with an IQ of 200 is going to get it done for his company and release into the wild. We have the man power BAR NONE.

      Microsoft will just pour i
    • by Anonymous Coward
      Nope, sorry. On-demand means that the computers can scale to cope with any load, in real-time.

      Companies will not own their hardware, but rent it. If they suddenly need 3 times as much CPU, then they get it immediately, and only pay for what they use.

      This is different than the current situation where a company must always keep enough hardware around to handle peak loads, which is almost never. And then, if they guessed wrong, they are still screwed.

      It's really that simple, but hard to implement. IBM plans
      • by Waffle Iron ( 339739 ) on Friday January 16, 2004 @11:59AM (#7998363)
        Companies will not own their hardware, but rent it. If they suddenly need 3 times as much CPU, then they get it immediately, and only pay for what they use.

        This is different than the current situation where a company must always keep enough hardware around to handle peak loads, which is almost never. And then, if they guessed wrong, they are still screwed.

        The problem with that scheme is that most business problems are more dependent on I/O bandwidth than on CPU crunching. Today, you can mail order a gigaflop of CPU horsepower for less than $100. Compute horsepower is not an issue.

        The problem is that if you try to ship your computing problems to some other location, you've got to get the data from your site to theirs, so you still need I/O bandwidth at your site. What's worse, now you need a high-capacity WAN link to move it to these arbitrary locations.

        You may also have massive databases of background data that need to be referenced to solve your problems. How do you handle this? Send terabytes of data offsite so that a third party can run their Opteron against it for a few minutes? Or do you install a massive Internet pipe so that they can mount your database remotely? Either choice costs more than buying your own Opteron.

        • by Anonymous Coward
          dependent I/O bandwidth than on CPU crunching

          Same deal. You suddenly need 43 database servers instead of just two, how will you cope? IBM's strategy is about EVERYTHING in a computer system, not just CPU. The hardware doesn't even live on the customer site.

          Here's how the customer sees it:

          Customer: Holy shit! We've got to process 43 times our normal data volume for the next 36 hours, starting right now! better call IBM.

          Customer: Hello, IBM, we're to be handling 43 times the transactions that we're norma
    • Computers will be everywhere and the will all talk to each other all of the time.
      What do you mean will be? There are 14 cpu's in my car, one in the ipod, at least one in the pda, one in the cellphone, one in each of my kids toys, the GPS has a 386 in it, the toaster's got a processor, as does the garage door opener, the inkjet printer, our hot tub...

      Need I go on?
    • Comment removed based on user account deletion
  • Discrete projects (Score:3, Insightful)

    by BWJones ( 18351 ) * on Friday January 16, 2004 @11:36AM (#7998091) Homepage Journal
    learly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.'"

    Absolutely. But I don't see large scale distributed computing or "utility computing" working in the public domain for more than a few conceptually cohesive projects (think SETI and Folding@home for publicly available projects). On the other hand individual companies could certainly take advantage of this concept for internal projects while harnessing the computing power that many of them already have in abundance. The problem is bringing all of this computing power (desktop systems) together easily and without hassle. Software like Pooch [daugerresearch.com] and Xgrid [apple.com] are decidedly the way to go here allowing companies to harness space CPU cycles for anything from rendering to bioinformatics to modeling airflow or turbulence. For instance, how many computers are at organizations like Lockheed Martin? Or Genentech? Or at most Universities?

    • by glinden ( 56181 ) *
      The problem with distributed ("utility") computing is that communication between the processing units is expensive (high latency, in particular), so your task needs to be divisible into many almost completely independent pieces. While a some CPU intensive tasks do fit that model (protein folding, Seti@Home, DNA analysis, some search problems, some AI problems), most don't.

      So, I'm basically agreeing with you that utility computing is applicable to only a small subset of interesting problems. A useful subs
  • by UrgleHoth ( 50415 ) on Friday January 16, 2004 @11:36AM (#7998092) Homepage
    Sounds like the standard round of buzzword bingo.
  • by krog ( 25663 ) on Friday January 16, 2004 @11:36AM (#7998093) Homepage
    She has enough money in her coffers (thanks to over 6000 layoffs translating to a $150M bonus last year) to give everyone she's ever met the finger, buy an island somewhere near the equator, and sip margaritas all day every day until she dies a miserable and lonely death.

    She knows nothing about technology, and rather little about business. She only knows how to drain money. Don't expect to see HP change the face of computing with her in the captain's chair.
  • It's simple (Score:4, Interesting)

    by rm -rf $HOME ( 738703 ) on Friday January 16, 2004 @11:36AM (#7998096)
    All that they're flogging is this: lots of intercommunicating little computers in everything. We're already about halfway there -- between the XBox, Tivo, and KISS Technology's (GPL-violating) DVD player, *normal* people are more likely than ever to have a computer connected to their television without even knowing it.
    • Re:It's simple (Score:3, Informative)

      by acramon1 ( 226153 )
      Well, I think they're flogging a bit more than just that: they're envisioning more widespread use of distributed computing. Distributed computing, according to these market leaders, will enable companies to come up with a working, marketable, and profitable way to sell computing power to other companies through "utility"-like means (think "metered", like electricity).

      As for what that means for us *normal* people, maybe it means we can opt to make an extra penny or two running an IBM branded screensaver tha
  • Clearly, something monumental must be going on in the world of computing

    no, just more crapspeak from the usual pack of rabid weasles jockeying for the best position from which to loot the citizenry.

  • by Czernobog ( 588687 ) on Friday January 16, 2004 @11:37AM (#7998102) Journal
    Because managers have taken over from engineers.
    The real problem is that if the masters of saying nothing by saying a lot, like the Economist is, don't understand what these IT heavyweights are saying, there must not really be much behind the terms...

  • by Stiletto ( 12066 ) on Friday January 16, 2004 @11:38AM (#7998116)
    An appropriate term is:

    Bullshit Computing

    or maybe PADOS "Pump And Dump Our Stock" Computing
  • by Anonymous Coward on Friday January 16, 2004 @11:40AM (#7998134)
    When I started IBM recently (posting as AC to protect the innocent), I had to attend an orientation class covering IBM's policies and so forth. One of the topics was the "E-business on demand" initiative, IBM's next big thing.

    The instructor couldn't explain it, so she brought in a marketing exec, who could only define it in terms of itself. "E-business on demand is about computing, on demand, for e-business." Sprinkle in a healthy dose of meaningless adjectives, and you get the picture.

    I'll tell you, it's pervasive. Since then, I've not found one person who can give a cohesive definition at this company. And yet, it's supposed to be my driving force and ultimate goal.

    yay.

  • On Demand from IBM (Score:3, Interesting)

    by mekkab ( 133181 ) * on Friday January 16, 2004 @11:40AM (#7998139) Homepage Journal
    I thought the core idea of IBM on-demand computing was having a box with 12 processors, but you are only paying for 4. Then, during a particularly busy time, your CPU usage goes way up- 80%. You then have the flexibility of having other CPU's "turn on" to meet the load... think of it as being able to handle a slashdotting dynamically ('cept with CPU, not bandwitdh).
    • by Daniel Boisvert ( 143499 ) on Friday January 16, 2004 @12:08PM (#7998456)
      I'm fairly certain that used to be the case when *everybody* was running mainframe environments (not that lots of folks still aren't), but the key to the new version of this is that it'll be done over the network.

      Look at it from IBM's perspective. You can have 8 extra processors on-site for each client for those few times when they need the extra CPU, or you can have massive datacenters all over the world with a pool of extra CPU's to draw from. The latter will lead to unprecedented economies of scale as you can reassign computrons dynamically between clients to whomever needs them most, while still maintaining a comfortable cushion. Those economies of scale likely mean both lower prices for the customers as well as increased profit for IBM, because it drastically increases the efficiency of their services.

      I would be surprised if IBM was *not* working on a way to make applications portable across architectures also, and the push towards Linux on everything would seem to support this endeavour, irrespective of all the other reasons.

      Imagine buying systems capabilities instead of machines. Let's say you need gobs of CPU but not so much I/O bandwidth. Your jobs are allocated to a Power-based compute node. Let's say you need gobs of I/O bandwidth but not so much CPU. Your jobs are allocated to a zSeries machine. Now things get *really* interesting when your job first needs lots of I/O, then lots of CPU, then settles down for a bit. Your job could get reallocated across the grid based on its needs at any given moment.

      The technical end of making transfers of processes and datasets seamless is where the difficulty lays, and all of the 800lb gorillas are chomping at the bit to get it working first. The first one to do it right stands to make a fortune.

      Dan

  • ..."adaptive" information technology is about. It has something to do with "Darwinian reference architectures", she suggests, and also with "modularising" and "integrating", as well as with lots of "enabling" and "processes"... Microsoft's Bill Gates talks of "seamless computing". Other vendors prefer "ubiquitous", "autonomous" or "utility" computing. Forrester Research, a consultancy, likes "organic". Gartner, a rival, opts for "real-time". Clearly, something monumental must be going on in the world of co
  • by laird ( 2705 ) <lairdp@gmail.TWAINcom minus author> on Friday January 16, 2004 @11:43AM (#7998181) Journal
    I was at Thinking Machines (the company that invented massively parallel computing) a decade ago, and back then Danny Hillis talked frequently about "utility computing" -- the idea that your computations would know how to flow back to wherever it needed to be done. So you'd work on a desktop computer and the user interactive bits would run locally, harder parts would flow back to a big CPU in the basement, and the really hard parts could flow back to a city supercomputer, in a CPU equivalent of the power grid.

    At a high level, it's a pretty simple idea, and very powerful.

    At the detailed level, there are some amazingly hard problems to solve. Like, for example, how does software get split into parts that can be separated with minimal communications overhead, or how do you decide when a task would run more efficiently spread across a bunch of CPU's, or how do you keep running smoothly when a network outage causes 10% of your CPU's to drop off of the grid. ...

    I suspect that the reason that all of the big companies are pitching this is that:

    1) CPU's and operating systems have been commoditized by Intel/AMD/etc. and Linux, and they want to have a reason for you to buy bigger/better/more expensive systems.

    2) Once one of them announced it, they all have to have a "response".

    That being said, I think that what they're doing is going to be of real value to high-end customers. If you're running a farm of 5,000 servers, you really need the software to be self-healing, etc.
    • If they invented the idea, everyone should follow their lead, that company has grown by leaps and bounds over the last decade!
    • Close, but I think a previous poster had it closer. As I understand the HP and IBM pitch (and I have seen the HP pitch). You buy a big box from them which has lots of CPU/Disk/bandwidth and they plug in the CPU/Disk/bandwidth meter to the box. You pay for the amount of CPU/disk/bandwidth you use per period.

      It is a very interesting model. Why? Well because under it, computing resources are like electricity. You pay for what you use and the kit they sell you is like the grid with all kinds of excess ca
  • The Big Thing (Score:4, Insightful)

    by Tom ( 822 ) on Friday January 16, 2004 @11:43AM (#7998183) Homepage Journal
    Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.'"

    Absolutely. It's called saturation and we're closing in on it. So the marketing drones are in red alert to find something different to sell before the old business runs out.

    Note the keyword "different". Also note that to marketing it means something entirely... uh, different, then to you and me.
    It's a bit like C++ and C - there is a new paradigm, a new approach, and some real technical differences. A lot of books get written, some people become famous, some rich, a few both. In the end, though, 90% of what you're actually writing doesn't change. It's still "i++;" and "exit 1 /* fucking bug I can't find! */"

  • this big unknown "thing" is a laptop on two wheels that doubles up as a scooter. the big feature of this new design is that it warns the user that the battery is running down by throwing the user onto the street and crashing the harddisk.

    it'll look something like this
    ___
    I
    o-o
  • by D-Cypell ( 446534 ) on Friday January 16, 2004 @11:44AM (#7998187)
    Look at it this way... if we cant work out what a "Darwinian reference architecture" is, the indians must be totally fucking baffled!

    • Well, Darwinian is the method of evolution, by which blind forces may make an alteration, and if that is more succesful than the previous mode, then it will succeed.

      So by the use of the term 'Darwinian', would that mean that HP have now sacked anyone capable of developing a long term plan, and they are now blindly altering and testing things they already have to see if they are in some way better than they used to be, without any real understanding of what they're doing?

  • I think this is computing coming full circle. At the beginning, you paid for computing by the amount you used it. As PC's be came ubiquitous, that fell by the wayside, as the accounting just seemed to be too much. Now that times are getting tight again, they are looking toward providing computing power as needed (and paying for it) as opposed to having it all on standby.

    Everything else is marketing gobbletygook.
  • Gordian Knot (Score:4, Insightful)

    by Moeses ( 19324 ) on Friday January 16, 2004 @11:46AM (#7998215)
    "Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.'"

    But if you have no idea what it is how can you claim it to be profound? Remember the Segway?

    Perhaps the simpler explaination is that they are making lame-brained babble about how there are lots of computers now, there are going to be even more and they need to be easier to use? They then pick some high falutin sounding words that kind of describe some aspect of that as they see it.

    Just maybe?!

    Really, anything short on details and full of buzzwords probably isn't a big deal - or anything at all. Yes, there are current trends in the way computers are used that is changing. There usually are. There IS a push that people want SERVICES, not computers. They want INFORMATION, not machines. People don't want to worry about running servers and infrastuctures and they also don't want to have to deal with a lot of computery stuff to do things in their daily life like listen to music, communicate, etc.

    Nothing new here.

  • Just like the profound, paradigm-shifting, mind-blowing, earth shattering concept that was Application Service Providers(ASP).

    For those that don't know it, the ASP model has generally proven to be a failure and this "new" concept seem like just another rehashing of the ASP model. But, this time they are going after CPU cycles rather than just applications.
  • by Phaid ( 938 ) on Friday January 16, 2004 @11:49AM (#7998246) Homepage
    IBM, HP, etc, are already offshoring massive numbers of jobs. This makes their outsourcing business, where they get paid to manage e.g. AT&T's networks, really profitable. But the problem for them is, the small fish are getting away. Small and medium-sized businesses don't really tend to outsource their IT processes, so there's a lost opportunity. If Carly's vision is implemented and IT becomes a generic service, they'll be able to market it at these smaller organizations and really rake in the dough. It's summed up quite well in this passage:
    Some day, firms will indeed stop maintaining huge, complex and expensive computer systems that often sit idle and cannot communicate with the computers of suppliers and customers. Instead, they will outsource their computing to specialists (IBM, HP, etc) and pay for it as they use it, just as they now pay for their electricity, gas and water. As with such traditional utilities, the complexity of the supply-systems will be entirely hidden from users.
    This way, the "specialists" can offshore the whole thing, pay a bunch of Indian tech slaves peanuts to run it, and charge you a rate that's just low enough to make it seem like a great deal compared to buying your own systems and paying your own people to run them. Hooray for progress.
    • by GOD_ALMIGHTY ( 17678 ) <curt DOT johnson AT gmail DOT com> on Friday January 16, 2004 @03:52PM (#8001220) Homepage
      This is the exact model that will kill any remotely technical job in the US. You can move this model to accounting, contract review and numerous other routine professional services that business' of all sizes use.

      IBM is currently offshoring 100 lawyers to do this and Indian's are being trained in US accounting. In the future large service organizations like H&R Block will have tons of Indians or Chinese trained in the US laws and practices, you will interface with an American account manager who hands you the reports and answers basic questions. Meanwhile, your data will be input by Americans working for around $12 an hour, the data will be shuttled off to the Indian or Chinese service centers and the product will come back to be given to you by the account manager.

      The efficiency gains that these large business' are getting from utilizing this model internally will be scaled and productized to appeal to small business, which will be considered a growth market. Local CPAs and a lot of basic work that local lawyers do will be aquired more cheaply by small business using these large service organizations. Some of the large service orgs will partner with local service providers to gain access to thier clients, the same way Intuit markets it's services to CPAs.

      The problem for the average middle class US professional is that there are not really any jobs outside of Health Services (nurses, docters) that don't fall into this model. The problem for the country is that we can't just be people who take care of the old and sick and sell stuff. This country has to produce something and there has to be oppurtunity for the middle class and those who are trying to seek entry into the middle class. Democracy and Capitalism don't function without a strong wealth owning middle class.

      Does anyone see a solution to this problem? I haven't found one. I've been looking too. Any new industries that we'll be able to move to?
  • by twitter ( 104583 ) on Friday January 16, 2004 @11:50AM (#7998265) Homepage Journal
    Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.

    Something profound is happening and it is hard to explain. Computing on demand, I like it. Still, it's hard for people to really get it.

    Terms don't work very well. I've told them about apt-get, dselect and aptitude, but they get lost.

    Showing them the tools in action is impressive, but they still don't get it. I've demonstrated apt-get and dselect and it's generally impressive. Changing out Exim for Sendmail flawlessly and remotely without reboot was kind of cool and impressed several people at work once. A demonstration of dselect came tantalizingly close to clueing in my brother in law. At first he was unimpressed with all the software he saw listed because he is used to music sharing programs. His eyes nearly poped out of his head when I told him that all of that software was free and inteneded to be by the authors. He still lacked an apreciation for the quality of the software and has yet to get it. Aptitude, while it may be easier to use and put a pretty face on the process will have about the same result.

    No, I'm afraid that the only way people will understand that there is a vast collection of software ready to fill any and all of their computing needs is for them to use it. Free software, to me, is the ultimate computing on demand, co-operative utility type computing. The abiltiy to demand only comes from control and control only comes from freedom.

    The candy available from the NOT net and all the other followers of Netscape's browser and remote desktop computing are nothing in the face of free computing.

  • something to do with "Darwinian reference architectures", she suggests

    I may be wrong, but weren't darwin's theories used by the "upper class" as an excuse to why they are better than the "lower class". Something to the effect of "we have evolved and you have not, so we deserve all these riches and you deserve nothing." I wish i still had my history notes. In anycase, vieled references to Darwin such as this "Darwinian reference architectures" has since left me skeptical about the persons motives.
  • Say bye bye to old-fashioned object-oriented computing and embrace a new era of autonomous agents. Phisical proactive agents will be the mind of our robots, data-mining web-harvesters soft-agents are already populating the Internet, personal agents are being developed to advise us from our handheld computers. Revolutions Comes, and a new era for IT is here.
  • by s.d. ( 33767 ) on Friday January 16, 2004 @11:52AM (#7998286)

    Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.

    There's nothing monumental that's really floated to the surface yet. I work in grid computing, which itself is an amazing buzzword that everyone wants to say and no one understands (hell, I am not really sure what the purpose of what I do is).

    Everyone's grasping for straws right now, b/c when some research project actually does become useful, they want to be in front of the wave so they can ride it all the way. This is everyone throwing out made up words in the hopes that people will like some (or at least one) of them. Around here, our made up phrase that I love is that we are being called "the cornerstone of cyberinfrastructure." It's even been used so much that they've shortened cyberinfrastructure to "CI" in big rambling memos about our future and direction. It's sort of depressing, though, when you realize that none of this actually means anything yet. Maybe it will one day, but that's not quite here yet.

  • by jeddak ( 12628 ) on Friday January 16, 2004 @11:54AM (#7998312)
    "Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.'"

    Yup. It's called "Bandwagon." :)
  • Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.

    What is going on is the marketroids are harvesting what they've sown. They are a little short on ideas at the moment. They have nothing to better to do so they copy one another. Having witnessed the overindulgence in irrational exuberance and the trade of talent for third-world coding monkeys (aka offshoring), the creative peo

  • by rudy_wayne ( 414635 ) on Friday January 16, 2004 @11:55AM (#7998324)
    Carly Fiorina spews out a bunch of meaningless bafflegab and everyone just nods their head. Once again we see that nobody learned anything from the story of The Emperor's New Clothes.

  • We had the JIT (just in time) manufacturing wave hit our plant about 10 years ago. When people want processes to run faster, but can't get them to do so, they come up with names for new technologies that should solve their problems -- before developing the technologies.

    Don't worry if you miss this current trend, there will be new names for working faster next year.
    • We had a similar experience around that time. I, on the other hand, came up with the "Managed Information Access [Secured] System".

      Ever since then I've been pulling things out of MI-ASS.

      People seem to love it...
  • Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.

    The marketing departments of these multi-national corporations all simultaneously decided to invent a new, improved, generic computing paradigm/infrastructure/idea/program/something else. And each of these companies are fighting to define what it is the marketing department is marketing.

    Ahh, to work in a large corporation!


  • Clearly, something monumental must be going on in the world of computing for these technology titans simultaneously to discover something that is so profound and yet so hard to name.

    Yeah. They all bet the farm on maintaining exponetial growth forever, without asking anyone who knew math if it was possible. Now they are tap dancing.

    The sound you hear is just a high speed mixture of marketspeak, fear, and tap shoes.

    -- MarkusQ

  • I know why (Score:5, Funny)

    by vorwerk ( 543034 ) on Friday January 16, 2004 @12:06PM (#7998438)
    It's so hard to name because these companies all lack the synergistic, results-driven leverage that will incentivize their paradigm shift.
    • by Walterk ( 124748 ) <slashdot@@@dublet...org> on Friday January 16, 2004 @12:23PM (#7998632) Homepage Journal
      Marketing lesson #1: The synergy of the result driven leverage can never incentivize a paradigm shift.

      But remember: ubiquitous autonomy of enabling processes in a modularizing environment, synergetically adapting and integrating to provide on-demand seamless real-time organic utiliy computing is extremely vital in the face of Darwinian reference architectures.
  • Behind the glass (Score:3, Insightful)

    by IGnatius T Foobar ( 4328 ) on Friday January 16, 2004 @12:15PM (#7998519) Homepage Journal
    Applications moving behind the glass. Any application accessible from any location, without having to load it on "your" computer first. Basically, it's a return to mainframe-like computing, but without the green screens.

    Well-designed hosting environments can make this happen. Portable API's such as those available in Unix/Linux and in Java help make it happen, and help make the apps relocatable. Truly transparent network filesystems like NFS allow for application and server load balancing. Transparent graphics systems like X11 help make the apps truly independent of the display they're viewed on -- applications moving to the Web is a big piece, too.

    This was the original vision of "network computing" and it's still a good idea -- it's still being worked on and there are places where it's being deployed. The reason why the original McNealy/Ellison vision of network computing failed is because they required everyone to move exclusively to pure Java applications. In reality, most environments can't make that big of a move that quickly.

    So what we're seeing is a gradual shift of applications off the desktop and back into the data center. For the time being, most users are still using a fat PC to access them, but IT organizations will wake up one morning and suddenly realize that everything has moved behind the glass and they really are in a utility computing environment. If they've done it right, they will then be able to move applications and storage resources around the data center without an impact on the users. This is the promise of utility computing and it's a good idea.

    And for organizations that don't want the expense of running their own data center, they can enlist the services of a hosting company [xand.com] that specializes in this type of thing -- IT keeps control of its applications, while someone else keeps the air conditioners, UPS's, and routers running.
  • AI (Score:3, Insightful)

    by LuxFX ( 220822 ) on Friday January 16, 2004 @12:20PM (#7998578) Homepage Journal
    The reason why it's so hard to name is because everybody is scared of the phrase "Artificial Intelligence." (read: everybody == investors) 'AI' used to be one of those buzzwords like 'convergence' but no longer. After a while it turned into this impossibility and the term 'AI' turned into a serious no-no when you make a presentation to an investor.

    But that's just what all of these sound like! "Darwinian reference architectures" sounds like a system that learns using a genetic algorithm. "autonomous" and "organic" are even more descriptive. But everybody is just dancing around the real issue so they don't scare off anybody.
  • by ExistentialFeline ( 696559 ) on Friday January 16, 2004 @12:21PM (#7998590)
    I think that a lot of people are missing that there are two concepts here. The first is grid computing, which is as far as I understand being able to offload processing to multiple computers. The second is ubiquitous computing, which is being able to use computers anywhere you want and access data anywhere you want in a natural fashion such that you're not even thinking about the fact that you're using a computer. See this google-cached page for an example [216.239.41.104]. The two may be used together but are not dependent on each other.
  • by Moderation abuser ( 184013 ) on Friday January 16, 2004 @12:30PM (#7998722)
    Honestly, it isn't difficult. You just need a load of machines and a way to manage the distribution of jobs, and that's been possible for decades.

    I have an architecture in place which can scale pretty much linearly from 10 concurrent users to 1000 concurrent users and probably beyond just by adding boxes, completely transparently and with spectacularly little administrative effort.

    Stop thinking of computers as individual machines, they are really just little blocks in the whole, treat them as such.

    Oh and we haven't spent a penny on setting up the system, making use of older kit, so it's cheap, scalable, easy to manage, highly available, fast etc etc. Am I going to tell you how to do it? Am I buggery, you'll be able to buy such a system from a web site near you soon. Any administrators with a bit of imagination, a few years of experience and penchant for infrastructures.org could come up with a similar system fairly easily but thankfully they are few and far between.

  • Bleh! (Score:3, Insightful)

    by Trolling4Dollars ( 627073 ) on Friday January 16, 2004 @12:31PM (#7998734) Journal
    Just more business crap. I think what it really means is that yet another industry is about to fall vitim to one of the things possible, the "monthly bill" business model.

    This is akin to the RIAA realizing that everyone else has moved to subscription services except them. That pay once, play forever model wasn't sitting well with them hence DRM and all it's associated ills were born.

    The thing that will keep this kind of computing a pipe dream for now is bandwidth. I think you'd be hard pressed to find a company or a regular "Joe User" willing to usurp the power that they have in a machines that they own outright just so they cn pay a monthly bill. It's the same reason why so many idiots are happy with a big powerful PC on their desktops even if all they ever do is browse the web and do some word processing. Once the bandwidth is such that you have super fast connections to centralized processing, then this kind of thing might take off.

    I'm doing it at home in a fashion with a terminal server and some wireless X terms. Instead of having to have fully outfitted PCs in every room, I just have one big honkin' nasty box that does everything I need it to. It's a file server, web server, mail server, DNS, application server, print server, streaming music server, IM server, etc... I put all the money and resources into this box and then everything else is just a glorified GUI dumb terminal. So far, no problems between my wife and I when we simultaneously run normal processes (or even some of my heavier ones). But I've got the bandwidth here at home. 100Mb wired to every machine except the wireless terminals. Until we get at least that kind of speed dedicated to every node on the net, this stuff isn't really going to happen.

    I remember the come on for this crap that we got where I work when it was Compaq preaching this stuff. My boss and I looked at it and laughed. Sounds like another NT... Not Today.
  • by ChaosMt ( 84630 ) on Friday January 16, 2004 @12:45PM (#7998958) Homepage
    There's been lots of good talk about market saturation (cpu, db & os) scaring marketing drones, the desire to lower labor costs (outsourcing), software getting smarter and so on... The thing I haven't seen mentioned is how the real commodity will be networking and the need for shared information (storage). I don't think the next big thing will really be about cpu. I think this is going to be more about information "discovery". I think that's the next big wave they should be for which we should be prepared. More CPU cycles are use for collecting, saving, distributing and presenting information than actually "doing" things with a system (such as interactive entertainment, analysis, etc). Credit agencies make far more than ANY asp EVER has.

    If the network is everywhere, easy and economic - outsourcing storage is perfect. Outsourcing CPU takes much more work (as we've pointed out). What will become profitable is not what can you do with a computer, but though the internet it is now, what can you know through a computer. It will become VERY profitable to make masses of information meaningful (information discovery). Take for example google. Or how about big brother... er, I mean Tom Ridge's TSA/Homeland Stupidity initative to link your grades, credit score and medical records together to determine if you're a terrorist. Yes, it's aweful, but this is what the powers that be want. And it's what you want judging from the popularity of google.

    They are expecting networking to get better and better to make this happen so that information and its software is more interesting. The problem is politics. The FCC and the varrious state public utility commissions are all bribed by big telecom, and have NO interest in doing something innovative that might help its citizens and break the business monopoly of build it, sit on your rear, make money. You want to know who will? China. No infrastructure in place, slave labor, easy government bribery... it's the perfect business growth environment.

    Ya, the network will make it happen, but the pessimist in me says it's not going to be here.
  • by K-Man ( 4117 ) on Friday January 16, 2004 @12:56PM (#7999102)

    time-sharing: 1. Computing The automatic sharing of processor time so that a computer can serve several users or devices concurrently, rapidly switching between them so that each user has the impression of continuous exclusive use.

    (Oxford English Dictionary)

  • by mrogers ( 85392 ) on Friday January 16, 2004 @01:06PM (#7999245)
    From the article:

    Irving Wladawsky-Berger, an in-house guru at IBM, pictures an ambulance delivering an unconscious patient to a random hospital. The doctors go online and get the patient's data (medical history, drug allergies, etc), which happens to be stored on the computer of a clinic on the other side of the world. They upload their scans of the patient on to the network and crunch the data with the processing power of thousands of remote computers-not just the little machine which is all that the hospital itself can nowadays afford.

    This "guru"'s story is so unrealistic that it's downright dishonest. First, how is the patient identified among the millions of medical records in this miraculous database? The patient must be carrying some kind of identity card, so why not embed his/her medical records in the card instead of putting them online where they are exposed to hackers? (Of course it's still possible for someone to steal a smartcard, but at least it requires a separate attack on each patient rather than a single attack on the entire database.)

    Second, how do the doctors authenticate themselves, or is everyone allowed to browse and update the medical records? These are doctors at a "random hospital", so in order to help this patient they must have access to the medical records of everyone in the country. Every doctor has access to every patient's records - great, what happens when one doctor's smartcard goes missing? The entire database is compromised. Again, the only sensible option is to keep each patient's data on a separate smartcard (with an offline backup in case the card is lost). The 'grid' is not the solution here.

    Finally, we have the touching story of The Little Computer That Could - the hospital's computer is too slow to crunch the data on its own so it makes use of idle cycles donated by other computers. This completely misses the point of utility computing, which is to make it possible to buy and sell computing resources. If grid computing ever becomes widespread, all those idle CPU cycles will become a commodity and you will have to pay for them. Perhaps some philanthropic souls will donate cycles to the hospital for free, but they're just as likely to donate a real computer - the idea that the 'grid' solves the problem of equipment shortages is absurd.

  • Next Big Thing (Score:3, Insightful)

    by Anonymous Coward on Friday January 16, 2004 @01:17PM (#7999380)
    The Next Big Thing will not come from large innitiatives at HP, Microsoft, IBM, or any big business. If the history of computers teaches us anything, it's that great innovations arise from small groups or unexpected places, with people trying to solve real world problems, not just trying to find anything new to sell someone. See UNIX, WWW, desktop computers, just to start. Also, very few people, especially the big execs, will see it coming.
  • Old, bad idea (Score:3, Insightful)

    by Animats ( 122034 ) on Friday January 16, 2004 @01:50PM (#7999763) Homepage
    "Grid computing" is a dumb idea for a very simple reason.

    CPUs are cheap.

    Once upon a time, computers were really expensive. Control Data Corporation proposed designs in the 1960s with one supercomputer (of about 5 MIPS power) per metropolitan area. They went on to build time-sharing data centers, and for a decade or so, it was a viable business. Back then, when a CPU cost over a million dollars, time-sharing made economic sense. It hasn't been that way for a long time. A very long time.

    It's notable that there's little enthusiasm for "grid computing" from the businesses best positioned to provide it - hosting services. They have the right infrastructure in place. If they wanted to sell number-crunching power during off-peak periods, they could. But nobody wants that service.

    The ASP business is a disaster. The biggest player, Corio, has had its stock price decline from 25 to 3 over the last three years. Their revenue is declining, and they're losing money. Many smaller ASPs have gone bankrupt, often leaving their customers in desperate straits. There are risks to outsourcing key business functions.

    The real trend in business computing is "buy once, run forever". That's what "utility computing" is really about. How often do you replace your power transformer? The real push for Linux comes from businesses that hate Microsoft's "buy once, pay forever" plan, "Software Assurance".

  • by taradfong ( 311185 ) * on Friday January 16, 2004 @03:15PM (#8000770) Homepage Journal
    If PCs continue to live in a world of their own among consumer products, utility computing will become 'the answer' to its own problems.

    I mean, today, I buy any other piece of consumer electronics, I plug it in, and I use it. It breaks, I throw it out.

    With a PC, I have this thing that needs to be maintained, occasionally turned on and off, needs to be asked permission to be turned off, becomes useless when its OS gets EOL'd, has software from dozens of companies on it, and still has stone-age level means of really assessing/changing how it's configured. It's a big load on a consumer's patience and requires much more skill to really safely wield than all but a few geeks possess. (asside: I think this is one reason MS will be surprised at how fast Linux catches on, because the extra ease of use of MS is eclipsed by the 'you can fix anything, there are no dead ends' attribute of Linux) Plus, more and more our PCs hold valuable content (your baby photos, your music library).

    So...eventually if someone instead offers a cheap, indestructable maintenance-free terminal and left the ugly issues of data storage, backup, application upgrades, virus definitions, and more to be handled for you remotely somewhere, and if it was done cleanly over a super fast connection, I think this idea will take off because consumers will value convenience over the flexibility and pain of essentially being a 1-man IT department for your own house.
  • by mveloso ( 325617 ) on Friday January 16, 2004 @04:13PM (#8001460)
    The whole COD bruhaha is driven by the same things that drove electricity generation back in the day, namely, that today getting incremental computing power is expensive, time-consuming, error-prone, and hard-to-manage.

    In the old days (and today for some really big shops), everyone generated their own electricity - they had to. Either that, or they bought it from local collectives. As you can imagine, that was relatively expensive and way inefficient. If you needed a few thousand kw more than your generators could produce, well, you'd have to buy new generators.

    Well heck, why not use some kw from your neighbor? Well you can, but the interconnect cost is high, as are the risks. What happens if you overload your neighbor's generator? Both of you are hosed. For your neighbor, the incremental benefit for selling you their excess electricity is far outweighed by the downside of total loss of all electrical. Doh!

    Back then it might have been called "electricity on demand." As much electricity, when you needed it, on a metered basis. Hey, you don't have to worry about your electricity needs anymore. And by leveraging electricity generation across a region, the total price is magnitudes less than what you would pay. A no-brainer, and something with benefits so great that the local governments gave monopolies to local power companies so they'd build out their infrastructures.

    Fast-forward to now, and COD is a major problem. No sane computer vendor wants to become a commodity like electricity, except...IBM. Only IBM has the scope to survive computing commoditization, because it believes its boxes are what's going to be at the end of that data cable snaking into your (or someone else's) business.

    Face it, nobody except geeks really cares how stuff happens on computers, just that it happens quickly, reliably, and as expected, three things that most IT departments are mostly incapable of doing. Why not let IBM do it?

    Right now there are a bunch of things to work out, like management, uptime, performance, and getting internal apps on hosted systems, stuff like that. It's the annoying management and administration stuff that's bogging everything down. But this is more than outsourcing, this is outsourcing to the next level.

    Think about it. Why does every business need their own accounting program? They don't, not really. How about for payroll? HR? Inventory? Email? They don't. They might like to think they do, but realistically speaking if accounting software adheres to GAAP they'll live with it. If they can customize reports, they'll be fine. Same with everything else.

    It would have millions (or billions) of dollars if the world was like this. Why have 5000 instances of peoplesoft running all over the US, when they basically do the same thing in the same way, with minimal customization? etc etc.

    That's the promise of CoD - getting rid of your IT department completely. IT is generally the worst-performing, least responsive part of any business. Let it be handled by pros, instead of the yokels you've got. And you'll save money to boot.

To the systems programmer, users and applications serve only to provide a test load.

Working...