Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
The Internet

Multiple Experts Try Defining "Cloud Computing" 117

jg21 writes "Even though IBM's Irving Wladawsky Berger reports a leading analyst as having said recently that 'There is a clear consensus that there is no real consensus on what cloud computing is,' here are no fewer than twenty attempts at a definition of the infrastructural paradigm shift that is sweeping across the Enterprise IT world — some of them really quite good. From the article: 'Cloud computing is...the user-friendly version of grid computing.' (Trevor Doerksen) and 'Cloud computing really is accessing resources and services needed to perform functions with dynamically changing needs. An application or service developer requests access from the cloud rather than a specific endpoint or named resource.' (Kevin Hartig)"
This discussion has been archived. No new comments can be posted.

Multiple Experts Try Defining "Cloud Computing"

Comments Filter:
  • by Mean Variance ( 913229 ) <mean.variance@gmail.com> on Thursday July 17, 2008 @05:25PM (#24234823)
    • Web 2.0
    • .NET
    • I think this presents a cloudier issue.
  • by gmuslera ( 3436 ) on Thursday July 17, 2008 @05:27PM (#24234841) Homepage Journal
    ... is mainly water vapor.

    Ok, unless we speak about software, where is mainly vapor ware.
    • So, either they are cloudy or hazy... Or, they are all trying to think of a way to play misty for the cloud... (Play Misty for Me...)

    • by sm62704 ( 957197 )

      Erm, I hate the term "cloud computing". But a meaningless buzzword that started life on a hastily drawn chart has actually gotten some meaning. I wish that the less intelligent and less creative people would stop coining new words; blog and blogosphere come to mind. But Wikipedia actually says [wikipedia.org]

      Cloud computing refers to computing resources being accessed which are typically owned and operated by a third-party provider on a consolidated basis in Data Center locations. Consumers of cloud computing services pur

      • Uncyclopedia needs your help.
        • by sm62704 ( 957197 )

          I did in fact add two lines to the entry on slashdot. [uncyclopedia.org]

          "In Soviet Russia, slashdot trolls YUO!." ~ Russian Reversal on Slashdot

          "On the streets these days, a dime bag of kittens costs a pretty penny." ~ Oscar Wilde on Slashdot's "offtopic" moderation

          They actually stayed without anybody messing with them. I thought for sure someone would change "yuo" to "you", but was happily surprised.

  • WTF is Software 10.0? I must have missed previous nine versions...
    • be happy you missed fortran cobol assembly, and punchcards.

      oh and enjoy java, and C#

      • Re: (Score:3, Interesting)

        i didn't just miss assembly. i still do.

        it's a shame no one gives assembly the respect it deserves.

        • by lymond01 ( 314120 ) on Thursday July 17, 2008 @05:59PM (#24235147)

          Assembly? I wish we were so advanced. We have to do all our modeling in real-time. Though there's something terribly satisfying when your final calculations are complete and the giant mousetrap fall atop the little rodent...

        • by Chemisor ( 97276 )

          > it's a shame no one gives assembly the respect it deserves.

          That's because aside from you, me, and a dozen other people, nobody knows assembly any more. Today's generation lives in the world of the 500M web application [thedailywtf.com].

          • Re: (Score:3, Informative)

            Maybe in the IT world but in the electronics world (at least in the UK) its still taught as micros. 6502 assembly is used quite extensively in teaching still here. Civilian industry may like Java and so on but those of us working in systems that require proven reliablity and standards conformity still need to know it.

            • You can program assembler on the java virtual machine as well. See project Jasmin [sourceforge.net] if interested.
            • by KGIII ( 973947 )
              I'm NOT a guru by any means, just a person who had the right skillset at the right time. But, I have often wondered why malware authors didn't target a smaller audience and right in assembler/machine?
          • by colmore ( 56499 )

            You know there's a lot more to programming than webdev and business logic.

            Embedded and realtime systems are enormously important and employ a lot of programmers. Not to mention that things like compilers, drivers, and operating systems are still important and heavily developed.

            (Treated as a single language) assembler is perennially in the top less than 10 languages.

        • "i didn't just miss assembly. i still do.

          it's a shame no one gives assembly the respect it deserves."

          The problem is, someone needs to come up with something similar to C but much easier to use, which comes with very good tutorials and is worked on by many like say python.

          The biggest barrier to assembly to my mind, is lack of compelling demonstrations on why to use it, I think what needs to be done is, remember the "scene" where people did demos like 2nd reality?

          http://www.youtube.com/watch?v=XtCW-axRJV8 [youtube.com]

          Stuf

  • What they mean is (Score:5, Insightful)

    by Scareduck ( 177470 ) on Thursday July 17, 2008 @05:32PM (#24234883) Homepage Journal
    buzzword-compliant computing. I hate stories like this, which are really just cover for somebody's marketing.
    • Re: (Score:3, Funny)

      I don't mean to rain on your parade, but there has been a thunderous demand for buzzwords that truly represent the crystallization of otherwise cloudy ideas.
      • Cloud crystallization? You mean Snow 1.0? As in, "we sure snowed the customer re: the necessity of paying for that upgrade"? Sorry for the cloud burst, but marketers having been snowing (both customers and each other) since marketing began...
  • by Wowsers ( 1151731 )

    Cloud computing is a privacy destroyer. That's my definition.

  • It seems that its kind of clouded what cloud computing really means.
    • Re: (Score:2, Informative)

      by Tablizer ( 95088 )

      At least the name does not pretend to be clear.

      My pet definition is resources that can be allocated to different departments, divisions, and users as needed rather than the "box-per-department" model that is common now. In other words, as-needed allocation.
                     

      • Re:nebulous (Score:5, Funny)

        by Red Flayer ( 890720 ) on Thursday July 17, 2008 @06:18PM (#24235309) Journal

        My pet definition is resources that can be allocated to different departments, divisions, and users as needed rather than the "box-per-department" model that is common now. In other words, as-needed allocation.

        I think you're looking at it from the wrong perspective -- one needs to look at it from the application's perspective, not the system's perspective. The "cloud" represents the resources needed to perform a task -- it's an abstraction used to represent resource acquisition, not resource allocation.

        In practice, though, you're pretty close to the truth. Instead of having an allocated set of computers for processing a group's tasks, they can draw from the cloud, which is available to multiple groups. As your computing needs grow, you can have the Cloud take over another computer, which reduces the number of computing resources, but increases the power of the Cloud. This has the advantage of reducing single points of failure, and more efficiently allocating computing resources. Say you start with 100 Macs... as each Mac is subsumed by the MacCloud, the MacCloud grows in strength. Eventually, there can be only one.

        Sorry.

  • Gives Wired and other mags yet another buzzword topic to claim is newfangled and great when really it's just a new paint job on an idea that has been around for decades. But no, really, it's a paradigm shift, we SWEAAAR. Bleh.

    • Re: (Score:3, Informative)

      by jacquesm ( 154384 )

      why mod this a troll ? He's quite right actually, there have been quite a few instances of virtualization and scaleable computing facilities in the last 20 years. Think transputers, thinking machines and some even older.

      The only difference between then and now is the level of ease-of-deployment. Basically anybody can do it now, whereas in the past you'd have to have a pretty serious budget.

    • by ch-chuck ( 9622 )

      You could say that the cultural imperative and demand for new innovative inventions and technological progress is so great that, even when nothing new comes along, people have to make a to-do about something.

    • Yea uhm.. this whole buzzword sounds strangely like.. oh i dunno.. many terminals connected to a unix system circa 1980..?
  • by MR.Mic ( 937158 )

    What is Cloud Computing - Video

    Summary -
    "At the Web 2.0 Expo, we asked Tim O'Reilly, Dan Farber, Matt Mullenweg, Jay Cross, Brian Solis, Kevin Marks, Steve Gillmor, Jeremy Tanner, Maggie Fox, Tom McGovern, Sam Lawrence, Stowe Boyd, David Tebbutt, Dave McClure, Chris Carfi, Vamshi Krishna and Rod Boothby the same question: "What is Cloud Computing?". Here's what we got. (more)"

    http://www.youtube.com/watch?v=6PNuQHUiV3Q [youtube.com]

  • by ibanezist00 ( 1306467 ) on Thursday July 17, 2008 @05:40PM (#24234965)
    It's obviously the latest Web 2.0 .NET technology-based user-driven blogging paradigm that gives the bloggosphere the synergy for cloud-based dynamic content platforms!

    /business-mode
    • But is it customer centric?
    • * Ostensibly, the dynamic upside of this new process innovation goes beyond the ubiquitous expansion of existing consolidation synergies for cloud computing consumers.
      * It also represents a radical reduction in physical footprint, power consumption and management headaches, err challenges in replacing and disposing old servers in a constant cycle.
      * This eco-friendly initiative is a solid platform to establish PR campaigns.
      * It further maximizes up-time through cloud redundancy, and fewer hardware upgrade cy

    • Re: (Score:2, Interesting)

      by Tablizer ( 95088 )

      You've been poking around the Dilbert Buzzword Generator, haven't you?

      http://www.unitedmedia.com/comics/dilbert/games/career/bin/ms.cgi [unitedmedia.com]

      Samples:

      It's our responsibility to continually provide access to low-risk high-yield benefits and collaboratively administrate economically sound materials while promoting personal employee growth

      It's our responsibility to authoritatively negotiate market-driven technology so that we may conveniently build low-risk high-yield opportunities to stay competitive in tomorrow's w

  • Shouldn't the title read 'Multiple experts try to define "Cloud Computing"'?

  • Joyent, the best web host I've ever used, recently wrote an extended piece attempting to define cloud computing [joyeur.com]. They introduced what they call a "CloudScore", and rated themselves as 7/9 on it. Interesting read.
  • Keep it away from the Silver Iodide [wikipedia.org]
  • buzz words (Score:4, Informative)

    by jacquesm ( 154384 ) <j@wwAUDEN.com minus poet> on Thursday July 17, 2008 @05:45PM (#24235015) Homepage

    It's interesting that a fairly large number of these guys refer to the term itself as a buzz word.

    I think cloud computing is less of a buzz word than most, but I really think that most of these definitions miss the biggest difference: With cloud computing you outsource *all* your hardware. So, any application where you are not physically talking about what software runs on which piece of hardware is cloud computing to me.

    • by Tablizer ( 95088 ) on Thursday July 17, 2008 @05:51PM (#24235081) Journal

      With cloud computing you outsource *all* your hardware.

      My firm practices "cloud staffing" then.
           

    • Re: (Score:2, Informative)

      by Aloisius ( 1294796 )
      So how is cloud computing different than the old model of renting time on mainframes/supercomputers?

      Maybe IBM was right. Maybe there will be only 5 computers in the whole world in the future...
      • related to but not exactly the same, since you still *could* know where the data was being processed and it mattered. Now you *can't* know where the data is being processed or stored and nobody cares, as long as it works. You can't 'grasp' a real life cloud any more than that you can grasp the hardware that your software now runs on. In fact you probably have no idea of the underlying hardware at all, it could be anything. In the case of that mainframe/supercomputer you probably had to jump through quite a

    • Re: (Score:3, Insightful)

      by verbamour ( 1308787 )

      In regular computing, you don't know what's being done.

      Cloud computing is the same, except that you don't know where it's being done either.

    • by leenks ( 906881 )

      That's one definition of cloud computing - but another says that it doesn't always have to be outsourced hardware, e.g. it can quite easily be a bunch of stuff in your company computer halls.

    • With cloud computing you outsource *all* your hardware. So, any application where you are not physically talking about what software runs on which piece of hardware is cloud computing to me.

      Yeah, I think my impression was that it was sort of like what used to be called "utility computing". The idea being that someone else has taken care of the hardware and resource allocation, and maybe even the OS. If you're setting up an OS, you're doing it on a virtual machine or something, so you aren't really worried about setting things up, supporting them, what's running where, what kind of hardware it's running on, etc.

      The reason you call it a "cloud" is because it's amorphous. When I feed multiple

    • So, in other words, its like the shared mainframe/minicomputer model that we got rid of in favor of desktop PCs then?

    • Interesting concept.

      But I am very much a concrete thinker, not an abstractionist. So for me, "cloud computing" is what a company is moving toward when it recognizes that putting its data in one of Google's data storage facilities is both more secure and less expensive than continuing to manage security, backups/restores, and so forth, in house. I've got no idea how widespread this practice is as yet, but it seems like a natural and rational extension of co-location practices. Sort of like how businesses i

  • If cloud computing has bugs, it's Fart Computing.

  • by fpgaprogrammer ( 1086859 ) on Thursday July 17, 2008 @05:47PM (#24235041) Homepage
    i always thought cloud computing is what happens when a bunch of researchers score really good pot. "i bet we can get more funding if we call it a paradigm shift"
  • Never trust a man's definition of something when he tells you what it does rather than what it is.
  • aint it ironic?

  • by mikael ( 484 ) on Thursday July 17, 2008 @05:57PM (#24235133)

    Reminds me of the infrastructure diagrams of corporate LAN's and WAN's back in the 1990's. They would have a diagram of the local network of each site with servers, workstations, routers and firewalls. Then each firewall would be connected to an X.25 cloud (which looked exactly like a big puffy cloud). If it was an internal ID department diagram, then someone would usually add four or more legs and a face or some lightning flashes (then it became an X.25 spider, an X.25 sheep, or an X.25 packet storm).

  • ...what you use to run Final Fantasy VII?

  • by mattmarlowe ( 694498 ) on Thursday July 17, 2008 @06:12PM (#24235257) Homepage

    Hrm, maybe it's just my background in systems administration, but I thought cloud computing was just an inevitable combination of large scale web hosting with virtualization.

    In late 1990's, businesses generally had their own internet server(s) in a colo facility.

    In the early 2000's, some companies outsourced their internet infrastructure to managed service providers - other companies built their own in-house data centers to keep up with escalating application requirements.

    In the mid 2000's, server sprawl started to impact practically everyone...the first 100 boxes you deploy can be somewhat interesting, but after that... you're entire admin staff (outsourced or not) ends up spending all its time dealing with faults in existing hardware rather than deploying new services...plus electricity/cooling/etc all get more expensive so everyone starts to figure out ways to avoid putting in new boxes. Poof, in comes with virtualization that's actually reliable and actually interesting when it disassociates the virtual machines from worrying about hardware at all and allows them to move from system to system w/o any need for sysadmins to press the "fail over" or "load balance" buttons.

    Now, in 2007, smart marketing and product development people at amazon and elsewhere decide they can take over the web hosting industry by heavily commercializing the large virtualization clusters amazon has already deployed...and poof, wrappers to allow developers to create virtual machines and access back end San storage for the clusters are written, along with other stuff that will appeal to anyone who doesn't have a large existing infrastructure..and it's called "cloud computing". To avoid losing out, everyone else says they have their own cloud computing plans/etc...

    Now, I guess this is all there and good...but I always thought that what differentiated good hosting facilities from each other was the quality of the admin staff, customer service, defined SLA's and 24/7 emergency response, comprehensive application monitoring, combined with general availability of senior system architects...all of which I don't think amazon/et al have seriously addressed. That means good managed service or web hosting companies can still succeed by either building their own large virtualization clusters and calling them clouds or rebranding and adding value on top of amazon and other cloud providers.

    • The reason many are so cynical about cloud computing is that your "inevitable combination of large scale web hosting with virtualization." is no different from the old model of businesses renting time on IBM mainframes to do their data processing. The computing industry has seen several cycles of "inevitable" centralization, followed by the equally inevitable decentralization. This is just latest instance of that repetition.

      I bet businesses will learn that having control of one's own data can be an advant

      • Re: (Score:2, Insightful)

        by mattmarlowe ( 694498 )

        OK, let's analyze the arguments you make against cloud computing again:

        a) centralization verse decentralization waves with Mainframe as metaphor

        Mainframes were replaced by PC's because of affordability/cost. People who had very limited access to a $100K - $1M mainframe could suddenly have unlimited access to a PC for under $5K. The economics drove the change.

        Let's look at the cost of an extremely minimal cloud: 3 Server Class PC's with extensive networking/ram/cpu and virtualization software ($15K), San

        • The driver behind the successive decentralization/centralization waves in computing is economics, to some extent, but I'd argue that bandwidth is also just as important. In other words, when bandwidth (as compared to CPU time) is cheap, people choose to centralize. When bandwidth is expensive, people choose to decentralize.

          Examples: When widespread use of computing started, bandwidth (phone lines) were relatively cheap compared to the costs of getting time on a mainframe. Therefore, centralization was t

  • An "infrastructural paradigm shift" that cannot be succinctly described. Or even not-succinctly described. A paradigm shift into the unknown.

    Suddenly, this sounds a heckuva lot like the late 90's.

    Excuse me, I've gotta go find some VC.

  • ...a term sprinkled liberally through grant proposals, business plans, etc to maximize funding and buzz. It's this year's marketing spin on "Grid Computing".

  • Now the buzzword is pontificating computing.

  • by jmcbain ( 1233044 ) on Thursday July 17, 2008 @06:24PM (#24235355)

    Cloud computing refers to a cluster computing environment hosted by a single company. This approach is also referred to as "utility computing," and back around 1999 or so, the companies providing these services used to be called "application service providers."

    The difference between cloud computing and grid computing, which was all the rage around 2000 (see the academic Globus project) is that grid computing aggregates *widely* heterogeneous computers under different authorities across Internet-scale wide-area networks. A common approach is aggregating universities' computers to form a large-scale cluster. Disadvantages include the fact that you had to program with MPI, communication latencies are high, and there were a lot of authentication issues.

    Cloud computing avoids these difficult issues by having a single company host these services for you, and it's typically being done by the big players who can afford to do so (Amazon, Microsoft, Google). Cluster farms are controlled in data centres under one authority. The programmatic interface is simpler, and computation is typically through a fixed paradigm like MapReduce, although there are known SQL-like approaches to run on clusters. Communication through a GigEthernet is typical in a cluster within a data centre.

    Is cloud computing a buzzword? Possibly, but then "multi-core," "data centre," and "XML" used to be buzzwords too. Within five years, doing development on a particular vendor's cloud computing infrastructure may be as viable a (specialised) skill as programming for Windows, Linux, or MacOS.

    • by jd ( 1658 )
      Grid computing has largely fixed the limitations on protocols and authentication, making "cloud computing" look, well, just a little bit wet. (There is now a very nice SASL layer for Globus, for example.) The main problem I have with the term "cloud computing" is that the term "cloud" already has a well-defined meaning in computer networks - it's any topology where you don't care about where things are or what things look like, you pass packets in and you get packets out. To use programming jargon, it is a
      • If you have to care about it being centralized somewhere, then it is not black box, because you DO care about what is on the inside.

        Well, not necessarily. For example, if I outsource my web hosting to Amazon's EC2, do I care what OS their servers are running? Do I care what kind of hardware or load balancing they're employing? Of course not. Indeed, if I had to care about such things, it'd defeat the purpose of the service. Note that Amazon is still free to distribute its services across multiple locations - as long as its all transparent to the user, its still cloud computing, according to most definitions I've read.

    • by DarkOx ( 621550 )

      XML still is a buzword. While there are all sorts of applications that could make use of XML for easy interoperability and such I mostly see it being used for config files and single application data stores that could be handled better in a flat format.

      The other place it gets useds is as crude method of database replication by developers who should have talked to a DBA before writing their applicaion. They would have learned their was proably no need to do their own sycronization, no matter how quick and

    • by rgviza ( 1303161 )

      Actually it's application service providers using clustered virtualization technologies to provide web 2.0 on the grid.

      The cloud comes from the guy at the desk in the corner of the datacenter where it all runs, right after he eats a bacon egg and cheese croissant from burger king.

      His evil plans are coming together and he wants to eat your bebbeh.

      -Viz

  • by Anonymous Coward

    Cloud computing is a buzzword referring to an environment in which all of your enterprise's data and communications resides in another company's servers. The perceived benefit is that your enterprise does not need to have any of its own servers, and thus your IT department does not need to have any engineers.

    IT managers love the concept of cloud computing, as the entire IT budget (beyond what is paid to the company that provided the cloud servers) can be used for salaries and perks for IT managers and thei

    • Somebody please mod the parent up. I had to read all they way to the bottom to find this, the only really insightful comment. For God's sake man, mod it up!

  • by plopez ( 54068 ) on Thursday July 17, 2008 @06:35PM (#24235445) Journal

    Cloud computing is all about visionary modular concepts creating adaptive logistical projection
    using a distributed scalable core for multi-tiered background ability resulting in a inverse didactic pricing structure.

    I hope I cleared that up. It's actually good to see a healthy level of skepticism on this board.

  • Not quite "grid"... (Score:3, Interesting)

    by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Thursday July 17, 2008 @07:08PM (#24235729) Journal

    I don't remember "grid computing" being quite the on-demand system that, say, Amazon EC2 is. What makes it cool is the ability to scale it up and down on demand, rather than in months or years.

    Or maybe it's some combination of grid computing with virtualization.

    And yes, it's pretty much a buzzword. Just like Web 2.0 or AJAX or all the rest. It's a useful abstraction, but not a world-changing "paradigm shift".

  • getting lots of data from sources outside your control.

  • by Duncan3 ( 10537 ) on Thursday July 17, 2008 @07:26PM (#24235891) Homepage

    No, sadly this one is EASY...

    Cloud computing is how computing worked in the 1960-80's - large centralized systems that did everything, and you connected to with dumb terminals. Well it's back, but this time with a different name.

    Simple yes, but simple is not exciting.

  • will place an ad on Dice.com requiring 5 years in depth experience in all aspects of Cloud Computing.

    • by Adelle ( 851961 )

      Well, the thing about "Cloud Computing" being a buzzword is that there are thousands of developers who have been doing it for 5, 10, 15, or 20 years, it just wasn't called "Cloud Computing" for all of that time.

      Asking for 5 years of .Net experience in 2001 was stupid, but asking for 5 years of cloud computing experience in 2008 is not. The difference is that .Net refers to a specific product line, "Cloud Computing" does not.

      Likewise, if anyone asks for Web 2.0 experience, I've been doing it for 10 years, e

  • by ghostlibrary ( 450718 ) on Thursday July 17, 2008 @08:12PM (#24236285) Homepage Journal

    Defining things has always been a problem:

    The king's three scholars had accused Nazrudin of heresy, and so he was brought into the king's court for trial.

    In his defense, Nazrudin asked the scholars, "Oh wise men, what is bread?"

    The first scholar said, "Bread is sustenance; a food."

    The second scholar said, "Bread is a combination of flour and water exposed to the heat of a fire."

    The third scholar said, "Bread is a gift from God."

    Nazrudin spoke to the king, "Your Majesty, how can you trust these men? Is it not strange they cannot agree on the nature of something they eat every day, yet are unanimous that I am a heretic?"


    (From The Trial of Nasrudin [wikibooks.org]

  • by BCW2 ( 168187 )
    Cloud computing is a buzzword in search of a function!
  • lot's  of pc's and stuff -|LAN Cloud|->|router |<-->| Internet (big fluffy[scary?] cloud)|<-->|router |<--| LAN 2 Cloud |- lots of other pc's and stuff
    (imagine some crappy ascii depiction of the above)

    Now we throw a VPN link into this and this becomes the WAN cloud.
    Or let's say we get a bunch of leased lines to remote sites and expand our token ring segment off our main LAN...

  • Run like hell because there's a very good chance some vendor just farted in your data center and called it a cloud.

    It could also allude to the 'vaporware' that has yet to accomplish anything other than dynamic provisioning and configuration of virtual servers. Sure that's neat, but it doesn't warrant a buzzword.

    Marketing loves buzzwords. They could not get what they do to fit any of the accepted definitions of 'grid' , so they picked 'cloud'.

    I'm not saying its not useful, just separate the worth from the hy

  • Progress? (Score:3, Funny)

    by Archtech ( 159117 ) on Friday July 18, 2008 @05:15AM (#24239691)

    Leslie Lamport famously defined a distributed system as "one in which the failure of a computer you didn't even know existed can render your own computer unusable".

    http://research.microsoft.com/users/lamport/pubs/distributed-system.txt [microsoft.com]

    In this vein, I would define cloud computing as "a computing system in which the failure of a network you didn't even know existed can render your own computer unusable".

  • Anyone else see similarities with Super Mario?!

  • by Anonymous Coward

    "This has the advantage of reducing single points of failure." That must be the quest. Definiton unnecessary, for me anyway.

  • by Anonymous Coward

    Is a system of programming where you just click on pictures so I can still code after my competitors burn out the part of my brain that uses language.

    (steadys hand carefully to click preview button instead of cancel)

  • Plus cloud platforms/OS have become more usuable and buyable from various vendors - Amazon, IBm, Google, Sun ...

Time is the most valuable thing a man can spend. -- Theophrastus

Working...