Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Internet

The Ideal, Non-Proprietary Cloud 93

jg21 writes "As previously discussed on Slashdot, the new tendency to speak of 'The Cloud' or 'Cloud Computing' often seems to generate more heat than light, but one familiar industry fault line is becoming clear — those who believe clouds can be proprietary vs. those who believe they should be free. One CEO who sides with open clouds in order that companies can pick and choose from vendors depending on precisely what they need has written a detailed article in which he outlines how, in his opinion, Platform-as-a-Service should work. He identifies nine features of 'an ideal PaaS cloud' including the requirement that 'Developers should be able to interact with the cloud computer, to do business with it, without having to get on the phone with a sales person, or submit a help ticket.' [From the article: 'I think this means that cloud computing companies will, just like banks, begin more and more to "loan" each other infrastructure to handle our own peaks and valleys, But in order for this to happen we'd need the next requirement.']"
This discussion has been archived. No new comments can be posted.

The Ideal, Non-Proprietary Cloud

Comments Filter:
  • Security? (Score:4, Insightful)

    by llamalad ( 12917 ) on Monday July 21, 2008 @09:17AM (#24272645)

    Am I missing something, or does the article make no mention of security?

  • Huh? (Score:3, Insightful)

    by sarathmenon ( 751376 ) <(moc.nonemhtaras) (ta) (mrs)> on Monday July 21, 2008 @09:19AM (#24272679) Homepage Journal

    What makes him so sure that interoperability will be even on the provider's list? I don't see any easy way to use EC2 with some third party solution for storage. Plus, it would be lame if I had to go via internet for every request that should ideally be local.

  • Re:Security? (Score:5, Insightful)

    by thatskinnyguy ( 1129515 ) on Monday July 21, 2008 @09:24AM (#24272739)

    Am I missing something, or does the article make no mention of security?

    Or some sort of business model where someone makes money to run all of this.

  • by rs232 ( 849320 ) on Monday July 21, 2008 @09:27AM (#24272765)
    Relying on third party technology is never going to provide the reliability or uptime required. The more straight forward solution is to hire some rackspace and host your own solution. 'Cloud Computing' is just the latest marketing promotion designed to move us to renting software.
  • by dkf ( 304284 ) <donal.k.fellows@manchester.ac.uk> on Monday July 21, 2008 @09:35AM (#24272861) Homepage

    'Cloud Computing' is just the latest marketing promotion designed to move us to renting software.

    For some software that makes sense. Some apps cost an enormous amount to buy a copy of (no, MS Office isn't one of these!) and many smaller businesses don't need a copy continually. For example, a small engineering firm probably doesn't need a Computational Fluid Dynamics package the whole time, but when they're designing a product it's useful to rent some use of one.

    Does this mean that everyone will be hiring everything? I really doubt it. I reckon that the end result will be a mixed economy with some purchases and some hiring. Which will be the dominant mode at any time? Well, that'll probably change from year to year. Guess what? That's true for other parts of the economy too. IT's not that special...

  • by samkass ( 174571 ) on Monday July 21, 2008 @09:37AM (#24272871) Homepage Journal

    You're making a lot of assumptions about needs, uptime, costs, and levels of in-house expertise when you make those blanket statements. There's always a balance between "relying on third parties" and "not invented here syndrome". In the latter case, you'll have people attempting things way outside their area of expertise and reliability or uptime will be significantly worse than if they'd let the experts do their job and paid a fair price.

  • by conspirator57 ( 1123519 ) on Monday July 21, 2008 @09:42AM (#24272939)

    so we'll end up with a sub-prime computing crisis?

    how can you bail out companies that fail to keep sufficient computing reserves in hand to cover their potential obligations?

  • by querist ( 97166 ) on Monday July 21, 2008 @09:43AM (#24272951) Homepage

    I believe that you are partly correct in your assertion that cloud computing is, eseentially, marketing hype intended to move us toward renting software.

    One advantage that cloud computing has over your proposed solution is that you are not paying for the idle time where your rack of computers is not doing anything. You only pay for what you use (within limits - I suspect a cellphone-like billing plan will emerge). This and the rapid scalability would be wonderful for smaller businesses.

    Imagine that you have minimal needs during most of the year - word processing, billing, etc, but on a quarterly basis you need to do your taxes (US businesses normally must file tax reports on a quarterly basis) and on an annual basis you need to do a large amount of computing - employee tax records, inventory, other annual processing. With cloud computing, if you are willing to accept having your data somewhere else that is not in your physical control, you simply ramp-up the computing need in December and then you're done. You finish on time and have a larger "bill" at the end of the month. This is very much like electricity - in cooler months you don't run your AC in the house, but when a heat wave comes along you run the AC more and you just pay a higher bill. You don't maintain your own power generation capacity, you simply use more of the available supply when you need it.

    One of the nice ideas behind "cloud" computing is that computational is treated as a consumable resource, much like electricity. Cloud computing, in that way at least, is similar to "grid computing". The differences are important, however.

    "Grid" computing is related to raw computing power being distributed for a large problem. Cloud computing, on the other hand, is not so much about one user being able to access huge amounts of processing power at once as it is about making computing resources available on demand and from anywhere.

    Imagine it like this for a moment: every device that plugs into a wall outlet has its own "power meter" like the one that the electric company use to determine how much to bill you each month. (Let's not go into a discussion about estimates, how often they really read the meters, etc., please. This is only an analogy.) You can take your devices anywhere, and when you plug it into the wall the little meter records how much electricity you use.

    So, when you are at a hotel, a friend's house, or the public library, you are still being billed personally for the electricity that your laptop computer is using. You can do what you like with the electricity as long as you don't violate any laws of physics and as long as you stay within the limits of your connection or access. (In other words, don't try to draw 40 amps from a 20 amp outlet - you'll trip the breaker.)

    But, instead of electricity, you are accessing computational services in the form of data storage and software as well as data transfer. The nice thing is that you can access it from anywhere (such as Google Apps) with little dependence on operating system or platform.

    If (and this is a big "if") they can work out the security concerns, this could be very useful for large businesses.

  • by sm62704 ( 957197 ) on Monday July 21, 2008 @09:55AM (#24273103) Journal

    For example, a small engineering firm probably doesn't need a Computational Fluid Dynamics package the whole time, but when they're designing a product it's useful to rent some use of one.

    Except that the training required to learn this software is more expensive than the software. It would be cheaper to hire an engineer who had his own tools.

    It's like when your car breaks - it's cheaper to hire a mechanic than to rent diagnostic computers and other tools the mechanic has and learn about internal combustion engines and how to use the tools you rented.

    Remember, the term "cloud computing" was coined by the clueless who didn't understand the chart's meaning, or he would have simply said "distributed computing".

  • by mosel-saar-ruwer ( 732341 ) on Monday July 21, 2008 @10:12AM (#24273429)

    In this day and age - when hardware is essentially worthless [today, for under $200, you can get what would have been a $10 million supercomputer ten years ago], and when even RDBs are essentially worthless [MySQL & PostgreSQL being free downloads], the only things which add value are:
    .

    1) Your schema [or your customizations of the vendor's standard template of the schema for your industry], and

    2) Your business logic for manipulating the schema [or your customizations of the vendor's standard template of the business logic for your industry], and

    3) The actual data in your database, and

    4) Your algorithms for analyzing the data in your database [or your customizations of the vendor's standard template of the analysis algorithms for your industry].

    Of those, at least 1), 3), and 4) are going to have to be uploaded to "The Cloud" [and 2) might have to at least interact with "The Cloud"], and unless "The Cloud" encrypts everything - both data & logic [and how do you really "encrypt" something if ultimately the registers in the CPU have to see unencrypted data, and especially unencrypted logic & algorithms?] - then you've just uploaded the crown jewels of your entire enterprise for all the world to see.

  • by Timothy Brownawell ( 627747 ) <tbrownaw@prjek.net> on Monday July 21, 2008 @10:17AM (#24273529) Homepage Journal

    Relying on third party technology is never going to provide the reliability or uptime required.

    Even if the third party has way more experience and better hardware than you do?

  • by Z34107 ( 925136 ) on Monday July 21, 2008 @10:48AM (#24274037)

    "Cloud computing" sounds exactly like how (I'd imagine, beinga young'un) mainframe time was rented back in the Bad Old Days. Except that one mainframe has been replaced with one "cloud."

    However they billed for a batch job back in the '50s is how I'd expect them to build for their cloud. Just replace dumb terminals or an operator with the interwebs, and you're good to go.

  • by rs232 ( 849320 ) on Monday July 21, 2008 @11:46AM (#24275187)
    "You can't get the same scaling from a physical server as you can get from "the cloud" for anywhere near the same price"

    Most people don't need such scaling and I can get more per price from a box hosted in a server farm. The reason "the cloud" would be cheaper is they build and staff it at the lowest possible cost. Things happen like forgetting to test the emergency generators [theregister.co.uk], or what probably really happened, skimping on routine maintenence.
  • by Anonymous Coward on Monday July 21, 2008 @01:16PM (#24277033)

    how can you bail out companies that fail to keep sufficient computing reserves in hand to cover their potential obligations?

    Simple: The computing provider uses a standard contract that doesn't offer any particular service level guarantee or compensation for downtime and call it 'industry standard'.

    Then if they don't have enough reserves to cover their obligations they laugh in their customers' faces.

  • by random name 6721 ( 876265 ) <andre@merzky.net> on Monday July 21, 2008 @01:42PM (#24277513)

    Imagine it like this for a moment: every device that plugs into a wall outlet has its own "power meter" like the one that the electric company use to determine how much to bill you each month...

    Well, true, Cloud computing could provide that. But you are missing the point of the name 'Grid Computing' - the original idea was to model compute time provisioning after a Power *Grids*: you plug your laptop into an outlet, and, voila, ...

    So, your wall outlet idea was already promised by Grid Computing -- what Cloud computing seems to add, IMHO, is support for (a) very simple interfaces to use the provided resources, and (b) support for specific usage modes. Grids are more all-purpose infrastructure, which makes them rather complex, and non-trivial to use. Clouds focus on a few, but very common use cases, which makes them much more simple to use. IMHO, the infrastructure to implement a Cloud can very well be a Grid...

    Best, Andre.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...