Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Supercomputing Businesses Education Google IBM The Internet

Google and IBM to Provide Cloud Computing to Students 68

John "butter/oreo" Bajana-Bacall writes to tell us that IBM and Google have decided to team up to provide cloud computing resources to participating college students. "Most of the innovation in cloud computing has been led by corporations, but industry executives and computer scientists say a shortage of skills and talent could limit future growth. 'We in academia and the government labs have not kept up with the times,' said Randal E. Bryant, dean of the computer science school at Carnegie Mellon University. 'Universities really need to get on board.' Six universities will be involved in the initiative. They are Carnegie Mellon, Massachusetts Institute of Technology, Stanford University, the University of California, Berkeley, the University of Maryland and the University of Washington."
This discussion has been archived. No new comments can be posted.

Google and IBM to Provide Cloud Computing to Students

Comments Filter:
  • by Anonymous Coward
    Students to ask what the hell that means.

    Many busy contemplating brand new concept: 'clouds of porn.'
  • by monkeyboythom ( 796957 ) on Monday October 08, 2007 @03:26PM (#20902679)

    on the grounds that the Rolling Stones will sue me and everyone else for the use of the word, cloud.

    • on the grounds that the Rolling Stones will sue me and everyone else for the use of the word, cloud.
      Pfffft! Get offa my cloud!
  • by Dareth ( 47614 ) on Monday October 08, 2007 @03:31PM (#20902723)
    I often wonder what form modern computing would be in today if the personal computer had not been so wide accepted. Look around you at the walls. Some of the things you see are very ubiquitous. People take electrical outlets and phone jacks for granted. It is just part of the infrastructure we are used to. Now imagine a computer port next to all the rest. All you need is simple input(keyboard,mouse) and simple output(monitor,printer) devices attached to an adapter that plugs into this outlet. That is all you would need to know about computing. Computing power would be offered by a "Computer Utility" company. They would handle all the technical details. You simply pay your bill and the "technical goodness" comes down the line.

    Sure, you certainly pay thru the nose for your time slices of CPU power. But to those of us fortunate to be "Computer Wizards" who live and work at the Computer Utility, life would be grand!
    • by pohl ( 872 ) *
      In that "grand" alternate universe Lily Tomlin would have done a different character [youtube.com], perhaps, and we'd be accepting abuse from the computing utility company.
    • by mosel-saar-ruwer ( 732341 ) on Monday October 08, 2007 @04:14PM (#20903237)

      Originally the IBM machines were strictly lease-only [little money upfront, big money down the road].

      Then sometime later they moved to the sales model [big money upfront, but little money down the road], and Thomas Watson Jr always felt that that was a disastrous mistake.

      In fact, the entire industry [M$FT, Oracle, IBM, Sun, HPQ, Unisys, Google, pretty much everybody] has been working desperately for the last ten or fifteen years to get away from the sales model, and back into the rental/services model - everyone seems to agree that that's where the big $$$s lie.

      • by rayzat ( 733303 )
        IBM didn't decide to switch from a lease model to a sales model they were forced to switch to it as a result of the anti-trust lawsuit that was filed against IBM.
    • Re: (Score:1, Insightful)

      by Anonymous Coward

      But to those of us fortunate to be "Computer Wizards" who live and work at the Computer Utility, life would be grand!

      And here you have identified what I believe is the root cause of the Microsoft hatred around here. The "Computer Utility" is not really any different than the "Glass Computer Rooms" of the 70s and early 80s before the PC came along. So now many of those "Computer Wizards" who formerly inhabited the glass rooms are dyed-in-the-wool Microsoft bashers largely because of Microsoft's role in provi

    • Re: (Score:3, Informative)

      by Animats ( 122034 )

      often wonder what form modern computing would be in today if the personal computer had not been so wide accepted. Look around you at the walls. Some of the things you see are very ubiquitous. People take electrical outlets and phone jacks for granted. It is just part of the infrastructure we are used to. Now imagine a computer port next to all the rest. All you need is simple input(keyboard,mouse) and simple output(monitor,printer) devices attached to an adapter that plugs into this outlet. That is all yo

    • by rtb61 ( 674572 )
      In this case access to cloud computing and you could be paying through the nose for it. Google is a strong supporter of patent first, not invent first legislation, with google's history I would stay will clear of running any possibly patentable work an a computer system that they had anything to do with.

      Whilst IBM certainly has a good reputation, you would think they would hesitate to associate themselves with google's privacy invasive history. As for those famous research universities, did they stop to t

    • Computing power would be offered by a "Computer Utility" company. They would handle all the technical details. You simply pay your bill and the "technical goodness" comes down the line...

      ... as you wave goodbye to your freedom. [gnu.org]
  • by Anonymous Coward
    I got very very hungry afterwards.
  • by User 956 ( 568564 ) on Monday October 08, 2007 @03:34PM (#20902757) Homepage
    industry executives and computer scientists say a shortage of skills and talent could limit future growth

    That doesn't seem to have stopped Microsoft.
    • by umghhh ( 965931 )
      By definition industry needs something different from what universities provide. That is so because universities teach about things that do exist already not about things that are being developed now (some of them do research too but that is not teaching). The role of a school was always to provide the base on which one can build. The role of a company in it is to provide means to build and such foundations.
      Instead of complaining they should use talent and skills that are at place.

      I thought that is what the
      • by jd ( 1658 )
        Last I heard, Universities taught the underlying theories and underlying principles, the science behind the practices and the maths behind the science. At least, that is what happened at the University I went to. And, as Inmos mostly worked with recent graduates, I feel confident in saying that many of the better European Universities work this way.

        Knowing what was common practice at the time the text books were written is useless. Books take years to write, by people who aren't usually researching at the

  • ...I wanted Squall computing!

    Chris Mattern
    • $ telnet d4thc.cloud.computer.lan
      Trying dead:beef:deca:fbad:ba5e:ba11 ...
      Connected to dead:beef:deca:fbad:ba5e:ba11
      Escape character is '^]'
      user: root
      password: ********
      Welcome the the Cyclone Beowulf Cluster. Please don't blow up the world.
      #
      # hack whitehouse.gov | find football.txt | crack | rsh pushthebutton.sac.mil
      Operation successful. You may kiss your ass goodbye.
      #
    • I would rather have Tempest, personally. You can get more power from it (although, admittedly, it tends to overload whatever I/O device you use, and capsize^H^H^H^H^H^H^Hcrash your computer).
  • John "butter/oreo" Bajana-Bacall


    Are you kidding me? Is this some kind of inside joke or is this guy's name really that messed up?
  • Open to the public (Score:2, Informative)

    by proidiot ( 747008 )
    While I'm glad they're opening this up to top universities before businesses, I would think that both companies should probably open this up to open source development as well. While dealing with real hardware is ultimately a must for any serious package, it would be nice to have a way to get a package off the ground without killing local resources (not to mention potential advantages in version control).
  • Sorry guys, I missed the memo. WTF is cloud computing?

    • Re:Cloud computing? (Score:4, Informative)

      by Nite_Hawk ( 1304 ) on Monday October 08, 2007 @03:53PM (#20902981) Homepage
      From wikipedia:

      Cloud computing is a term used to describe applications that were developed to be rich internet applications. In the cloud computing paradigm software that is traditionally installed on personal computers is shifted or extended to be accessible via the internet. These "cloud applications" or "cloud apps" utilize massive data centers and powerful servers that host web applications and web services. They can be accessed by anyone with a suitable internet connection and a standard browser.
      http://en.wikipedia.org/wiki/Cloud_computing [wikipedia.org]

      • Examples: Google Maps, Google Docs & Spreadsheets, etc., and, to greater or lesser extent, Google Earth. All rely on massive computing power connected via the Internet and HTTP.

        IOW, didn't the students already have cloud computing? Or is this an implementation of the server side?
      • Re: (Score:3, Insightful)

        by jlarocco ( 851450 )

        Does that really need a new buzzword? Sounds like the same old shit that people have been doing with the internet for 10 years now. At the very least, isn't that basically the definition of "Web 2.0"? What's the difference?

        I was a little worried I had completely missed out on some new phenomenon, but that Wikipedia page has only been around since March. Sounds to me like Google and IBM just want to inspire "OMG!!1 We're missing out on 'cloud computing'!1" in idiot PHBs and investors.

        • I think it's because the internet being represented as a cloud in most diagrams, so it means "internet computing". I think "internet computing" might be better, but then someone would decide it takes too long or isn't cool to write "internet computing" and would use "IC".

          The best thing to do when someone says a stupid acronym is do just what the GP* did, and ask "Exactly what does that mean?", and don't use it yourself.

          * Ahh, the irony..
  • A Little Confused (Score:3, Interesting)

    by MrCrassic ( 994046 ) <deprecated&ema,il> on Monday October 08, 2007 @03:47PM (#20902929) Journal

    I am confused about the concept of cloud computing. Is it supposed to be similar to that of the famed beowulf cluster, as in making a supercomputing platform out of regular computer networks? Or does it use more powerful computers and cluster them together?

    Furthermore, what would be the point of doing this exactly?

  • This sounds an awful lot like what grids do (and have been doing for a while). I did rtfa and I didn't see much defining "cloud computing" other than "large data centers that students can tap into over the Internet to program and research remotely". Is "Cloud" the new "Grid"?
    • Cloud computing is the new name for mainframe computing. It's a marketating word, devised by some marketating person.
  • letters 'PFM occurs here' in it ???

  • Hangon... (Score:2, Funny)

    by CRX588 ( 1002741 )
    Does this "Cloud computing" require "Tags" in anyway? I mean were talking "Web 2.0" right, someone did mention this in the "Blogosphere"?!

    Either way, as long as this stuff does not run on a "Hypervisor" I don't want anything to do with it!
    • Sir, I believe you are trivializing this breakthrough concept. In actuality, "cloud computing" involves a direct high-bandwith fiber optic connection to Cloud-Cuckoo Land. Perhaps the greatest benefit of this new concept is that it supports greatly simplified Power Point presentations—all you need is a really fuzzy picture of just about anything. Then you sketch a little stick figure labeled "user" and draw arrows connecting him to the fuzz. Finally, you add a single bullet item: "Profit".


  • Presumably these clusters are for really hard problems - folding proteins, or simulating nuke explosions, or searching for exotic primes, or classifying Lie Groups, or proving Four Color theorems, or whatever - i.e. presumably these programs are expected to run for a long, long time before they terminate.

    On the other hand, a fellow named Alan Turing once proved that we can't know whether an arbitrary program will ever terminate.

    Now here's the question: If you allow a student onto one of these clusters,
    • Re: (Score:1, Troll)

      by deander2 ( 26173 ) *
      Or tried anything in signal analysis without the benefit of O(nlog(n)) algorithms?

        O(n*lg(n))

      nitpick, i know, but "log" w/o specifying the base usually means base 10, and i assume you mean to say base 2 (which is usually written "lg")

      • It's irrelevant, since lg(n) is a constant multiple of log(n) -- more generally, a logarithm of any base is a constant multiple of a logarithm of any other base -- therefore O(lg(n)) = O(log(n)).
    • Re: (Score:3, Funny)

      by g0at ( 135364 )
      What does "meta question" in your subject line mean?

      -b

    • Easy. Just ask the student for their graduation date, If the app hasn't reported by then, either terminate or send an e-mail asking if the results are going to form part of their post graduate studies.
    • by Ckwop ( 707653 ) *

      So if you are one of the lucky few who gets chosen [or at least pre-selected] for this sort of thing, then will you have to submit a "proof" of the finiteness of your program before you're given the green light?

      The halting problem is actually tractable for the vast majority of algorithms. If you were to select a program at random from the vast sea of possible programs, the vast majority are known to halt, however, the precise percentage is itself in-computable.

      Most commonly used algorithms such as qu

      • by pikine ( 771084 )
        Is there a name to this function?
      • your reply looked like it would never end
      • by Krakhan ( 784021 )
        I thought the issue of the halting problem wasn't for a *particular* program of whether it could be halt, but rather the problem was to come up with a single, general algorithm (or program if you want) that would decide for *all* programs on a given input whether it would halt or not. That is an undecidable problem.

        However, although it is possible to prove certain algorithms terminate, but the proof is special for each one that you have to come up with.

        With that in mind, I always found the interesting thin
  • What a great way for IBM and google to get the best and brightest to solve all kinds of problems for FREE! Well, it costs them $30 million a year, but that's probably nothing compared to having their OWN employees develop something. Industry has been raping and pillaging academia for years (e.g. the Pharma industry) so why not extend that opportunity to the computer world. GENIUS! I'll have to hand it to IB-Goo (sick forshadowing???), they sure are generous donating all the that computing power (which no do
  • How much to rent a bot net to do "real work"?

    Hey, results are results, right? And if it lessens my spam, oh well.

Air pollution is really making us pay through the nose.

Working...