NVIDIA Targeting Real-Time Cloud Rendering 184
MojoKid writes "To date, the majority of cloud computing applications have emphasized storage, group collaboration, or the ability to share information and applications with large groups of people. So far, there's been no push to make GPU power available in a cloud computing environment — but that's something NVIDIA hopes to change. The company announced version 3.0 of its RealityServer today. The new revision sports hardware-level 3D acceleration, a new rendering engine (iray), and the ability to create 'images of photorealistic scenes at rates approaching an interactive gaming experience.' NVIDIA claims that the combination of RealityServer and its Tesla hardware can deliver those photorealistic scenes on your workstation or your cell phone, with no difference in speed or quality. Instead of relying on a client PC to handle the task of 3D rendering, NVIDIA wants to move the capability into the cloud, where the task of rendering an image or scene is handed off to a specialized Tesla server. Then that server performs the necessary calculations and fires back the finished product to the client."
No more!! (Score:4, Insightful)
Re:No more!! (Score:5, Insightful)
It's got a very well-defined meaning: performing computing and storing data on an internet-connected server from an internet-connected client. It's a new term for, arguably, a very old thing, coined because the average end-user these days isn't familiar with the idea of doing their computing from a dumb terminal.
Re:No more!! (Score:4, Insightful)
Pay to Play? (Score:5, Insightful)
Re:No more!! (Score:2, Insightful)
I thought they had peaked with the hype around AJAX. But you're right, computing publications have taken it to the next level with "cloud computing".
The people who hype "cloud computing" tend to be young and ignorant. Here is a perfect example of this. [roadtofailure.com]
Simply put, these young punks have a huge ego, but no knowledge of computing history. They don't realize that "cloud computing" is merely what we called "mainframes" back in the day. Their low-powered hand-held devices that'd supposedly benefit from the cloud really aren't different at all from the dumb terminals we hooked up to our mainframes.
Most enterprises moved away from the mainframe because it just wasn't as useful and efficient as individual desktop systems on each user's desk. Unfortunately, most of those fools pushing "cloud computing" these days were born well after we made that transition. They don't realize that they're just resurrecting problems that we dealt with in the early 1980s.
Re:Question (Score:3, Insightful)
Maybe those patients don't want you to know anything about themselves?
Re:No more!! (Score:3, Insightful)
No, your whole point was that it's meaningless. Which we've established it isn't.
Your new argument is that the distinction between cloud computing and local computing is unimportant. Well, ask anyone who's had a computer-time grant on one of the monstrous IBM research clusters how they feel about the distinction between "fucking regular old computing" that "just happens to be taking place somewhere else" and going out and just buying their own hardware.
Won't work in some areas (Score:3, Insightful)
There is still this thing called "bandwidth quota" where you get overcharged to death if you go over it. As an example, say 40$/month for 50GB, then 10$ per additional GB.
And please no stupid "change ISP" comments, a lot of people aren't lucky enough to even have a choice of high-speed providers. It's either high-speed cable/DSL, or dial-up. Sometimes from the same ISP, even.
Latency (Score:5, Insightful)
There's one big reason - latency. 30 FPS is one frame every 33.333ms. What's your ping time? Add the rendering time to that, and that's what your interactivity is going to look like. Remember that many games have ways of hiding the latency between client and server - in particular they know the players POV and the static environment, so those things can be handled very well.
As someone else said, cloud rendering is fine for making movies. It's not viable for games. And besides, if a GPU can do this stuff in real time, why do we need to push it into the cloud? This sounds like OTOY all over again.
BTW, CPUs will be doing realtime ray tracing soon anyway - give me a bunch of bulldozer cores and a frame buffer.
Re:No more!! (Score:3, Insightful)
Re:No more!! (Score:5, Insightful)
with CPU's and memory and HDD's and the like -- it just happens to be taking place somewhere else.
That's an important distinction.
Bandwidth & Latency? (Score:3, Insightful)
Backbone and last-mile providers are already crying about filesharers overburdening the infrastructure, especially here in the U.S.. ISPs in the U.S. typically devote well more than 95% of capacity to downstream traffic to try and cope. The modern graphics card works on a bandwidth [wikipedia.org] spoken in terms of GB/s. There's no way a 50 FPS+ 1080p or better video feed from a rendering farm could be supported for every console user. While not needing as high of resolution, mobile devices communicate off of cellular networks that make in-ground network capacity problems seem petty. Even if these could be remedied, the latency involved in even a same city rendering farm would still make for a lack-luster experience.
Re:No more!! (Score:3, Insightful)
It's got a very well-defined meaning: performing computing and storing data on an internet-connected server from an internet-connected client.
I disagree. If it doesn't involve large server farms, in which the location of your data/process is arbitrary and ideally diffuse, then it's not cloud computing.
"Cloud" is a fairly good analogy for that.