NVIDIA Targeting Real-Time Cloud Rendering 184
MojoKid writes "To date, the majority of cloud computing applications have emphasized storage, group collaboration, or the ability to share information and applications with large groups of people. So far, there's been no push to make GPU power available in a cloud computing environment — but that's something NVIDIA hopes to change. The company announced version 3.0 of its RealityServer today. The new revision sports hardware-level 3D acceleration, a new rendering engine (iray), and the ability to create 'images of photorealistic scenes at rates approaching an interactive gaming experience.' NVIDIA claims that the combination of RealityServer and its Tesla hardware can deliver those photorealistic scenes on your workstation or your cell phone, with no difference in speed or quality. Instead of relying on a client PC to handle the task of 3D rendering, NVIDIA wants to move the capability into the cloud, where the task of rendering an image or scene is handed off to a specialized Tesla server. Then that server performs the necessary calculations and fires back the finished product to the client."
No more!! (Score:4, Insightful)
Re:No more!! (Score:5, Insightful)
It's got a very well-defined meaning: performing computing and storing data on an internet-connected server from an internet-connected client. It's a new term for, arguably, a very old thing, coined because the average end-user these days isn't familiar with the idea of doing their computing from a dumb terminal.
Re:No more!! (Score:4, Insightful)
Re: (Score:3, Insightful)
No, your whole point was that it's meaningless. Which we've established it isn't.
Your new argument is that the distinction between cloud computing and local computing is unimportant. Well, ask anyone who's had a computer-time grant on one of the monstrous IBM research clusters how they feel about the distinction between "fucking regular old computing" that "just happens to be taking place somewhere else" and going out and just buying their own hardware.
Re: (Score:2)
If you have Comcast, Time-Warner, or Cox internet you don't have the old style 80s-era time-sharing, but you still have an allotment. About 250 gigs per month. Have fun watching youtube videos, or CBS.com tv shows, or netflix.com rentals, AND doing cloud computing at the same time. You'll have overage fees galore.
Re: (Score:2)
...okay?
Re: (Score:2)
>>>"cloud" computing phenomenon -- it is only a new (and exceptionally stupid) buzzword for something that we have been doing for a long, long time
>>>
Well it's similar to when my local TV station started talking about "phantom power". i.e. When you leave your VCR or TV plugged-in, it uses about 5 watts of power. They act as if this is something new, but we engineers have known it as "parasitic" or standby power for a long long time.
And bell-bottom jeans. Today they call them "flares" o
Re: (Score:2)
I don't think anyone pushing to cut down standby power actually thought they'd discovered some shocking new phenomenon hitherto unreported to science. I've never heard it called "phantom power" at any rate: it must be unique to your region.
Re: (Score:2)
Even better is Cloud 2.0 Computing which is done in actual clouds using standing stone circles and such.
Re:No more!! (Score:5, Insightful)
with CPU's and memory and HDD's and the like -- it just happens to be taking place somewhere else.
That's an important distinction.
Re: (Score:2)
So, let me see, if I get this right. For some reason my web hosting (collocated dedicated server, visualization, load balancing servers) has just become "cloud computing" because they take place somewhere other than my desktop?
So what the f*** have I been doing for the last 10-15 years? For that matter, anything else that has happened on a network or the Internet over say the last 30 years or more?
It is a silly piece of marketing to rebrand the client / server paradigm.
Re: (Score:2)
So, let me see, if I get this right. For some reason my web hosting (collocated dedicated server, visualization, load balancing servers) has just become "cloud computing" because they take place somewhere other than my desktop?
No. They became cloud computing when the servers were doing the job your desktop normally did and you were using your computer as basically just a forwarder for keyboard inputs. Your blog? Not 'cloud'. Using Google Docs? Cloud.
It is a silly piece of marketing to rebrand the client / server paradigm.
I don't really disagree. Can't say my panties are bunched about it, either. Now that we have the masses using these services 'cloud' is easier to pass on to Joe Sixpack than 'client/server-your-work-is-over-there-and-not-over-here'.
In any event, which side of your NIC your prog
Re: (Score:2)
...when I hear these jack off tech companies
You mean like the company that developed the fleshlight? I'm surprised slashdot doesn't post more articles about this type of technology, considering the typical slashdotter. Jack off tech: it's the future, man!
Re: (Score:2)
Yeah, I'm gonna have to go ahead and disagree with you there... I think there are subtle but important differences between the old mainframe approach from the 70s, the kind of hosted computing we had in the 90s and the latest cloud computing stuff.
What I think differentiates cloud computing from earlier iterations of client-server architecture is the ability for a single device to transparently access virtually unlimited (or at least orders of magnitude greater) computing resources with little additional re
Re: (Score:2)
It's got a very well-defined meaning:
No it does not. People continue to misuse what it means in everything from daily speech to presentations, manager proposals and articles. The solution to everything nowadays is "Put it in the cloud", but few people really understand what they are saying when they say that.
Its just like most other buzzwords of the past 10 years. People here them and then think they are smart for repeating them to get what they want.
Re: (Score:2)
By your criterion, "CPU" has no well-defined meaning. Bugger-all people who say that have a clue what it actually means. However that does not magically undefine it.
Re: (Score:2)
People misuse the word "CPU" the way they abuse the word "cloud computing"? Really? I've not heard anyone saying they need to buy a new 1920x1080 CPU, or a new 10 gigabyte CPU for their machines.
Re: (Score:3, Informative)
I would humbly suggest that the people who talk about a 1920 by 1080 anything are unlikely to misuse the term "cloud computing", either. The people who use "cloud computing" as a magic talisman without bothering to know what it means are the sort of people who start their "CPU" with the front-panel lock key and download internets from the email.
This is besides the point. It was argued that, because people use "cloud computing" without knowing what it means, then the term has no meaning. This is simply an ab
Re: (Score:2)
Given some of the download times I've seen, I'm pretty sure this has happened once or twice.
Re: (Score:2)
People misuse the word "CPU" the way they abuse the word "cloud computing"? Really? I've not heard anyone saying they need to buy a new 1920x1080 CPU, or a new 10 gigabyte CPU for their machines.
Never heard someone refer to the entire desktop case and its contents as "the CPU"? "I plugged the monitor into the CPU, but nothing seems to be happening". I got that all the time when I was in IT support.
Re: (Score:2)
Re: (Score:2)
The thing that bus me about this 'cloud computing" nonsense is we already had a perfectly good and well established term for this-thin clients.
That would be an excellent point, if only cloud computing and thin-client were related in anything but the most tangential of ways.
But they're not, so it isn't.
5 years from now this will be just another dotbomb buzzword chucked in the trashcan of history
Meanwhile, people are actually using cloud services right now, and they're saving money.
http://blogs.smugmug.com/don/2006/11/10/amazon-s3-show-me-the-money/ [smugmug.com]
These people aren't affected by the ISP caps you mention -- because cloud computing isn't what you think it is.
OTOH, I do think that service aimed at end-users will become popular. Things along th
Re: (Score:2)
It has to have a meaning for people to get the meaning wrong.
Re: (Score:3, Insightful)
It's got a very well-defined meaning: performing computing and storing data on an internet-connected server from an internet-connected client.
I disagree. If it doesn't involve large server farms, in which the location of your data/process is arbitrary and ideally diffuse, then it's not cloud computing.
"Cloud" is a fairly good analogy for that.
Re: (Score:2)
You're right, its meaning is even more specific better-defined than I had laid out. In my haste I made it too general.
Re: (Score:2)
You missed something... I agree that the term is not clear and is commonly misused, but it does have a meaning.
A cloud service is one that is not provided by A server, but by many servers. Additionally to be considered a cloud service, it must be distributed geographically.
In the beginning computing was centralized, you would use a dumb terminal to access a mainframe system and all of your computing needs were centralized. Then with the PC, computing was distributed. Finally they centralized much of it
Re: (Score:2, Insightful)
I thought they had peaked with the hype around AJAX. But you're right, computing publications have taken it to the next level with "cloud computing".
The people who hype "cloud computing" tend to be young and ignorant. Here is a perfect example of this. [roadtofailure.com]
Simply put, these young punks have a huge ego, but no knowledge of computing history. They don't realize that "cloud computing" is merely what we called "mainframes" back in the day. Their low-powered hand-held devices that'd supposedly benefit from the cloud
Re: (Score:3, Insightful)
Re: (Score:2)
"Another is that where the mainframe were typically placed close to the workstations, the servers in the cloud can be placed remotely."
Uh, but what about latency? Where we talk about rendering in the cloud we need clients to be as close as possible.
"A third is that the workstations often were unable to function without access to the mainframe, modern desktops are able to use the advantages of the mainframe/cloud as well as the advantages of an autonomous desktop."
Not in every model, specifically not in OnLi
Re: (Score:2)
>>>these young punks have a huge ego, but no knowledge of computing history. They don't realize that "cloud computing" is merely what we called "mainframes" back in the day.
>>>
What I don't understand, even if these young'uns have no knowledge of history, why do they think cloud computing is a good idea? Why would they want to offload all the processing onto some distant central computer, when they have a quadruple CPU sitting right here in front of them? It makes no logical sense.
My own
Re: (Score:2)
It's a trade-off. For trivial tasks like word processing, the performance trade-off is worth the convenience benefit. For the home user's idea of a high-performance-computing task, such as gaming and video watching, the convenience benefit is negligable for the huge performance trade-off.
For real high-performance-computing tasks where purchasing a lot of computing resources for one project might not be justified, again there's a very large convenience benefit to just renting at a distance, which is why main
Re: (Score:2)
What I don't understand, even if these young'uns have no knowledge of history, why do they think cloud computing is a good idea? Why would they want to offload all the processing onto some distant central computer, when they have a quadruple CPU sitting right here in front of them?
It's that phrase "central computer" that suggests to me you've misunderstood cloud computing. If there's one "central computer" handling my request, I wouldn't consider that a cloud service. A cloud service is by definition distributed. Don't think "big mainframe in a datacentre". Think "huge datacentre full of servers with dynamically managed roles".
Why [...] when they have a quadruple CPU sitting right here in front of them?
Maybe they don't have a quad CPU, and maybe they don't want to buy one.
Re: (Score:2)
>>>A cloud service is by definition distributed.
A distinction that matters not. A network of computers at some distant Microsoft facility still has the same appearance as a "central computer" from the user's viewpoint, and still offloading workload from a terminal.
>>>Maybe they don't have a quad CPU, and maybe they don't want to buy one.
Yeah but you failed to read the rest of my sentence. Even my single-core ancient Pentium 4 is faster than my 750 kbit/s network connection. It makes mor
Re: (Score:2)
Even my single-core ancient Pentium 4 is faster than my 750 kbit/s network connection
A nonsensical assertion. Instructions/second != bytes/second.
For a computation such as "what is the millionth prime", you'd get the answer faster by going to a faster remote service than you would computing it locally, even if you did it over a 1200 baud modem.
Or, something you probably do more often, a question like "find the most relevant web pages matching these words, from your enormous database".
Re: (Score:2)
Why send data to a server to process over a slow link when I could get the result faster by processing that data locally? That's the question he's asking.
And who is suggesting an application in which you do that?
All the cloud services being offered or suggested offer something on the remote side that you're likely not to have locally. Whether it's massive processing power and access to data (Google search); fast, highly specialised processing power (this Nvidia project); highly redundant cheap storage (Amazon S3); and so on.
I've given two examples of how, even with a fairly fast local processor and a super-slow connection, it would be worth sending a job to
Re: (Score:2)
Nothing for the end user that can't be accomplished through plain old offline data synchronisation, if you're patient. For the computer provider, it gives them control over your upgrade pattern and what features you have to pay for. You can hardly save money by not bothering to upgrade the CPU and RAM on the purely conceptual machine hundreds of miles away that you rent time from: if the cloud is getting its quarterly upgrade, it's happening, and you're paying for it.
Re: (Score:2)
I am not that sure actually. It's not very well defined and different people use it differently, sometimes with a marketing agenda. ;-)
But it also conveys some property quite clearly:
- cloud computing is not precisely located and you don't really care
- it's not happening in your home
- it's everywhere or almost
- it's out of your control (others may access it without your knowledge etc.)
- it can disappear and be unavailable anytime (just like real clouds
The previous ter
Re: (Score:2)
>>>it's nice to change the buzzwords every so often...
Bad is good. And good is bad. War is peace, and chocolate rations have been increased from 10 to 5.
How about instead of inventing words we just use the ones we have? Rather than "cloud" computing we could just call it internet-based computing, because that's what it is.
Re: (Score:2)
How about instead of inventing words we just use the ones we have? Rather than "cloud" computing we could just call it internet-based computing, because that's what it is.
Yeah, we could do away with all kinds of pesky specific descriptions, if we just call everything that touches the internet "internet-based computing". I mean, what idiot coined the tedious and unnecessary buzzword "World Wide Web"?
"Cloud computing" has a meaning. If you can't be bothered to know what that meaning is, that's your problem.
Clue:
If you put up a web server and I browse it, that's internet-based computing, but it's not cloud computing.
If your web server performs some processing for me (to stay on
Re: (Score:2)
"Cloud" is a pretty stupid name, one that bugs me almost as much as "AJAX", but it's hurt even more by being associated with two things at once.
The first is simply a client->server connection, or perhaps hosting your data online. This, I think, doesn't need a new name. The old names were working fine.
The second, and far more interesting, is for much more complex systems that are marking a move from managed server hosting to scalable application hosting. These guys design their systems from the ground
Re: (Score:2)
The first is simply a client->server connection, or perhaps hosting your data online. This, I think, doesn't need a new name. The old names were working fine.
The only people using this meaning of "cloud", are people constructing a strawman argument.
"Since cloud computing is simply [insert old concept], it's a pointless buzzword for an old concept".
If it's not dynamically distributing tasks on a large cluster of servers, it's not a cloud.
Re: (Score:2)
Re: (Score:2)
Off-topic (as it is rated) for rendering on the cloud, but potentially on-topic for cloud in general. At the moment people want some degree of privacy of data, but "cloud" wants us to throw it to teh interwebz and process it there. Anyone care to guess how much easier it may become to get the data the OP wanted? ;)
Re: (Score:2)
Cloud is as private as any other thing you would put on in a web application : Privacy is bound to your service provider usage term. Their tem of use are a contract between you and them.
Ya, so in other worse, there is no privacy.
They are as bound to it as you are, except they have the right to change it anytime and you have the right to refuse the modifications and quit using their service.
And don't forget, lose access to my data. Great, thanks.
If they break their term and for any reason your data end up
Re: (Score:2)
>>>It's nothing new, but it haven't been harnesses to do general purpose computing 'till recently.
That's odd. I seem to recall use my VAX terminal to "cloud compute" and do general computing (math problems) back in the 80s. Maybe you think that doesn't count for some reason?
Re: (Score:2)
>>>You said yourself, it was a terminal. The computation was done on a mainframe
If I only use my PC to connect to Microsoft.com applications like Cloudword or Cloudexcel, then in effect I've turned my PC into a terminal.
Pay to Play? (Score:5, Insightful)
Re: (Score:2)
Think of it as paying for everyone else's video cards. On credit. Forever.
Re: (Score:2)
Re: (Score:2)
Someone has to pay for the computer eventually. [slashdot.org]
Re: (Score:2)
OnLive won't be $1 a month. With the amount of bandwidth they'll be paying for, plus the higher server-end requirements, I'll be shocked if it's under $12 a month. I wouldn't be surprised to see it as high as $24.99 a month or beyond.
Imagine fees inline with the concept of "interactive cable TV".
Re: (Score:2)
Return Of The Mainframe! (Score:2)
YOUR RENDER FARM ESPLODE! (Score:2)
So like, are they gonna use ATI cards for this or something? LOL
End of the upgrade path? (Score:2)
So if the GPU become a glorified web client how will they keep soaking everyone for a (bi)yearly card upgrade? If all of the most complex tasks are handed off to a remote server that's where the upgrades should be handled.
Also if part of the secret sauce is being handled remotely NVidia has no further excuses for keeping it's linux drivers closed.
Re: (Score:2)
So if the GPU become a glorified web client how will they keep soaking everyone for a (bi)yearly card upgrade?
Oh, but there's a better revenue stream here: subscription fees.
Re: (Score:2)
Well, now you get soaked for the upgrade whether you want/need it or not. That's the rub.
Won't work in some areas (Score:3, Insightful)
There is still this thing called "bandwidth quota" where you get overcharged to death if you go over it. As an example, say 40$/month for 50GB, then 10$ per additional GB.
And please no stupid "change ISP" comments, a lot of people aren't lucky enough to even have a choice of high-speed providers. It's either high-speed cable/DSL, or dial-up. Sometimes from the same ISP, even.
Re: (Score:2)
Fine be me, if NVIDIA thinks they can make this work it'll be just one more industry supporting net neutrality. Maybe we should encourage more and more industries to implement high bandwidth, questionably useful technologies. Eventually, the lobby money from the Net Neutrality group will be greater than the lobby money from the Telcos/ISP group.
Re: (Score:2)
But doesn't the target of Cloud Rendering mean that one day I can have my own render farm set up to run a game? For example the minimum requirement specs for a game could be "20 Rendering GPU's running to a total of X speed" instead of "Nvidia Card X or greater" ?
Re: (Score:2)
Perhaps you could call my condo association and have them reverse their decision to disallow Verizon's deployment of FIOS.
You're right though, I suppose I should just be asking myself "Have you thought about changing residences?"
That makes a lot of sense just to play a video game, way more sense than going to NewEgg and buying a new video card for my computer.
Latency? (Score:2)
So, even if I had the the bandwidth to upload graphic data (geometry, textures, etc) and download 1080p video in realtime without any buffering, my 5ms Monitor would now have to deal with at least 30ms in video latency?
Oblig. Penny Arcade (Score:3, Funny)
It happens in the sky:
http://www.penny-arcade.com/comic/2009/3/25/ [penny-arcade.com]
Great... (Score:2)
A big-ass binary hairball to further clog the tubes.
How much additional traffic is this going to add to all the other interactive high-bandwidth stuff transiting the infrastructure?
Re: (Score:2)
From TFA:
NVIDIA fielded a question on this topic during the Q&A session, and insists that RealityServer applications will have a bandwidth footprint equal to or less than that of a YouTube video stream, and it erred on the side of "less." Assuming this is true, the new services should have little to no effect on current-generation networks.
Though, I had to disagree... We aren't talking about a tiny YouTube video screen here, I want full pixel 1920x1200 x 16bit x 60fps (at least) rendering, and I doubt that's less than a YouTube video.
Bandwidth & Latency? (Score:3, Insightful)
Backbone and last-mile providers are already crying about filesharers overburdening the infrastructure, especially here in the U.S.. ISPs in the U.S. typically devote well more than 95% of capacity to downstream traffic to try and cope. The modern graphics card works on a bandwidth [wikipedia.org] spoken in terms of GB/s. There's no way a 50 FPS+ 1080p or better video feed from a rendering farm could be supported for every console user. While not needing as high of resolution, mobile devices communicate off of cellular networks that make in-ground network capacity problems seem petty. Even if these could be remedied, the latency involved in even a same city rendering farm would still make for a lack-luster experience.
Re: (Score:2)
Highlighting how consumer internet companies have oversold their capacity could be a benefit, mind you. If people get oversold on flights they get some sort of comeback, maybe people whose internet is unusuable when the sun's up would start to ask for a discount.
Re: (Score:3, Interesting)
Which is why this isn't currently targeted at the gaming market (though there is some startup doing "streaming" games, I forget their name but you can play crysis!). The target here is for tasks which used to be sent off to render farms for a day or two and would return a half dozen high resolution pictures. Previously the architect had to anticipate all the possible views/angles that their clients wanted to see.
Now you can get the same high quality ray-traced graphics in almost real time which allows the a
Re: (Score:2)
Video cards need high bandwidth to get textures, models and arrays into memory when needed. If the server farm already has all this in memory, then your just updating some state arrays to indicate the state of those objects, which lowers your bandwidth drastically.
Your video subsystem has massive amounts of bandwidth to deal with the instantaneous needs that require huge amounts when the scene changes completely. If you're just standing there in some game, looking at the same spot and not moving, almost
Re: (Score:2)
The servers have to get the scene assets somehow. That is why modern video cards use high bandwidth PCI Express slots, so the assets of the scene can be sent quickly when needed without adding delay.
Just displaying the video is pretty low bandwidth compared to that, although if you were to try and feed me video over the internet for the same resolution that I play games at, I'd need at least 4 times the bandwidth Time Warner lets me burst to right now. 4 times probably won't cut it since you rarely ever g
On your marks, get set....! (Score:2)
Ok, it's time everybody! Break those old Sun SparcStation ELC's and SLC's out of storage!
Oh, wait, you don't have one? How about all those SunRays you've got in the garage?
No?
Right.
Re: (Score:2)
How about all those SunRays you've got in the garage?
The SunRays are sitting on top of my sun enterprise rack in my room, you insensitive clod! (not kidding, behind me to my right :) )
Hard to imagine a worse idea... (Score:2)
This is marketoid-think at its worst.
Graphic rendering requires very low latency.
Of all the things that might be done in the "cloud", realtime graphics is the silliest.
But, the marketoids have been convinced that the "cloud" is the future, so they invent nonsense scenarios where their products can be used.
Wouldn't it be cool if... (Score:2)
1) We all had free, unrestricted and unlimited fast Internet links...
2) Our GPU could pull on the idle processing power of all GPUs in the world...
3) Everyone in the world got on with each other...(ok ok, off-topic here)...
Seriously though, there is no way that one could support the network requirements of this...How many Nvidia GPUs are sold a year? F***ing eh...multiply that by the bandwidth require for 25-50fps for a 1080p image...the number is frightening.
Re: (Score:3, Insightful)
Maybe those patients don't want you to know anything about themselves?
Re: (Score:2)
Yes for example if my records are opened, it will be discovered my IQ is only 90, and I may lose my engineering job. Shhh.
Re: (Score:2)
OK, so you've been modded offtopic.
But I'm curious why you thought a question about publically available datasets had anything to do with cloud computing? It doesn't look as if you were trolling. So what was it you were misunderstanding?
Re: (Score:2)
You realise that those restrictions are the only reason that the data could be gathered in the first place, right? People won't allow their information to be disclosed at all unless there's some reassurance about what will be done with it. Maybe you should collaborate with somebody who can get access instead of trying to work around it, if only for your own good. It's not good for your career to be known as "the guy who stole all that private medical data and wrote a paper with it". The journals frown upon
Re: (Score:3, Funny)
The rendering of clouds in the cloud computing will stop.
With 31% chances of rain.
Re: (Score:2, Troll)
this is the "whoops" of cloud computing and why it doesn't work for these purposes. Render farms do what they do well, and so does distributed computing. Neither of these are cloud.
Can we please stop the marketing hype for everything cloud?
Re: (Score:3, Informative)
I take it you don't have a render farm. If you're closer to delivery your render farm is probably completely occupied rendering final frames. If you are in the middle of a project it's probably running at quarter or half capacity. A render farm is often either over burdened or under burdened. That's a situation that's perfect for cloud computing. Instead of wasting thousands and thousands of dollars in idle machines you simply pay for the time when you need processing power. And since most of the w
Re: (Score:2)
Apparently you work for a company who only works on one project at a time and only has one team.
If a company isn't partway into its next project by the time the current one project is done, I wouldn't want to work there - too unstable.
Re: (Score:2)
With multiple teams and multiple projects you tend to just increase the frequency of the spikes in demand not even them out. Yesterday our farm was at capacity and queued up for an excess of 5 hours. Right now there is no wait to start rendering. Yesterday we needed a farm 4x our size. Today we need a farm a quarter. But even that isn't really the way it works either. Yesterday we would prefer to have had a farm 100x our size for 30 minutes. Today we would want to have a farm 100x our size for 1
Re: (Score:2)
I can't fathom, though, why nVidia -- a graphics chipset maker which has nothing to do with the ray tracing you're describing -- would be interested in this.
The bandwidth between the CPU and the graphics chipset is a frequent bottleneck. This is why the graphics adapter often has a special slot (VLB vs. ISA; AGP vs. PCI; PCI-e x16 vs. PCI-e x1) and why we're starting to see the marriage of the CPU and graphics chipset (AMD buying ATI, nVidia talking about making their own CPU, Intel making graphics chips,
Re: (Score:2, Interesting)
Well, the new name is supposed to be for the specific case of moving traditionally local computing tasks off to farms. Doing a movie on a remote render-farm is hardly cloud computing, but re-encoding your holiday video is.
Latency aside, my worry is that you're buying a gaming timeshare. It's cheaper to pay for the computing time you actually use, in principle. However online game communities depend on lots of people playing at the same time, which is exactly the sort of thing that would make online gaming u
Re: (Score:2)
Hell, an average of less than 3 users. If those four are online for only four hours, and your system is only loaded with two users the other eight hours, then you're only getting an average of one and a third users over four computers. How do you get one and one third users to pay for four games machines?
Re: (Score:2)
Managing for capacity will certainly be the difficult part of running a cloud server farm.
It's fairly obvious, that if you build your farm to cope with the peaks, there will be spare capacity during the troughs.
But there are strategies to deal with this. For a start you can soften the peaks with pricing strategies. You could even offer discounts for off-peak gaming.
Plus, you could sell your off-peak capacity for other purposes. For example, a Hollywood animation could be rendered using the spare off-peak ca
Re: (Score:2)
I suppose that's nVidia's idea with the new platform, then: to make graphics rendering genuinely fungible.
Incomparable to OnLive - different goals (Score:5, Informative)
NVidia's offering performs full scene raytracing/pathtracing, with effects ranging from reflections and refractions to global illumination and caustics all the way through to sub-surface scattering and participating media.
Some of these things can be done in proper realtime (say, at least, 30fps at 720p) on existing GPUs, but typically by using hacks that look 'good enough', but aren't actually correct. Which is fine for gaming (where refresh rates matter), but not fine for product visualization, architectural visualization or to go to an extreme.. materials and lighting analysis, where you don't care if it's not 30fps, but are more than happy to wait 10 seconds for something that used to take 15 minutes.
That said... if the cards keep getting faster, then eventually 30fps@720p will be possible and there's no reason, in the time inbetween, that games couldn't add the more fancy effects and have the GPGPU solutions take care of those on a 'cloud' platform.
Latency (Score:5, Insightful)
There's one big reason - latency. 30 FPS is one frame every 33.333ms. What's your ping time? Add the rendering time to that, and that's what your interactivity is going to look like. Remember that many games have ways of hiding the latency between client and server - in particular they know the players POV and the static environment, so those things can be handled very well.
As someone else said, cloud rendering is fine for making movies. It's not viable for games. And besides, if a GPU can do this stuff in real time, why do we need to push it into the cloud? This sounds like OTOY all over again.
BTW, CPUs will be doing realtime ray tracing soon anyway - give me a bunch of bulldozer cores and a frame buffer.
Re: (Score:2)
It'll be fine for people who're happy with low-grade graphics that existing hardware can do quickly enough for the latency to be the limit, or gameplay which is not in real-time. Unfortunately this is a market that probably won't see the point in signing up to the service in the first place, and could be just as easily served by a cheap local box.
Re: (Score:2)
What if the game server also had the GPU engine on board for all of the clients?
Current case: Game server tells client "you are at coordinates X,Y,Z within the game, you're facing this way, the following is happening in front of you, etc". Client takes all of that an renders a local copy of the game map to match what the server says.
New case: Game server sends game imagery as compressed streaming video and audio for each client, and receives back motion/command instructions from client. All of the calcu
Re: (Score:2)
Cool for small computers (iPod touch, iPhone, cell phone, netbooks) that don't have GPUs.
Re: (Score:2)
*20,000 frames included in monthly fee. Additional frames are charged at $0.004 per frame
Re: (Score:2)
I assume it's "in the cloud" for the same reason people outsource other tasks. The architect doesn't need to invest in the hardware/software platform and a render farm. Instead they contract the work out and don't need to worry about the technical details. This is not much different than what many people do today only instead of getting some static images back they get an interactive utility.
Re: (Score:2)
Which makes me wonder what the point is here really. OnLive is in testing already. TFA doesn't compare to OL so I don't know why Nvidia's offering is so much better.
Nvidia and OnLive are partners: http://www.onlive.com/partners.html [onlive.com]
These announcements are probably related to OnLive.
Re:Why is rendering clouds so important? (Score:5, Funny)
The gaming industry has been trying to jump start the flight simulator market again.
Re: (Score:2)
Reality server is designed to deliver single frames for visualization not interactive games. The acceleration structures would make it pretty much useless for a video game with lots of deforming meshes. The applications are things like an Ikea website where you can build your living room, place furniture and see a photo realistic rendering of the outcome without waiting a few minutes would be required on a local machine without a render farm.
Re: (Score:2)
The applications are things like an Ikea website where you can build your living room, place furniture and see a photo realistic rendering of the outcome without waiting a few minutes would be required on a local machine without a render farm.
THIS!
Once you start thinking like that, potential applications start leaping out at you.
There's also the web site where you write type in a script with camera directions, and it renders it into a speech-synth-narrated machinima (I forget the name). This technology could vastly improve both the quality and the responsiveness of such a site.
Re: (Score:2)
Such a hurry to post you didn't even read the summary, not even the first sentence.
Its about cloud computing, not rendering cloud images.