Nvidia's RealityServer to Offer Ubiquitous 3D Images 82
WesternActor writes "ExtremeTech has an interview with a couple of the folks behind Nvidia's new RealityServer platform, which purports to make photorealistic 3D images available to anyone on any computing platform, even things like smartphones. The idea is that all the rendering happens 'in the cloud,' which allows for a much wider distribution of high-quality images. RealityServer isn't released until November 30, but it looks like it could be interesting. The article has photos and a video that show it in action."
It takes chutzpah to use the term "RealityServer" (Score:4, Funny)
...for demoware.
Re:It takes chutzpah to use the term "RealityServe (Score:5, Funny)
Re: (Score:2)
Hey, don't rain on their parade.
Re: (Score:1, Insightful)
Any "new" technology that is marketed with the phrase "cloud computing" is starting to get a really bad reputation with software developers.
The "cloud" is the sort of idea that managers and other fucking morons like that think is totally great, but those of us who actually have to work with such systems know how shitty most of them are.
"Cloud computing" is this year's version last year's "web services", "SOA" and "SaaS". So many bold claims, but in the end nothing but a steaming pile of fecal matter pushed
What about Data Transfer (Score:5, Insightful)
Aren't Photo-Realistic Images pretty big in size? If I want to get 30 Frames per second, how am I ever going to push 30 Photorealistic Frames through the internet - I can hardly get 5 Mb/s from my ISP.
Re: (Score:2)
Re: (Score:3, Insightful)
Video's pretty big - but its always compressed to a point I wouldn't call it photo realistic.
Re: (Score:2)
Re:What about Data Transfer (Score:4, Insightful)
Really you wouldn't describe Netflix HD as photorealistic? Even things... originally shot on film? With a camera?
Re: (Score:3, Insightful)
Another way to see this is that Nvidia just wants to expand its marketshare. They are likely hoping that with something like this, they could sell expensive serve
Re: (Score:2)
Re: (Score:3, Insightful)
Re: (Score:2, Insightful)
Many applications do not need 30 fps, though. For example, an house architect software would be able to use this for rendering various shots of the designed house.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3, Insightful)
which purports to make photorealistic 3D images available to anyone on any computing platform, even things like smartphones. The idea is that all the rendering happens 'in the cloud,' which allows for a much wider distribution of high-quality images. RealityServer isn't released until November 30, but it looks like it could be interesting. The article has photos
Notice there is no emphasis on video or animation. This is for 3d images only. Or were you seriously hoping to play 3d realistic games on your phone?
Re: (Score:2)
Maybe not the phone - I can't imagine why anyone would really need high quality photo realistic Renderings on your phone - I mean once the image is rendered you can just put it on your phone and show people, if thats what you're going for. But there isn't exactly an Engineering or Architect App for the iPhone, as far as I'm aware (don't hold me to that).
However, in my experience, the only time where rendering is preferable over a picture is for entertainment purposes. Though someone above mentioned this wou
Re: (Score:2)
With this technique, it might be possible with a 4g connection.
Re: (Score:2)
RTFA. Animation of a dress worn by a model of a size specified by the user is given as an example.
Re: (Score:2)
how am I ever going to push 30 Photorealistic Frames through the internet - I can hardly get 5 Mb/s from my ISP.
I'm far from being a computer programmer/expert.
But say you have a display at, for argument's sake, 1280x1024 pixels at 32 bits per pixel. That's 41.9 million bits per frame. Call it 42 Mbits. You want to do that at 30 frames per second? You're up to 1.26 Gb/s. Now please raise your hands who has a 2GBs internet connection? OK there will be some compres
Re: (Score:2)
Then I beg you to come up with more than 5 practical applications.
Re:What about Data Transfer (Score:4, Informative)
How big is your screen?
That's the real question here. "Photorealistic" (a meaningless term in the context of transferring image data) on a smartphone screen is a whole lot smaller than on my full 1920x1280 desktop monitor.
"Photorealistic" will only ever be as high resolution as the screen you view it on.
Re: (Score:2)
You can have supersampled pixels to avoid jagged lines - for every pixel in the framebuffer, your raytracer might generate a grid of 8x8 or 16x16 rays, each of which has its own unique direction. These leads to smoother blended edges on objects. It takes considerably more time, but helps to improve the appearance of low resolution images, especially mobile phone screen which may only be 640x480 or 320x200 (early VGA screen resolutions).
Re: (Score:2)
That's a rendering trick that has zero impact whatsoever on the final size of a rendered frame. I highly doubt they're sending raytracing data to smartphones.
A pixel is a pixel is a pixel, no matter how you go about generating it, and a screen can only display so many of them. The smart phone or whatever isn't generating these images, it doesn't give a crap about the raytracing methods behind them. It just downloads them and puts them on the screen.
Reminds me of people who take a 300x300px image into Photos
Re: (Score:2)
The screen resolution of an iPhone is around 640x480 . I would guess that there are probably applications to allow larger images to be viewed through the use of scroll and zoom functionality. What I meant is that the server is going to do the raytracing, and all it has to do is send an image back to the iPhone.
Re: (Score:3, Informative)
Forget the data transfers, they'll increase, it's the latency that's the problem. Games using this technology will be almost useless, especially action games. Currently you get practically 0ms latency when you interact with a game, which is what makes it seem fast. If it's a multiplayer game then the only latency you get are from other people, and if they appear to go left 50ms later than when they pressed the button to go left it doesn't make a difference for you, since you don't know when they pressed the
Re: (Score:3, Interesting)
Not all games. Many genres would work great such as an RTS or many RPGs like WOW or Baldur's gate or any other game where the interface could be run locally on the portable's hardware and then let the server handle the rendering.
I imagine even a local 3D copy which is hidden from the user but handles all of the 3D Mechanics of detecting unit selection etc. Since it's not being shaded and it only needs collision meshes it would run fine on a cell phone. Then let the server render the well shaded and lit v
Re: (Score:3, Insightful)
Good point, I didn't think about that way. More specifically the server could for example render expensive global illumination and then send the textures to the client, which can use simple GPUs to apply the textures to local meshes.
Re: (Score:3, Funny)
Yes, no one could ever get you 30 frames a second, that's why we can't watch tv shows and movies online~
Re: (Score:2)
Re: (Score:2)
Uh oh... (Score:1)
Re: (Score:2, Funny)
Stop saying "cloud" (Score:5, Funny)
FTFA:
Why not just say:
I guess it's just not as cool...
I wonder if this would work for cooking?
Comment removed (Score:5, Insightful)
Re:Stop saying "cloud" (Score:4, Insightful)
Don't worry, in 6 months we will have another buzz word we can hate and cloud will be history.
Re:Stop saying "cloud" (Score:5, Funny)
I'm all about the "river computing" system. You dump whatever crap you want in, and its downstream's problem.
Re: (Score:2)
Duh! Come on, man, everybody knows that the Internet is not something that you just dump something on!
Re: (Score:2)
I've just worked out what "the cloud" means (Score:2)
"Not my responsibility".
Re: (Score:2)
I too was skeptical. But last night there was a presentation on cloud computing at Monadlug, and rerendering for a video service to insert new advertisements was given as an example. This is something that is being done NOW, a few dollars paying for 20 minutes of time on someone's "cloud", that would otherwise require that the video service buy a whole roomfull of expensive multiprocessor computers.
Amazon and Rackspace and others are already offering cloud services. I don't like it - I think everyone should
Re: (Score:2)
I think everyone should own all the processing power they need - but cloud computing is here, it's real, and it performs a valuable economic function.
Old news. It used to be called "server time". There are bits and pieces related to "server time" billing left in most Unix or Unix-like systems (which could probably be brought back to life if need be). No need to bring any meteorology in it.
"Sorry, your cloud computing operations have been cancelled because of an unexpected storm which washed away our reserve of zeroes"
Re: (Score:1)
Buzzwords can be fun. Next time you're scheduled for a sales presentation make up a bunch of cards with different sets of mixed buzzwords and give each attendee a card and a highlighter. The first person to get five buzzwords marked off should yell BINGO! and win a small prize for paying attention. It's called buzzword bingo. It works equally well whether you warn the presenters or not, since they can't help themselves. Some salespeople can't get past the first slide without "BINGO" ringing out.
Here's
Re:Stop saying "cloud" (Score:5, Insightful)
Shhhhh! You'll ruin the scam (of convincing uninformed people that an old idea is a new idea by renaming it).
Thin client -> fat client -> thin client -> fat client. *yawn*
Every time, this happens; things move away from the client for "performance" and "flexibility" and "scalability" reasons and then everyone realises it's a pain because of the lack of control or reliability and by that point the client hardware's moved on to the point where it can do the job better anyway so everyone moves back to it.
Re:Stop saying "cloud" (Score:5, Funny)
Thin client -> fat client -> thin client -> fat client. *yawn*
We were forced to stop using the term "fat client' here at Big Bank; our end-users got offended when they heard the term, apparently they thought we were talking about the /users/ and not the systems... Instead, we must call it "thick client"* -- which is odd, since if they interpret it the same way it's just as insulting from another direction.
*go ahead, laugh, but it really happened!
Re: (Score:3, Informative)
We were forced to stop using the term "fat client' here at Big Bank; our end-users got offended when they heard the term, apparently they thought we were talking about the /users/ and not the systems... Instead, we must call it "thick client"* -- which is odd, since if they interpret it the same way it's just as insulting from another direction.
You forgot how we used to refer to IDE devices as either a "master" or a "slave"... this wasn't back in the 50s either.
Re: (Score:2)
Just tell them that its the BSDM release.
Re: (Score:2)
We used to call "computers on wheels" COWs, except apparently a customer and/or customer's customer got very offended during implementation.
And yes, that really happened, too.
Re: (Score:1)
Thick client is also insensitive these days. You want to go with "fluffy client".
Just kidding... the least offensive term is, I believe, "rich client", though in a bank that could be confusing too.
Re: (Score:2)
lack of control (Score:2)
That depends on which side you are on.
For the people hosting ( or governments that want to butt in ) there is plenty of control.
If you own the symbolset, you own the mindshare (Score:1)
IBM tried it when they went to OS/2. Suddenly a hard drive was a "Fixed disk" and a motherboard was a "Planar board".
It's a sad game but it's the only one there is. It's fun to watch megacorporations fight to the death over ownership of a word.
Re: (Score:2)
Oh, and it's not real-time at all. IT will *at least* have the lag of one ping roundtrip. Then add some ms of rendering time and input/ouput on the device to it. On a mobile phone that can mean 1.5 seconds(!) in delay. Or ever more.
It's real-time, when it does not sound weird anymore, when I press a key in a music game, to hear the sound.
That's below 10 ms for me. But something around 50ms TOTAL for the average Joe.
Oh, and don't even think about winning a game against someone with real real-time rendering.
I
Re: (Score:1)
Photo-realistic on smart phones! (Score:2)
That low-resolution BlackBerry in your pocket will suddenly be capable of producing high resolution images?
Uh-huh.
Nvidia also claims that simply by wiring money into their account, they can make you lose weight by doing nothing at all!
Re: (Score:2)
The point is that the blackberry doesn't do any processing. It just streams the end result. Which is certainly doable considering the ZuneHD can playback 720p HD footage and it's not much bigger than a blackberry.
Re: (Score:2)
I'm not talking about real-time processing (which cloud rendering can help with).
The new Zune HD is one of a few select devices that actually supports a decent resolution. It pisses me off because I can't use a Zune in Linux, and I won't buy a Zune, but it does have perhaps the nicest screen of any portable device on the market right now.
Most smart phones have low-resolution screens. You can't produce a photo-realistic image on a low-resolution screen, regardless of pushing rendering out to the cloud.
Re: (Score:1)
...it does have perhaps the nicest screen of any portable device on the market right now. Most smart phones have low-resolution screens.
Really?! I'm looking at specs now; by what I see, the Zune HD 32 has a 480x272 pixel screen.
There are quite a few smart phones out there with better than that. The Droid has 480 x 854; the HTC hero has 320x480; the Nokia N900 has 800x480. Even the iPhone, which doesn't have stellar resolution, is 480x320.
Paranoid I am (Score:1, Offtopic)
A few security questions
Any attempt at encryption?
Considering that pretty much all internet traffic is copied, how hard would it be to watch someone's screen?
Is this processing limited to extreme graphics or is that spreadsheet being watched.
Yes there are plenty more, but enough for now.
Re: (Score:3, Insightful)
note: I actualy LIKE nVidia video cards, but the writing is on the wall. AMD is going to be putting out a veritable monster with CPU + GPU on a single chip, and Intel is going to be doing similar with larrabee (more general purpose, tho.)
nVidia can't compete without its own line of x64 chips, and they are just too far away from that ca
Re: (Score:2)
Actually, Intel is putting its GMA into the CPU, not Larrabee. In no way will Intel GMA spell the end for discrete graphics cards, a category which includes Larrabee.
Re: (Score:2)
AFAIK, GMA is never going to be integrated into the CPU. Its going to continue to be integrated in motherboards.
Re: (Score:2)
Re: (Score:1)
And we haven't heard promises of technology X doing Y spelling the end of Z before?
I'll believe it when they have functional units past the prototype phase at a reasonable cost.
You may only disagree if you post while driving your flying car.
The article kinda misses the point (Score:2)
While the comments here are mostly negative, I can say this is a big leap ahead for rendering technology mainly because the rendering is occuring at the hardware level, rendered on the Nvidia processors on a video card, instead of the CPU via software rendering. They are calling this iray and it's developed by mental images, not nvidia. While video cards are currently great at rendering games in real time, they require a tremendous amount of shader programming and only do this sort of rendering within the c
Re: (Score:2)
It's not the leap you make it out to be.
Ray tracing has been done on video hardware for quite a while. It still takes a pile of shader programming. These things are programmed using CUDA, which is really just another layer over top of a shader. The 200 parallel processors in a Tesla are really just a modified version of the X number of shaders on your video card. Yeah, the Tesla boxes are cool, but they're not a revolutionary change - people have been talking about GPU clusters for a long time.
The "clou
Games make no sense... (Score:3, Interesting)
Wanna know what playing games on a system like this would be like? Go to your favorite video streaming site and change the player settings (if you can) to 0 caching. The end result is, approximately, what you'd get here. The internet is a very unstable place. The only reason online games work is that programmers have gotten really good at developing latency hiding tricks which all stop working when the video rendering is done by the server. And, don't think this will just effect fps games. Just because it doesn't make or break a game like WOW doesn't mean you'd want the stuttering game-play you'd have to put up with. As far as I can see, the only kind of game this would be useful for it photo-realistic checkers.
Re: (Score:2)
What about Dragon's Lair or Space Ace? Or how about all the "games" out there which are mostly noninteractive cut scenes?
Hmmm...I see the big game studios may be moving back to those "choose your own adventure [wikipedia.org]" video titles of the late 1990s...except in 3D!!!! Mwahahaha! (**cue cheesy villan is victorious music**)
Though I did think the Star Trek: Klingon one was a bit cool...
SGI?! (Score:1)