Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Image

Nvidia's RealityServer 3.0 Demonstrated 91

robotsrule writes "As we discussed last month, RealityServer 3.0 is Nvidia's attempt to bring photo-realistic 3D images to any Internet-connected device, including the likes of Android and iPhone. RealityServer 3.0 pushes the CPU-killing 3D rendering process to a high-power, GPU based, back-end server farm based on Nvidia's Tesla or Quadro architectures. The resulting images are then streamed back to the client device in seconds; such images would normally take hours to compute even on a high-end unassisted workstation. Extreme Tech has up an article containing an interview with product managers from Nvidia and Mental Images, whose iray application is employed in a two-minute video demonstration of near-real-time ray-traced rendering." Once you get to the Extreme Tech site, going to the printable version will help to preserve sanity.
This discussion has been archived. No new comments can be posted.

Nvidia's RealityServer 3.0 Demonstrated

Comments Filter:
  • Re:Hours and hours (Score:3, Interesting)

    by Idiomatick ( 976696 ) on Monday November 16, 2009 @01:49AM (#30112342)
    fine... but w/e you did back in ms dos likely isn't a fraction as complex. And with it being done on another's servers there is no need to hold back on complexity... I'm thinking rendering a birds eye shot in LOTR would have taken a damn long time on a phone...

    BTW it took weta 4hrs per frame to render... likely not on a cellphone.
  • Re:Hours and hours (Score:3, Interesting)

    by MichaelSmith ( 789609 ) on Monday November 16, 2009 @01:56AM (#30112378) Homepage Journal

    Maybe if you are trying to render an MMO, a single render farm can do less work in total than all the clients rendering from their own POV.

  • Re:Hours and hours (Score:1, Interesting)

    by Anonymous Coward on Monday November 16, 2009 @02:35AM (#30112536)

    You should have tried rendering something other than the simple POV-Ray sphere tutorials. I used to use POV-Ray in MS-DOS and some of my more complex scenes (ie. a model of the solar system with space stations and starships) took weeks to months to render on a 486DX2-66. Take one of those scenes, multiply the details and polygons by a factor of 100, then have it render at a minimum of 60 frames per second.

    So yeah, you're off by quite a bit there.

  • Good for VR (Score:4, Interesting)

    by cowtamer ( 311087 ) on Monday November 16, 2009 @03:12AM (#30112682) Journal

    This is a great advancement for high end virtual reality systems, but the current state of "rendering in the cloud" sounds like either a solution looking for a problem or the wrong application of the technology.

    On a future Internet with sub 30 ms latency, this would ROCK. [You could have low-powered wearable augmented reality devices, "Rainbows End" style gaming, and maybe even the engine behind a Snow Crash style metaverse that remote users can log in to].

    NVidia is NOT doing itself a favor with the lame empty office with boring blinds demo. They'd better come up with something sexier quick if they want to sell this (and I don't mean the remote avatar someone posted a link to).

    This reminds me of the "thin client" hype circa 1999. "Thin clients" exist now in the form of AJAX enabled web browsers, Netbooks, phones etc, but that technology took about a decade to come to fruition and found a different (and more limited) niche than all the hype a decade ago [they were supposed to replace worker's PCs for word processing, spreadsheets, etc].

  • Re:Hours and hours (Score:3, Interesting)

    by Artraze ( 600366 ) on Monday November 16, 2009 @04:15AM (#30112962)

    Did your computer have an FPU? You cellphone doesn't, so despite it's 200+MHz(?) clock, you'll be lucky to get much past 10MFLOP/s, especially since the library code may often miss the cache (it's pretty limited on ARM). Also, POV scenes frequently use parametric surfaces, rather than meshes, making calculations easier and much less memory intensive than the high-poly meshes used in the demo scenes.

    So, maybe a month may be a bit long, but I don't really think that it'd be able to do much better than a week per frame. (Especially since the only way it'd be able to store the data (mesh and texture) would be an SD card, which ain't quick either).

  • by nateb ( 59324 ) on Monday November 16, 2009 @04:43AM (#30113098)
    One word: iPhone app.

    Imagine Street View rendered in the direction you are holding your phone, from your position. With all the goodies that that 3D map that someone was building a while back (and sure could be ongoing) plus a live application of the algorithm from Canoma and similar applications, you could have a pretty interesting "virtual" world. Another benefit would be that while using the application, you could be aiding the mapping backend with live GPS to refine the map and the 3D model on top of it.

  • Comment removed (Score:3, Interesting)

    by account_deleted ( 4530225 ) on Monday November 16, 2009 @05:48AM (#30113368)
    Comment removed based on user account deletion
  • Re:Hours and hours (Score:3, Interesting)

    by poetmatt ( 793785 ) on Monday November 16, 2009 @12:07PM (#30116264) Journal

    bandwidth issues that will continue even into 4g and othwerwise, it has uses, just not mobile. I agree there may be some PC use - this general idea was not unexpected. With adding graphics support to mainstream virtualization this is somewhat of the next step.

Your computer account is overdrawn. Please reauthorize.

Working...