Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software

Final Fantasy At 2.5FPS 308

Rikardon writes: "Adding a little fuel to the ATi-vs-NVIDIA fire started earlier today on Slashdot, NVIDIA and Square are showing a demo at SIGGRAPH of Final Fantasy: The Spirits Within being rendered in 'real time' (four-tenths of a second per frame) on a Quadro-based workstation. Now that I think of it, this should also inject new life into this debate." Defender2000 points to the Yahoo article. Update: 08/14 09:30 PM by T : Original headline was wrong, said ".4FPS" but as cxreg pointed out, .4 frame per second isn't .4 seconds per frame. Sorry.
This discussion has been archived. No new comments can be posted.

Final Fantasy At .4FPS

Comments Filter:
  • by Pop n' Fresh ( 411094 ) on Tuesday August 14, 2001 @04:24PM (#2112275)
    Rendering a movie at 320 x 240 or 640 x 480 is much easier than rendering it at the resolution and size of a movie theater's screen. If the Quadro was rendering the movie at 100 x 75 pixels, all this doesn't mean much.
  • by donglekey ( 124433 ) on Tuesday August 14, 2001 @05:05PM (#2118746) Homepage
    Yep, very true. Also, I think that a lot of people aren't considering that even though frames might have taken 90 min. on an SGI the entire frame is not rendered all at once. I don't know for sure if the 90 min. refers to the entire frame, but I doubt it. There are layers upon layer for backgrounds, main characters, the ghost alien phantom things, shadow passes, reflection passes, caustic passes (in rare cases) and on and on.
  • Re:It's amazing... (Score:3, Insightful)

    by TBone ( 5692 ) on Tuesday August 14, 2001 @04:39PM (#2132714) Homepage
    Not only that, but holy bejeezus, there isn't a single link to the pertinent information in the submitter's italicized text. Timothy had to pull the story link out of some other submission. Come on people, I don't care about your freaking thread on Slashdot in the last 8 articles that mentioned Nvidia or SIGGRAPH or Squaresoft, I want to see the story.
  • by Rothron the Wise ( 171030 ) on Tuesday August 14, 2001 @05:19PM (#2133070)
    1. The size of the rendered frames probably doesn't matter at all. A scene from the FF-movie is most likely bound by polygon-throughput and texture memory which has to be swapped in and out (a real performance killer that one). You'd probably get roughly the same performance in 640x480 as in 1600x1200 antialias or no antialias.

    2. Renderman shader code implemented using pixel shaders? Hah, surely not in current hardware, and I doubt we'll see it for a few years at least, and by then Renderman will have moved on.

    3. Of course, the lighting model is a lot more primitive in the real time version, and the card can't do all the nifty post-prosessing done in the movie.

    All the macho marketing crap from Square and NVidia aside, this shows that graphics cards are able to give a pretty darn good preview of the finished frame in a very short time, which will be very valuable to animators when compositing and lighting scenes etc.
  • Apples to Oranges? (Score:5, Insightful)

    by All Dat ( 180680 ) on Tuesday August 14, 2001 @04:37PM (#2143859) Homepage
    Notice how the official "press release" doesn't state the resolution it was rendered at? What's the movie resolution? Several thousand by Several thousand I imagine. Does doing it a 640x480 or LOWER mean the same thing? I have a hard time believing that a Quadro Setup can render something in .4 of a sec that their SGI setup takes 90mins to do. If Nvidia WAS INDEED 100,000 times faster at this using a Quadro setup, we might have heard of this before? Something's missing from this methinks.
  • by lordpixel ( 22352 ) on Tuesday August 14, 2001 @05:44PM (#2157114) Homepage
    Hmm, I seem to remember reading a very convinving analysis which said about 3000x2000 was what was required to match standard 35mm film *well projected* in an average size theatre.

    I don't have a link, but it was all to do with the physical optics of the eye and the point at which the eye can't tell the difference between one dot and two dots when projected onto the opposite wall. Sooner or later you just don't have enough retinal cells to be able to see any more detail.

    My fear is that by pushing this through a couple of years too early at this slightly lower resolution we'll see a net loss of quality. If the switch to digital was to happen in 5 years time then theatres' projectors and studio's cameras would be more likely to be 3000x2000 equipment.

    If the public accepts the lower resolution, why spend the money on upgrading.

    That said I saw Akira digitally projected this year on a huge screen (of course, it was originally film, not digital tape) and it was beautiful.

    Of course, given most movies most of us see are projected using dirty equipment by an untrained 16 year old at a multiplex it probably doesn't make any difference. The current resolution is probably good enough. A bit like DVD and HDTV.
  • by dillon_rinker ( 17944 ) on Tuesday August 14, 2001 @08:35PM (#2170438) Homepage
    I'd submit that the ultimate limit of real-time rendering will be when the onscreen characters are able to pass a sort of Turing test - are they human or computer generated actors? When the audience can't decide (ie, when the vote is split), the point of diminishing returns will have been reached. Further effort beyond that time will be devoted to better physics and more realistic modeling of human behavior - doesn't matter if you have that perfect rendering of a human face if the eyes never smile when the mouth does.
  • Re:Lets see... (Score:2, Insightful)

    by benb ( 100570 ) on Tuesday August 14, 2001 @10:14PM (#2170696) Homepage Journal
    > 60 frames per second divided by .4 (frames per
    > second) = 150

    Not .4 frames per second, but "four-tenths of a second per frame", i.e. the other way around, which is the 2.5fps you can see in the title.

    If you double that 3.5 times, you have 30 fps, i.e. it will be ~3.5*9 months = ~2.5 years until we have it.

The use of money is all the advantage there is to having money. -- B. Franklin

Working...