Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Final Fantasy At 2.5FPS 308

Rikardon writes: "Adding a little fuel to the ATi-vs-NVIDIA fire started earlier today on Slashdot, NVIDIA and Square are showing a demo at SIGGRAPH of Final Fantasy: The Spirits Within being rendered in 'real time' (four-tenths of a second per frame) on a Quadro-based workstation. Now that I think of it, this should also inject new life into this debate." Defender2000 points to the Yahoo article. Update: 08/14 09:30 PM by T : Original headline was wrong, said ".4FPS" but as cxreg pointed out, .4 frame per second isn't .4 seconds per frame. Sorry.
This discussion has been archived. No new comments can be posted.

Final Fantasy At .4FPS

Comments Filter:
  • by soboroff ( 91667 ) on Tuesday August 14, 2001 @04:25PM (#2116825)
    ``It has long been an artist's dream to render CG animation in real-time,'' stated Kazuyuki Hashimoto, CTO at Square USA.
    We've been able to render CG animation in real time since Ivan Sutherland was a grad student. What makes it hard is a classic Parkinson's law: your needs expand to fill existing processor power. When the movie companies and animation houses have more horsepower, they will go to the next level and push the state of the art in CG back from what's capable of being done in real-time.

    The FF render times sound about the same as numbers I heard from Pixar about Toy Story. What was that post a couple weeks ago, about the machine you want always costing $5000? Well, the frame you want to render will always take 90 minutes.

  • Blinn's Law (Score:1, Informative)

    by Anonymous Coward on Tuesday August 14, 2001 @05:19PM (#2120694)
    It's called Blinn's Law (after Jim Blinn). The artist will increase complexity of the scene to negate any speed improvement due to upgraded hardware [wait time is constant]
  • by ThisIsNotATest ( 515208 ) on Tuesday August 14, 2001 @04:40PM (#2124677)
    While it is impressive to see the movie rendered in real-time (with adjustable lighting sources and shadows and reflections) it really doesn't look as good as the movie did. I'm at siggraph now (just saw the demo five minutes ago) and the interactive polygon rendering techniques just can't match the radiosity/raytracing used for professional moves - its getting close though!
  • Lets see... (Score:4, Informative)

    by swordboy ( 472941 ) on Tuesday August 14, 2001 @04:23PM (#2125189) Journal
    60 frames per second divided by .4 (frames per second) = 150. If we oversimplify and apply Moore's law to the speed of 3D processors, we will halve this every 18 months.

    As I see it, we are about 7 - 8 years away from this kind of rendering in real time.

    Thoughts? Comments? Complaints?
  • by misaka ( 13816 ) <[moc.xobop] [ta] [akasim]> on Tuesday August 14, 2001 @04:45PM (#2127260) Homepage
    It sure sounds nice when they write that they can render something that took 90 mins per frame at .4 seconds per frame, but is this really a fair comparison? I don't doubt that NVIDIA is bringing some wicked technologies to the table, but let's also consider:
    1. Size of rendered frames. What resolution was NVIDIA rendering out, maybe 640x480? 1024x768? FF was probably rendered out at 1880x1024 (about 2-3 times the number of pixels as compared to 1024x768) if not more.
    2. How did they have to massage the data before passing it to the rendering pipeline? I hear FF was rendered with Renderman ... are they claiming they can render RIB files through the Quadra chipset? If not, how much time does it take to convert/cook the data? If so, then ... wow
    3. How good did it look in the end? Were all the elements rendered properly, and does it really look anywhere near as good as the movie we saw in the theatre?
    Don't get me wrong, I'm excited to see this kind of technology coming, I can totally see this replacing, or at least complementing, our Linux render farm at some point in the future. But it sure would be nice if we had some usefull technical details to qualify this 90 mins verses .4 seconds render time comparison.

    --M

  • by zhensel ( 228891 ) on Tuesday August 14, 2001 @05:03PM (#2130342) Homepage Journal
    It doesn't actually take 90 minutes per frame to render. That is if the rendering was done on a single CPU. Square used a massive renderfarm so each frame took a variable amount of time based on the complexity of the rendered image and the fraction of the farm dedicated to that particular render operation. That's why you see things like "Final Fantasy took 1 million years to render" or whatever when you know it isn't exactly true. Look at ArsTechnica where they did an interview with some people from Square about the rendering process. I think there was even a slashdot article about it.

    And yes, it's a little rediculous for NVidia to suggest that their card is 100k times faster than Square's rendering hardware for FF-TSW. But what's more rediculous is that yahoo took that statement and printed it in its article with no explanation of exactly what NVidia means when they say that.
  • Do the math... (Score:2, Informative)

    by tweakt ( 325224 ) on Tuesday August 14, 2001 @04:19PM (#2132239) Homepage
    .4FPS is NOT the same as "Four-tenths of a second per frame" which is it?
  • Re:Lets see... (Score:3, Informative)

    by jedwards ( 135260 ) on Tuesday August 14, 2001 @04:25PM (#2132942) Homepage Journal
    You only need 24fps for the cinema. Knocks a year or so off your estimate.
  • Re:So what? (Score:3, Informative)

    by donglekey ( 124433 ) on Tuesday August 14, 2001 @04:42PM (#2139388) Homepage
    You have obviously never worked in a production environment. Rendering isn't everything. Nothing is everything. No one will ever be saying that rendering is the only thing, because anyone with half a brain cell knows that it isn't, and anyone who has ever looked at CG knows that you are stating the obvious.

    Rendering fast is a big deal though. Actually, its a fucking big deal. The faster something can be rendered, that faster people can work because the interactivity is there. Many 3D programs are instituting semi-real time fully rendered previews over limited spaces, like Softimage, 3DS etc. Everyone realizes the extensive work that goes into a movie. Toy Story took around a month and a half to render, I don't think anyone thinks that a movie can be made in a month and a half and it probable never will. (A good movie that is). Fast rendering is what drives the animation industry by allowing more interactivity, more complexity, and an every increasingly powerful toolset.

    I can't make a movie sitting here on my computer. I don't have the computing power for it. All of those other things keep me from the mecca of the one-man movie as well, but I could do them in theory. What I cannot overcome is the power it takes to render, and that takes computers, which likewise take money. So 'yippee' is right, it is a big deal to render faster.

    Now does this particulare demo mean anything? Yes and no. Geforce 3's and Radeon 8500's won't mean anything to final rendering time for a while, that would take alot of programming that hasn't been done yet. But interactivity is a huge deal, and it makes all the difference in the world to an artist who doesn't want to be constrained.
  • by RadagastTheMagician ( 469373 ) on Tuesday August 14, 2001 @04:59PM (#2140545) Homepage
    Lucasfilm's Sony camera, on which they have filmed Episode II, and which was considered to completely supercede analog film, picks up 1920x1080 resolution. You don't really need that much resolution to look fantastically better than what passes for film these days.
  • by Anonymous Coward on Tuesday August 14, 2001 @05:02PM (#2143017)
    http://www.nvidia.com/view.asp?IO=final_fantasy Notice the picture of the girl is virtually textureless. No moles or skin blemishes like in the movie. Also they say the "Technology Demo" runs fast while the movie ran 90 minutes per frame. It does not say they rendered the actual movie at that speed.
  • Re:Lets see... (Score:3, Informative)

    by skroz ( 7870 ) on Tuesday August 14, 2001 @04:36PM (#2143858) Homepage
    Actually, at 2.5 frames a second, you'll only need about 5 years, give or take a few months.
  • by MarkoNo5 ( 139955 ) <<MarkovanDooren> <at> <gmail.com>> on Tuesday August 14, 2001 @05:07PM (#2144710)
    From http://movieweb.com/movie/toystory/toystory.txt [movieweb.com]

    "RESOLUTION:The resolution of a digital image refers to the number of pixels stored. For "Toy Story," the resolution is typically 1536 x 922 pixels."

    Marko No. 5
  • Give me a break. (Score:1, Informative)

    by rash ( 83406 ) on Tuesday August 14, 2001 @07:06PM (#2145756) Homepage
    1. Antialiasing
    2. Resolution
    3. Motionblur
    4. CatmullClarc Subdiv surfaces
    5. Color correction
    6. MASSIVE TEXTURE SIZES
    7. Extreme number of hairs.
    8. Displacement shaders
    9. Texture Shaders
    10. light shaders
    11. Displacement of all genetic geometry at once.
    ie every object has different xyz values for every vertex in every frame.

    I seriusly dont that this DEMO does that.
    Untill you get prman to run on your
    gfx card this kind of thing means shit.
    Not system has enogh bandwidth to plaback
    a movie in even preview mode.
    You need to render it first.
  • by MtnMan1021 ( 47935 ) on Tuesday August 14, 2001 @05:20PM (#2155060)
    I'd be curious to see if directors/artists will ever be "satisfied" with the quality of cg... if the level of cg will be indiscernible from reality to the untrained eye, as processor speed increases and the render complexity remains constant, render time would speed up. given the teasers i've seen of FF, that level can't be too many years off. do you think fully "realistic" cg is attainable?
  • Re:Hmm... (Score:1, Informative)

    by Judas96' ( 151194 ) on Tuesday August 14, 2001 @05:38PM (#2157048)
    Square has already said they Final Fantasy XI would come out on PS2, XBOX, PC, and possibly even the GameCube... http://gamespot.com/gamespot/stories/news/0,10870, 2784927,00.html http://www.rpgamer.com/news/Q3-2001/071401f.html
  • by Dynedain ( 141758 ) <slashdot2 AT anthonymclin DOT com> on Wednesday August 15, 2001 @12:35AM (#2171122) Homepage
    I was at SIGGRAPH, they are using a 34" (widescreen proportions) plasma (HDTV?) display, with absolutely no visible pixelation. As for the speed difference, remember that the SGI cluster square used was handleing composite rendering, and as such the various "layers" (specular, shading, etc...) can easily be split up to significantly increase the speed. The NVidia solution doesn't break it up the same way, and simplifies a lot of it (the hair for instance!) Lighting and radiosity is significantly downgraded from the original, and her hair is definately not to the same level as in the movie, and the bumpmapped cloth textures seem much more pronounced....but....this is far beyond any display of on the fly 3D rendering compared to final movie-quality product.

And it should be the law: If you use the word `paradigm' without knowing what the dictionary says it means, you go to jail. No exceptions. -- David Jones

Working...