Final Fantasy At 2.5FPS 308
Rikardon writes: "Adding a little fuel to the ATi-vs-NVIDIA fire started earlier today on Slashdot, NVIDIA and Square are showing a demo at SIGGRAPH of Final Fantasy: The Spirits Within being rendered in 'real time' (four-tenths of a second per frame) on a Quadro-based workstation. Now that I think of it, this should also inject new life into this debate." Defender2000 points to the Yahoo article. Update: 08/14 09:30 PM by T : Original headline was wrong, said ".4FPS" but as cxreg pointed out, .4 frame per second isn't .4 seconds per frame. Sorry.
At what resolution? (Score:2, Insightful)
oh my bad memory and deleted old stories (Score:1)
1. they actually were able to do it, this quickly
2. it was said that nvidia had no chance of ever being able to do something like this as it they weren't serious graphics cards.
this comes from an article several months ago on maximum pc [maximumpc.com] (sorry i checked for the link but it's not to be found, but if a staff member from there reads this and they can find the link please link it) which was about a flame war that someone from nvidia and one of the guys from sgi (i might be wrong on sgi, sorry if i am and i forget the names, it's not cause i don't want to leave them out it's that i don't remember), where the guy from sgi said that nvidia was off his rocker with saying that nvidia cards could ever come near the level of performance to do something like toy story and it would be many years, tho the article is less than a year old.
well i guess the guy from sgi is eating crow now after reading what the nvidia cards are doing what they said would take years till bill gates becomes a linux lover in less than a year and right now i don't think that bill has really embraced the penguin quite yet.
it was just an interesting side note to this story.
Re:oh my bad memory and deleted old stories (Score:4, Interesting)
0.04166 SPF (Score:1)
Re:0.04166 SPF (Score:2, Interesting)
Re: (Score:2, Funny)
So what? (Score:3, Offtopic)
So, yippee, it can render fast...too bad that has NO BEARING on the actual quality of the production (with the possible exception of the team gets to iterate on the work a little more).
Re:So what? (Score:3, Informative)
Rendering fast is a big deal though. Actually, its a fucking big deal. The faster something can be rendered, that faster people can work because the interactivity is there. Many 3D programs are instituting semi-real time fully rendered previews over limited spaces, like Softimage, 3DS etc. Everyone realizes the extensive work that goes into a movie. Toy Story took around a month and a half to render, I don't think anyone thinks that a movie can be made in a month and a half and it probable never will. (A good movie that is). Fast rendering is what drives the animation industry by allowing more interactivity, more complexity, and an every increasingly powerful toolset.
I can't make a movie sitting here on my computer. I don't have the computing power for it. All of those other things keep me from the mecca of the one-man movie as well, but I could do them in theory. What I cannot overcome is the power it takes to render, and that takes computers, which likewise take money. So 'yippee' is right, it is a big deal to render faster.
Now does this particulare demo mean anything? Yes and no. Geforce 3's and Radeon 8500's won't mean anything to final rendering time for a while, that would take alot of programming that hasn't been done yet. But interactivity is a huge deal, and it makes all the difference in the world to an artist who doesn't want to be constrained.
Re:So what? (Score:1)
Rocky (1976) was shot in 28 days. Granted, that doesn't include editing and other post production but it's possible (probable?) the final product was finished in around 1.5 months. And yes, I think it's a good movie. =P
Re:So what? (Score:2)
Re:So what? (Score:5, Funny)
Re:So what? (Score:2)
Yeah, well, it doesn't take long to string clichés together....
-jYou actually saw the movie? (Score:1)
I hope you at least snuck into the theater?
If not, demand your money back.
What's the deal with panning FF's writing? (Score:2)
Final Fantasy was anime. Since when have we expected anime films to have good scripts? OK, The Matrix had a pretty good script. Apart from that.
Re:What's the deal with panning FF's writing? (Score:1)
Hell, at even Digi Charat was more internally consistent!
Re:So what? (Score:2)
Once the hard work is done... (Score:2)
what does realtime rendering give us?
1. "What now, master?"
2. "Now turn around, bend down and touch your toes!"
But the big question is (Score:5, Funny)
Re:But the big question is (Score:3, Funny)
The what?
-jRendering in real-time won't happen... (Score:5, Informative)
The FF render times sound about the same as numbers I heard from Pixar about Toy Story. What was that post a couple weeks ago, about the machine you want always costing $5000? Well, the frame you want to render will always take 90 minutes.
Blinn's Law (Score:1, Informative)
Re:Rendering in real-time won't happen... (Score:1)
For audio, MP3 uses the quirks in our hearing systems to filter out useless or less important data, how often do we stare at what is in our peripheral vision?
Of course, this would be good for the general population, and not CG artists and fans of their work.
~poloco
Re:Rendering in real-time won't happen... (Score:2)
I can just the the wave of "real-time rendering" promos.
Sure...we can render this at 30fps - it's a polar bear in an snow storm, or there is our other demo...a story of one mans view of his world around him...oh, did we mention that man is blind, so the screen stays black the whole time.
Or is that lossy type algorithms applied to human intelligence
I suppose one could do things like "This part of the scene will be blurred in post - render it in low-res" kinda optimizations, if they are not done already.
Re:Rendering in real-time won't happen... (Score:2, Informative)
Re:Rendering in real-time won't happen... (Score:3, Insightful)
digital film specs (Score:1)
So now which is smaller...? (Score:1)
Re:So now which is smaller...? (Score:1)
Finally some screenshots (Score:5, Interesting)
The article (on yahoo) is pretty exagerated and sensationalistic, but the images are still very impressive, even they are about what you would expect at 2.5 FPS with such a powerful card. I think it is a pretty good indication of what the next generation of console games (after gamecube and x-box) will look like.
Ask the RenderMan users, not PC card marketing (Score:1)
I still read newsgroups :) and while there's not much about Siggraph 2001 on c.g.r.renderman
at the moment, there's some stuff about the GS cube - if the RenderMan users bother about this FF demo at all, I suspect they won't be impressed. Vermifax's /. post (#44) Score 5 Interesting [slashdot.org] makes the point well, plus it has the Tom Duff quote.
,It doesn't look as good as the movie (Score:5, Informative)
Re:It doesn't look as good as the movie (Score:2)
And from attending this year's RenderMan course, it was mentioned that hair (particularly with Aki) was the major bottleneck, enough that "upwards of 80 percent of Aki's render time could be spent rendering just her hair." Keep that in mind when you consider the render time for a character-oriented scene like this one.
Lets see... (Score:4, Informative)
As I see it, we are about 7 - 8 years away from this kind of rendering in real time.
Thoughts? Comments? Complaints?
Re:Lets see... (Score:1)
Moore's Law says absolutely NOTHING about performance (let alone "double speed in 18 months".
All Moore's Law states is that chip COMPLEXITY (*NOT* performance) doubles every 18-24 months.
Re:Lets see... (Score:3, Informative)
Re:24 fps computer != 24 fps film (Re:Lets see...) (Score:2)
Re:Lets see... (Score:3, Informative)
Re:Lets see... (Score:2, Insightful)
> second) = 150
Not
If you double that 3.5 times, you have 30 fps, i.e. it will be ~3.5*9 months = ~2.5 years until we have it.
A few factors to consider ... (Score:5, Informative)
--M
Re:A few factors to consider ... (Score:1)
Re:A few factors to consider ... (Score:3, Insightful)
2. Renderman shader code implemented using pixel shaders? Hah, surely not in current hardware, and I doubt we'll see it for a few years at least, and by then Renderman will have moved on.
3. Of course, the lighting model is a lot more primitive in the real time version, and the card can't do all the nifty post-prosessing done in the movie.
All the macho marketing crap from Square and NVidia aside, this shows that graphics cards are able to give a pretty darn good preview of the finished frame in a very short time, which will be very valuable to animators when compositing and lighting scenes etc.
THEIR original headline was wrong, not mine (Score:1)
My suggested headline was "FF:TSW rendered in real time on NVIDIA Quadro."
I said nothing in the headline about
Just to clear that up.
I don't know about you... (Score:1)
. . . I kind of think
Just kidding, of course. :-)
a few more years... (Score:2)
The user could interact with the movie and affect the animation in real time. Or, to put that in perspective, imagine fragging your office mates in a photo-realistic Quake VIII. :)
Re:a few more years... (Score:1)
I can rephrase "DVD quality" to "looks as good as DVD", if it will help you understand. I'm sure eveyone knew what I meant.
Do the math... (Score:2, Informative)
Re:Do the math... (Score:2)
Re:Do the math... (Score:2)
"Quadro" vs. "GeForce" (Score:3, Interesting)
As I pointed out previously [slashdot.org], NVidia's "Quadro" and "GeForce" lines are actually the same hardware. GeForce 2 boards can be "converted" to Quadro 2 boards with a jumper. [geocities.com]
The GeForce 3 and "Quadro DCC" boards both use the NVidia NV20 chip, have the same driver, and appear to be very similar if not identical. It's hard to find differences in the feature set. Only ELSA (which is basically a unit of NVidia) sells the Quadro DCC, and apparently only through 3DS Max dealers, along with a special 3DS MAX driver. It's more of a private label than a real product line at this point.
Rendered Goosebumps (Score:2)
Ok, quick Geek Test: If, upon reading this news post (despite the ditzy title), you did not instantly gasp, shiver, or become aroused, you are NOT a geek. Period.
Which sort of answers my question to my friend after we watched FF for the first time. Is this the top of our abilities in CG? Or was it a matter of the producers saying, "Um...no. We can do a LOT better, but we'd have to wait 100 years for it to build/animate/render instead of 2, so we cut it down to size."
If that is the case, then it's just a matter of BBF (Bigger, Better, Faster (tm)) in terms of hardware before we see something twice as good as FF. Otherwise, if this is the height of skill we have, then we're talking development of new technologies and methods of doing this sort of detail before we see something else come out.
I'm no graphics expert, so maybe someone can answer that question for me. At any rate, the movie still made me shiver. Now I can watch it on my desktop...at 2.5fps, not
at the same resolution (Score:2, Interesting)
Hmm... (Score:2, Interesting)
- Square has tie-ins to Sony (exclusivity clause of Final Fantasies on the PS1, rights to publish the movie).
- Microsoft has tie-ins to nVidia (nVidia makes some of the chips for the XBox).
-Square now has tie-ins to nVidia with this demonstration.
Does this mean that more Square games will get ported to the nVidia chipsets, most notably Final Fantasy for the XBox? If I had a choice between the relative hardwares (rather than my PC, which would come first) I'd love to see what Square could do with an nVidia chipset.
Resolution, details, etc. (Score:2)
Also, 2.5 FPS isn't "real time". 24 fps film is "real time". 30 fps on video is "real time".
HOWEVER, this would be incredibly useful for generating dalies; spot render checks; web-based trailers and streaming video; Television-quality animation; etc.
Now you can PROVE to a director that a plot sucks, even in final form, and no all the whiz-bang graphics don't help!
Apples to Oranges? (Score:5, Insightful)
Re:Apples to Oranges? (Score:3, Insightful)
Re:Apples to Oranges? (Score:3, Informative)
And yes, it's a little rediculous for NVidia to suggest that their card is 100k times faster than Square's rendering hardware for FF-TSW. But what's more rediculous is that yahoo took that statement and printed it in its article with no explanation of exactly what NVidia means when they say that.
Re:Apples to Oranges? (Score:3, Informative)
Excuse me? (Score:2)
Re:Apples to Oranges? (Score:3, Insightful)
I don't have a link, but it was all to do with the physical optics of the eye and the point at which the eye can't tell the difference between one dot and two dots when projected onto the opposite wall. Sooner or later you just don't have enough retinal cells to be able to see any more detail.
My fear is that by pushing this through a couple of years too early at this slightly lower resolution we'll see a net loss of quality. If the switch to digital was to happen in 5 years time then theatres' projectors and studio's cameras would be more likely to be 3000x2000 equipment.
If the public accepts the lower resolution, why spend the money on upgrading.
That said I saw Akira digitally projected this year on a huge screen (of course, it was originally film, not digital tape) and it was beautiful.
Of course, given most movies most of us see are projected using dirty equipment by an untrained 16 year old at a multiplex it probably doesn't make any difference. The current resolution is probably good enough. A bit like DVD and HDTV.
Re:Apples to Oranges? (Score:2, Interesting)
I know that no one is broadcasting or releasting at 1080i resolution yet, but it's only a matter of time. DVD has allowances for this, as do some of the new tubes coming out of sony. Even my 19inch monitor sitting here on my desk does 1920x1080.
Scott
PS. DivX encoded 1920x1080 Lightwave rendered animations look sweeeeeeeet...
Re:Apples to Oranges? (Score:1)
Re:Apples to Oranges? (Score:2, Interesting)
Re:Apples to Oranges? (Score:2)
Re:Apples to Oranges? (Score:1)
Re:Apples to Oranges? (Score:2)
Sorry, but no (Score:3, Interesting)
From IMDB:
Final Fantasy was shot at 1.85:1.
Anyway, movie aspect ratios have varied ever since the advent of TV. Movies were originally all shot at 1.33:1, and when TV was popularized, it used that aspect ratio. The movie industry was panicked that TV would steal all its customers, so it came up with all sorts of names for new and exciting aspect ratios like "panavision" and "cinemascope". It had nothing to do with technical matters like shooting on 35 millimeter film (and not all films are shot on 35 millimeter, BTW), though, and everything to do with marketting. Because different companies used different systems, the aspect ratios varied wildly by film. Today, the aspect ratio is a choice of the director. 1.85:1 is the most common, but not the only one by any means, and is mostly used for movies where the look of the film is secondary. Special effects movies usually use something bigger. here [geocities.com] is some more info.
Re:Apples to Oranges? (Score:2)
16:9 and other wide aspect ratios were created by movie industry to differentiate themselves from traditional broadcast media in an attempt to drive people to movie theaters.
Re:Apples to Oranges? (Score:2)
The first is called a matte whereby the top and bottom of the film are simply not used.
The second method involves using an anamorphic lens which stretches the image vertically to fill the full frame. In the theaters, another lense is used to do the reverse.
There is actually a third method which is used by a few high profile directors involving a lens which projects onto the sound strip area of the frame in conjunction with a matte, but I won't go into that.
Anyhow, a great site which explains most of this stuff is http://www.hometheaterforum.com/home/wsfaq.html [hometheaterforum.com].
Re:Apples to Oranges? (Score:5, Informative)
"RESOLUTION:The resolution of a digital image refers to the number of pixels stored. For "Toy Story," the resolution is typically 1536 x 922 pixels."
Marko No. 5
Re:Apples to Oranges? (Score:2)
You have a point, but it's not as crazy as you think. Although SGI's have 3D acceloration (as our gaming machines do), this is used for the actuall modelling. The server farms use don't use the video card to render (they can't for multiple reasons). Consider this: try playing Quake 3 on a 1ghz Athlon in Software mode (if you can). It looks like crap and runs sloooow. Put a geforce 3 in it, and you can get a 100x speed boost (ambiguous number to make a point). This is because the rendering get's done in the hardware.
The only problem with this, is that when you move to hardware acceloration, you can't use the super complex rendering engine that they used to render the movie. Therefore, the visual quality can't possibly be as good.
One final example: go buy a mega PRO 3D card that accelorates the modeling in 3DSMax. Ask your vender if the card will also speed up the rendering. They will tell you as I have - the rendering can't use hardware acceleration! Therefore, the machine with the GF3 has a HUGE advantage, albeit at a visual quality loss.
Re:Apples to Oranges? (Score:2, Informative)
It's all about image quality... (Score:2)
Until their chip can produce a single frame that matches the image quality, they're still just making toys for quake fiends. Diffraction, interference, antialiasing...just a few of the photorealistic rendering staples, and nvidia has only recently been able to do antialiasing. They've got a long, long way to go before we're going to see actual movies rendered using their hardware.
.4FPS IS NOT 4/10s of a second per frame!! (Score:2, Redundant)
4/10s of a frame per second means you can do just over 2 frames per second.
God damn. People go to college and come out knowing this much about math?
Re:.4FPS IS NOT 4/10s of a second per frame!! (Score:1)
Re:.4FPS IS NOT 4/10s of a second per frame!! (Score:2, Redundant)
Wouldn't it be 4/10s of a second per frame means you can do just over 2 frames per second?
Resolution, film, and other stuff. (Score:2)
Consider the resolution. Images rendered for film are typically done at about 3000 x 2000 (give or take depending on aspect ratio, etc). Now, even assuming we could gang up 16 or 25 or whatever of these nvidia boards, we're left with another problem: you can't record VGA signals on film. All the hardware shortcuts and special-purpose circuitry in the latest video card are useless when it comes to final render for film, because they're not built into the gadget (and there are several different sorts) that's actually bombarding the emulsion with photons. (Typically some sort of three-pass (R,G,B) laser scanner).
Yes, it'll make for wonderful computer games (if you like that sort of thing) and maybe even some interesting experiments in real-time porno animation, but it doesn't do much for the film industry, nor would it at 10 times the speed (24 FPS is typical movie framerate). It'd have to be about 250 times faster for full framerate, full resolution images. About 12 years at Moore's Law rates. (Althogh I suspect at that resolution the flaws in the rendering and physics would become very distracting.)
(Actually, it helps the film production process, where animators can preview their work that much quicker. Faster graphics is always good, just let's not get carried away with the hype.)
Re:Resolution, film, and other stuff. (Score:2)
Not as far as you might think. Actually, it's about 100 feet down the hall from where I'm sitting right now.
Consider the resolution. Images rendered for film are typically done at about 3000 x 2000
The final product is about half that. Resolutions up to 4k x 3k are used for intermediate special-effects editing. Digital cinema will be operating about HDTV resolutions.
we're left with another problem: you can't record VGA signals on film.
With digital cinema, it's (roughly speaking) VGA in - so you could sit in the theater watching what's being generated at that moment. For film, it's not that big a deal to replace the VGA out with a connector to a real-time film recorder (remember the hall I mentioned above? go to the other side of the hall).
Re:Resolution, film, and other stuff. (Score:2)
Eh? You could easily read the contents of the video framebuffer out after it is finished rendering. Then you could save it to disk, spit it out to a special purpose film framebuffer or whatever. Yes, this is a relatively slow operation compared to writing to the videocard framebuffer, but if you're only rendering 2.5 frames per second anyway it would be a negligable hit.
There are tons of issues they are glossing over (the resolution issue you mentioned, the fact that current videocards dont have enough color precision for complex multipass effects and many others), but this 'VGA' issue isn't one of them.
At any rate, nobody who knows what they are talking about is saying that this process will replace traditional raytracing for film..but it a fairly good indicator of how quickly videocard performance and quality if progressing.
Also, it has other uses than a final render. If you can get a 'pretty good idea' of what a particular scene will look like at near realtime rates it would speed up some processes (like light placement for a scene) tremendously.
UPDATE (Score:2)
Has the whole world gone mad?
a new class of people emerged... the innumerates...
.4 FPS? (Score:2, Redundant)
Re:.4 FPS? (Score:1)
Is it 4/10 or 1/10 of a second? (Score:3, Interesting)
Raytracing versus gaming graphics (Score:2)
I may be very, very wrong, however it was my impression that the "rendered" graphics of modern video cards are shortcut 3D images that are very, very unlike raytraced images: i.e. Quake3 looks nice, but it looks absolutely nothing like the stunning beauty of a Truespace or 3dsmax image (i.e. one is averaging surface point lighting, whereas the other is actually tracing the rays of light throwing shadows, umbras, etc). I though the Quadra cards were only really relevant for modeling (i.e. moving stuff around and such), but they still used an FPU for the real rendering.
What was the quality? (Score:2)
I'm writing this FROM SIGGRAPH!!! (Score:2, Interesting)
Have you seen the Zoltar demo? (Score:3, Interesting)
Re:Lies damned lies and benchmarks (Score:1)
Re:Lies damned lies and benchmarks (Score:1)
But then, he's an AC, i should really filter those out.
Re:Wireframe (Score:3, Interesting)
I had a friend in the early 90's in the computer animation field who was wowed when his first 486 with an astounding 8mb of RAM could render a full frame of a 640x480 scene in under an hour or so. So I can imagine that wherever he is now, he's happier than can be.
And yeah, if they wanted to demo some huge frame rate, they could dump the textures to a lower quality..but then it wouldnt be all that impressive now, would it?
Re:Wireframe (Score:1)
However i suspect (although i am not certain) that Ed Avis's point was that watching a 100fps rendered-in-realtime line-removal wireframe-only (with the wireframes colored based on a stripped-down version of the shading algorhythm) version of FF:TSW would be really fucking cool. I mean, if i wanted to see FF:TSW, i'd go to the theater. But if i went to siggraph, i'd want to see something i haven't seen. I'd want to see ff:tsw specially rendered in such a way that all skin textures were replaced with astroturf.
C'mon, square, where's your creativity? You can render the thing in realtime, yeh, and it's maybe among the most impressive demonstrations ever committed by a human-- but you could do so much more. WHy not set up a booth like that, but with an interactive console such that the spectators can replace certain predetermined aspects of the movie with values that make no sense? I mean, just think of the possibilities of something like that! Replace the old guy with a model of Sonic the Hedgehog. Continuously render all of the members of alec baldwin's team wearing NVIDIA t-shirts, or (yum) in skimpy bathing suits. Screw with the viscosity values in the renderer such that aki's hair always acts as if it is in zero-g. Or take out all the backgrounds and use the extra processing power to render all the characters with waist-length hair. I mean, just think. 30 minutes of work by the renderer programmers, and that would be the coolest demo EVER!!!
that being said, i too am boggling at nvidia. How long until we're playing FPSes with Aki-quality hair on the characters?
OK, really, I have nothing terribly socially relevant or interesting to say. However, in summary:
d00d!!
Re:Well then (Score:1, Redundant)
Which is it? A frame roughly every two seconds (.4 FPS) or roughly two frames each second (.4 SPF)?
--
Evan
Re:FPS (Score:1)
I presume you're talking about when AMD was still running K6 (classic) and Intel had just come out with the PII. AMD generally ran two comparision charts: one with the systems configured similarly (just swap out 200MHz PII with 200MHz K6) and one with the systems configured for the same price. The idea was "Look, we're pretty close clock-for-clock, but if you buy our stuff, you have enough money to really boost the performance." Neither benchmark is more valid than the other, so they ran both.
Re:Wait a minute: (Score:1)
Re:... (Score:2)
You say that Intel is 100% X86 and AMD is 99.9% compatible. You state it as if obvious, and it sure _sounds_ obvious, but any meaning that can be attached to that statement is either false or irrelevant.
The x86 architecture is documented in technical manuals published by Intel. Actually, I'm going to specify the "ia32 architecture" and thus ignore anything before the 386. Anyway, these manuals detail what the processor is going to do when fed certain instructions. Assemblers are written to these documentations.
The first thing you have to realize is that each Intel processor has a different technical manual, because there are instructions added with every major revision. So if we take any given Intel CPU model (say, an i80386DX), and compare it to any other processor model (Pentium II), we can say with certainty that they are not 100% compatible. The Pentium II will react differently if given certain instructions; for example, it will process MMX instructions instead of objecting to them. You could not say these processors are 100% compatible and retain any meaning in "100%".
Secondly, there are bugs. No Intel chip matches the specifications perfectly, and so every chip would be slightly under 100% compatible even with its own manual. (Yes, you hear about very few bugs, but there are many more that aren't really very important that you can read about on Intel's site if you want).
Now we can say that AMD chips are no different from different Intel chips. Some AMD chips have capabilities like 3dnow! that an i80386DX does not have (and this is admitted by the chip in its processor flags). But that's no different from a Pentium II having MMX. And AMD chips have bugs also, but there's no evidence to show that AMD chips have more bugs or anything.
Intel chips are not "100% X86 compatible" just because Intel makes them. That's like saying that Windows NT is 100% MS-DOS compatible just because Microsoft made it and can define what MS-DOS is. Even Microsoft will admit that certain applications which will run under MS-DOS will not run on Windows NT.
And just as an aside, there are no undocumented instructions of even the most remote practical significance in Intel chips. Undocumented instructions are ignored by assembler and compiler developers.
Re:good god (Score:1)
Re:It's amazing... (Score:3, Insightful)
A sunburn waiting to happen (Score:2)
Heh, if you think I'm going to the beach with .4 SPF sunscreen on, you're out of your mind!
Re:His error correction has an error!!! (Score:2)
Re:His error correction has an error!!! (Score:2)
Re:Just what are they smoking at NVIDIA? (Score:2)