GeForce3: Real-time RenderMan? 173
b0ris writes "This review of the NVIDIA GeForce3 at The Tech Report does a nice job explaining how the GF3 chip can create advanced graphics effects in real time. The author raises the prospect of having real-time Final Fantasy or Shrek-style animation on the desktop in a consumer graphics card. The examples from the GF3 he uses to back it up are almost convincing, even if it isn't quite there yet. Will render farms go the way of the dodo?" Well, I'm all for dreaming, but its gonna be a few years before the GeForce8 can do renderman in real time, but when we get there, Final Fantasy 21 is gonna rule.
Re:Jobs showed it at MacWorld (Score:1)
Re:Not for years.!!!! Quote from pixar about Nvidi (Score:1)
2->4->8->16->32->64->128
between 6 and 7 years until pixar (by your transistor requirements) quality rendering can happen in real time on a commodity card. Factor in the fact that most users are going to be rendering at computer screen resolutions & 5 years is still a safe bet.
No, its not time to unscrew your case, but it IS time for game and software companies to pay attention.
Re:It's not just the rendering. (Score:2)
Render Farms (Score:3)
When a video card has the power of a render farm, then people will simply make a render farm using those cards.
This will always be the case, until the rendering abilities of a card become indistinguishable from reality, and can render twice as fast.
All your race are belong to Gus.
Re:render != raytrace (Score:1)
Re:Jobs showed it at MacWorld (Score:2)
Rendering is not speed (Score:2)
It takes a LOT more than polygon-pushing power to make a realistic image. The Geforce 3 (and the OpenGL or D3D which drives it) cannot do motion blur (REAL distributed motion blur, not accumulation), accurate reflection or refraction, shaders of arbitrary complexity, or any scene management and geometry generation operations.
Source code vs. performance (Score:2)
Now 5 years after the hoopla, one screenshot of a 320x240 camelion that looks like a movie, and 500 layoffs later, let's all say it in unison, "who gives a fuck if NVidia doesn't release any source code!"
Re:render != raytrace (Score:1)
Most of what people think of for CG work is raytraced... this isn't always the case, nor is it what I was really going to talk about.
Real time raytrace isn't going to happen for a long time...
But... for animatioin, under Maya, it takes longer to go through all the math with the deformers and the skeleton solvers and all that than it does to display. So sometimes, you can only get 1 fps.
Games use a much different approach, where everything is approximated to some extent.
Its true that they have IK solvers and are more realistic than in the past, but the dependency is not like what you would have in some animated scene files. What happens for secondary effects, such as hair moving is all based upon the characters movements. This is not really forseen... and only few animators would want that level of control on a character under most situations.
I also agree with you on the direction for 3D cards and movie CG. It will get better and better. And rendertimes will still be the same, because artists always want to add more realism or quality. As speed goes up, so does complexity, but render times stays about the same. Thats what I call job security.
Re:render != raytrace (Score:1)
I was typing faster than thinking... not rare for me.
Everything you state is correct.
What I was intending to say is that what the graphics card is doing is different than what renderfarm rendering is doing. I don't consider it really rendering unless its from some render package such as Maya, PRrenderman, Mental Ray or the like.
I have had to do some raytraced rendering before. I know the difference. My fingers must not of.
Thanks for pointing this out so others are not as confused as my fingers.
Re:Real time Rendering IS here (Score:1)
For the image quality of a render, even from Maya, you will have to wait a while... or you are doing some really simple stuff.
The lighting that the GF3 does does not compare to what Maya renders. There are a couple levels of complexity in difference.
Re:Render Farms (Score:2)
Some software does use the video hardware to render. It is grabing the frame off the buffer and saving to disk. Slower than real time, faster than a full blown software render.
Re:P2P OpenSource Rendering? (Score:2)
It would take a lot of development time and coopereation from the software companies to support low bandwith render systems.
The scene files that we are working with are 10s of MB, and they reference other files that are of similar sizes. You may pull accross 100MB of proprietary scene files (which means encrypted to the users) and then the system determines what to render. It may take 30+ minutes a frame, while either creating a scad of misc files or eating up memory (such as shadow map files, motion blur files...) and then assemble all of them to make a 2-3 mb image to upload.
The average user's home machine would only be a waste to studios. The bandwidth would kill us. Legal would kill us for letting proprietary data out. Your system would be smoked while rendering... or it would take a long long time.
All the transfer time of the scene files and the textures would take longer than the render.
We keep a nice fat backbone to the renderfarm for a reason. No sense in having 200+ procs waiting on data.
We do use software that allows us to use the users desktop, but this is over a LAN and not a WAN... and that makes a big difference.
render != raytrace (Score:5)
Its true that they are getting close and blurring the line between rendering and desktop 3D for all practical purposes there is a difference.
I just hope rendering never goes away... I need this job!
Another difference is that game movement is not near as complex as cinematic animation. Most game movement is pre-definded movements trigered by something. A lot of secondary animation and even some primary animation is done by a complicated set of equations. It all depends on the package, but sometimes with these solvers on, you might get 1 fps when viewing the animation. Until issues like that are fixed, you will not be able to generate stuff like that on the fly.
Re:FPS (Score:2)
The eye can detect above 120, depending on the person. My threshold is around 80 or so, anything above that adds little to the gameplay, other than the framerate is less likely to dip below what I notice.
What makes 24-30fps acceptable in film and TV is motion bluring. Search the archvies for the arguments, as I don't feel like getting into it again.
Re:Wow.... (Score:1)
That's called marketing
Vermifax
It is relevant ... (Score:1)
Vermifax
Not for years.!!!! Quote from pixar about Nvidia (Score:5)
Achieving Pixar-level animation in real-time has been an industry dream for years. With twice the performance of the GeForce 256 and per-pixel shading technology, the GeForce2 GTS is a major step toward achieving that goal.
-Jen-Hsun Huang, President of NVIDIA Corp.
Here [google.com] is what Tom Duff from Pixar thinks about that:
These guys just have no idea what goes into `Pixar-level animation.' (That's not quite fair, their engineers do, they come and visit all the time. But their managers and marketing monkeys haven't a clue, or possibly just think that you don't.)
`Pixar-level animation' runs about 8 hundred thousand times slower than real-time on our renderfarm cpus. (I'm guessing. There's about 1000 cpus in the renderfarm and I guess we could produce all the frames in TS2 in about 50 days of renderfarm time. That comes to 1.2 million cpu hours for a 1.5 hour movie. That lags real time by a factor of 800,000.)
Do you really believe that their toy is a million times faster than one of the cpus on our Ultra Sparc servers? What's the chance that we wouldn't put one of these babies on every desk in the building? They cost a couple of hundred bucks, right? Why hasn't NVIDIA tried to give us a carton of these things? -- think of the publicity milage they could get out of it!
Don't forget that the scene descriptions of TS2 frames average between 500MB and 1GB. The data rate required to read the data in real time is at least 96Gb/sec. Think your AGP port can do that? Think again. 96 Gb/sec means that if they clock data in at 250 MHz, they need a bus 384 bits wide [this is typo. 384 _bytes_ wide!]. NBL!
At Moore's Law-like rates (a factor of 10 in 5 years), even if the hardware they have today is 80 times more powerful than what we use now, it will take them 20 years before they can do the frames we do today in real time. And 20 years from now, Pixar won't be even remotely interested in TS2-level images, and I'll be retired, sitting on the front porch and picking my banjo, laughing at the same press release, recycled by NVIDIA's heirs and assigns.
Vermifax
Re:QED (Score:1)
But anyway, you don't really want a QED renderer. Imagine rendering a soap bubble. In order to get an accurate shifting-rainbow effect, you'd have to model extremely subtle air currents, and the thickness of the film in micrometers. It's far more efficent to take your usual surface and apply a time-varying texture to it, and tweak it until it looks accurately like a soap film.
The computation required to accurately render QED is absurd. Instead it would be better to have a class of objects in your renderer (diffractors, thin films, etc) that can simulate diffraction. But don't expect those to behave nicely when they interact with non-diffracting objects, the computing required would just be too huge. If you could do that, it's time to start coding The Matrix.
--Bob
Re:QED (Score:2)
You're kidding, right?
How often, in every day life, do you notice diffraction and interference? I never do. Consider also that the size of objects which cause diffraction are the same order of magnatude in size as the wavelength of light (i.e. 10^-7m). Which, BTW, is far smaller than you can see. Now imagine you're going to keep track of polygons/voxels 10^-7 in size, for a room that's 10m by 10m by 3m. That's 10*10*3/(10^-7)^3 =~ 3*10^23 voxels to keep track of. Forget it. There are far better ways to simulate diffraction, if you really wanted it.
What I have seen, that's really cool, are Relativistic [fourmilab.ch] ray [man.ac.uk] tracing [mu.oz.au]. Do that Nsuck^H^H^H^Hvidia!
--Bob
Re:P2P OpenSource Rendering? (Score:2)
2) The people who MAKE movies are a different group of people than those who SHOW movies.
3) Seti@home has to do a ton of redundant work, because people turn seti@home off in the middle of a block and never turn it on again, kids download block then try to upload spoofed workfiles to crank their work completed stats, and other garbage that the Studios just wouldn't tolerate well.
Consider this: Would the Seti project buy a server farm to perform this work if they could afford it? Or do you think they'd go through all of this crap, simply because they enjoy dealing with crap more than doing science?
Re:Not for years.!!!! Quote from pixar about Nvidi (Score:1)
Dude. Both of those quotes are referring to the GeForce 2 GTS. The Tom Duff post to comp.graphics.rendering.renderman is over a year old. The review being posted is about the GeForce 3.
Yes, much of what Tom Duff said probably still holds true, but let's try to quote material that is actually referring to the subject at hand, mm'kay?
--
Re:FPS (Score:1)
The reason why watching a film on a TV monitor vs watching a film in the theater is because that film is translated from 24fps to 30fps (usually by frame doubling, but I don't know the process well enough to describe it -- search the web. I've seen a good review of progressive scan DVD players that gave a good background to the whole film->video conversion process). And as far as films go, many theaters actually use projection cameras with shutters that open two or three times per frame, rather than once. It doesn't help the fact that there are still only 24 frames in a second, but it does make it seem a bit smoother (making it seem as if there are really 48 or 72 frames, though those are doubled or tripled).
Re:FPS (Score:1)
Don't I know it! There's no real cutting-edge theaters out here (IMAX doesn't count, as that's not really cutting-edge), though one would expect at least something, what with being a big technology center. Oh well. I guess I'll continue to be happy watching movies at the Bella Botega or Cineplex Odeon (two eastside theaters with stadium seating). As far as better filming processes, the best I've heard of is 32fps. Of course, I'm assuming these get translated down for play in "normal" theaters, because to do otherwise would require a massive layout of cash by theater owners. And we all know that they don't make much money off of anything but the concessions :).
Re:Not for years.!!!! Quote from pixar about Nvidi (Score:1)
Re:Man power... (Score:2)
I don't remember what movie it was, but I read an article on the making of some movie (set in the 1800's, one of those cheesy romantic dramas, released about 2 years ago) and they showed the original filmed scene where you could see scaffolding, cameras, and lights. Then they showed the final result, which was a fully convincing 1800's-era scene. Most of the buildings and background people were created in 3D Studio MAX and rendered with Mental Ray, and you just can't tell. It was truly impressive. The buildings moved perfectly with the camera angle, the CG people walked and moved perfectly (there were no closeups of them, which removed the hardest part - facial modelling. The human eye is very good at picking up inconsistencies, especially in objects we observe every day, such as human facial emotions. It was very impressive nonetheless).
Of course, convincing facial modelling isn't impossible - look at this picture [raph.com] by Asier Hernaez Laviña, which was modelled and rendered in 3D Studio MAX. Not video, but it's an amazing technical achievement and is almost indistinguishable from a photograph.
--
Tell this to the BeOS community (Score:1)
--
Re:Tell this to the BeOS community (Score:1)
--
Movies vs. games (Score:1)
Re:PRMan does raytrace (Score:2)
>>rendered with renderman.
actually i'm not. Other than Antz/Shrek from PDI (which have their own in house renderer), i'd say 95% (conservative estimate) of the feature quality CGI put out is done with PRMan.
Mental ray - yeah i'd been used for a few things (e.g. Flubber) when you absolutely HAVE to use raytracing. But other than that, no way. It can't swallow the type of scenes that PRMan can handle, the memory requirements are FAR too high, and it's motion blur is weak at best in comparision. There have been plenty of post houses that have tried to use it, but once you start to throw large scenes at it that require quality anti-aliasing, motion blur, and HUGE geometry databases it just falls apart. MR3.0 will tackle *some* of these problems, but the fact that it's a raytracer gives it some inherent limitations of what it can handle.
And this isn't from somebody who hates mental ray --- i worked on it at Softimage for 3+ years. It's a good renderer, but it's no PRMan.
Re:PRMan does raytrace (Score:2)
Correct. and PRMan returns BLACK whenever you call trace. Therefor PRMan doesn't raytrace - ever.
You can hook it up to another renderer (e.g. BMRT, RenderDotC) to handle the trace() calls, but that raises it's own issues.
Re:render != raytrace (Score:3)
>>not raytraced.
That makes zero sense.
'real' rendering is not raytracing. Raytracing is one type of approach for simulation light propagation. It's not the end-all and be-all. it has it's own serious problems.
Go see a movie. 99% of the CGI that you will see in feature films is done with PRMan (Pixars implementation of the Renderman standard). PRMan doesn't raytrace. Ever.
Raytracing has it's place. So do a lot of other approaches. Open your mind.......
Let's see if I can beat... (Score:4)
Sarcasm mode on:
Will computers continue to get faster? Will we someday have lightbulbs in every room of the house? Will everyone who wants one be able to afford an automobile one day?
Well, it'll be a few years before we're able to play color video games on our personal computers, but when we do the arcade games will really rock!!
Sarcasm mode off:
Really? What kind of sensless 'wow-computers-are-getting-faster' is this? The article actually makes sense and is interesting. It explains how computers are getting faster. It's the silly, so-called 'editoralizing' that stoopid.
Re:Lightwave 6 3D (Score:1)
Lightwave 6 3D (Score:3)
Re:It's not just the rendering. (Score:1)
By "things we know" I mean human motion and things that we see every day and notice subconciously. Dinosaurs and spaceships are easy to fake since most people only see those in the movies -- and that is Hollywood motion, not reality anyway.
If you notice, it isn't that uncommon to see a rendered STILL that is indistinguishable from reality. However, rendered MOTION is still a bitch.
-chill
--
Charles E. Hill
Re:Not gonna happen anytime soon. (Score:2)
Other shading methods (radiosity for proper lighting) are used elsewhere.
Real-time rendering CAN be achieved by using the proper methods and not just throwing the entire ball of wax at any scene.
The idea is SMART rendering: Z-culling (so you only render pixels that affect the scene); polygon reduction (so you don't bother with a 10,000 poly item that is so far away in the frame it is a single pixel); variable mapping (using environmental maps for reflections when appropriate (like fly-thrus where there are only "background" objects).
Think Hollywood set -- build (and shoot) only what the camera will see, nothing else.
--
Charles E. Hill
Hot stuff (Score:1)
Great Graphics do not make a Great Game (Score:2)
My dream is that GeForce8 can make it unnecessary to discuss the quality of the game in the same sentence we discuss the quality of the graphics. For years now, we have seen one product after another try to top the preceding generation in terms of delivering beauty and graphic heat -- and yet it has been a long time since games have really done, IMHO, a great job of delivering fun.
This is not to say that twitch isn't fun -- or that pretty isn't interesting. Its just to say that I'm not sure that more more photorealism equates to great gaming.
Re:Not quite right (Score:2)
Re:FPS (Score:3)
The electron gun scan rate is CONTROLLED by the video card, so the frame rate coming out of the card is constant. The RAMDAC accesses memory at a constant rate, determined entirely by this refresh rate. Frames are generated into the back buffer and flipped into the front buffer once the entire frame is generated.
Nothing is actually "sampled" in the chain from frame generation to displaying the image (unless we want to talk about pixels rather than frames).
This means you need 30fps generated by the card to get 30fps displayed on the screen - not the 120fps you are suggesting!!
Re:render != raytrace (Score:4)
Rendering is not raytracing. Rendering (in terms of 3D) tends to be an all-encompassing term which covers the conversion of the model (ie bytes that describe a scene) into the image (ie bytes that depict a scene). Raytracing is simply one tool at the disposal of the rendering engine.
Raytracing isn't even the best you can do as it can't cater for atmospehric effects and diffusion of light through a scene.
A commercial renderer (LW, Maya, 3DSMax) will use a lot of different methods to generate the final scene. Some objects will used simple renders that you find on a Voodoo 1 chip, others will use complex ray traced algorithms that can't be done in 3d hardware yet.
The GF3 with it's pixel and vertex shaders is just one step closer to what Pixar and ILM managed to achieve in the 80s. The problem is that Pixar and ILM are just getting better and better every day. There is no way a GF3 would ever be able to produce something like Shrek in real time, and by the time a GF* does it will pale in comparison to what is coming out of the movie studios.
Many games now have some rather complex IK effects to get more realistic motion. The Halo engine produces some fairly impressive physical effects - look at the way the jeep drives sometime, it is quite realistic. Games have the distinct disadvantage to cinematic animation at this point - in the movie you know exactly what is going to happen and you can write special exceptions where needed, even altering the vertexes by hand if needed. In a game EVERYTHING has to be either anticipated, or computed in real time. No wonder things are still a little forced.
3D cards are getting there. Most can put out plenty of FPS when required (remember the cinima renders are only 24fps - below what most gamers consider even passable). It's really getting the polygon count up and the parallel processing power up now. Given that most 3D cores have MORE processing power than the CPU in your machine, it's hardly surprising that the processing load is steadily going from the CPU to the 3D card.
Who knows where the future is going, but I'll assure you that 3D engines are just going to get better and better, and movies are certainly going to improve to the point where you won't be able to tell an animated film from the real thing.
Re:Let's see if I can beat... (Score:1)
LS
Re:Like rain on your wedding day (Score:1)
Re:PRMan does raytrace (Score:2)
Disclaimer: I work for DotC.
RenderDotC doesn't raytrace either. You might be thinking of Mirage-3D, the author of which, the great Timm Dapper, also works for DotC.
FPS (Score:2)
The eye can't even detect anything above 30 FPS or so.
Bullshit (Score:2)
Cells in the retina have a recovery time of ~30 milliseconds. Do the math.
(If "80 FPS" seems choppy to you, it is because this is an average. The framerate only has to dip below 30 or so for a hundred milliseconds or so to be detectable.)
Here's a link [utexas.edu] from google
Re:Like rain on your wedding day (Score:1)
Re:It's not just the rendering. (Score:2)
Exactly. Why? Because we use textures as a form of compression. Computers just don't enough memory and bandwidth to allocate an unique texture for EVERY surface. (Light maps push this boundry though, as can be noted in Quake with it's light map cache.)
The reason textures even "work" to begin with, is that from a distance, a surface looks pretty much "flat". But at the microscopic level (atoms) the "surface" is extremely hilly. In the real world, *ALL* those micro details ADD UP when that object is light. And that is why the textures in any game stand out like a sore thumb. It's not the "textures" themselves that are the problem. It's the surface roughness and lighting that we are CRUDELY approximating (for real-time rendering.) Bringing this back on topic, thats why off-line rendering farms can look SO much better and realistic. They have the time to do all the expensive math calcs needed for realistic lighting (i.e. ray-tracing)
> It didn't appear flat, but rather bumpy
That's why bump-mapping is so badly needed in today's games. It fakes the atomic "roughness" of a surface.
I'll dig up a link to that Quake 1 client (with source) that added bump-mapping later today. The cool part was that you could adjust the level of bumpiness. A textured brick with a little bit of bump-mapping looked WAY better and started to look like a real brick (with indents.)
Re:render != raytrace (Score:2)
Frame-based animation is the example of the prior.
Skeletal-based animation (and motion blending, ala Granny) is the example of the later.
Re:Jobs showed it at MacWorld (Score:2)
> The fill rate on the Geforce series is reasonably high. The color depth is 32 bits
For the GeForce, 32 bits per pixel is only 8-bits per channel, and can leave bad banding and mach artifacts with overlays.
16 bits per channel is 64 bits per pixel (ARGB). Unfortunately it will be a while before consumer cards even start thinking of supporting it.
Jobs showed it at MacWorld (Score:4)
Re:... (Score:1)
Yeah, one of my housemates brought a copy of FF II back from his house. I went nuts on it for like 6 hours, saved the game, quit - next day, gone. Play the start again for maybe an hour, save, quit, gone again. Game is fried. Too bad, the first few hours were a lot of fun...
(I don't see what the problem is with DBZ-style hair, but then again I do spend about 2 hours a day watching it).
It's not just the rendering. (Score:4)
Here, even the most advanced renderer won't help much if you're talking about real-time interactive stuff -- it is sitll raw CPU speed here...
Re:Like rain on your wedding day (Score:2)
We've been through this before... (Score:1)
Source [fgnonline.com]
Some of the stuff the GeForce3 can do is great, but let's calm down. Move along...nothing to see here...
-brennan
How about The Product? (Score:2)
www.theproduct.de [theproduct.de]
It's really amazing, and it would seem that what they were describing in the article is already here, but maybe im not quite clear on what they meant.
Malcolm solves his problems with a chainsaw,
Not quite right (Score:1)
Searching for "human eye framerate" on Google provided this link [www.ping.be]. A very good point raised here is that the screen is turned on and off many times a second. This makes us much more perceptible to refresh rates above 30 Hz. Especially on TVs and monitor pictures with higher intensities, where white colour is the brightest. If you don't believe me, adjust your monitor refresh rate to 60 Hz and notice the difference. Compared to 100 Hz, I notice the blinking extremely well. Hell, I even notice it a little when switching from 100 Hz to 85 Hz. However, if you use a lower refresh rate, your "eyes" adjust after a while. Especially using darker colours on the screen makes it easier. This might be a synchronisation problem, and that we start synchronizing with the lower refresh rate after a while. However, we DO notice extremely well when comparing, and working on a lower refresh rate may give you more headaches!
Notice the difference between refresh rate and framerate. IMHO refresh rate has everything with how "smooth" motion you can have. With lower refresh rates, it's much easier to create completely "smooth" scrolling (we perceive the motion as continuous), but we might notice the blinking of the screen. This is why games on TV can look PERFECTLY smooth, but "horrible" on a high Hz monitor. The more Hz you have, the higher STABLE framerate you need to get the same effect. So if you want more smooth motion in games, I recommend learning to play at a lower monitor refresh rate. Really! Your head may throb, but it's smoooth
All in all, I think of the problem as in two parts:
A) A synchronisation problem between refresh rate and framerate. (Which is really the same as your conclusion) Sometimes, a frame can take longer than a refresh and people will notice.
B) A synchronisation problem between the eye and the refresh rate of the monitor and it's intensity (remember colours are frequency too!) Remember that the human eye isn't built for watching rapidly blinking objects.
They don't pay me, so I won't clarify much more than this.
- Steeltoe
Re:Not quite right (Score:1)
Actually, I lied (they still don't pay me though). Modern 3D games usually use more time to draw a frame than just one vertical refresh on the monitor. So you notice lag in motion on most new 3D games anyways. The higher number of refreshes used, the more noticable "jaggy motion" you get (depending on refresh rate). You won't notice anything in between, except occational skips (by chance) now and then. Try playing an older 3D game. With low enough detail level and resolution, you should be able to push the limit so that everything is calculated in the vertical refresh period and get "perfect motion" (Doom for instance). If this fails, try adjusting the refresh rate of the monitor down.
The VR period is when the beam on the monitor moves from the lower right- to the upper left corner and the screen is blanked. If you want "perfect motion", this short time is all you got to draw the next frame. That is why it is easier to have "perfect motion" at lower frequencies. Actually, the motion is not perfect at all since there is no motion(!). However, the brain is fooled into seeing perfect motion. If you just skip one refresh, that's enough to notice a small lag in motion (depending on refresh rate).
In reality though, you have a little more than the vertical refresh. As the beam goes drawing down the screen, if you manage to stay ahead of it (drawing the scene from top-to-bottom), the player can't notice it. I know this from experience. This does not of course apply if you are using double-buffering. It's harder to have "perfect motion" with double buffering, since you need to synchronize with the VR in order to set the screen address every refresh. I believe most modern games draw directly on the screen nowadays since the GPUs are so fast, so this might not be a problem anymore.
So all in all. It doesn't matter how high you can push your fps. As long as you don't synchronize properly with the monitor refresh rate, you'll not get "perfect motion". The refresh rate is what is fooling the brain in the first place. A higher framerate than the refresh rate is meaningless. Humans DO recognize the difference between objects blinking 30-120 Hz with high difference in intensities. Humans are NOT simple math and simple science.
- Steeltoe
Re:Not quite right (Score:1)
- Steeltoe
Man power... (Score:4)
PRMan does raytrace (Score:2)
You are wrong. The shader language can raytrace.
Using BMRT [bmrt.org] together with PRMan, it can ray trace, and many people use it. Like in Hollow Man [imdb.com], for instance.
Here is a gallery [exluna.com], which includes Hollow Man. The call looks like this :
color trace (point from, vector dir)
Traces a ray from position from in the direction of vector dir. The return value is the incoming light from that direction.
Source [exluna.com]
Fear? (Score:2)
What's so frightening about terminating a program?
PRMAN CAN RAYTRACE (Score:2)
BMRT Raytracing Howto [exluna.com]
You people are amazing. You don't even bother to look at my link, and you tell me I'm wrong.
Re:PRMan does raytrace (Score:2)
BMRT Raytracing Howto [exluna.com]
You can hook up PRMan and BMRT together, using BMRT to do the trace() calls. This is in fact a semi-supported function that Pixar gives to people. When you get PRMan, they'll happily give you BMRT, as well. Many things Pixar does use BMRT. BMRT is good. Don't diss BMRT. When people use PRMan, BMRT is a natural thing to include in many cases!
Hypothetical (Score:2)
If I'm in Word, and I add an Excel spreadsheet to it - and then I print it - is that Word printing a spreadsheet? By my definition, yes - by your definition, no.
By my definition, Word can use Excel as a spreadsheet renderer. By your definition, apparently, just because Excel is not built in to Word, it means that Word is incapable of printing spreadsheets.
They're telling me, "There's no way for Word to print a spreadsheet," and I'm saying they're wrong. You're also saying I'm wrong. But I'm not, I'm right - and the page I pointed to shows how it can be done. It's not easy, and there are problems, but it can be done. The actual facts are on my side.
Your definition might be more technically correct (since "we don't say that program 1 is performing the specific task"), but mine is certainly more useful. Since my point was that program 1 is able to perform a specific task, by commincating with another program. Many programs are incapable of communicating with other programs in such a manner, and that makes PRMan pretty cool, in my book.
By the way, it's "English," not "english."
echo, sed, double negatives (Score:2)
After executing your commands, I am left with the following statement from you :
prman can't do radiosity via an SL trace without BMRT.
Now, I will apply the English language suggestion for good writing, "Don't never use double negatives," after which, your statement becomes :
prman can do radiosity via an SL trace with BMRT.
This is shockingly like my original statment :
PRMAN CAN RAYTRACE USING BMRT AS A TRACER.
And I guess I agree with myself. So, then I can only laugh, when I read your insulting statement :
If you still think that's prman doing raytracing then I suggest you take a remedial english class.
Because I just proved that you "think that's prman doing raytracing"! So, why exactly did you need to insult me twice in your post, in order to agree with me?
Okay, my last post in this thread (Score:2)
Thanks - having the last word is kind of fun.
You might consider getting some help for that persecution complex.
If this is a back-handed apology, I accept. If it's merely another insult, you might want to consider taking some agression management classes. I think it's somewhat childish of you to move an argument of fact into a namecalling bout, suggesting that I don't live in the real world, don't know how to use the English language, and now that I need therapy, and have a "widdle head."
Semantic debates are sometimes enthralling, because you can twist words to make facts lie - but they don't really further understanding. Your definitions of "program" and "call" are well stated, and I believe I understand them, but I don't believe that they reall help you.
Word can call Excel to do spreadsheets, but that doesn't make it a spreadsheet program. PRman can call BMRT to do raytracing, but that doesn't make it a raytracing program. Eudora can call PGP to do encryption, but that doesn't make it an encryption program. The JVM can call methods in a class file to give you the long-distance phone calling capabilities in DialPad, but that doesn't make the JVM a long-distance phone calling program. Quake III : Team Arena can call jpeg library functions to load jpeg images, but that doesn't make it a jpeg image loading program. Internet Explorer can call Hotmail to send email, but that doesn't make Internet Explorer an email program. PRMan can call the shader language to do shading, but that doesn't make it a shading program.
Utility is an interesting thing. I can use a butter knife to turn screws, and I will agree with you that my ability to use a butter knife in that manner doesn't somehow turn it into a "screwdriver," in the traditional sense. But, if your definition of a screwdriver is merely that it is an implement with which one may turn a screw, it becomes a pretty hazy line. If you ask me for a screwdriver, and I hand you a butter knife, I'll laugh at myself, right along with you. It's silly, it's not what you asked for, but it'll do the job. You have to agree that if a demolitions expert is trying to defuse a nuclear bomb without his tools, the clock says 26 seconds, and he asks me for a screwdriver, if I hand him a butter knife - I've saved the day!
Most effects houses have a hard time staying in the black, and most use PRMan. If an effects house is down to the wire, and their client demands that a certain effect needs to look more real, and the only way to pull it off is by having PRMan call BMRT to do raytracing, they'd rather use my definitions of "program" and "call" than yours.
I love your last paragraph - I think you should use it, then next time you're on Jerry Springer.
Re:FPS (Score:1)
Not many features released in ShowScan, I'll agree. But you occasionally see it turning up in amusement rides & Vegas "experiences".
Re:Final Fantasy XXI, eh? (Score:1)
Re:The real problem (Score:2)
Nah, it works fine (Score:2)
I'm using Win2K SP1, and the driver for the GEForce 3 works quite well. All the graphics features work. The chameleon demo is indeed impressive. Considering that NVidia just started shipping the Win2K driver, I'm quite impressed.
I have more fans in my systems than most users go for. Nothing is overclocked. GeForce 3 boards have heat sinks on the RAM and a fan on the graphics chip. This board is pushing the limits of what's possible with current semiconductor technology. Power and cooling should be sized accordingly. Just shoving this into some low-end PC with a minimal power supply and fan may not work.
Re:Not for years.!!!! Quote from pixar about Nvidi (Score:3)
However, I'd say we ARE about advanced enough to do crap like this [indiana.edu] in realtime... ;) (No goat links, I swear!!)
However, there is some hope... I remember reading in a great book about 3D games (Black Art of Macintosh Game Programming) that raytrace-quality realtime games would be (according to the author's math) about 20 years away. Interestingly, that's exactly what the Pixar guy predicted, and that book was printed in 1996. My observations: today, Pixar does far more than simple raytracing. It's radiosity up the wazoo, for example (I assume ;). So to me, this suggests that ~20 years from when the book was published, we will be able to have realtime raytracing of 1996 quality. Still not too shabby. BUT. There are gazillions of optimizations you can make in realtime games that you can't make in raytracing. Here's how I see it: We can improve the algorithms a few powers of ten, efficiency-wise. (Don't say we can't, you'd be very wrong.) We can speed up our processors a few powers of ten. I think we're getting there faster than these guys are suggesting, just as long as we don't aim for the moving target of Today's Pixar Production$. (As he points out, there will never be a day when the realtime graphics are as good as the prerendered ones, simply because the big companies have the cash to throw at it to make it look better.) Anyawy. Sorry this was so long. Great stuff ahead, though. :)
Post was correct, except title (Score:2)
It is slightly ironic that the same people who one day were saying digital art is still art are the next day saying that animation on the level of a movie that took thousands of man-hours to create can be generated by a computer. Thus stripping away the art value of the movie (or at least the animation in the movie).
It does piss me off when people misuse words, especially words that are very nuanced and clever. But there you go. People are stupid.
BTW, good examples.
Re:Not for years.!!!! Quote from pixar about Nvidi (Score:2)
It's not that powerful (Score:2)
Re:It's not just the rendering. (Score:2)
Re:Not for years.!!!! Quote from pixar about Nvidi (Score:2)
Re:Not for years.!!!! Quote from pixar about Nvidi (Score:4)
After all, that orange tree would have taken years to grow.
Re:Jobs showed it at MacWorld (Score:2)
Re:... (Score:2)
I never liked FF. Big Dragonball style hair, people riding these weird chickens... silly big swords... that's all I ever saw. Well, a friend of mine was playing FF8 and his character had to dress up like a woman to go do something. I guess that's more interesting than a big chicken. I'd rather watch Record of Lodoss War or some other old classic.
The FF movie looks nothing like the video games I have seen, thank goodness. I hope the characters in the movie aren't breeding those giant chicken things...
- Someone Confused by the FF Hype
Re:FPS (Score:2)
Video transfer uses something called "3:2 pulldown," I used to read up on that stuff when I collected LDs. Pretty tricky actually turning a 24FPS non-interlated data format into a 30FPS interlaced stream... but no matter how the frames are sliced and diced a film only hs 24 frames per second of data to offer.
I just want true 30fps in theaters! Maybe in an all digital theater... they do shoot some movies on HD video cameras, don't they? Still haven't been in one of them newfangled digital theaters. Seattle has crappy theaters.
Re:Jobs showed it at MacWorld (Score:2)
Re:Like rain on your wedding day (Score:5)
"10,000 spoons when all you want is a knife", how is that ironic? It would only be ironic if later you discovered that a spoon would have done just as well for say, opening a can of paint.
Re:PRMan does raytrace - if the render supports it (Score:2)
If you read the RenderMan Interface Spec. (V 3.2) the entry for trace() reads as follows:
So, you can call trace() in prman, but it's not going to do you any good.
That said, it is possible to write a ray tracer in the shading language! This has been done, in fact, by an insane person named Katsuaki Hiramitsu [http]. This shader, however, does not use the trace() call. The trick lies in actually defining the objects you're going to do ray tracing on in the shader along with your own version of trace(), which is, by necessity, intimately bound to the type of object you've defined.
So, saying the shading language can ray trace is like saying you can keep yourself alive for a while by eating selected portions of your own body. It's possible, but certainly pessimal.
Gameplay, not graphics (Score:3)
I still play Ms. Pac Man, but I hardly ever play games from just five years ago.
Graphics are cool and all, but they're essentially just pornographic. Not in the sexual sense, but pretty graphics just sit there vacuously to amuse your eyes. As has been said long and loud, game developers should strive to focus at least as much on gameplay as they do on making their graphics cutting edge. Give the user an elegant interface, something fun to interact with, something new, and something challenging.
This will make the big brother MSN telescreens fly (Score:3)
Remember Freedom is slavery, war is piece, and Ignorance is strength!
Now I need to stop goofing around here on the slashdot insoc-msdn party news and go back to work studying the 11th edition of NewSpeak by MSN expedia. I keep hearing people here on slashdot speaking in oldspeak.
You people all need to learn how to excell(smart tag link to Microsoft office homepage)on what you do to learn and explore (smart tag link to internet explorer site)your newspeak party langauge. With free(link to how free you are with hailstorm/.net)enterprise and innovation to lead the market, great Microsoft can actively access(link to ms access)all the information we need. We need an active innovatorto actively explore, and actively leadthe market, and they ask that we all support the revoluton by your activation subscription.
See that wasn't hard. You need to all speak newsspeak and only use these adjectives innovation, lead, explore, access, active, word, excell. This will make thought crime impossible. less is better. For something double-pluss-un-innovative like linux you should not use the word bad or sucks. You all are required to use the following words above with the -un extension. If something is really innovative you need to put doublepluss innovative or really bad its doubleplus uninovative. Everything non Big brother is just plus uninnovative. So remember its not GNU-linux but doubleplussuninnovative gnu/linux. Now lets here you all respect big brother now and after my newspeak lesson I will play some video games and render linus doing a double-plus-un-innovative things to scare people so that Bill Gates can actively explore my record so that I can be considered loyal member who doesn't doublethink.
Re:Wow.... (Score:2)
Well, I had to throw my TNT2 card into the trash because it had unstable drivers. I recently built a low end machine for my wife to play everquest, and since eq doesn't need a high end graphics card I purchased a TNT2. About 50% of the time that she zoned or started the game her comp would get a fatal exception 0E. In fact it would get this fatal exception:
A fatal exception 0E has occurred at 0028:C0006EB2 in VXD VMM (01)+ 00005EB2
After days of playing with it.. buying new memory, trying everything I could I finally find out this is a problem that the TNT2 card has been having for years. I would say the drivers are realitivly unstable. I bought a Voodoo5 card for the machine and it hasn't crashed since. If you goto this link [techadvice.com] you'll see that the only workaround I could find (the AGP apeture size and disable video caching did not work) was to TURN OF HARDWARE ACCELERATION. ROFL! Why by a 3d card at all if you have to turn off the hardware acceleration to get it to work properly? Can't even run EQ with the acceleration turned off. And just to prove that the problem is not eq's.. I had the same problem when running Half-life on the card.
It saddens me that Nvidia is quickly gaining a monopoly on the graphics industry, because I truely do not want to purchase another card from this company after they knowingly allow the old TNT2 cards to have driver problems without fixing it.
Re:Wow.... (Score:2)
Heheheh.. Yeah the voodoo 5 is AGP. I don't think it's the TNT2 chipset but the way the specific card vendor (IOMAGIC) integrated the chipset into the card didn't work with the drivers, i.e. the TNT2 chipset works fine.. but the IOMAGIC card has problems with the drivers. I agree that it is probably a problem with the IOMAGIC card, and not Nvidia, but the people who purchased IOMAGIC cards need support too!
If you see my name up there, I develop device drivers for PCI cards, so uhh.. I think I can tell the difference between a AGP slot and a PCI slot.. heheheheh.. funny to think about trying to fit a keyed agp card into a PCI slot.. I'm certain the pin locations are different, so if you did manage to get it in I don't think you would see ANY graphics..
The machine had 256 megs of ram, and I tried all the various AGP apeture size settings with no success.. the only way I could get the card to work reliably is to turn off hardware acceleration in windows all together, which wasn't an acceptable fix for me.
As for the card, I've seen several iomagic cards have the same problems (3 or 4 different cards that people have had) in some 3d games (it seems EQ and Half-life is the worst). Again, I think the chipset is probably okay, but the drivers didn't seem to work with iomagic's specific implimentation of the card.
As to the person who mentioned that it was funny that I got a Voodoo5.. Well I had only built the machine for 1 purpose, and that was to play Everquest. I purchased the Voodoo card AFTER Nvidia had purchased 3dfx, but I got it because I knew that card had very few issues with EQ, compared to the number of issues I saw people were having with the GeForce2 cards at the time.
Re:Wow.... (Score:2)
I would guess you either had a heating problem, old drivers lying around in the windows directory, or some motherboard/card interaction. nVidia has probably the best driver support out of all the consumer card companies, compared to say 3dfx who couldn't even be bothered supporting my Voodoo2 properly a little over a year after it was out and never did get a real OpenGL implementation going.
Well.. I'm 99% sure that it was a driver problem because if you follow my link lots of people are having the problem, and I have seen the problem happen on several different cards on different machines. All of the cards were from one vendor tho, IOMAGIC. Now, that aside, you guys have definately made me MUCH less worried about my decision to purchase a GeForce3 card for the new machine I am building. I have honestly been very worried about purchasing any Nvidia chipsets since I saw the problem. I know that lots of peope HAVEN'T had the problem, but the problem definately does exist...
Re:It's not just the rendering. (Score:2)
The hard(er) part is getting the models and animations to look "right." The easy(ier) part should be rendering the textures for those models. Right now they're both expensive. The GF3 should lower the cost of the latter.
Re:Man power... (Score:2)
I totally disagree. Take a look at the Quake movies that were made. Before Quake came out, how many thousands of man-hours would it have taken to render and animate those movies "the old-fashioned way"? A shitload! Then Quake came out, and you could get semi-realistic 3D graphics, and people could "act out" the scenes with rudimentary tools.
Skip ahead 5-10 years, where the CPU power and "acting" tools available are much more sophisticated. Are you still going to claim that "pixar-level animation" cannot be done with a good 3D model artist, a scene artist, and an electronic actor?
Do we really want realistic rendering in games? (Score:3)
If we do get game companies trying to produce games with "realistically rendered" graphics, won't they need budgets of 100 million for each game to develop all of the data (detail, world, etc.) that the hardware will operate on? Then we'll be walking into the software store laying down $5,000 for a game instead of $50.
Re:QED (Score:2)
But the first person who wants to model proper rainbows, sun-dogs, or coatings...
The really hard part about QED isn't the iterations. It's defining the integration regime in the first place. I haven't looked in a couple of years, but I bet even the best Feynman diagram tools still can't work without heuristic input.
--Blair
"Luxo, Jr. always wanted to grow up to be an electron microscope."
Re:QED (Score:2)
It's all based on waves (classical theory). And it seems to be a surface phenomenon only, and dependent only on the geometric surface description.
Real QED would include interactions of photons with the subatomic particles of the atoms within the body of the material.
Diffraction and thin-foil effects are too-simple examples, with classical analogues. Phase-conjugate mirrors or simulacral holograms; now there you have to have QED.
This isn't to take away from Stam's work. It's gorgeous. The idea of walking into a bar with a double-barrelled shotgun and blowing away the pseudo-retro Wurlitzer with the wave-rendered CDs rotating on top, wave-rendered shards of CD spinning through space...
The idea of finding a secret because of its slight change in lustre vs its surroundings when the overhead lights dim and an accent spotlight becomes dominant...
The idea of being able to tell painted plastic from painted metal and painted wood, or black dirt from gunpowder and incinerated-demon charcoal...
Someone get nVidia on the horn.
--Blair
QED (Score:3)
--Blair
Like rain on your wedding day (Score:3)
Believe me, there is a lot of artistic skill that goes into making animation like that, from storyboarding to complicated modeling and animating to directorial talent and writing ability.
Just because Avid-style editing has been brought to the desktop, doesn't mean what you see on iFilm is as good as what you see in theaters. Most of the time it isn't. It's all about the talent, not the tools.
Case in point: Robert Rodriguez [amazon.com], who scraped together only $7,000 to become one of Hollywood's hot young directors. For those who don't know about him, his latest film was the hit Spy Kids.
The problem... (Score:2)
Real time Rendering IS here (Score:2)
Re:It's not just the rendering. (Score:2)
This is basically proving your point, but just to throw my own slant on it, having worked on small 3d rendering projects myself, it's never been the graphical hardware that has ever made anything I've worked on more realistic. And very rarely, it's been things implemented in code (deformations, etc) but rather texture detail.
A quick explanation I feel is in order before I leave work. First and foremost, the most important aspect of any 3d environment is it's actual construction, in this case, polygons. However to the end user, a bunch of polygons will look like just that, a bunch of polygons. However with smart texturing (explanation of THAT in another minute) these polygons can actually begin to LOOK like something.
Now, smart texturing. This is the tough part, I think, but also the most rewarding. I define smart texturing to be using textures which simulate real life features. Not just having a brick wall where all the bricks look the same, but a brick wall where some of the bricks are broken, some are off color, some are missing, but NEVER in any repeating pattern. The last part is the most important, but rarely done these days. However repetition of textures is usually the first thing that triggers a message in the brain that says, "ooh, right. this is a videogame"
I first realized this a few years back when I was driving to work one morning. I was looking at the road ahead of me and was more or less studying the blacktop. I was trying to figure out why I had never seen a realistic looking road in a videogame before. I realized that it was because most roads consist of very simple textures: black surface, yellow and white lines. But the road I saw in front of me was drastically different: it wasn't just black, but a multitude of color. It didn't appear flat, but rather bumpy, and "textured". It was covered in skid marks, and there were signs that accidents had take place there before. The surface wasn't just some asphalt, but rather it told a story. Humans had been there, damaged it, and thats what made it interesting to look at that day.
But I think we're still a ways off from being able to do away with the repetitive textures that dominate roads or brick walls in videogames. But sometime while texture mapping, experiment with details in the environment that show humans have been there: wear and tear. It's what makes it cool to see bullet holes in the walls and blood splattered all over the floor when you play Quake 3.
Just some thoughts. Yeah, I rambled. But work is over now. Mission accomplished.
-NeoTomba