Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software

3DFX Motion Blur In Action 110

Mr.Tweak writes: "TweakTown has some "exclusive" pictures of 3DFX's new Motion Blur effects on a 3DFX Voodoo5 in Quake 3 Arena @ 1280x980, 4X FSAA and oh boy - it looks absolutely sensational and so futuristic! According to 3DFX, together the Voodoo5 5000's support of the FXT1(tm) and DXTC(TM) texture compression and textures as large as 2048 texels x 2048 texels, the T-Buffer(TM) can render 3d landscapes with unparalleled realism and with an impressive cinematic style which is Motion Blur. Their article quickly explains Motion Blur and has lots of pictures of the new feature which is currently unreleased by 3DFX, or should we now say nVidiaFX?"
This discussion has been archived. No new comments can be posted.

3DFX Motion Blur In Action

Comments Filter:
  • anything above 60 FPS is pointless

    Sorry. I can readily pick the difference between 60fps and 90fps, unless the monitor refresh is already at 60. It does make a difference - it means my mouse sampling rate is better synchronised with the monitor refresh.

    why people wouldn't want T-Buffer features

    I've got news for you. There's nothing special about T-Buffer. Everything that can be done with T-Buffer can be done with standard OpenGL calls with no significant performance losses. It's just marketing.

    No, no-one will support T-Buffer now that 3Dfx is gone. Practically no-one was supporting it anyway (I can't think of anyone). It simply wasn't worth losing a chunk of the potential audience, especially when the benefits are minimal and the vast majority are using NVIDIA cards anyway.

  • by toofast ( 20646 ) on Friday December 22, 2000 @06:16AM (#543588)
    oh haha I thought it was 6 with motion blur on it...
  • Heh, at first I thought you said 'my old mother's 3d card she got 3 years ago.'
  • I think so too, it doesn't even seem to be a real motion blur. It looks like they're just blending the previous frame with the current one, which, at 60fps, actually creates the appearance of smoother motion, but is not true motion blur. Real motion blur, on the other hand, would involve a gauss-style edge blur along the angle of motion, i beleive there are photoshop and GIMP plugiins that do this... implementing that in hardware would make this effect look twice as good. It's how rendered 3D movies get their "realistic" look (well, for motion, anyways), not through frame blending.

    Don't believe the hype.

  • ...all I have to do is run it on a crappy LCD screen!
  • I can do this to the cursor in Winblows. Just turn on "Mouse Tails".

    Hah.
  • i think the motion blur thing is a bit gimmicky. it looks crap, and kind of like the mouse cursor trails in windows.

    also, i like the reflection of the bloke in the monitor, reminds me of a porno video i've watched where you can see a reflection of a bloke cracking one off on his sofa - quality.



    wray
  • If you could track down a hardware raytracer board
    Thank you for confirming my point.

    I guess you haven't seen Quake III recently - or any other game to come out in the last year.

    Yes, I've seen Quake 3 on my GeForce2. It would be realistic looking if the entire world were made out of molded plastic. On a brick wall in real life I wouldn't be able to use the corner to shave. Light does things in the real world that aren't represented at all with present 3d hardware. We're not going to fix the problem by increasing the framerate either.

    I'm not saying I'm not impressed by the technical feats of current 3d hardware, I'm just saying that it's not going in the direction that I'm looking for. I don't need my FPS or third person shooter, or what ever other rehash they're working on right now to have higher FPS. I would like to use my video hardware for other things too. I want real lighting effects, I want realistic reflections instead of perfectly sharp ones, I want acceleration for textile motion. Is it wrong for me to want these things?

  • Those screenshots are pitifully bad. Not only are they small and low-res, but they look like they were taken with a coolpix ten feet away from a crummy, warped 1980's-era monitor. What a POS.

    Also, that is not a good motion blur effect. It just echoes the image with decreasing transparencies. Quake is the worst game to demo this effect on, too. Those of us who live for driving games want motion blur bad. You just can't get the effect of going 200+ MPH without it. This 3dfx crap wouldn't help much, but a photoshop quality motion blur effect on a driving game would really be cool.
  • I can't get excited about motion blur. It's a hack, and even though it can improve animation at low fps speeds, it's not natural.

    The problem is that our eyes can track a moving object (even at pretty high speeds), and if you do, you'd expect the object to be sharp again, not blurry. With motion blur, the object will be blurry, even if you track it with your eyes, so it will be unnatural.

    The only real solution is more frames per second. Now, this is not easily done on a TV screen for example, so I can see where motion blur in Toy Story animations is a good thing. But not for computer games, played on a monitor that's capable of high frame rates. Instead of using hard- and software to perform motion blur, use it to increase the frame rates...
  • by ALG ( 41966 )
    Now I don't need to drop a hit before I play Quake!

    ALG
  • by Chris Johnson ( 580 ) on Friday December 22, 2000 @06:50AM (#543598) Homepage Journal
    The effect is _still_ being demoed at _ten_ times the time duration it should be, just to make it obvious. If they don't get a clue they'll never figure out how to make it worth a damn- though frankly we're not likely ever to see this in the market now that nVidia controls the technology- I think they are going to suppress it and sue anyone else who tries to compete with them using the idea.

    All they needed to do was use the T-Buffer in line with the way they use FSAA, and keep the last rendered frame to average with the current one. It'd only look (worse than) those awful screenshots at rotten framerates like 9fps, but as you went above 30fps it would start looking just like unblurred rendering _except_ that fast-moving detailed textures would be _softened_ dynamically. Combining that with 2X FSAA would be a fantastic effect with a great deal of 'you are thereness', but NOOOOO... can't these people check with other professionals other than just computer programmers? Ask any cinematographer if that's a normal cinematic level of photographic motion blur :P they'll look at you like you are _insane_. Almost nothing moves through the frame enough to produce _that_ much blur.

  • Part of the reason "pro" gamers have horrible display quality is to attain super high (120 - 200) fps which actually can modify some of the physics in quake 3 allowing you to make certain jumps, which at times can be especially usefull.
  • Basically, any vertex which faces backwards from the direction of movement is transformed by the previous frame's matrix. Then, you put in some alpha blending and you've got motion blur. NVidia has a demo of this on their page, and it looks great in action. It actually looks like a blur rather than four transparent images of an object. And it doesn't require four complete rendering passes, either.

    The one problem is that it requires alpha blending on moving object, which means that you have to sort the objects by z-distance before rendering. Not too hard to do if you take a few shortcuts, but doing it perfectly can be very difficult depending on the situation.

    ------

  • It doesn't add anything to the game, it does however make it look worse. At least from the perspective of the player models.

    I can understand implementing motion bluring, BUT only for objects that are supposed to be moving very fast. Perhaps if it could be tied to objects by assignment it would be an excellent addition to the game. Items would be any super fast projectile (not beam weaponary), fragmentation projectiles from grenade type objects, and explosive debris from detonating vehicles or destruction of large objects.

    Otherwise keep it, I don't need something else to spoil my aim....
  • I've got news for you. There's nothing special about T-Buffer. Everything that can be done with T-Buffer can be done with standard OpenGL calls with no significant performance losses. It's just marketing. Ok so that's why no one else but 3dfx brought out anything that even resembles t-buffer results and why nvidia has major problems with fsaa...marketing hype... uh huh... I'm sure you'll find the "vast majority" is using ati cards as most oem setups carry them they're the microsoft of video cards.
  • Motion blur effects in video games, whether done through hardware or software (see old versions of Motorhead for the PC platform) are always exaggerated, mostly for the same reason that every game that came out in 1997 looked like it took place in a disco - because colored lighting had just become a common effect in games and no one had learned to use it with restraint. Maybe they will eventually. Keep your fingers crossed.

    I'm a little surprised by the tone of your post, though. It seems like you're saying that the technology here sucks (or at least, the applicaiton thereof), but you predict that nVidia will sit on it. Maybe they will, but I suspect that since they own the patent now, they'll include it on future boards.

    Interestingly, I think that this is one refinement that is going to see very little acceptance among people who are serious about winning. If you're really trying to up your frag count, you don't mind higher frame rates, or higher resolutions, or better texture management, but a feature that blurs details of the scene - that sounds like a really bad idea. That's why I stopped using motion blur in Motorhead, actually: what good is a feature that prevents you from being able to clearly identify an object you're about to crash into?
  • Ummm, actually this is a very important development, learn a little about animation and you'll see that.

    Humans do, in fact, see motion blur. The brain just filters it out for you. Humans can't see much better then 60-70 frames/sec, and interpret anything happening faster then that as motion blur. Wave your hand in front of your face as fast as you can. You should see some blurring.

    This is extremely important for very fast-paced action, like Quake, where you can have fights where something will happen so fast that you only see it for 1 or 2 frames. The problem is this - if you only see it for a few frames, that's not enough data for your mind to assimilate the information. All you see is a couple of still frames one after another, very quickly. You're not quite sure where anything's going, where they're coming from. Ever watch the sword-fights in Beast Wars? They suffer from this badly, as does a lot of 3d combat. If those images were blurred, the mind could fill in the gaps between the frames better. This is also why framerates of higher then 60 are often a good thing - if the computer wont blur it, then the human eye will instead.

    Mind you, the effect they've got there looks a little bad. They're handling it simply by fading previous frames, when its actually much more complicated.
  • To bad that 3dfx is at this moment being cut up sold off by Nvidia. This T buffer story is too little and too late.
  • Since when does taking pictures of your monitor count as a screenshot?
  • by citizenc ( 60589 ) <cary&glidedesign,ca> on Friday December 22, 2000 @05:45AM (#543607) Journal
    Interesting.. when says 'exclusive' he really means 'nobody else bothered to download the demo, which is freely available ALL OVER THE INTERNET'

    From FilePlanet [fileplanet.com]:
    This is the early release of the stand alone Q3Test that does not require the game to play. The difference is 3dfx Interactive's motion blur technology implimented into the game. Those using Voodoo 3 and up based video cards will see a cool motion blur from 3dfx Interactive's T-buffer being used. It requires the use of 4sample Rotated-Grid Full-Scene Anti-Aliasing. This is not a patch but a stand alone demo of the early Quake 3 Test.
    Download the demo here [fileplanet.com].

    ------------
    CitizenC
  • All you people who say this is dumb are the ones that keep the output of 3d hardware looking shitty. I don't care what any of you say, when compared to a real world scene or a raytraced rendering GeForce2 output looks like shit. No, we can't expect realtime raytraces yet, because our computers are much too slow, but the T-buffer from 3dfx was an attempt at being able to simulate such things. Do you all want to look at unrealistically sharp corners and harsh angles in your games forever?

    Sure I'm being obnoxious, sure this may be flaimbait, but I'm selfish damnit, and I want realistic looking realtime 3d. The only way I'm gonna get it is if other people want it too.

    Say all you want about 3dfx, but their T-buffer tech is the only innovative thing to come out of the video card industry in recent years. Unfortunatly FPS stats sell video cards these days, not image quality.

  • From the pictures it looks like motion blur is total bunk. Why would I want to screw up my aiming. I can only see a few applications for this blur crap. Which is an anime game or something like the Matrix.
  • #!/bin/sh #s1a5h D07 3d170r v3r51on 2.3.3020 while [ ${keyword} = '3dfx'|'MPAA'|'linux' ];do post ${article} done
  • "BECAUSE FSAA WAS CRANKED TO 4X AND THEY WERE RUNNING 1280x980"

    Is that really going to give you only 9fps? I know I've cranked it up above 800x600 with max FSAA and still had something playable.

    And from what I recall, motion blur was supposed to be a free calculation if you were already doing FSAA, wasn't it?
  • motion blur is an integral part in fooling the brain into thinking it is seeing real motion. its like looking out a car window as you move as seeing things blur in the extremes of your vision. without that blur there is no real sense of speed or motion.
  • Yeah, just like in real life, when something is moving, and you just can't quite touch it because you touch the blurs. Oh wait, you're not stupid. The brain recognizes motion blur because we experience it in real life.

    And another thing, how can you judge this by the screenshots? Hmm? You don't know what motion blur would be like to play with. It could be a great experience, but you're an NVidia bigot, so you'll never try it.
  • The reason the framerates are so low is because the screenshots are taken with 4x FSAA on. If FSAA were off the numbers would be more like 40-50fps. I would imagine that the motion blur would have a framerate hit, but it is not going to drop it to 1/4 what it should be in the first place.

    Also, not only were they running Quake 3 with 4x FSAA on, they were doing it in 1280x960. The poor Voodoo5 is heavily strained when running in that resolution anyway, and then the load is increased some more by the motion blur, and quadrupled through the FSAA, you are going to see crappy framerates.

  • When I see real life motion blur, I don't see two or three discrete after images. I see a smudge. Perhaps it would be more accurate for 3dfx to have called this motion-strobe or motion-stutter.
  • Let me just say that I think those screen shots were absolutely terrible. It would be nice if the motion blur applied to specific objects of the game, or is only applied at certain times, like lets say you're shooting oh (ficticious) slime cannon, would you want that to blur, does slime really travel that fast, unless of course it's supersonic slime. Now if I were shooting a rocket launcher or something, it would make more sense. People moving/running at a regular pace should not blur nearly as much as they were in the game! They had a complete seperate "blurred" image! And the one shot they had of some very busy scene was very confusing. All I saw were a few balls of flame and this giant ball of stuff in the middle and couldn't figure out what was what. And as mentioned before, 9 FPS just does not cut it. I think the best FPS they got was around 12, and that STILL sucks! Someone else also said they'd rather get blurred motion effects from being drunk and playing the game, I would to, at least it would look better and the frame rates would be higher! Either way, I think it's wasted technology. Well ok, maybe not wasted, but I'm sure it's not worth the inflated price that this "breakthrough technology" will drive it up to.
  • yeah considering that most people use high speed film to eliminate motion blur...

    the fact that while playing a game you would want to see that is ridiculous. I do NOT think it is cool looking, I find it horribly annoying.

    Then again, I don't play Q3 for the simple fact that the lag does not "feel" like lag, seeing my lag wouldn't work too well I don't think ;-)
  • Damn right. You'd get really confused, especially if an internet game got laggy. It doesn't even look that great anyway, it just looks tacky. You don't get motion blur like that in real life, do you? Last time I saw somebody running past I saw pretty much what you get in CounterStrike. Its like when they brought in lens flare... wooh, wow, whizz bang crackle and pop, adds nothing to the game it just satisfies the dumbass kids who would pay money for it. What a silly idea. Lets hope NVidia never take this idea up, or you'd no doubt have to spend money even tho you'd just disable it later!
  • ...in the screen shots? It hung nicely between 8-15 fps. Very very nice. I blink my eyes alot so 5 fps will do fine for me.
    Old technology dies quick in today's market. I wonder what 3dfx will have to offer NVIDIA besides warm bodies?

  • He sounds more like an adapted press release from 3dfx.
  • jesus christ... Think people...remember that little movie Jurassic Park... To get the motion of the dinosaurs to appear real they used a blurring affect. Regardless of the framerate the motion does not look natural. It looks CG...throw a little ( not a lot like in the screenshots) blur at it and it appears softer...more like real life...
  • "Actually anything above 30 FPS is not detected by the human eye."

    You could not be more wrong.
  • Did anyone else see the FPS on the screenshots? Is the 'pretty motion blurring' worth it if the FPS drops to 12 or lower???

    Maybe it's not so bad at 800x600... does anyone know?
  • (we put the weak in tweaktown...)

    Tweaktown threatening lawsuits because they weren't linked by a site [detonate.net] from Detonate.net [detonate.net]
    and an email sent out by tweaktown and posted on detonate.net that prompted the reply above [detonate.net]

    Makes for some humorous reading ;)

  • I kinda thought the same thing, but the words that sprang into my head weren't as harsh as yours. :P
    To me, it just looks like an excuse for 3Dfx to put the buzzword "Mothion Blur" bullet on the back of their box.

    For starters, I for one would not appreciate trying to aim my railgun at 6 models, trying to figure out which one is going to score me a frag.

    Secondly, it's just not that impressive. Whoopee, the Voodoo 5 can render several time-lapsed translucent models of a character! I'm positive my GeForce could do that just as easily without having special support for it in hardware.

    I would say maybe that's the reason 3Dfx didn't end up supporting it. It just isn't interesting or useful enough to justify. Slashdot must be having an incredibly slow day.
  • The difficulty (level) of the achievement is not really that interesting, nor is the effect pleasant. Now, if the blur was ONLY for those "instances" where you see a few frames (not enough to register, as you put it), that might be nice, but equivalent to a crutch. Part of the FPS is being able to percieve as well as to aim, no?

    Often wrong but never in doubt.
    I am Jack9.
  • If they don't get a clue they'll never figure out how to make it worth a damn- though frankly we're not likely ever to see this in the market now that nVidia controls the technology- I think they are going to suppress it and sue anyone else who tries to compete with them using the idea.

    Prior art. SGI has been doing this for years. The T-buffer stuff is all well-documented OpenGL tricks with accumulation buffers.

    All they needed to do was use the T-Buffer in line with the way they use FSAA, and keep the last rendered frame to average with the current one.

    Ack! No! You're confusing motion blur with motion trails. Motion blurred objects shouldn't linger from past frames. It should only represent movement between frames. Anything from the previous frame should be gone from the present frame.

    can't these people check with other professionals other than just computer programmers?

    They can and they do. The problem is not with the motion blur technique. The problem is with the fact that they can't do it at an acceptable frame rate. Obviously, you haven't checked with either a programmer or a cinematographer.

    Almost nothing moves through the frame enough to produce _that_ much blur.

    At 9fps? You'd better believe things can move that much.

  • Well, let me ask you:

    In 1/60th of a second, can you tell if 'Thresh' is turning towards you or away from you?

    Given _good_ motion blur (i.e. absolutely minimal) this is going to mean an additional type of motion cue. You're talking about FPSes (apart from Motorhead, evidently) and FPSes are the one place where you most care whether the gun barrel is swinging toward you or moving parallel to you. You don't want to take too many frames to figure this out. Motion blur is an additional movement cue that accentuates the perception of motion visually- for instance that gun barrel- moving parallel to you it's going to have a tiny softening. If it suddenly turns toward you, it foreshortens and key details stop moving relative to you and will become clearer. These are subliminal details but perfectly legitimate- except that the sort of 'motion blur' you're seeing currently is miserably inadequate at rendering any such cues because it's 10X too diffuse and wastes way too much framerate.

    Yes, I do think nVidia will sit on it. Call it fiduciary duty- why spend the money to implement this properly when you can just sue anybody to stop them from using it to compete with you?

  • You can actually witness this yourself if you have Q3: http://www.3dpulpit.com/Q3Arena/3dfx/index.html
  • by plone ( 140417 ) on Friday December 22, 2000 @05:27AM (#543630) Homepage
    Isn't it odd that the submitter of the story is also the owner of the website where the page is hosted? It seems to me as if it is a quick attempt to generate some banner ad revenue. Especially since screenshots of this sort have been around since the initial introduction of the T-buffer.
  • Amen brother, usually I have to drink 5 or 6 pints of Keiths pale ale before I start seeing that sort of "effect" =)

  • I'm still using my old monster 3d that I got 3 years ago. I guess it might be time to upgrde now.
  • Movies need motion blur because they're done at 24 fps. TV is a little better at 25 or 30. Gamers usually want 60 fps, which is close to the refresh rate of a monitor. If a monitor is refreshing the screen fast enough, you don't see the picutre flashing on and off like a strobe light, even though your eyes are very sensative to changes in contrast and brightness. Word is flashing on and off, and it still looks like a blank piece of paper. Why wouldn't an object moving across the screen look real?
  • Ack - I meant if you have a Voodoo5
  • I looked at the screenshots and it looks like the blur would be more annoying then anything else.
    Also, wouldn't it screw with your aim in Quake? I would turn it off personally.


    --

  • What exactly is the point of this thing? From a functional standpoint, it's bunk. Motion blur is only useful if it's placed against a, erm, "regular" background, and from these screenshots it looks like that doesn't really exist. About the only place this would serve a function would be at a trade show or something:

    nVidiaFX rep: "Look! It's all blurry!"
    Collective audience: "Oooooh. Undistinguishable features. Me want!"
  • Motion blur will allow you to get away with lower framerates to a degree. After all, it does what your eyes do which is time average over a number of frames.

    I do agree that 9fps is pretty shitty. I figure you'd need to be talking 20 at the extreme lowest before you're any better off than just pumping out the framerates that are available today

    Rich

  • by MikeTheYak ( 123496 ) on Friday December 22, 2000 @05:51AM (#543638)
    Motion blur is antialiasing. Instead of in the spatial domain, as most people think of antialiasing, it's done in the time domain. A blurry monitor doesn't recreate the effect any more than a blurry monitor improves image quality by antialiasing edges of polygons.

    Like the antialiasing that gets rid of jaggies, it can occasionally be a bad thing, but in general it's a good thing. At 9 fps, it's a bad thing; it's like rendering a picture at 1/4 resolution for the sake of removing jaggies. However, as hardware gets faster, and motion blur at 30fps becomes possible, it does improve the user experience. You don't really see the blurring, but the movement becomes smoother.

    Is it really beneficial? Consider this. People start complaining about frame rates when they drop below 40fps or so for a first-person shooter (which I won't abbreviate to a TLA for obvious reasons). Movies, including the CG parts, are played at only 24fps, yet the motion looks smoother. This is because cameras naturally add motion blur (because of a finite shutter speed), and the folks who add CG elements take great pains to include motion blur effects.

  • I'm not so sure how i feel about motion blur in videogames... I could just get a crappy monitor for that... I mean, it is a neat visual effect, and i think it's pretty crazy that they can do that in hardware now, but if i remember correctly, that same technology could be giving us depth of field or soft shadows...
    How do other people feel about motion blur? I guess it is sort of handy from a gaming point of view because it helps you at a glance estimate the heading and velosity of an object so you can lead it with the gun or avoid it with the car or whatever...
  • They've finally managed to emulate the slow refresh of older LCDs... Now we can all enjoy the sheer gaming nirvana of playing first-person shooters on aging laptops!! I've been waiting for this day...
  • ... this is nothing that can't be done on 'normal' hardware at similar performance levels to what 3Dfx hardware is getting.

    The blur would seriously piss me off while playing - I'd try to rail someone, and hit their shadow instead. Looking at some of the 'hardcore competitive' players in my area, with GeForce II's, etc, who play 640x480, minimum texture detail and vertex lighting - I really don't think they give a damn about the enemy models being smeared all over the screen.

    I've said it many times before. Death to 3Dfx! I finally got my wish!

  • I coldn't help but notice, on the screenshots, the FPS... 8-10 fps. That's totally unplayable. Some people consider 30fps unplayable. Besides, I like to have precise aim on my opponents. The blur would just confuse me when I appear to have 25 ennemies lobbing a few rockets around my head.

  • Looks more like badly compressed jpeg's to me!
    Does everything have to be blurred?
    Could be an awesome effect, if used appropriately and in small quantities rather than just because they can as these shots appear to be.

  • Here's why it would actually make sense to take the "screenshots" when the game was running at 9fps - notice that the blur looks awful? That's because it's aggregating frames that are .1s apart. So you get ass. But it proves that the "screenshot" is running motion blur. If it were running at an acceptable frame rate, it wouldn't be so noticable - better gameplay, worse "screenshot".
  • Just to be fair, there was one shot @ 16fps.
  • All you people who say this is dumb are the ones that keep the output of 3d hardware looking shitty.

    No, the fact that Voodoo cards have only had a stencil buffer for a few MONTHS has kept things looking shitty, whereas the rest of the industry has had it, and other elementary features that 3Dfx 'forgot' so it could get higher fillrates, since 1998.

    I don't care what any of you say, when compared to a real world scene or a raytraced rendering GeForce2 output looks like shit.

    That's because it's generated several times faster. If you could track down a hardware raytracer board, you'd probably be pleasantly surprised by the performance level.

    Current games don't contain enough information to warrant raytracing anyway. "Ooh - there's another shadow on the intersection of those two walls!" There's very little difference.

    but the T-buffer from 3dfx was an attempt at being able to simulate such things

    No, the T-buffer was an attempt by 3Dfx to market itself into a good position. It failed miserably. It doesn't offer anything at all to programmers except narrowing the potential market to Voodoo users.

    Do you all want to look at unrealistically sharp corners and harsh angles in your games forever?

    I guess you haven't seen Quake III recently - or any other game to come out in the last year.

    Sure I'm being obnoxious

    You sure are. And clueless. And a lot of other adjectives that would be suitably preceded by the word 'fucking'.

    their T-buffer tech is the only innovative thing to come out of the video card industry in recent years

    It's not innovative. It's not even new, and it's not properly done. It's too restrictive to be useful - do you see Carmack running around extolling the virtues of T-Buffer? No. He's focusing on what people actually have - the GeForce line. I believe he described T-Buffer as a 'simple accumulation buffer' - ie. something that any old 3D card can do.

    As evidence, notice that the Motion Blur Q3 also runs on Voodoo 3's - which don't have T-Buffer. Coincidence?

    Unfortunatly FPS stats sell video cards these days, not image quality.

    Damn right. We want our games fast, smooth and sharp. FSAA all three points worse, and supposedly 'improves' image quality. It's pathetic. And now we're expected to believe that blurry images are better. What next? "Black screen, the ultimate high! Discover intellectual wonders in an empty screen on your NEW IMPROVED *(compared to S3 Virge) Voodoo 8 With External Power Supply And 128 Processors! And No T&L So If You Don't Have A Quad Athlon 2GHz, You're Fucked!"

    Have a nice day.

  • Does anyone else think this visual effect "feature" is cheezy, cartoonish, and overall really stupid looking? Looking at it makes me think of really bad cartoons. Maybe it'd actually be better playing the game than seeing stills...

    -- iCEBaLM
  • No, that's "why you want (calculated motion blur) or (higher resolutions and faster frame rates)". I love your claim that without motion blur, there is no sense of speed or motion, by the way. Excellent use of hyperbole to disarm whatever point you may have had.
  • By the year 2000, we'll all have flying cars, video phones, and -- as shown in the Bugs Bunny cartoon from circa 1945 -- hunting rifles that shoot lightning bolts.

    Just what is considered "futuristic" looking nowadays? When you look at old comic books, etc, it's bright, gleeming stainless steel and glass. Nowadays, it seems like gray, dark, dank Bladerunner-style is "futuristic". Probably says something about society. :)

    On the other hand, Star Trek-style futurism looks like a Doctor's waiting room, with a few extra technological features.


    --

  • Turns quake into Tron, why would they call that a feature? I haven't seen anyone run fast enough to make a blur anyway. To be realistic the whole screen should blur when you change directions quickly, but screw that, I want to see what Im shooting at.
  • People use high speed film to eliminate motion blur in still images, not animations. If it can be done without sacrificing frame rate, you WANT motion blur. Done properly, it is not noticeable, except that motion seems to look less choppy.
  • Because of course, our senses are real.

    And it was a bloody typographical error, not "poor form". I was transcribing your text, and missed typing a word which WAS considered in my point.

    I am looking out the window at cars driving roughly 60mph. I'm about 500 feet away from them. I have a REAL MOTHERFUCKING REAL sense that they are moving, and yet, I don't witness any motion blur. Of course, your claim is that the motion blur is STILL THERE, it's just so SMALL that I don't NOTICE it. However, if I move my cursor at the same speed across my monitor, I get EXACTLY the same sense that it is moving. I don't know how much more REAL MOTHERFUCKING REAL my senses get than the degree to which I am able to sense them.

    ps dont make grammatical errors when yur flaming someone because of a typographical error.
  • oops, my bad, the threatening is in the second one and the first one is just a whiney message to detonate.net
  • I remember seeing motion blur sometimes while playing Carmageddon 2, but it was very intermittent. It was weird seeing it go blurry for a second, and then back to normal when the opponent car got closer to you.

    Really, why are the screen shots that of a camera taking pictures of a monitor? How do you expect to see anything clearly, especially with this "motion blur" he's talking about? Pretty bad article.
  • by mjh ( 57755 )
    The most interesting thing that I see in these pictures is that they mention the frame rate. The pics that I see have 8 FPS, 19 FPS, and 12 FPS respectively.

    That's effectively unplayable.
  • UH, motion blur is a diffrent feature from 3dfx's FSAA, that site is using both.

    That's a (I think good) design decision on the part of 3Dfx for the T-buffer. The cost is four samples per pixel. If you take the samples intelligently, they give you antialiasing, motion blur, depth of field, soft shadows, fuzzy reflections, and other things I may have forgotten, all for almost the same price as just the FSAA.

  • I'm afraid you're not going to get it. Even given that 3dfx have been demoing wildly overblown versions of this effect to make it more obvious (a _good_ motion blur would be maybe three very faintly overlaid images- no big streaky effects but _textures_ and _edges_ would soften in the direction of motion, which would not require anywhere near that much performance hit), why should nVidia ever let this see daylight when they can simply sue anyone else in the industry who ever attempts to bring it to the market? You're not going to get it. It's more profitable for nVidia to keep anyone else from getting it and not bother to come out with it themselves. Software patents at work for you again.
  • why don't you FUCKING inform me then MOTHERFUCKING JACKASS?

    Or bring your ass over here and show me a 1200dpi image at 240Hz that DOESN'T appear to simulate motion blur without software motion blurring?
  • I've seen motion blur being used in an arcade. It is a tennis game and I believe it was Atari's.

    The motion blur effect is breathtaking, even at about on 30 fps. I doubt it is produced by a Voodoo though.
  • Why is this so special? Just take Windows 98, open disk defrag, format a floppy disk, and surf the web...look at that mouse pointer and menus blur!!! And you don't even need mouse trails for it.
  • i have an old voodoo2 add-on card, and when i play UT, i see motion blur........... and when i type, i see motion blur.... and when i pay for new linux boxes, i see motion blur..... and when i people run away from my discussions about Beowoulf clusters.....
  • This story may have been posted by the owner of the actual website in order to generate ad revenue, but I still checked it out anyway. My opinion? The shots aren't anything special. Nothing that we haven't already seen, and it still looks like it would suck because the card was only doing 8-10 fps with the blur extension enabled. It's worthless to me to sacrifice all that FPS for the sake of having a motion blur effect... woohoo. If it moves, shoot it. I get 90 fps in quake3 with my g400 (if I play in 640x480), and that's fine with me.

    /me runs like heck and sets up a firewall :)

    - webfreak

    webfreak@themes.org
    http://e.themes.org developer
  • ...of why 3dfx did not survive. Look at this. Is it true innovation? Does it serve to improve gameplay without sacrificing other important performance aspects?

    The "motion blur" in these scenes is not actually motion blur... it's the same thing as pixel memory (remember those old phosphorous screens and my favorite xscreensaver). It buffers the previous frame then additively disolves it overtop of the new frame. (You can't even call it bluring.) It's poor and if you'll notice, is it really worth losing 40-50 frames/sec?

    It's technology hype. Sure, it's kind of neato, but it's used where it doesn't apply (notice the walls get blurred? wtf?) and it only serves to muddy up the images. You wouldn't even *need* a Voodoo chipset to do this if you were willing to sacrifice enough video memory. Think about it. Gimp does this same effect real time too. :-)

    NVIDIA's approach, is however, a bit more in the right direction. It uses a real blur, and maintains frame rate. I highly suggest that you check out http://www.nvidia.com/Marketing/Developer/DevRel.n sf/pages/64A26BA3A82A992188256993007AC623 [nvidia.com] if you want the real deal.

  • I imagine its because not only is it rendering the current frame, but also the six/seven frames of motion blur objects!

    Baz
  • Man i wish i could get "screenshots" of the motion blurs i get on acid.
  • I totally agree. Slashdot has gone way down hill since VA Linux took over.

    More promotion then "news"

    Fight the power at SlashDuh [slashduh.org]
    --

  • Maybee more people should read . . .

    To quote the page providing the binaries:
    "What we have here is a demonstration of 3dfx Interactive's "T-Buffer" being used to effect "Motion Blur" in the game Quake3Arena (or rather an earlier Q3Test version). It requires the use of 4sample Rotated-Grid Full-Scene Anti-Aliasing."

    Yes, thats right, it says that it REQUIRES the use of 4xFSAA

    Journey
  • Put your monitor on top of the clothes dryer while it's on spin cycle.
  • You are fucking shitting me, right? Look at the screenshots! That's not natural blur. That's freaking mouse trails.

    For us to see blur, something has to be moving pretty damn fast - ie. much faster than what happens in 99% of games. Quake III is pretty slow-paced.

    I'm not an NVIDIA bigot. I just can't stand 3Dfx.

  • How is this exclusive? 3DFX motion blur pictures have been out since before the v5 line of cards were released!!!!
    I only think it's because they were the only ones to run it at @ 1280x980, 4X FSAA that they can call it an exclusive.
    Way to go, look at the fps here [tweaktown.com], 5fps? niiiice!
  • Technically we AREN'T seeing this new effect, since it can only be produced on a high-end card we don't own.

    Thus if we head over to that site with our current cards, we will see some footage that will not impress us and thus not encourage us to buy the newer card. I love it!

  • I just got my Voodoo 5 5500 and I love it. I have a pII 450 w/256m RAM and I get 30-40 fps with motion blur on. What I don't understand is why people wouldn't want T-Buffer features. High framerates is cool and all, but anything above 60 FPS is pointless - I'd rather have better looking / more realistic graphics. Does anyone know if any new games will be supporting the T-Buffer now that 3Dfx is gone?
  • And this is good...why?

    Oh! I know! I can make my own "Agent dodging bullets" scene in Quake!

    at 9fps. Woo.
  • Some kid takes shots of his monitor calls it exclusive, and post it on /. Ive seen shots from the Q3A demo with T buffer effects over a year ago on sharky extreme, and various other gaming sites, the demo to actualy play with the effect has been out for ages too. Just some one looking for attention i guess.
  • by CBoy ( 129544 ) on Friday December 22, 2000 @05:40AM (#543675) Homepage
    I prefer the NATURAL motion blur induced by a few good beers. :-)
  • by Snowfox ( 34467 ) <`snowfox' `at' `snowfox.net'> on Friday December 22, 2000 @06:04AM (#543676) Homepage
    So this &ltfingerquote&gt motion blur technology &lt/fingerquote&gt involves a kind of smearing technique? Kind of like you'd get from taking photos of monitors [tweaktown.com]?
  • I don't think Quake is the best application to demonstrate "motion blur" with, because it's really not appropriate. How often do you walk down the street and see the people moving around you as "blurred" - I mean, maybe if you're on drugs or something.

    Yeah, this looks cool and there are probably some good applications for it, but is it necessary?

    The advances in computer graphics are, in my view, meant to bring the computer world closer and closer to the ability to accurately display what one sees in the real world, while allowing you the freedom to create false realities. So, having motion blur does not seem to me to be that big of an addition, since it is rarely seen in the real world.

    You really only see true motion blur when items are on your peripheral vision or are moving *extremely* fast. A guy jogging in front of you is not going to produce a blur. The bullet aimed at your head might, though.

    So, I guess it's great that there are now better tools to produce this effect, but it's up to the developers to use it effectively and not to overdue it or use it in an inappropriate place.

    --

  • Thanks, i feel more enlightened =:-)

    Happy [late] solstice and any other assorted holidays =:-)
  • Look at the settings:

    1280x980, 4X FSAA

    Nobody runs at this res. Sure, it looks pretty there, but it also looks pretty damn nice at 800x600 4X FSAA, which is a much more realistic resolution, and that doesn't run at 9 fps.

    So try it out before you start bitching about how unrealistic the framerate is.

  • Past a certain point, you're not going to notice a difference in framerate. The threshold depends on the viewer - I usually hear quotes of 60fps, although I prefer around 90 to get a smooth game. That threshold isn't hard to hit with current hardware.

    I agree with the second poster - the 3D card should emulate the environment. Wouldn't it be stupid if the card tryed to account for the fact that your peripheral vision is blurrier than your central vision?

  • by Anonymous Coward
    This feature has been around for ages, just under a different name. It was originally called dual-scan lcd, and people would pay to get rid of it.
  • "real sense of" as opposed to "sense of". There is a difference, one feels like speed and motion and one *is* speed and motion as far as your brain is concerened. The point was, to really fool the brain into thinking it is moving an integral part is motion blur. Is that better?

    ps dont claim someone says something and then simplify what they said to fit into the context of your flame. its poor form.
  • You know, I saw VIDEOS (not pics) of this being used on a variety of games including Q3 MONTHS ago. So I guess my question is: why is Slashdot the last place to get news?

  • recap: 9-12 fps screenshots of a special effects hack on a poorly engineered product from a now defunct company.

    sign me up for more!

    :::
  • frame rate is probably the most important thing to me in gaming. That's why I bought a 3d Prophet2 GeForce2 GTS 64mb video card over the Voodoo5 5500. I gave 3dfx a chance with the Voodoo3 2000 PCI. Worst thing I ever did.

    By the way, sfindley. 3dfx doesn't make absolutly shitty stuff, sure some sucks, but you konw why they were getting 9 fps? BECAUSE FSAA WAS CRANKED TO 4X AND THEY WERE RUNNING 1280x980! FSAA at 4x should be running at 640x480. With that they would probably be getting 30-40fps with the motion blur. I can't be certain with that, but it's a good guess considering the Voodoo5 is king of FSAA (as far as I remember from articles in the past).

    I wonder what will happen to the next batch of GeForce2 cards (Maybe GeForce3 next) now that Nvidia has 3dfx. Hopefully superier FSAA and motion blur. We can only speculate right now.

    ----------
  • by seanmeister ( 156224 ) on Friday December 22, 2000 @05:40AM (#543695)
    I've had these blur effects ever since the original Doom was released!

    Oh wait, those are fingerprints on my monitor...

    Sean

Programmers do it bit by bit.

Working...