ATI Radeon 9700 Dissected 192
Bender writes "The guys who laid out the future of real-time graphics a while back have now dissected ATI's Radeon 9700 chip. Their analysis breaks down performance into multiple components--fill rate, occlusion detection, pixel shaders, vertex shaders, antialiasing--and tests each one synthetically before moving on to the usual application tests like SPECviewperf and UT 2003. You can see exactly how this chip advances the state of the art in graphics, piece by piece. Interesting stuff."
Well ATI is on top... for now (Score:3, Insightful)
ATI isn't even on the chart. (Score:5, Interesting)
ATI release some specs, and that's all. They don't either bother writing drivers for their cards and they just hope someone else will - *maybe the weather channel, maybe soon, maybe later, maybe not for your specific card) or release binary-only drivers (great, at least they exist) that don't have anything like the performance of their Windows drivers. The UT2003 benchmark, if ran under Linux, won't even start on a Radeon 8500 (which ATI do have fast, binary only drivers for because its missing correct support for S3 texture compression. Which isn't exactly a new technology by any means.
So I can get Open Source 2D support for a Radeon 9700? Great. I'm sure 2D support is why people buy a Radeon 9700.
Vote with your dollars.
Nvidia only releases binary drivers (Score:3, Informative)
What do you think the 1MB 'Module-nvkernel' file in their NVIDIA_kernel-1.0-nnnn.tar.gz [nvidia.com] is?
NVIDIA_kernel-1.0-2960> file Module-nvkernel
Module-nvkernel: ELF 32-bit LSB relocatable, Intel 80386, version 1 (SYSV), not stripped
You didn't seriously think the few snippets of C code in that package was the complete driver, did you? That's just a kernel wrapper for their binary blob.
Re:Nvidia only releases binary drivers (Score:2)
No shit! At least they have drivers! (Score:2)
I'm not. Read what I wrote. I'm slamming ATI for not releasing any drivers for their current generation cards, releasing poor quality binary only drivers for their older cards, and expecting the community to write drivers for the rest.
Because ATI's binary drivers suck? (Score:3, Informative)
a) Open-source drivers - No S3TC support, UT2K3 won't even run
b) Binary-only drivers sorely lacking in performance. (I don't even recall seeing any Linux binary drivers from ATi - Does he mean the XiG drivers you have to *pay extra for*?)
With Nvidia, your only choice for 3D is unfortunately the binary drivers. While I'd rather not have it be that way, NV's drivers are maintained from the *same* source base that ATi's are, and hence are kept as up-to-date as the Windows drivers. In fact, the Linux drivers often *outperform* NV's Windows drivers by 1-2 FPS. (Not a big difference, but the fact is that they are not only "as good", but they are FASTER.)
So overall, given that binary drivers are the ONLY real option for both cards, NVidia is the way to go because their binary drivers are *far* superior to ATi's.
Re:Because ATI's binary drivers suck? (Score:2)
Huh?? Perhaps you mean to say NV's Linux drivers are from the same source base as their Windows drivers. Yes their unified driver model is a very Good Thing.
Re:ATI isn't even on the chart. (Score:3, Informative)
The DRI people have some problems with supporting s3tc with the Radeon 7200/8500, but these are not technical problems, and they don't have anything to do with ATI - s3tc is covered by patents. mczak
Gentoo and ATI - IMPORTANT! (Score:2, Interesting)
Fun facts about NVIDIA's drivers... (Score:4, Informative)
As for the complaint that NVIDIA is no better than ATI because of a binary driver release: that is not NVIDIA's fault. NVIDIA tries to make as much of their driver open source as possible (which is kind of a necessity because of the plethora of kernel configurations out there). However, the closed-source portions are kept closed because of SGI's patents on OpenGL. Assign blame where blame is due, please.
Patents (Score:1, Interesting)
I believe SGI sold most (all?) of their OpenGL patents to Microsoft some time ago...
Re:Fun facts about NVIDIA's drivers... (Score:2)
When I buy a piece of hardware, I have every reasonable expectation to get full register level documentation on how to interface with the hardware. I don't care if the chipset itself is a black box. Disclosing how to talk to the hardware does not give up any trade secrets--it's just giving the customer what they paid for.
Nvidia is full of crap. Of course, it'd be nice if ATI would actually help/fund the DRI people too so we don't have to wait a year or more for 3D support on each new Radeon card.
ATI support has NOT gotten any better..... (Score:2)
Re:ATI support has NOT gotten any better..... (Score:2)
The driver disc I was sent with the card was bad, and I COULD NOT download drivers, not available, and it took 2 weeks for them to stop sending the canned go to ati.drivers.com and download WHAT WASN'T THERE. I've had ATI cards before, and befoe you say they work FINE why don't you go READ SOME. The list of defects is more than TWO PAGES LONG, again it is a new card, but like I said before IF YOU AIN'T GOT ONE YET HOLD OFF A WHILE and let them iron out the bugs, or you can believe the AC who seems to know everything...
BTW thanks Marco for putting up your ATI catalyst disk on FTP so I could actually USE MY CARD.
Re:Well ATI is on top... for now (Score:2, Insightful)
Re:Well ATI is on top... for now (Score:1)
ATI Still the Best (Score:1)
Re:ATI Still the Best (Score:1)
Re:ATI Still the Best (Score:2, Insightful)
Re:ATI Still the Best (Score:2)
Re:ATI Still the Best (Score:1)
usual suspects (Score:1)
UT2003 demo has been out for a grand total of 3 days and its already a usual test?
Re:usual suspects (Score:1, Informative)
Epic released it so people would at least have a rough idea how one card would compare to another (I believe only relatively speaking, not in terms of absolute fps as the test was preliminary)
Re:usual suspects (Score:2)
Re:usual suspects (Score:1)
Re:usual suspects (Score:2, Informative)
But you're probably looking for:
Horrendously offtopic, I know
How about Xfree86 ? (Score:4, Insightful)
Re:How about Xfree86 ? (Score:4, Insightful)
IF you can't be bothered with the clicks, look here [ati.com]
Re:How about Xfree86 ? (Score:3, Informative)
Re:How about Xfree86 ? (Score:2)
Rumor? Huh? Hate to break it to you, but the Weather Channel has confirmed that they're paying Tungsten Graphics to develop the drivers, and Tungsten Graphics has already begun work.
Dinivin
Re:How about Xfree86 ? (Score:2)
from Tungsten Graphics' web site....
UPDATE (June 7, 2002): The Weather Channel® is funding Tungsten Graphics Inc. to develop an open source 3D DRI driver for the ATI Radeon 8500 graphics card. It is anticipated that the driver will be released to the XFree86 Project by year-end 2002, and will be distributed to the public in future versions of the XFree86 X Server.
yup...looks unfounded to me.
110 million transistors of joy (Score:2, Interesting)
My power Supply struggles with the Radeon 8500. I am going to have to upgrade before i get one of these babies. Running Dual LCD's, the Radeon's are the only real option.
I have to hand it to ATI, they have absolutely wholloped the rest of the market getting this baby out before Christmas.
Re:110 million transistors of joy (Score:2, Insightful)
Re:110 million transistors of joy-GHZ (Score:1)
drivers, drivers, drivers (Score:1)
Meet the new ATI, same as the old ATI.
Re:drivers, drivers, drivers (Score:2)
And if you look at the forum's for nVidia card, you'll see the same basic posts... Please remember that people post on forums and newgroups like that in order to complain, not to praise.
Dinivin
Enthusiasm for procedural shaders (Score:5, Interesting)
It's partly a working style issue. Texture-map people go out with cameras and photograph nice-looking surfaces, which they then apply to models. Or they paint the textures. Procedural shader people try to code up the "meaning" of a texture. Texture maps are created by artists; procedural shaders are created by programmers.
The basic problem with texture maps, of course, is that you can't get too close before the texture gets blurry and the illusion breaks down. In film work, you know where the camera is, so you can figure out how much texture detail you need. Games don't have that luxury; you can get close to a surface and blow the illusion.
Most film work other than Pixar's has used texture maps. There are exceptions, but they're typically for hair, fur, and water, where the problem is motion.
The price you pay for using procedural shaders is that they usually model surface, not detail. So you have to model the detail. Lots of it. Again, Pixar is notorious for this. ("We modelled the threads on the screws, even though you couldn't see them!")
Texture maps, bump maps, and displacement maps can be used to modulate procedural shaders, and that's probably how surface detail will be done, rather than getting carried away with building complex textures in some programming language.
Re:Enthusiasm for procedural shaders (Score:4, Funny)
Again, Pixar is notorious for this. ("We modelled the threads on the screws, even though you couldn't see them!")
How else would the objects stay together? Magic? Sheesh.
Re:Enthusiasm for procedural shaders (Score:2, Interesting)
Procedural shaders is as much 'art' as texture maps.
Speaking with an 'artsy' analogy, using texture maps is akin to doing a rough sculpture and painting patterns on it to make it look more real, while using procedural shaders is like doing a very detailed sculpture. How can you say that modelling the threads on the screws is 'less art'?
This I agree with, though. In the short term, bump & displacement maps is a fast way to beauty.
But maybe it'll become easier for artists to use procedural shaders in the future, and there might be more ready-made objects available.
Re:Enthusiasm for procedural shaders (Score:1)
This problem was addressed long ago. The solution was Mip-mapping. Mip-mapping is a technique used where textures are swapped in and out of a scene, depending on how far you are from an object. Most games have at least 2 and somtimes 3-4 different resolutions for each texture so the texture is never "blurry" or out of focus.
That's not what MIP mapping was for: (Score:2, Informative)
Perhaps I'm being pedantic here, but that's not what MIP mapping was for. Lance Williams invented it as an inexpensive means of texture antialiasing (see "Pyramidal Parametrics", Computer Graphics (SIGGRAPH) Vol 7, No 3, July 1983 (reprinted in Seminal Graphics)). Once the highest resolution texture map is defined, a "pyramid" of smaller, down-filtered, maps are created from that original source.
You cannot obtain more detail than that which is defined in that top level map.
Oh, incidentally, it is not a great idea to go swapping the MIP map levels in and out of (texture) memory because on true hardware the levels that the texels are read from are chosen by the hardware on a pixel-by-pixel basis. You could easily end up with texture aliasing if the hardware is forced to read inappropriate texture levels. (The P(o)S2, of course, has b'all texure memory and so developers often don't have a choice).
What you are probably looking for are solutions either based on virtual texturing (i.e. specific HW support for swapping texture data) or use of detail textures.
Simon
Re:That's not what MIP mapping was for: -update (Score:1)
Re:That's not what MIP mapping was for: (Score:1)
There is also that little problem of texture memory. As was so elequently pointed out, you don't really want to swap MIP map levels in and out of texture memory. Using highly detailed base texures eats up that memory fast.
As we continue to purchase PCs with more and more main memory (did you ever think you'd get a Gig?), I am surprised at how long it has taken graphics card manufacturers (including SGI) to take the step beyond the precious 64Mb that has been standard for the past few years.
-Jeremy
Re:Enthusiasm for procedural shaders (Score:5, Informative)
The combination of pixel and vertex shaders allows stunning effects like flag that flaps in the wind and still casts the right shadows, and it's all done on the card (an example I stole from an NVidia presentation).
It's no cure-all, but it is another large step forward.
Re:Enthusiasm for procedural shaders (Score:2)
Re:Enthusiasm for procedural shaders (Score:1)
Re:Enthusiasm for procedural shaders (Score:1)
Procedural shaders are equally for surface detail (displacement) as they are for surface appearance. I certainly write shaders for screw threads, fuzzy fur, corrugations etc.
Pure proceduralism is as rarely used as pure texture mapping. Almost everyone uses a good combination of the two, using maps to modulate the proceduralism. Procedural shaders generate any level of detail you need, but are hard to control - textures are the reverse, so using both together works great.
More ATI/UT2K3 info at linuxgames.com (Score:3, Informative)
(I only skimmed through part of it, but it looks like if you have an ATI card, you may not have much luck with UT2K3.)
-B
Re:More ATI/UT2K3 info at linuxgames.com (Score:2)
Re:More ATI/UT2K3 info at linuxgames.com (Score:1)
So, not sure what the discussion is about over on the boards there, but unless folks who have the setup are posting, then it is a lot of uninformed discussion. I have no problem and love it! Also is great in Asheron's Call 2 beta, Morrowind, and other games. Never had any driver problems yet, just keeping up with the latest drivers as they come out.
jay
Re:More ATI/UT2K3 info at linuxgames.com (Score:2)
...
So, not sure what the discussion is about over on the boards there, but unless folks who have the setup are posting, then it is a lot of uninformed discussion
If you mean that you have Windows XP, then the discussion there doesn't really apply to your setup. Everything on that site is about gaming in Linux, and they are talking about the issues with regards to the ATI drivers in Linux and the Linux version of the demo.
Sorry if I wasn't clear enough.
-B
Waiting for the All In Wonder Version (Score:1)
Just imagine an AIW powered by the RADEON 9700, with dual screen output, perfect (preferably hardware deinterlaced) TV picture, FireWire connectors and all the stuff that would make us happy geeks.
Seems the AIW product is the best in getting the Video Cards (GPUs, your-preferred-abbreviation-here...) to the next level....
Linux drivers, anyone?
I may be asking too much here... (Score:3, Interesting)
Now, if you can provide that, let me throw a twist that makes me think nobody has done it - I've run 100% Linux for several years. Is there a site that reviews video cards plus all the extras (like TV-in and out) with an eye toward their Linux compatable features? I have a G400 and ATI All in Wonder Pro and can do TV-in (but not record video) and TV-out (although I lose a monitor and have to swap cables makes that a PITA).
For that matter, I'd like to do video editing at some point in the future (when I get a digital camcorder). I'd like to convert all my VHS tapes to a digital format. Anybody know of a good import card at a reasonable price (under that $5k prosumer/low end professional bracket)? If it doesn't pull the absolute *best* quality possible from the VHS format, I'm content to wait rather than reencode a couple years from now.
I kick this question out to Slashdot every year or so. To those with experience: what's the latest?
--
Evan (no reference)
Re:I may be asking too much here... (Score:2)
I've masted video on it that was been broadcast on MTV and ITV.
(mines the dps PVR system - it uses a dedicated scsi UW drive and records in m-jpeg, it's a Windows only project. they say that other drivers will never be available because of some licensing issues with Adaptec)
Re:I may be asking too much here... (Score:2)
Using Avid DV Xpress now, which is simply incredible. And of course, is available on both PC and Mac formats.
Re:I may be asking too much here... (capture? Use (Score:3, Informative)
For that matter, I'd like to do video editing at some point in the future (when I get a digital camcorder).
Video capture on Linux... from a "freebie" capture port on your video card??
Forget it man.
Video capture requires drivers AND applications. You buy a video card for Linux, and IF the manufacturer supports Linux, video drivers are all you get. ATI has drivers for Linux... but not even the 3D part. See what I mean?
The only way to get Linux capture drivers is to buy a dedicated capture card for Linux. That way you get what you paid for, with no "missing features" on the Linux side.
Besides, the way things improve and drop in price, you never want to buy this hardware BEFORE you are ready to begin using it.
Me? I have a MSI GeForce4 4400 (oc'd of course). Capture only works on Windows, but in a few years I expect Linux capture support to become a competitive feature... just like primitive driver support has become now.
I've used broadcast capture equipment, and while this capture port can be called a "toy", the MSI Video-In/Out port which handles uncompressed 720x480 fine (if your drive can not handle uncompressed YUV I sugest HuffYUV which is lossless compression).
Whatever you use, "realtime" MPEG compression sucks. It looks OK if you consider how hard your PC is working to do the job in software, but there's just no substitute for variable-bitrate multipass compression. CBR video creates fixed size files that are compromised everywhere... multipass VBR allows you to lower the "average" bitrate by 25%, AND give better quality (presuming you lower the bitrate floor and ceiling and have a good encoder).
I've transferred 8 hours of VHS to DVD so far. Did someone say Star Wars? I didn't. ;-)
With VHS, you shouldn't have to capture at 720x480 because of the limitations of VHS resolution on the VHS tape... you can get away with 360x480 (not a typo!) and then double the horizontal lines... a good capture card does this in hardware.
IF there's a way to use 360x480 on DVD and specify the aspect ratio as 8:3 (did I do that right?), you'd save a LOT of DVD space but I have not tested this. Until I figure that problem out, there's no advantage to capturing at this res... but it's worth mentioning if your hardware cannot keep up (you would have to stretch the video afterwards).
In short, dual boot... or fork out real cash for professional capture under Linux. You have a limited selection under Linux and will pay more until the market becomes more viable.
You'd also need to MASTER your DVD's under Windows. No authoring sw for Linux anywhere (AFAIK). Once you HAVE mastered your DVD, you CAN burn it under Linux using dvdrtools [fsf.org].
Re:I may be asking too much here... (capture? Use (Score:2)
No. The question was in two parts - the first about video cards, the second about deticated analog capture solutions. In fact, the reason I phrased it like that was in case there is a nice hardware analog to firewire solution that would be OS independant.
Hardly freebie... my price range was "something under $5k". The Linux specific capture cards that I have seen have not compared well to other capture solutions. Since my analog capture requirements are primarily VHS, that's a very low hurdle. Since my home theater system is HDTV, I'd rather have something now to encode my rare VHS tapes (stuff that will never be available on DVD - quite a bit is converted from messy 16mm stock. Cult movies, low budget horror flicks, etc.) so I can safely get rid of all the tapes (well, put them in deep storage, and just flip them every 24 months).
In other words, I'm looking to do for video what album collectors are doing to vinyl that will never be converted to CD (for that matter, I have a bunch of Tim Curry and Little Nell albums... ;) ).
You'd also need to MASTER your DVD's under Windows.
I'm looking to store them on DAT tapes on an HP drive, three duplicates, one local, one in another state, and one in storage. Collectors have a thing about threes (with physical stuff, it's one to play with, one to keep, and one to sell or trade). No need for DVDs in the immediate future - as I need the movies for festivals or whatnot, I'll restore the file, use them, and then delete.
--
Evan (no reference)
Re:I may be asking too much here... (capture? Use (Score:2)
Oh. Then you want a RCA-to-Firewire bridge. You can get them at CompUSA or online for $150 up. They are basically realtime capture devices using a constant bitrate. I have no idea what the quality is, but I would doubt it's as clean as software-based multipass variable bitrate stuff.
Like you said though... the current Linux offerings are sub-par. Now you can GROW that market yourself by buying an inferior product, but who the hell wants to do that?
It will take a few years for Linux multimedia to gain traction. The special effects houses are ALREADY on Linux, but you're talking niche stuff that we can't afford. If you can't wait a few years, get a iMac or PC capture board.
Re:I may be asking too much here... (capture? Use (Score:2)
I can, and since nobody this year can suggest anything decent, I'll probably wait another year and ask around again. :) As I said, I'm in no hurry to encode now and then reencode everything a few years later because there's something better. Since it's VHS, I have a low bar for quality required (but I want the maximum quality out of that poor source).
get a iMac or PC capture board
I've been thinking about getting a Mac for my next laptop. If a PC capture board that ran under Linux and delivered good quality existed, I do it. I may wind up just buying a new motherboard and put together a Windows machine for this... but then, as I said, I'm not in a hurry, I prefer Linux, and making a deticated system that I'll use for one (albiet long) project seems a bit of overkill - I'd rather be able to use the capture card for occasional casual use later without hauling out a different machine, having to keep that machine working right, etc.
That adds up to a standalone hardware solution (which you say is not as good as software based stuff), or a deticated capture card, which everybody seems to agree is split at either the subpar or high end professional levels.
So I'll wait.
--
Evan (no reference)
Re:I may be asking too much here... (capture? Use (Score:2)
The hurdles one has to go through to use a x86 box, it's just sad.
Re:I may be asking too much here... (capture? Use (Score:2)
Why bother with such a snide remark? You read the article; obviously I have a working setup and I am happy with it.
BTW -- this isn't 1995 anymore... x86 plug and play WORKS as good as on the Mac; sometimes BETTER. Don't believe me? Tell me how you get a external DVD-R recorder working on an iMac. The blinders that some Mac users wear.. it's just sad.
BTW, I had a G3 up until 2 years ago. The Mac has a chance of becoming the "best of" both Linux and Windows, but they'll never get the new titles without expanding the user base.
YEARS AWAY, but it's more likely that Linux will become more usable AND get the needed apps... before Apple gets their prices in line. Or maybe neither will happen.
Re:I may be asking too much here... (Score:1)
Re:I may be asking too much here... (Score:2)
nVidia's newest Linux drivers [nvidia.com] claim to support up to 16 monitors. I'm not sure of the performance though.
Occlusion detection... (Score:2, Funny)
occlusion detection? Is that used for detecting the occult?
I can see how it might be usefull for games like Quake, Doom etc, but I'm not so sure about GTA etc
Ok ok, its probably just a typo
Re:Occlusion detection... (Score:2, Informative)
Occlusion Culling (Score:2)
Re:Occlusion detection... (Score:2)
occlusion detection? Is that used for detecting the occult?
I can see how it might be usefull for games like Quake, Doom etc, but I'm not so sure about GTA etc
Actually, occlusion detection will have a massive impact on GTA3. Basically, occlusion detection determines whether the pixel/block of pixels currently being rendered will be obscured by some other pixel/block of pixels closer to the viewer. It is most effective when you have a lot of objects to render which overlap - a cityscape like GTA3 will benefit the most from this sort of processing. Consider that at most you can normally see about 15 or twenty buildings in GTA3 even with maximum viewing distance set, although there are hundreds you could potentially see if you had xray vision. If the card can skip rendering any of those buildings you can't see, you get faster performance.
Cheers,
Toby Haynes
Re:Occlusion detection... (Score:1)
Basically, occlusion detection determines whether the pixel/block of pixels currently being rendered will be obscured by some other pixel/block of pixels closer to the viewer
Thought any decent graphics engine would do that already. If it doesn't then yeah would make things a lot faster!
Ah but I guess that if it can be done in the hardware then it'll be quicker
Just what we need (Score:2, Funny)
The crossbar solution is very nice though as a memory interface. It has 19 GB/sec memory bandwidth. I would like to have that bandwidth in the main CPU though. An Athlon/Pentium 4 will smoke those cards with such a memory bandwidth.
FPS isn't just about your eyes (Score:2)
FPS is also related to response and remember it is peak FPS. Get 15 people filling a nice open zone with plasma and expect your FPS to drop.
The Quake champions I know could tell you the FPS without having it displayed from the responsiveness.
Re:FPS isn't just about your eyes (Score:2)
We're all different (Score:2)
That's you. We're all different in what we can perceive.
I can see 75 and I doubt I've got the most sensitive eyes.
Another aspect to consider is what happens to fps when you up the resolution or image complexity. Per-pixel shading, resolution, Anti-Aliasing, etc., will all combine to slow the cards down.
What's keeping me away from this card is ATI's notorious reputation when it comes to drivers. Why buy killer hardware if the software for it is dodgy? Add to that that ATI's not saying how they'll handle 8X AGP and it doesn't make me comfortable that it's a good choice.
agp 8x (Score:1)
"Tweaktown has a news item (dated 9/12, 7:08 AM) which states that Epox Taiwan told them that the 8X AGP problem is being caused by the GPU, and a new stepping corrects the problem.
"
if it was just software problem i'd not care that much about it since there was theoretical possibility of convinient update..
Re:Just what we need (Score:1)
FPS over 60 is very important in a ... uh, FPS.
In reality, people only see about 30 frames a second. The difference between seeing something in reality and looking at it rendered on a monitor is, when you watch something move across your field of vision in real life it actually occupies the space between the two frames you saw. Your eyes are not digital devices scanning each pixel of something you see, so you see fast motion as a blur.
On a monitor OTOH, each frame is distinct and seperate from one another, so if something moves very far it will just jump from one place to another and will confuse your built-in physics engine when you're trying to line the crosshairs up on [TITAN]SexualHarrasmentPanda's head.
Depending on the level of action in a game, you're going to want anywhere from 60-150 FPS in order to be able to predict trajectories and track targets.
The reason you don't notice this on television, even though it runs at slightly under 30 FPS, is that the cameras keep the shutter open for the entire length of a frame, and so you still get the blur that your brain interprets as motion. Now if only they could figure out how to add that to a game, we would never have to worry about FPS again. Other than the fact that with today's hardware we'd get about .2 FPS to acheive an effect like that of course.
Re:Just what we need (Score:2)
IANAFPSD, but it seems that all you'd have to do is render two frames, then do some sort of morph-type thing between them. Since the game engine knows where all the vertices are in the rendered images, it has all the info it needs to figure out which objects are going which way. It would definitely be a performance hit, but surely you could do it faster than 5 seconds/frame if it were implemented in the GPU itself.
One caveat though - if you're in a complex scene and your framerate drops really low, the motion blur should probably turn off - both to get the rate back up, and to prevent your screen from just being a mess of blurred players and rockets screaming toward your head.
I found its weakness! (Score:2, Insightful)
Not to dog on ATI or Nvidia but 400 bucks for a video card is just to much. Sure it can pump out 200+ fps but it kind of gets pointless after 30. what can the humen eye sample at? 24 fps(guess) or so? Cards like this are made to stroke egos, and mine is big enough. I can only pray it doesn't fall into the wrong hands (sucky gamers that cry lag).
Re:I found its weakness! (Score:4, Informative)
Actually, it does matter with more FPS. Don't compare it to film, because even though they both use the term 'frame' they mean different things.
A 24 fps film means that each frame is recording 1/24th of a second. That means that if an object being filmed is moving fast enough, the frame will have motion blur. When strung together with the other frames, this will give the illusion of smooth movement. A 24 fps 3d engine, however, means that you have 24 static shots. There's no transition from point to point, unless you wind up rendering said inbetween shots. Or, put another way, a 5 fps film of a hand waving in front of the camera will produce five frames full of motion-blurred hand, which, when played, will look relatively smooth. A 5 FPS render, however, will have five static shots of a hand sitting motionless in space, and when played, the hand will appear to 'teleport' from spot to spot to spot.Or, put another way, record that hand with a standard camera shooting at five 'frames per second' not 'several frames, each 1/5th of a second exposure' and then string the negatives into a film reel, splicing in copies to make the whole thing last one second.
This is one of the reasons, I always thought, that 3dFX was trying to get their T-buffer out into the world, becuase then, yes, if you could LOCK the rendering at 30 FPS, and throw in motion and acceleration blur, it would still look better than a card rendering the exact same thing at 300 FPS.
Re:I found its weakness! (Score:2)
No it wouldn't.
a) T-buffer's motionblur only did 4 subframe samples (V5 6k could do 8), which resulted in a very stepped-looking blur for even moderate motion, not a smooth blur at all. It'd look much the same as a card that rendered & displayed those 4 frames individually.
b) The card (and host CPU) still had to calculate, transform, upload, project & rasterise all those subframes, so the system had to be capable of 120 fps anyway. It's just as slow as a card that rendered & displayed those 4 frames individually.
c) A card that could render & display at 120 fps would show more detail than one that smeared the same thing down to 30 fps. It's much like anti-aliasing - great if you're limited by what you can display rather than what you can calculate, but not a substitute for real detail.
Motion blur isn't necessarily a good thing, it's just a way to convey movement information beyond the limitations of the display. The V5 was too limited in other ways (mostly speed) to take good advantage of this, but current cards could do a better job.
Since people's eyes/brains are generally limited to perceiving 60-90 fps anyway, there's a valid argument for using the ability to render at 300 fps to calculate subframes & create motionblur at 75 fps instead. Contrary to the ugly & overexaggerated blur that 3dfx liked to demo, this would actually help FPS players to track their target's movement at high speeds, by still conveying some of the extra information that a true 300 fps visual system could handle.
Re:I found its weakness! (Score:1)
Horray for Adam Smith and let the pricing war begin.
Mmmmmm.. dissection. (Score:1, Funny)
Now those Radeon 9700's know what it feels like to be a frog.
Ribbit.
Catch up nVidia .... (Score:2, Interesting)
It's too bad (Score:2)
So, despite this card's impressive numbers, expect 60 fps under XP and 2K running on fat 32 only.
Re:WTF?? (Score:2)
Re:WTF?? (Score:2)
May be a problem associated with large system cache but I've read something somewhere that had a much better explanation.
The other problem, straight from ATI:
"In certain cases, after installing the RADEON 9700 PRO 128MB in an AGP 8x capable motherboard, the system will not post, or boot up.
"Not all AGP 8x motherboards are affected. Some customer's have reported that lowering the AGP transfer speed in some instances, will allow the system to then post.
"ATI Engineering has been advised of this issue and is investigating. Any updates will appear on this page when they become available."
Experience with two monitors on one of these? (Score:2, Interesting)
Wake me up when we... (Score:1)
(Wait a minute, hasn't the Playstations 3 PR team claimed they're debugging that at the moment?)
Most hardware in card not yet supported (Score:2)
Card can do 30 bit color. ATI has no drivers for Windows that can do this, however.
Card has floating point for color mapping. ATI has no drivers for Windows that can do this, however.
And so on. In short, there are many cool things in the hardware that do you no good right now, because they aren't supported in Windows.
<voice character="biff">McFly! Hello!</voice>
Were ATI to release the interfaces to this things to the XFree86 guys, they could have an environment in which all of this cool stuff was supported very quickly. And since you can get access to the XFree86 code easily, supporting things like 30 bit color depth becomes a great deal easier than doing so under Windows. Yes, you might have to modify (GTK|QT) to get full support, and you might have a few apps asking for a 24 bit visual because they don't support 30 bit, but imagine if you had (Gnome|KDE) running 30 bit depth, running The Gimp.
Imagine running Q3, UT, or RTCW in 30 bit color with floating point shaders.
Imagine the pain on MS's collective faces when the boot logo of the demonstration machine is not broken glass but rather flightless waterfowl.
Re:Most hardware in card not yet supported (Score:1)
Wait for DirectX 9 [neoseeker.com] to come out if you want to see floating-point color. Even if the drivers do support it, you'll never see software using it until there's another HAL there. Remember selecting your video card in every game? No way developers are going to go back to that.
Oh, and as for Linux supporting FP color before Windows... don't bet on it. I'm not doubting the Open Source community's ability to implement it, but MS has the specs shipped to them in lead-lined boxes with motion-detecting turrets mounted 360 degrees.
Re:Most hardware in card not yet supported (Score:2)
Hence my point - that ATI is really only screwing themselves by continuing to allow that sort of favoritism.
And under Linux, all I have to to is tell the game that it is talking to my native libGL, and the differences are handled there. That's what libGL is - the ultimate HAL.
Ditto for 2D stuff: new stuff would be linked against a version of libX11 that know how to access the extended color depths - and all that would do is pass the requests on to the X server.
In short - we actually have it BETTER than the Windows people in this regard, if the damn vendors would just throw us a bone!
Re:Most hardware in card not yet supported (Score:2)
Clearly, you're no programmer. Imagining these things is a great deal easier, and requires much less time, than actually doing them.
I don't suppose you've wondered why it is that open-source programmers have been missing all these fabulous opportunities to cause MS pain? It's because the relatively few people with sufficient talent, time and inclination to do this sort of stuff are for the most part being paid to work full-time for a closed-source company instead...
Re:Most hardware in card not yet supported (Score:2)
I have worked both with MS WinNT and with Linux. I have written low-level drivers for both. I have designed systems of great complexity [p25.com].
I can trivially turn your points around by pointing to Linux, to Mozilla, to Apache, to Sendmail, in fact to all of Sourceforge.
The single biggest thing holding back drivers under XFree is the fact that talented individuals such as myself cannot get the programming documents for boards like the ATI without signing an NDA - and to be given the opportunity to do so requires you to ALREADY be a "registered" XFree developer. Can you say Catch-22?
Analytically, not synthetically (Score:3, Funny)
If you're taking pieces apart and considering them separately, that's analysis.
If you're explaining this on Slashdot, that's anal-retentiveness.
Re:Analytically, not synthetically (Score:2)
It's only anal retentive if you hyphenate anal-retentiveness.
Want to use this for my "2D" desktop (Score:2)
-adnans
fill rate?? (Score:2)
it is listed at "2600" on the chart which is just a little over a few of them, and below another one.
either I'm retarded (*very* likely), or there is some sort of typo on there.
Re:fill rate?? (Score:1)
sorry to say, but you are retarded
2600 for the 9700 the highest competitor reaches 1200 (Mpixels/s). Do the math
You looked at the Peak fill rate (Mtexels/s). According to the articel:
The 9700 Pro's texel fill rate is good, but it's not head and shoulders above the other cards.
So you see, everything is fine...
Re:fill rate?? (Score:2)
thanks for pointing it out. gotta run, the short bus calls.
Re:fill rate?? (Score:2)
Yep, you're retarded. :-) Nah, seriously, the pixel fill rate is 2600 (million pixels/sec) compared to 1100, 1200 and 880 for the three compared cards. So, yes, more than double that of the closest competition.
You're looking at the texel fillrate, which is also 2600 (million texels/sec) for the Radeon 9700, and is not blistering ahead of the competition (indeed, as you say, it's below one of them, the Matrox Parhelia).
But, as they say a few paragraphs down, it's better to get a high max texel fill rate from high clock speed and lots of pipes (as the Radeon 9700 does) rather than from lots of texture units per pipe (as the Parhelia does), because not every game is going to want to use 4 textures on every pixel. They all want to draw a helluvalot of pixels, though.
Why didn't they mention... (Score:2)
Re:UT2003 scores??? (Score:2)