Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

ATI Radeon 9700 Dissected 192

Bender writes "The guys who laid out the future of real-time graphics a while back have now dissected ATI's Radeon 9700 chip. Their analysis breaks down performance into multiple components--fill rate, occlusion detection, pixel shaders, vertex shaders, antialiasing--and tests each one synthetically before moving on to the usual application tests like SPECviewperf and UT 2003. You can see exactly how this chip advances the state of the art in graphics, piece by piece. Interesting stuff."
This discussion has been archived. No new comments can be posted.

ATI Radeon 9700 Dissected

Comments Filter:
  • by Gooner ( 28391 ) on Tuesday September 17, 2002 @01:47AM (#4271276)
    The big question for an ATI part is how are those drivers. In addition, it looks like going to .13 micron is helping slow Nvidia down for the NV30 but that is a bump in the road ATI will hit too.
    • by Nailer ( 69468 ) on Tuesday September 17, 2002 @04:05AM (#4271496)
      Every NVIDIA card since the GeForce2 Ultra has had Linux drivers before they even hit the shelves. This is because Nvidia pay people to write and maintain the drivers. They might not have specs, but at least NVidia support your choice of operating system.

      ATI release some specs, and that's all. They don't either bother writing drivers for their cards and they just hope someone else will - *maybe the weather channel, maybe soon, maybe later, maybe not for your specific card) or release binary-only drivers (great, at least they exist) that don't have anything like the performance of their Windows drivers. The UT2003 benchmark, if ran under Linux, won't even start on a Radeon 8500 (which ATI do have fast, binary only drivers for because its missing correct support for S3 texture compression. Which isn't exactly a new technology by any means.

      So I can get Open Source 2D support for a Radeon 9700? Great. I'm sure 2D support is why people buy a Radeon 9700.

      Vote with your dollars.

      • Why are you slamming ATI for releasing binary-only drivers, while hailing Nvidia? Nvidia does exactly the same thing.
        What do you think the 1MB 'Module-nvkernel' file in their NVIDIA_kernel-1.0-nnnn.tar.gz [nvidia.com] is?

        NVIDIA_kernel-1.0-2960> file Module-nvkernel

        Module-nvkernel: ELF 32-bit LSB relocatable, Intel 80386, version 1 (SYSV), not stripped

        You didn't seriously think the few snippets of C code in that package was the complete driver, did you? That's just a kernel wrapper for their binary blob.
        • The point was that NVidia release binary only drivers that use the full potential of the chip, and achieve pretty much the same performance as the Windows drivers.
        • Why are you slamming ATI for releasing binary-only drivers

          I'm not. Read what I wrote. I'm slamming ATI for not releasing any drivers for their current generation cards, releasing poor quality binary only drivers for their older cards, and expecting the community to write drivers for the rest.

        • I think the poster's point was that you have two choices for drivers with an ATi card:

          a) Open-source drivers - No S3TC support, UT2K3 won't even run
          b) Binary-only drivers sorely lacking in performance. (I don't even recall seeing any Linux binary drivers from ATi - Does he mean the XiG drivers you have to *pay extra for*?)

          With Nvidia, your only choice for 3D is unfortunately the binary drivers. While I'd rather not have it be that way, NV's drivers are maintained from the *same* source base that ATi's are, and hence are kept as up-to-date as the Windows drivers. In fact, the Linux drivers often *outperform* NV's Windows drivers by 1-2 FPS. (Not a big difference, but the fact is that they are not only "as good", but they are FASTER.)

          So overall, given that binary drivers are the ONLY real option for both cards, NVidia is the way to go because their binary drivers are *far* superior to ATi's.
          • "While I'd rather not have it be that way, NV's drivers are maintained from the *same* source base that ATi's are"

            Huh?? Perhaps you mean to say NV's Linux drivers are from the same source base as their Windows drivers. Yes their unified driver model is a very Good Thing.
      • The binary driver for the R8500 is indeed missing texture compression. However, ATI has a driver which does support s3tc since quite some time already, but it is not released yet - but don't ask me why. ATI also already demonstrated a 3d xfree/linux driver for the R9700 (they did a demo on it), but no release so far (and I haven't heard anything of a release plan neither).
        The DRI people have some problems with supporting s3tc with the Radeon 7200/8500, but these are not technical problems, and they don't have anything to do with ATI - s3tc is covered by patents. mczak
      • by Anonymous Coward
        Epic Games' much-anticipated Unreal Tournament 2003 Demo is now available on a self-booting Gentoo Linux-based LiveCD, allowing you to play the Unreal Tournament 2003 Demo using any modern PC with an NVIDIA GeForce 2 or greater graphics card and a CD-ROM drive. Full networking, OSS sound and Creative Soundblaster Live! and Audigy support included, allowing for the full gaming experience including LAN/Internet play, EAX environmental audio and of course 3D accelerated OpenGL graphics. The CD also serves as a fully-functional Gentoo Linux installation CD. Go grab it here! [gentoo.org]
      • by Lethyos ( 408045 ) on Tuesday September 17, 2002 @08:07AM (#4272132) Journal
        NVIDIA uses the same codebase for ther Windows, Mac OS ?, and Linux drivers. This same codebase will also be used for their FreeBSD drivers to come [netexplorer.org]. Their unified driver architecture ensures that every platform the card runs on gets the latest version of the code and can take advantage of each card's features. So this is definitely a few notches above ATI who won't even produce drivers for my platform, let alone release full specifications to the public to write them.

        As for the complaint that NVIDIA is no better than ATI because of a binary driver release: that is not NVIDIA's fault. NVIDIA tries to make as much of their driver open source as possible (which is kind of a necessity because of the plethora of kernel configurations out there). However, the closed-source portions are kept closed because of SGI's patents on OpenGL. Assign blame where blame is due, please.
        • Patents (Score:1, Interesting)

          by Anonymous Coward

          I believe SGI sold most (all?) of their OpenGL patents to Microsoft some time ago...
        • There's nothing stopping NVidia from releasing full specs for their cards so that the DRI people can make their own attempts at developing drivers. They may not be as good, but at least they'll be open and allow development for as long as Xfree86 exists. Or if NVidia was non-lame, they'd help develop the DRI drivers using the non-SGI-licensed Mesa libraries.

          When I buy a piece of hardware, I have every reasonable expectation to get full register level documentation on how to interface with the hardware. I don't care if the chipset itself is a black box. Disclosing how to talk to the hardware does not give up any trade secrets--it's just giving the customer what they paid for.

          Nvidia is full of crap. Of course, it'd be nice if ATI would actually help/fund the DRI people too so we don't have to wait a year or more for 3D support on each new Radeon card.
    • A card is only as good as its support drivers and ATI support BLOWS. I've had a radeon 9700 since the day they came out. 2 weeks after they hit the market you could still not get ANY HELP from ATI. The card is great, their support SUCKS, and DRIVERS make the card. While I love the performance, the 5 day response time for a simple driver question, which WAS NOT answered, instead I got a form letter telling me to go to a broken URL to download drivers that did not exist. After 2 PHONE calls I still was down and It took me requesting a RMA# to actually get anyone to read my email. ATI's big problem has always been their LOUSY drivers and things have not changed. While the Radeon out performs my TI-4600, if drivers keep coming this slow and buggy I will be tempted to put the Nvidia card back in the main machine. NWN STILL HAS ISSUES with the catalyst drivers, and it crashes on exit about 85% of the time, needing a hard boot. My advice is if you've not already bought one HOLD OFF. Let ATI make their commitment to driver support first before investing that much money. BTW the WDM drivers they shipped with the card WERE not windows certified, even though they claimed they were. It seems the WDM drivers were left out of the bundle shipped to M$ and they just tacked them on after the fact. ATI support claims a certified driver will be forth-coming...or so they've said for 3 weeks now.
  • I've had ATI's video cards in the computers I've built for myself, and I must say that they are great. I love the All-in-Wonder series in particular. It's nice to be able to watch cable television on one's computer. Looks like ATI has another winner on its hands, and that's good news for all of us.
    • I dunno if having a TV tuner built into your vid card is a great idea. If you're an early adopter of the latest and greatest then it makes more sense to have a standalone card or a USB one.
    • by Anonymous Coward
      I find using a separate TV card works quite well for the same purpose, and when I upgrade my video card for better performance, I don't have to worry about paying a premium for one with TV functionality built in - you're essentially paying for that TV functionality over and over again, whereas a cheap hauppage PCI tv card could be used through several generations of main graphics cards.

      • I had noticed that reviews of the Radeon 8500DV (with TV tuner) card pointed out that its bandwidth is a smidgen slower than the regular 8500 (without tuner). I think ATI said this was because they used slower memory (4 ns) on the 8500DV compareed to the 8500 (3 ns?) to give it an afforable price point. So if you're right, a separate PCI tuner card might give good TV without hindering 3D performance for other things such as games.
    • I couldn't give a monkey's about the TV tuner crap, but ATi have consistently offered BY FAR the best DVD playing and RAMDAC performance (Matrox excepted on the RAMDAC) in their cards. For Mac users with BIG CRTs, ATi provides MUCH better image quality EVERY time. The MacOS nVidia drivers are buggy bastards, I'm afraid.
  • "usual application tests"

    UT2003 demo has been out for a grand total of 3 days and its already a usual test?
    • Re:usual suspects (Score:1, Informative)

      by Anonymous Coward
      a UT tech test has been around for months - anandtech has been using it for a while.

      Epic released it so people would at least have a rough idea how one card would compare to another (I believe only relatively speaking, not in terms of absolute fps as the test was preliminary)
    • "The UT 2003 demo is a late addition to this review. Epic released it right as we were finishing up, and we decided to include test results, because UT 2003 uses more advanced DX8 graphics features, more polygon detail, and larger textures than most current games."
    • Maybe I will be a bit off-topic here, but there's one thing that always bugged me: how does one measure the frame rate, for example in Quake? I want to benchmark my notebook but couldn't find anywhere in the docs how to show the frame rate. Do I need a patch/plug in or is there a magic command line option?
  • by Fred_A ( 10934 ) <fred@freds[ ]e.org ['hom' in gap]> on Tuesday September 17, 2002 @01:56AM (#4271292) Homepage
    The real question for those of us who don't run Windows is how well it works in X. What it ATI's attitude towards open source ? Are their specs public ? Do they provide drivers ? In short is there a reason to switch from nVidia when I upgrade ?
    • by StArSkY ( 128453 ) on Tuesday September 17, 2002 @02:09AM (#4271315) Homepage
      Goto the ATI web site. Just Click on "Built by ATI" in the Drivers section, choose Linux/Xfree86 and then Radeon 9700 Pro, and there yah have it. Their approach is not perfect, but at least theyconsider it, and actively support the 2d side. As for 3d... Doesn't look to promising...

      IF you can't be bothered with the clicks, look here [ati.com]

    • Tungsten Graphics [tungstengraphics.com] were recently contracted by the weather channel to write accelerated xfree86 drivers for the Radeon. You can get a beta from their site. Given that ATI make their specs available and the influx of cash you'd expect the drivers to develop well.
  • What a quote on page 16. "110 million transistors of joy".

    My power Supply struggles with the Radeon 8500. I am going to have to upgrade before i get one of these babies. Running Dual LCD's, the Radeon's are the only real option.

    I have to hand it to ATI, they have absolutely wholloped the rest of the market getting this baby out before Christmas.

  • All the hardware wizadry that ATI can come up with won't matter a damn thing unless they get their drivers straightened out. ATI has a long and sordid history of terrible, terrible driver support - crashes/lockups/glitches etc. etc. (I'm talking Windows here). Take a look at alt.comp.periphs.videocards.ati to see how badly many are faring with this new card, even though many reviewers claim that ATI has fixed their historical driver problems.

    Meet the new ATI, same as the old ATI.

    • And if you look at the forum's for nVidia card, you'll see the same basic posts... Please remember that people post on forums and newgroups like that in order to complain, not to praise.

      Dinivin
  • by Animats ( 122034 ) on Tuesday September 17, 2002 @02:20AM (#4271332) Homepage
    There seems to be considerable enthusiasm for procedural shaders amongst graphics card designers. This is not necessarily shared by animators.

    It's partly a working style issue. Texture-map people go out with cameras and photograph nice-looking surfaces, which they then apply to models. Or they paint the textures. Procedural shader people try to code up the "meaning" of a texture. Texture maps are created by artists; procedural shaders are created by programmers.

    The basic problem with texture maps, of course, is that you can't get too close before the texture gets blurry and the illusion breaks down. In film work, you know where the camera is, so you can figure out how much texture detail you need. Games don't have that luxury; you can get close to a surface and blow the illusion.

    Most film work other than Pixar's has used texture maps. There are exceptions, but they're typically for hair, fur, and water, where the problem is motion.

    The price you pay for using procedural shaders is that they usually model surface, not detail. So you have to model the detail. Lots of it. Again, Pixar is notorious for this. ("We modelled the threads on the screws, even though you couldn't see them!")

    Texture maps, bump maps, and displacement maps can be used to modulate procedural shaders, and that's probably how surface detail will be done, rather than getting carried away with building complex textures in some programming language.

    • by cooldev ( 204270 ) on Tuesday September 17, 2002 @02:59AM (#4271398)

      Again, Pixar is notorious for this. ("We modelled the threads on the screws, even though you couldn't see them!")

      How else would the objects stay together? Magic? Sheesh.

    • Texture maps are created by artists; procedural shaders are created by programmers.


      Procedural shaders is as much 'art' as texture maps.

      Speaking with an 'artsy' analogy, using texture maps is akin to doing a rough sculpture and painting patterns on it to make it look more real, while using procedural shaders is like doing a very detailed sculpture. How can you say that modelling the threads on the screws is 'less art'?

      Texture maps, bump maps, and displacement maps can be used to modulate procedural shaders,


      This I agree with, though. In the short term, bump & displacement maps is a fast way to beauty.

      But maybe it'll become easier for artists to use procedural shaders in the future, and there might be more ready-made objects available.
    • "The basic problem with texture maps, of course, is that you can't get too close before the texture gets blurry and the illusion breaks down...Games don't have that luxury; you can get close to a surface and blow the illusion."

      This problem was addressed long ago. The solution was Mip-mapping. Mip-mapping is a technique used where textures are swapped in and out of a scene, depending on how far you are from an object. Most games have at least 2 and somtimes 3-4 different resolutions for each texture so the texture is never "blurry" or out of focus.
      • "The basic problem with texture maps, of course, is that you can't get too close before the texture gets blurry and the illusion breaks down...Games don't have that luxury; you can get close to a surface and blow the illusion."

        This problem was addressed long ago. The solution was Mip-mapping. Mip-mapping is a technique used where textures are swapped in and out of a scene, depending on how far you are from an object. Most games have at least 2 and somtimes 3-4 different resolutions for each texture so the texture is never "blurry" or out of focus.

        Perhaps I'm being pedantic here, but that's not what MIP mapping was for. Lance Williams invented it as an inexpensive means of texture antialiasing (see "Pyramidal Parametrics", Computer Graphics (SIGGRAPH) Vol 7, No 3, July 1983 (reprinted in Seminal Graphics)). Once the highest resolution texture map is defined, a "pyramid" of smaller, down-filtered, maps are created from that original source.

        You cannot obtain more detail than that which is defined in that top level map.

        Oh, incidentally, it is not a great idea to go swapping the MIP map levels in and out of (texture) memory because on true hardware the levels that the texels are read from are chosen by the hardware on a pixel-by-pixel basis. You could easily end up with texture aliasing if the hardware is forced to read inappropriate texture levels. (The P(o)S2, of course, has b'all texure memory and so developers often don't have a choice).

        What you are probably looking for are solutions either based on virtual texturing (i.e. specific HW support for swapping texture data) or use of detail textures.

        Simon
        • One thing I should add. Pyramids of textures were used in texturing systems prior to "MIP" mapping, but it was Williams who introduced triliear filtering. (The earlier systems used simpler filtering, eg Dungan(?) just used linear interpolation of samples).
        • There is also that little problem of texture memory. As was so elequently pointed out, you don't really want to swap MIP map levels in and out of texture memory. Using highly detailed base texures eats up that memory fast.

          As we continue to purchase PCs with more and more main memory (did you ever think you'd get a Gig?), I am surprised at how long it has taken graphics card manufacturers (including SGI) to take the step beyond the precious 64Mb that has been standard for the past few years.

          -Jeremy

      • by Crag ( 18776 ) on Tuesday September 17, 2002 @04:28AM (#4271528)
        It's not just the pixelation or blurring that procedural shaders solve. Combined with other techniques such as bump and environment mapping, surfaces can be given depth without increasing the poly count. A texture can be be made to look like water without transmitting wave information to the video card. Just send a function.

        The combination of pixel and vertex shaders allows stunning effects like flag that flaps in the wind and still casts the right shadows, and it's all done on the card (an example I stole from an NVidia presentation).

        It's no cure-all, but it is another large step forward.
      • Mip-mapping is basically a form of data compression. You still have to have large areas of texture data at some high resolution for it to work.
    • Procedural shaders are not mostly about creating procedural maps. These shaders can be used to specify how the different maps(textures or procedural) are combined togather, how they interact with the scene and object parameters (transparency,lights,fog,etc.). Complex BRDF lighting, toon shading, smooth shadows, slimy-shiny-bumpy reflective surfaces and all the cool effects you can think of(cloak, electricity, glow, holograms, light beams and lasers) - this is what the procedural shaders are used for.
    • You've never written renderman shaders, have you?? I do so on a daily business, and if I wasn't reading Slashdot while waiting for a render to come back, that's exactly what I am doing.

      Procedural shaders are equally for surface detail (displacement) as they are for surface appearance. I certainly write shaders for screw threads, fuzzy fur, corrugations etc.

      Pure proceduralism is as rarely used as pure texture mapping. Almost everyone uses a good combination of the two, using maps to modulate the proceduralism. Procedural shaders generate any level of detail you need, but are hard to control - textures are the reverse, so using both together works great.
  • by Wee ( 17189 ) on Tuesday September 17, 2002 @02:28AM (#4271348)
    There's a huge discussion [linuxgames.com] going on over at linuxgames.com about the new ATI cards, Nvidia stuff, and the new Unreal Tournament demo if anyone's curious. There's also a link to a great review of the demo from a linux perspective.

    (I only skimmed through part of it, but it looks like if you have an ATI card, you may not have much luck with UT2K3.)

    -B

    • XiG [xig.com] has a free, fully functional demo of the X server which supports all the necessary features to run the UT2K3 demo. The X server runs for 30 minutes before stopping, but it can be run as many times as you like. In addition, Epic has said they're working with ATI to get S3TC supported in the binary drivers ATI has released.
    • I played the demo for over an hour last night with all options at their fullest at 1280x1024 with my ATI Radeon 9700 Pro on a P4 1.8Ghz, 512mb RDRAM. It was absolutely stunning! Had a great time playing it! I have XP SP1, and the latest ATI drivers that support XP SP1.

      So, not sure what the discussion is about over on the boards there, but unless folks who have the setup are posting, then it is a lot of uninformed discussion. I have no problem and love it! Also is great in Asheron's Call 2 beta, Morrowind, and other games. Never had any driver problems yet, just keeping up with the latest drivers as they come out.

      jay
      • I have XP SP1, and the latest ATI drivers that support XP SP1.
        ...
        So, not sure what the discussion is about over on the boards there, but unless folks who have the setup are posting, then it is a lot of uninformed discussion

        If you mean that you have Windows XP, then the discussion there doesn't really apply to your setup. Everything on that site is about gaming in Linux, and they are talking about the issues with regards to the ATI drivers in Linux and the Linux version of the demo.

        Sorry if I wasn't clear enough.

        -B

  • I would surely hope that ATI comes up with the next generation of the AIW series powered with this chip. Let's hope though that they make a product that includes ALL the features of all their products (no AIW model has dual screen output, some have firewire ports while others don't etc...)

    Just imagine an AIW powered by the RADEON 9700, with dual screen output, perfect (preferably hardware deinterlaced) TV picture, FireWire connectors and all the stuff that would make us happy geeks.

    Seems the AIW product is the best in getting the Video Cards (GPUs, your-preferred-abbreviation-here...) to the next level....

    Linux drivers, anyone?
  • by JabberWokky ( 19442 ) <slashdot.com@timewarp.org> on Tuesday September 17, 2002 @02:48AM (#4271379) Homepage Journal
    Are there any 2d/extras reviews of the modern crop of cards? I don't play games whatsoever, especially 3d games, but I like a nice video system that can handle three or more monitors at high resolution with a high res wallpaper.

    Now, if you can provide that, let me throw a twist that makes me think nobody has done it - I've run 100% Linux for several years. Is there a site that reviews video cards plus all the extras (like TV-in and out) with an eye toward their Linux compatable features? I have a G400 and ATI All in Wonder Pro and can do TV-in (but not record video) and TV-out (although I lose a monitor and have to swap cables makes that a PITA).

    For that matter, I'd like to do video editing at some point in the future (when I get a digital camcorder). I'd like to convert all my VHS tapes to a digital format. Anybody know of a good import card at a reasonable price (under that $5k prosumer/low end professional bracket)? If it doesn't pull the absolute *best* quality possible from the VHS format, I'm content to wait rather than reencode a couple years from now.

    I kick this question out to Slashdot every year or so. To those with experience: what's the latest?

    --
    Evan (no reference)

    • I got a broadcast level digital edit system from www.dps.com for £1500 and that was 5 years ago.

      I've masted video on it that was been broadcast on MTV and ITV.

      (mines the dps PVR system - it uses a dedicated scsi UW drive and records in m-jpeg, it's a Windows only project. they say that other drivers will never be available because of some licensing issues with Adaptec)

      • I used to use the same thing to do TV commercials locally. Worked great - the drivers were a little buggy and crashed a little more than I appeciated, but the quality was excellent.

        Using Avid DV Xpress now, which is simply incredible. And of course, is available on both PC and Mac formats.
    • For that matter, I'd like to do video editing at some point in the future (when I get a digital camcorder).

      Video capture on Linux... from a "freebie" capture port on your video card??

      Forget it man.

      Video capture requires drivers AND applications. You buy a video card for Linux, and IF the manufacturer supports Linux, video drivers are all you get. ATI has drivers for Linux... but not even the 3D part. See what I mean?

      The only way to get Linux capture drivers is to buy a dedicated capture card for Linux. That way you get what you paid for, with no "missing features" on the Linux side.

      Besides, the way things improve and drop in price, you never want to buy this hardware BEFORE you are ready to begin using it.

      Me? I have a MSI GeForce4 4400 (oc'd of course). Capture only works on Windows, but in a few years I expect Linux capture support to become a competitive feature... just like primitive driver support has become now.

      I've used broadcast capture equipment, and while this capture port can be called a "toy", the MSI Video-In/Out port which handles uncompressed 720x480 fine (if your drive can not handle uncompressed YUV I sugest HuffYUV which is lossless compression).

      Whatever you use, "realtime" MPEG compression sucks. It looks OK if you consider how hard your PC is working to do the job in software, but there's just no substitute for variable-bitrate multipass compression. CBR video creates fixed size files that are compromised everywhere... multipass VBR allows you to lower the "average" bitrate by 25%, AND give better quality (presuming you lower the bitrate floor and ceiling and have a good encoder).

      I've transferred 8 hours of VHS to DVD so far. Did someone say Star Wars? I didn't. ;-)

      With VHS, you shouldn't have to capture at 720x480 because of the limitations of VHS resolution on the VHS tape... you can get away with 360x480 (not a typo!) and then double the horizontal lines... a good capture card does this in hardware.

      IF there's a way to use 360x480 on DVD and specify the aspect ratio as 8:3 (did I do that right?), you'd save a LOT of DVD space but I have not tested this. Until I figure that problem out, there's no advantage to capturing at this res... but it's worth mentioning if your hardware cannot keep up (you would have to stretch the video afterwards).

      In short, dual boot... or fork out real cash for professional capture under Linux. You have a limited selection under Linux and will pay more until the market becomes more viable.

      You'd also need to MASTER your DVD's under Windows. No authoring sw for Linux anywhere (AFAIK). Once you HAVE mastered your DVD, you CAN burn it under Linux using dvdrtools [fsf.org].

      • Video capture on Linux... from a "freebie" capture port on your video card??

        No. The question was in two parts - the first about video cards, the second about deticated analog capture solutions. In fact, the reason I phrased it like that was in case there is a nice hardware analog to firewire solution that would be OS independant.

        Hardly freebie... my price range was "something under $5k". The Linux specific capture cards that I have seen have not compared well to other capture solutions. Since my analog capture requirements are primarily VHS, that's a very low hurdle. Since my home theater system is HDTV, I'd rather have something now to encode my rare VHS tapes (stuff that will never be available on DVD - quite a bit is converted from messy 16mm stock. Cult movies, low budget horror flicks, etc.) so I can safely get rid of all the tapes (well, put them in deep storage, and just flip them every 24 months).

        In other words, I'm looking to do for video what album collectors are doing to vinyl that will never be converted to CD (for that matter, I have a bunch of Tim Curry and Little Nell albums... ;) ).

        You'd also need to MASTER your DVD's under Windows.

        I'm looking to store them on DAT tapes on an HP drive, three duplicates, one local, one in another state, and one in storage. Collectors have a thing about threes (with physical stuff, it's one to play with, one to keep, and one to sell or trade). No need for DVDs in the immediate future - as I need the movies for festivals or whatnot, I'll restore the file, use them, and then delete.

        --
        Evan (no reference)

        • >No. The question was in two parts - the first about video cards, the second about deticated analog capture solutions. In fact, the reason I phrased it like that was in case there is a nice hardware analog to firewire solution that would be OS independant.

          Oh. Then you want a RCA-to-Firewire bridge. You can get them at CompUSA or online for $150 up. They are basically realtime capture devices using a constant bitrate. I have no idea what the quality is, but I would doubt it's as clean as software-based multipass variable bitrate stuff.

          Like you said though... the current Linux offerings are sub-par. Now you can GROW that market yourself by buying an inferior product, but who the hell wants to do that?

          It will take a few years for Linux multimedia to gain traction. The special effects houses are ALREADY on Linux, but you're talking niche stuff that we can't afford. If you can't wait a few years, get a iMac or PC capture board.
          • If you can't wait a few years

            I can, and since nobody this year can suggest anything decent, I'll probably wait another year and ask around again. :) As I said, I'm in no hurry to encode now and then reencode everything a few years later because there's something better. Since it's VHS, I have a low bar for quality required (but I want the maximum quality out of that poor source).

            get a iMac or PC capture board

            I've been thinking about getting a Mac for my next laptop. If a PC capture board that ran under Linux and delivered good quality existed, I do it. I may wind up just buying a new motherboard and put together a Windows machine for this... but then, as I said, I'm not in a hurry, I prefer Linux, and making a deticated system that I'll use for one (albiet long) project seems a bit of overkill - I'd rather be able to use the capture card for occasional casual use later without hauling out a different machine, having to keep that machine working right, etc.

            That adds up to a standalone hardware solution (which you say is not as good as software based stuff), or a deticated capture card, which everybody seems to agree is split at either the subpar or high end professional levels.

            So I'll wait.

            --
            Evan (no reference)

      • Or both of you could get a mac and realize this works flawlessly out of the box with no tweaking

        The hurdles one has to go through to use a x86 box, it's just sad.
        • >Or both of you could get a mac and realize this works flawlessly out of the box with no tweaking

          Why bother with such a snide remark? You read the article; obviously I have a working setup and I am happy with it.

          BTW -- this isn't 1995 anymore... x86 plug and play WORKS as good as on the Mac; sometimes BETTER. Don't believe me? Tell me how you get a external DVD-R recorder working on an iMac. The blinders that some Mac users wear.. it's just sad. ;-)

          BTW, I had a G3 up until 2 years ago. The Mac has a chance of becoming the "best of" both Linux and Windows, but they'll never get the new titles without expanding the user base.

          YEARS AWAY, but it's more likely that Linux will become more usable AND get the needed apps... before Apple gets their prices in line. Or maybe neither will happen.

    • The best way I have found yet to get video into a PC on a buget is a high quality digital camcorder hooked up to firewire. Your encoding circitry is design and function in the cam firewire is design and function to get it off the cam and then you can edit from your hearts desire VHS still looks like the garbage that it is though.
    • I like a nice video system that can handle three or more monitors at high resolution

      nVidia's newest Linux drivers [nvidia.com] claim to support up to 16 monitors. I'm not sure of the performance though.

  • occlusion detection? Is that used for detecting the occult?

    I can see how it might be usefull for games like Quake, Doom etc, but I'm not so sure about GTA etc

    Ok ok, its probably just a typo

    • by Anonymous Coward
      Occlusion is a perfectly normal english word. Occlusion detection is an important feature in complex scenarios where algorithms like bsp-trees fail. It saves fillrate by not drawing fragments which would be overdrawn with opaque pixels anyway.
    • I've noticed that the Wintel world loves to use "detection" in many of the buzzwords (such as motion detection)... Occlusion Detection may be slightly more familar to you as Occlusion Culling or just the "Cull" step in the render pipeline. It's nothing new, in fact the time saving step is what allowed oldschool 3D hardware (such as an SGI RealityEngine) to obtain decent performance back in the day.
    • occlusion detection? Is that used for detecting the occult?

      I can see how it might be usefull for games like Quake, Doom etc, but I'm not so sure about GTA etc

      Actually, occlusion detection will have a massive impact on GTA3. Basically, occlusion detection determines whether the pixel/block of pixels currently being rendered will be obscured by some other pixel/block of pixels closer to the viewer. It is most effective when you have a lot of objects to render which overlap - a cityscape like GTA3 will benefit the most from this sort of processing. Consider that at most you can normally see about 15 or twenty buildings in GTA3 even with maximum viewing distance set, although there are hundreds you could potentially see if you had xray vision. If the card can skip rendering any of those buildings you can't see, you get faster performance.

      Cheers,

      Toby Haynes

      • Basically, occlusion detection determines whether the pixel/block of pixels currently being rendered will be obscured by some other pixel/block of pixels closer to the viewer

        Thought any decent graphics engine would do that already. If it doesn't then yeah would make things a lot faster!

        Ah but I guess that if it can be done in the hardware then it'll be quicker

  • Yeap, I really need such a monster card to play UT2003 at 120 FPS...really...what, you don't believe me ? why ? just because my eye can't tell FPS after 60 ?

    The crossbar solution is very nice though as a memory interface. It has 19 GB/sec memory bandwidth. I would like to have that bandwidth in the main CPU though. An Athlon/Pentium 4 will smoke those cards with such a memory bandwidth.
    • for starters I can discern between 60fps and 120fps (but not much higher) but, like being able to tell an mp3 from a wav, it's not something everyone can do.

      FPS is also related to response and remember it is peak FPS. Get 15 people filling a nice open zone with plasma and expect your FPS to drop.

      The Quake champions I know could tell you the FPS without having it displayed from the responsiveness.

      • And I can tell the difference between 120 Hz refresh rate and 160 Hz refresh rate. It's about more than your eyes. It's also about pressing the button that makes to OSD pop up and tell me what the refresh rate is.
    • just because my eye can't tell FPS after 60

      That's you. We're all different in what we can perceive.

      I can see 75 and I doubt I've got the most sensitive eyes.

      Another aspect to consider is what happens to fps when you up the resolution or image complexity. Per-pixel shading, resolution, Anti-Aliasing, etc., will all combine to slow the cards down.

      What's keeping me away from this card is ATI's notorious reputation when it comes to drivers. Why buy killer hardware if the software for it is dodgy? Add to that that ATI's not saying how they'll handle 8X AGP and it doesn't make me comfortable that it's a good choice.

      • some info about the agp8 (dys)functionality can be found http://www.overclockers.com/tips00114/ [overclockers.com].

        "Tweaktown has a news item (dated 9/12, 7:08 AM) which states that Epox Taiwan told them that the 8X AGP problem is being caused by the GPU, and a new stepping corrects the problem.
        "

        if it was just software problem i'd not care that much about it since there was theoretical possibility of convinient update..
    • FPS over 60 is very important in a ... uh, FPS.

      In reality, people only see about 30 frames a second. The difference between seeing something in reality and looking at it rendered on a monitor is, when you watch something move across your field of vision in real life it actually occupies the space between the two frames you saw. Your eyes are not digital devices scanning each pixel of something you see, so you see fast motion as a blur.

      On a monitor OTOH, each frame is distinct and seperate from one another, so if something moves very far it will just jump from one place to another and will confuse your built-in physics engine when you're trying to line the crosshairs up on [TITAN]SexualHarrasmentPanda's head.

      Depending on the level of action in a game, you're going to want anywhere from 60-150 FPS in order to be able to predict trajectories and track targets.

      The reason you don't notice this on television, even though it runs at slightly under 30 FPS, is that the cameras keep the shutter open for the entire length of a frame, and so you still get the blur that your brain interprets as motion. Now if only they could figure out how to add that to a game, we would never have to worry about FPS again. Other than the fact that with today's hardware we'd get about .2 FPS to acheive an effect like that of course.

      • Now if only they could figure out how to add that to a game, we would never have to worry about FPS again. Other than the fact that with today's hardware we'd get about .2 FPS to acheive an effect like that of course.

        IANAFPSD, but it seems that all you'd have to do is render two frames, then do some sort of morph-type thing between them. Since the game engine knows where all the vertices are in the rendered images, it has all the info it needs to figure out which objects are going which way. It would definitely be a performance hit, but surely you could do it faster than 5 seconds/frame if it were implemented in the GPU itself.

        One caveat though - if you're in a complex scene and your framerate drops really low, the motion blur should probably turn off - both to get the rate back up, and to prevent your screen from just being a mess of blurred players and rockets screaming toward your head.
  • by Niadh ( 468443 )
    off ATI's site: RADEON 9700 PRO 128MB AGP $399

    Not to dog on ATI or Nvidia but 400 bucks for a video card is just to much. Sure it can pump out 200+ fps but it kind of gets pointless after 30. what can the humen eye sample at? 24 fps(guess) or so? Cards like this are made to stroke egos, and mine is big enough. I can only pray it doesn't fall into the wrong hands (sucky gamers that cry lag).
    • by SuiteSisterMary ( 123932 ) <{slebrun} {at} {gmail.com}> on Tuesday September 17, 2002 @08:17AM (#4272204) Journal

      Actually, it does matter with more FPS. Don't compare it to film, because even though they both use the term 'frame' they mean different things.

      A 24 fps film means that each frame is recording 1/24th of a second. That means that if an object being filmed is moving fast enough, the frame will have motion blur. When strung together with the other frames, this will give the illusion of smooth movement. A 24 fps 3d engine, however, means that you have 24 static shots. There's no transition from point to point, unless you wind up rendering said inbetween shots. Or, put another way, a 5 fps film of a hand waving in front of the camera will produce five frames full of motion-blurred hand, which, when played, will look relatively smooth. A 5 FPS render, however, will have five static shots of a hand sitting motionless in space, and when played, the hand will appear to 'teleport' from spot to spot to spot.

      Or, put another way, record that hand with a standard camera shooting at five 'frames per second' not 'several frames, each 1/5th of a second exposure' and then string the negatives into a film reel, splicing in copies to make the whole thing last one second.

      This is one of the reasons, I always thought, that 3dFX was trying to get their T-buffer out into the world, becuase then, yes, if you could LOCK the rendering at 30 FPS, and throw in motion and acceleration blur, it would still look better than a card rendering the exact same thing at 300 FPS.

      • ...if you could LOCK the rendering at 30 FPS, and throw in motion and acceleration blur, it would still look better than a card rendering the exact same thing at 300 FPS.

        No it wouldn't.

        a) T-buffer's motionblur only did 4 subframe samples (V5 6k could do 8), which resulted in a very stepped-looking blur for even moderate motion, not a smooth blur at all. It'd look much the same as a card that rendered & displayed those 4 frames individually.

        b) The card (and host CPU) still had to calculate, transform, upload, project & rasterise all those subframes, so the system had to be capable of 120 fps anyway. It's just as slow as a card that rendered & displayed those 4 frames individually.

        c) A card that could render & display at 120 fps would show more detail than one that smeared the same thing down to 30 fps. It's much like anti-aliasing - great if you're limited by what you can display rather than what you can calculate, but not a substitute for real detail.

        Motion blur isn't necessarily a good thing, it's just a way to convey movement information beyond the limitations of the display. The V5 was too limited in other ways (mostly speed) to take good advantage of this, but current cards could do a better job.

        Since people's eyes/brains are generally limited to perceiving 60-90 fps anyway, there's a valid argument for using the ability to render at 300 fps to calculate subframes & create motionblur at 75 fps instead. Contrary to the ugly & overexaggerated blur that 3dfx liked to demo, this would actually help FPS players to track their target's movement at high speeds, by still conveying some of the extra information that a true 300 fps visual system could handle.

    • just wait a few months when nvidia rolls out it's line of new cards and i'm sure the prices will go even lower and consumers like me who were once part of a fallen group of dotcomers that played quake after hours on the company lan will rise from the ashes and hail forth a new video card upgrade :-)

      Horray for Adam Smith and let the pricing war begin.
  • Reminds of Biology class.

    Now those Radeon 9700's know what it feels like to be a frog.

    Ribbit.
  • Catch up nVidia .... (Score:2, Interesting)

    by walkunder ( 607693 )
    Build their own Xfree86 driver enables all the features of their own card under none-ms systems , and don't expect the 3rd party will do them this favor ..... being a friendly and good company
  • about the AGP 8x error and no Win 9x driver support and data corruption under NTFS.

    So, despite this card's impressive numbers, expect 60 fps under XP and 2K running on fat 32 only.
  • Right now I'm using a Matrox G550 with two CRTs and I nothing but problems: When using video editing software (Avid ExpressDV, Premiere, Final Cut, you name it) with more then 16 bit colours the mouse disappears on the primary display. When i try to use software with some kind of fullscreen mode, like media players, acdsee, the computer reboots... I really loved my G550 with one Display. But (for me, YMMV) it just plain sucks with two displays... Anybody using the 9700 with two monitors with different resolutions (1600*1200, 1280*1024, with win2k) and can tell about his or her experiences?
  • Have murderous AI's locking us in the holosuite and not before then...

    (Wait a minute, hasn't the Playstations 3 PR team claimed they're debugging that at the moment?)

  • Several things jumped out at me in the beginning of the article:

    Card can do 30 bit color. ATI has no drivers for Windows that can do this, however.

    Card has floating point for color mapping. ATI has no drivers for Windows that can do this, however.

    And so on. In short, there are many cool things in the hardware that do you no good right now, because they aren't supported in Windows.

    <voice character="biff">McFly! Hello!</voice>
    Were ATI to release the interfaces to this things to the XFree86 guys, they could have an environment in which all of this cool stuff was supported very quickly. And since you can get access to the XFree86 code easily, supporting things like 30 bit color depth becomes a great deal easier than doing so under Windows. Yes, you might have to modify (GTK|QT) to get full support, and you might have a few apps asking for a 24 bit visual because they don't support 30 bit, but imagine if you had (Gnome|KDE) running 30 bit depth, running The Gimp.

    Imagine running Q3, UT, or RTCW in 30 bit color with floating point shaders.

    Imagine the pain on MS's collective faces when the boot logo of the demonstration machine is not broken glass but rather flightless waterfowl.
    • Wait for DirectX 9 [neoseeker.com] to come out if you want to see floating-point color. Even if the drivers do support it, you'll never see software using it until there's another HAL there. Remember selecting your video card in every game? No way developers are going to go back to that.

      Oh, and as for Linux supporting FP color before Windows... don't bet on it. I'm not doubting the Open Source community's ability to implement it, but MS has the specs shipped to them in lead-lined boxes with motion-detecting turrets mounted 360 degrees.

      • ...but MS has the specs shipped to them in lead-lined boxes with motion-detecting turrets mounted 360 degrees.


        Hence my point - that ATI is really only screwing themselves by continuing to allow that sort of favoritism.

        And under Linux, all I have to to is tell the game that it is talking to my native libGL, and the differences are handled there. That's what libGL is - the ultimate HAL.

        Ditto for 2D stuff: new stuff would be linked against a version of libX11 that know how to access the extended color depths - and all that would do is pass the requests on to the X server.

        In short - we actually have it BETTER than the Windows people in this regard, if the damn vendors would just throw us a bone!
    • And who would write all of this? You?

      ...you might have to modify...to get full support...

      Clearly, you're no programmer. Imagining these things is a great deal easier, and requires much less time, than actually doing them.

      I don't suppose you've wondered why it is that open-source programmers have been missing all these fabulous opportunities to cause MS pain? It's because the relatively few people with sufficient talent, time and inclination to do this sort of stuff are for the most part being paid to work full-time for a closed-source company instead...

      • Actually, I make a very good living as a software engineer, thank you very much.

        I have worked both with MS WinNT and with Linux. I have written low-level drivers for both. I have designed systems of great complexity [p25.com].

        I can trivially turn your points around by pointing to Linux, to Mozilla, to Apache, to Sendmail, in fact to all of Sourceforge.

        The single biggest thing holding back drivers under XFree is the fact that talented individuals such as myself cannot get the programming documents for boards like the ATI without signing an NDA - and to be given the opportunity to do so requires you to ALREADY be a "registered" XFree developer. Can you say Catch-22?
  • by vaxer ( 91962 ) <<sylvar> <at> <vaxer.net>> on Tuesday September 17, 2002 @07:38AM (#4271985) Homepage
    If you're putting pieces together and considering them as a whole, that's synthesis.

    If you're taking pieces apart and considering them separately, that's analysis.

    If you're explaining this on Slashdot, that's anal-retentiveness.
  • "Quartz Extreme" for XFree86 anyone? I have a huge amount of power locked up in my NVidia Gefore4 Ti card, wish I could use it for my regular 2D work (blending, translucancy, etc.)

    -adnans
  • "The Radeon 9700 Pro leads in nearly every category. It's endowed with gobs of memory bandwidth and a blistering pixel fill rate that's more than double that of the closest competition. "

    it is listed at "2600" on the chart which is just a little over a few of them, and below another one.

    either I'm retarded (*very* likely), or there is some sort of typo on there.
    • ...either I'm retarded (*very* likely)...

      sorry to say, but you are retarded :-):
      2600 for the 9700 the highest competitor reaches 1200 (Mpixels/s). Do the math :-)
      You looked at the Peak fill rate (Mtexels/s). According to the articel:
      The 9700 Pro's texel fill rate is good, but it's not head and shoulders above the other cards.

      So you see, everything is fine...
    • Yep, you're retarded. :-) Nah, seriously, the pixel fill rate is 2600 (million pixels/sec) compared to 1100, 1200 and 880 for the three compared cards. So, yes, more than double that of the closest competition.

      You're looking at the texel fillrate, which is also 2600 (million texels/sec) for the Radeon 9700, and is not blistering ahead of the competition (indeed, as you say, it's below one of them, the Matrox Parhelia).

      But, as they say a few paragraphs down, it's better to get a high max texel fill rate from high clock speed and lots of pipes (as the Radeon 9700 does) rather than from lots of texture units per pipe (as the Parhelia does), because not every game is going to want to use 4 textures on every pixel. They all want to draw a helluvalot of pixels, though.

  • ... that there seems to be a great deal of trouble with the AGP 8X interface as documented here [overclockers.com] and acknowledged here [ati.com]? This does not appear to be an isolated case, as many people with many different mainboards are reporting this. If one looks only at performance without the chance of actually getting the thing working, the review is incomplete, if not downright misleading.

Bus error -- please leave by the rear door.

Working...