Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software

ATi Radeon 9700 Full Release Review w/ Benchmarks 318

Chalupa_Man writes: "ATi Technologies has officially released their new Radeon 9700 Pro today. Real benchmark numbers and a full review can be found here. The card is impressive for sure and should have NVIDIA on the ropes for a while, as it beats out a GeForce 4 Ti 4600 handily, especially with Anti-Aliasing and Anisotropic Filtering enabled. Image quality is also top notch for this new high end DX9 compliant product from ATi." sunny_talwar adds these links to more reviews of the new high-end Radeon at AnandTech's and Tom's Hardware. Update: 08/20 03:06 GMT by T : Cp writes "Gamers Depot also has their full review up of the Radeon 9700 Pro, including nice images of the driver tabs and 6x Antialiasing performance."
This discussion has been archived. No new comments can be posted.

ATi Radeon 9700 Full Release Review w/ Benchmarks

Comments Filter:
  • by Anonymous Coward on Monday August 19, 2002 @02:55PM (#4099381)
    This is old news.. the Radeon 9700 has been out for a few hours already. Why do we have to wait so long for news on this site?
  • HardOCP too... (Score:4, Informative)

    by Marasmus ( 63844 ) on Monday August 19, 2002 @02:57PM (#4099409) Homepage Journal
    [H]ardOCP [hardocp.com] Also has a review and benchmarks. Good stuff from the [H]ard crew.
  • Competition is a good thing. The last thing I particularly want is for nVidia to get stomped by ATI because they start getting complacent like 3Dfx did. Let's hope they keep each other on their toes.
  • by pnatural ( 59329 ) on Monday August 19, 2002 @03:01PM (#4099445)
    ...until John Carmack responds with his take on the card.

    I'm serious. How many of us base our video card purchases on the recommendations he makes? He knows the cards in detail, knows what features they support and how well, and he sure as hell knows how well they'll perform with the next id game.

    So John, is this card worthy?
    • According to a recent .plan of John's, he's already decided to demo Doom III on the Radeon platform. He decided this a few short weeks after doting on the Nvidia stuff, too.

      It sounds to me that ATI has some serious card here. Now if they can overcome their pitiful history of sorry drivers...
      • According to this article (did you read it?) the drivers recently released by ATi are anything but pitiful. They're apparently stable and solid, so feast your eyes on a new card if that's what was holding you back!
    • by Anonymous Coward
      http://www.anandtech.com/showdoc.html?i=1656 [anandtech.com]

      Quote:

      When ATI started talking about R300 and hinted that it would be significantly faster than anything NVIDIA had up their sleeves, we were understandably skeptical. The progression from there is best summed up by what our own Matthew Witheiler had to say about the R300: "It all started with Carmack's endorsement of the card; that was huge for them. Now it has erupted into something that I didn't think was possible"

      Matthew's final statement sums up the feelings all of us at AnandTech had about the R300; we were impressed that John Carmack provided such a glowing endorsement of the technology back at Quakecon, but we were floored once we actually saw working silicon in action.

    • From what I remember of an earlier post of his on one of the innumerable Doom ]|[ threads, his opinion was that the Raedon cards are superior to the NVidia cards, and have been for a while... his main beef seemed to be that ATI's drivers were so crappy that NVidia was able to outperform them, even though the Raedon card has better specs. Of course, IANJC, so take with as many grains of salt as appropriate.
    • In gamespy's coverage of quakecon [gamespy.com], Carmack says Doom3 will be, "believe it or not, based around the technology that became available with GeForce1-level hardware." (That's a quote from the article writer, not Carmack himself.) But if you buy the new ATI card, you'll see its advanced technology put to good use in 2-3 years.
    • I'm serious. How many of us base our video card purchases on the recommendations he makes?

      I don't base my decisions on any one data point, especially one from someone wanting to sell me a product. To do so would be stupid.

      He knows the cards in detail, knows what features they support and how well,

      As do hundreds (if not thousands) of others.

      and he sure as hell knows how well they'll perform with the next id game.

      As I said above, his views can potentially be seen as one trying to sell you his product. I prefer to get my reviews from more notably un-biased sources. I'm not saying Mr. Carmack's opinions are biased. But, at the same time, I wouldn't necessarily base my Goodyear tire purchase on Ford's recomendations.
      • All opinions are biased.

        All of them.

        The most important thing is to determine how significant the bias is, and how it plays a roll in the decision process.

        Mr. Carmack, for instance, may wish to get you to buy his latest game, but his bias is in presenting his game as good as possible with the special effects/speed etc.

        This is different from a bias where a reviewer likes a particular company, instead of a product.

        Not to say either is better, or worse, but there is always bias.

    • by cwebster ( 100824 ) on Monday August 19, 2002 @03:56PM (#4099839)
      the doom3 demo theater at quakecon was run by a box with a radeon 9700, so that should answer your question.
    • by toofast ( 20646 ) on Monday August 19, 2002 @10:53PM (#4101721)
      From ati's website [ati.com]:

      John Carmack

      "The R300 is an ideal rendering target for the DOOM engine, it can do both our highly complex pixel shaders for light surface interactions and can very rapidly render all the stencil shadow volumes which deal with all our dynamic masking of way light operations"

      "3D accelerators are all about performance, quality and flexibility and the R300 breaks new ground over anything thats come before it in all three areas."
  • by wilburdg ( 178573 ) on Monday August 19, 2002 @03:01PM (#4099446)

    Compared to some other companies *cough*NVIDIA*cough* ATI has been very helpful to linux developers. While NVIDIA only releases binaries, and only for x86, ATI actually provides developers with technical specs to aid development on other platforms (PowerPC anyone?).

    From ATI's website:

    While ATI does not develop Linux or XFree86 drivers for its graphics cards in house, we actively support 3rd party developers that provide driver support for the majority of ATI products with development kits and information.

    Radeon drivers for Linux are in development. XFree86 and the DRI Open Source Project offer Radeon 2D support with their latest released source code. 3D support is scheduled to be released Q1 2001.

    • Do they provide full specs, or do they keep details of the fanciest features secret? How does their support of free software compare to Matrox?

      I used to buy Nvidia-based cards (mostly from Asus), and technically they're great, but nowdays I try to only buy documented hardware. I've been wondering for a while which video cards I should buy for a desktop system.

      My Fujitsu Lifebook (P2040) has an ATI Rage Pro Mobility of some sort, and it seems to work fine. To get the X Video extension working so I could view CDs, I had to download a new driver that isn't yet in the XFree86 distribution, but it works great.

    • Compared to some other companies *cough*NVIDIA*cough* ATI has been very helpful to linux developers. While NVIDIA only releases binaries, and only for x86, ATI actually provides developers with technical specs to aid development on other platforms

      Probably because they want some competent people to write some drivers for them. :-)

    • by Clue4All ( 580842 ) on Monday August 19, 2002 @03:26PM (#4099659) Homepage
      And? ATi has dumped partial specs on XFree86 developers and said "here." 3D support for the Radeon series of cards is abysmal and non-existent for the 8500 and higher cards. 2D isn't that impressive either. If you want OpenGL form ATi, you can get it, though, with BINARY-ONLY DRIVERS. You're comparing this to high quality nVidia drivers from the start that get 95-99% of the Windows framerate? Gimme a break.
    • It's easy to complain, but how long will this be out before there are drivers for XFree86? 2,3,4 generations?! (There are still no Radeon 8500 drivers) You can complain about NVIDIA binary drivers for linux all you want, but I for one appreciate being able to use the latest technology when it is released without having to use windows. I would really like to see ATI release drivers (binary or otherwise) for linux.
      • >but I for one appreciate being able to use the latest technology when it is released without having to use windows.

        And I appreciate being able to build a box extra cheap with an old 2D video card.

        That's the problem with binary drivers -- you're buying a time limited product.

        And yeah, I do still have a machine on an old Trident 8900 video card. It just doesn't need a $500 upgrade (no AGP slot and 486 processor == No NVIDIA for it). Thank God for the longetivity of open source.
        • Umm, your comments make no sense. There is no AGP slot on your computer, so it can't accept an NVIDIA card. Open Source drivers would help you how? If you're talking about NVIDIA dumping support for older cards in newer releases of the binary drivers which (presumably) support newer kernel versions or whatnot

          A) A RivaTNT2 (minimum reqs to get into NVIDIA driver land) is $20. How poor are you?

          B) The kernel wrapper is Open Source. You can modify it to run on whatever kernel you want. A bunch of people hacked it to make it run on FreeBSD, and NVIDIA techs even provided support!
      • I would really like to see ATI release drivers (binary or otherwise) for linux.


        This must be your happy day then, cause XFree86 4.2 comes with Radeon 8500 drivers (has since it was released, I use it with my 8500 every day) and ATI just released some really sweet FireGL drivers for XFree86 4.1 & 4.2 that also work with the Radeon 8500's.

    • You're fscking kidding, right? Their support appears to be a "we're going to toss bones at some open source developers and see if _they_ can get our sh!t to work".

      I bought an ATI All-in-Wonder to use to do some video capture, VCD some TV shows, etc. On the windows side, the box wouldn't work (it needs to be the ONLY video card in the system, or it either doesn't work or locks up your system in 90 seconds - I timed it.). So with my top-of-the-line 3D card, I need to physically remove it when I want to record video - even telling the AGP card to be secondary doesn't work... it STILL locks up. Their vaunted tech support answer? "um, just use our card for 3d, or remove the other one each time". Um, pass.

      So I decided to use it for a PVR in Linux. Good luck getting that to work. There's this thing called GATOS, which works on SOME of the ATI Video Capture cards (and not necessarily all of the same model), but is apparently too complicated for me... (okay, I need to install this Kernel Module, recompile the kernel, make sure the headers are where it thinks they should be, download from a CVS tree the latest source, install that, install this other thing, then a program to watch TV and another to record!)

      ATI - NEVER AGAIN.
    • what?

      I never understood this anti-NVIDIA fud.

      Look, they write drivers for us, which these days outperform the windows ones sometimes.

      what the fuck are you complaining for?

      and this crap you say about binary only, they ARE released in source, I have it right here. Ok sorry their openGL libraries I don't have the code to. But you can download the driver code off their website

      here are other things about them. Each release has a rather substantial ChangeLog. They support cool things like Xrender. They give us support for that mouse cursor-shadow hack that you see in windows. They even let Brian Paul implement some of their proprietary openGL extensions in Mesa.

      so, troll, tell me again why NVIDIA sucks. last I checked, running an NVIDIA card under linux you have a MUCH MUCH MUCH better chance of having fast 3D than with an ATI card. when I mean much better, I mean like 10 to 1.
    • Not really.
      They docs they gave were sparse and contradictory. They only gave docs under a strict NDA agreement with a chicken-and-egg test for who gets access to them which filters most people out.
      They are not allowing people to develop support for TV-out or Hyper-Z or a lot of other features. Hyper-Z I can understand, but TV-out? Come on... Don't tell me I have to go back to nVidia just for TV-out.
    • It was the last time I saw Linux Radeon drivers.

      Apparently if you have a really fast Rage 128 games like Q3 will run fast. But who needs a fast Rage 128...we need drivers that treat an N-generation card as such, not an (N-1) generation card.

      So my true questions are: do the _current_ drivers support

      1. hardware T&L?
      2. vertex shaders?
      3. pixel shaders?
      4. FSAA / SmoothVision?

      and last but not least,
      5. TV-out / Multiple monitor / Video-in?

  • Quack Benches (Score:2, Insightful)

    by yeoua ( 86835 )
    Sure, we know it runs awesome on Quake3... but will it run awesome on Quack3?
  • It doesn't surprise me one bit that ATI can push the envelope of 3D graphics.

    They've been in business since the dawn of the x86 age. They always made solid cards.

    Around the time of the stealth64 ATI lost its edge because they didn't see the potential for the consumer gaming market. (Stealth64 was the hot gaming card back in the doom days, ask thresh) Despite companies like 3dfx releasing the voodoo1 and Creative releasing the VLB 3D blaster, it was years before ATI came out with a graphics chip with even rudimentary 3D support.

    Nvidia, a new company only took couple of chip revisions before they were able to match 3dfx's performance. It's no surprise that a company like ATI with years of 2D behind them would be able to quickly beat out the new top dog Nvidia.

    Kudo's and good job ATI. Now if you could only price these new cards in a reasonable range, let's say less than $200, you could definetly become the new king.
    • True, if they could offer the top card at $200 bucks, noone would buy the top card of their competitor. I don't think the economics justify that. Also, doing that would kill any further revenues from their former high end cards (now the mid-range). The very high end is always very expensive. It allows the companies to make some good money at the expense of the richest / most demanding. The rest of us reap the benifits 6-12 months down the road when the technology those with cash paid dearly to develop trickles down into affordable price ranges.


      I'm sure you know this and I'm not really sure why I responded, but there it is.

  • I don't think anyone should purchase the ATI 9700 until the Geforce5 (or nv30) comes out.

    Honestly anyone could have told you months ago the 9700 would beat the gf4... it's a new generation card.

    And whats the use in getting it this month, since most games out now are still based on 5year old GFX engines that run decently on a geforce2.

    and please spare me the tears of 60fps vs 200fps :)

    --me

    • Thanks for telling everyone what to do, where would we be without you? :)

      Yes, the NV30 will probably beat the 9700. But when it is arriving? Some estimates I've heard don't have it showing up until Feb '03. Six months.

      Anytime you're ready to buy a new vid card (or CPU, or mobo) you can wait six months to get something better. But sooner or later, you have to buy something. Putting off a purchase because something better is coming down the road is never a good choice, because there will always be something better down the road. If you need a machine right now, but it now and get the best components for your needs.

      And in six months or whenever the NV30 arrives, you know what? Most games will still be based on 5 year old GFX engines that run decently on a GF2. :)
    • It's always a coin toss:

      Heads - wait 3 years for games to be developed that can actually take advantage of the features of a new card (good luck finding a 9700 or GeForce5 in 3 years)
      - or -
      Tails - buy that card now and know that it'll be good for at least 3 years

      Decisions, decisions.....
    • Yup, and when the GF5/NV30 comes out ATI will have another card out "I don't think anyone should purchase the GF5 until the (new ATI modle) comes out".

      Buying PC components is like this; your computer is off the leading edge within weeks, if not days or hours.

      However, you're pretty much right in that most games don't require the 3D power of a new card unless you're running at 1600x1200x32, and even a GF2/3 should handle that OK. However, there will be those that absolutely must have the latest & (presumably) greatest.

  • by Critical_ ( 25211 ) on Monday August 19, 2002 @03:17PM (#4099590) Homepage
    Its a demo/test model that I was using in the lab to verify compatibility with our applications. Yes, that is corporate speak for "I played quake for a couple hours on company time". I am payed to do that. Anyway, here it goes...

    * 2D: WOW! I have been a diehard Matrox fan because of the awesome 2D on their boards. However, I think Matrox might have a challenger on their hands. Even at dizzyingly high resolutions, the fonts were crisp and clean.

    * 3D: Very nice. It has been image quality than the Geforce Ti's with FSAA enabled. However, it cannot compete with the Matrox Parhelia here. The Parhelia, though it has slower framerates, has better color saturation and 16x FSAA w/o a massive performance hit.

    * Drivers: so far it was worked fine under WinXP. I got the SVGA xserver running on it after mucking around with Redhat for a couple hours. I am hoping a dedicated XServer is coming out for this card since it needs one badly.

    Anyone else have any luck under Linux?
    • X server (Score:3, Insightful)

      by wowbagger ( 69688 )
      It is unlikely you will see an effective Xserver for this card any time soon. While nVidia may only provide closed-source drivers (save for the barest minimum source-level shim to allow their drivers to work with a few different kernels), at least nVidia pays programmers to support their cards under !MSWindows.

      ATI will provide some documentation to selected members of the XFree development team, but they do not release all the programming information to the world, nor do they pay anybody to support their cards.

      Perhaps that might change if enough people make it clear to ATI that Free Software drivers for XFree, source on the CD that comes with the card and pre-compiled binary modules for the current releases of XFree will sell more cards.

      Of course, the odds of this happening any time soon are roughly 2-to-the-9421 power, and falling...
    • > I am hoping a dedicated XServer is coming out for this card since it needs one badly.

      mee too! Maybe The Weather Channel will $$support$$ the development of open source XFree86 drivers [slashdot.org].
      I'm still waiting for the 8500 drivers due out in Q4.

      -metric

    • Dedicated Xserver?

      I'd really like to hear your information if it pertained to the XFree86 4.x tree. XFree86 3.x and its separate Xserver binaries for each card disapeared a long time ago. XFree86 4.x has an ABI which allows driver .o files to be loaded automatically based on the config file.

      Now, if there was a way that per-user accounts could have an XFree86 override and there were easy tools for both CLI and GUI configuration, and these were all the default settings in distributions, and the changes made in a session were stateful (IE: if I changed the res down a notch and restarted X, it'd be at that res, even if I had many modes defined), we'd finally be close to where Windows / MacOS is in terms of easy-GUI configuration.

      Setting up X is still too much black magic.
  • ..because it seems that everytime a new game comes out, I have to upgrade my existing video card just to take advantage of all the new and improved features. Actually I'm still trying to max out my little GeForce2 MX, which by the way has been doing a damn good job so far. Not bad for the 70 bucks I paid for it. I'll just have to look for the next big bang for the buck.

  • I'm using an Nvidia GeForce2 GTS right now, and I've long been a fan of Nvidia -- because they have released superior products, not because of any allegiance.

    But, let me say it, the 9700 wipes the floor with everything else -- though that's pretty much obvious. As a side note, were a log scale used, and my GeForce2 put on the scale, it'd be a negative number as large as would 9700's be a positive number.

    Radeon's released a great product here. Kudos to them. I could probably play Descent 3 or Tomb Radier 4/5 with the 9700 at 1600x1200 with all the eye-candy at 5000fps. :-) It'll definately be able to handle Tomb Raider 6 when it comes out.

    The question here is do you buy it now or wait for the NV30? I say it depends on who you are. I myself am not much of a game person, though I have a few (very limited) favorites: I like the Tomb Raider series and the Descent series; I'm also partial to Janes USAF. That said, I'll never buy a game outside of these areas, UNLESS GameCube's Eternal Darkness [eternaldarkness.com] comes out for the PC. (I might be buying it anyways, as there's an emulator for the GameCube which works on Linux).

    What I say you do is figure out how much power the games you want require. Also, how much power do the 3D apps you want require. Buy your graphics card based on that, and buy for a value.

    To me, that means you don't buy it as soon as it comes out. You wait until the next major release comes out, then buy it; i.e., you buy the Radeon 9700 when the next Radeon comes out, or when its price drops significantly. Similarly, you buy the NV30 when the NV35 or whatever comes out, or when its price drops significantly.

    Once a product is no longer bleeding edge, you normally pay for it what its a actually worth, and not some insanely high cost. Look at the prices of the GeForce 3's now: pretty reasonable at $100 to $250 or so dollars, usually. GeForce4's ell for about $400 dollars...are they really 1.6-4x better than the GeForce3's? In other words, if you could cluster GPU's, would one GeForce4 beat out 2-4 GeForce3's? I don't think so. That's what the price says, but I think they fall short of that.
  • I was hoping for a less ass kissing interview from a good site like the Tech-Report but they were not included it the boards sent out to reviewers [tech-report.com] I can only guess why.
  • by Com2Kid ( 142006 ) <com2kidSPAMLESS@gmail.com> on Monday August 19, 2002 @03:59PM (#4099868) Homepage Journal
    Polygons hit the road and we get some real time NURBS rendering going on. :)

    Seriously though, heh. I want decent multi-head support in ALL my games, as in BUILT AROUND the concept. Imagine having one screen for your inventory and maps and so forth and another one for your actual playing field.
  • by gatekeep ( 122108 ) on Monday August 19, 2002 @04:27PM (#4100073)
    Sharky Extreme has a review up now too. [gamers.com]

    Mostly the same glowing reviews.. It'll be interesting to see how Nvidia's next card responds.
  • by D_Fresh ( 90926 ) <slashdot@dougalP ... om minus painter> on Monday August 19, 2002 @05:02PM (#4100285) Journal
    The card is impressive for sure and should have NVIDIA on the ropes for a while, as it beats out a GeForce 4 Ti 4600 handily, especially with Anti-Aliasing and Anisotropic Filtering enabled.

    Holy polygons, would you just quit the hype already? I *just* ordered a dual-867 Power Mac with GF4 Ti, and I spent a pretty penny for that upgrade - can't a guy bask in hardware glory without some bithead like you going and raining on his GPU parade? Sheesh.

    • I just made my return to Mac-land. Bought a G4/800 with 768 megs of RAM and a 40 gig HD with Radeon 7500 over the weekend. It runs OS X 10.1.5 freaking sweet! Can't wait to get 10.2 on it.

      Got a sweet deal on it for $1399, CompUSA was making room for all the dual processor G4's. Hope you enjoy it.

      -Pat

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...