Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software

ATI R300 and R250V 297

Chuu writes "The ATI R300 (Radeon 9700) and R250V (Radeon 9000/Radeon 9000 Pro) reviews are out, at all the usual suspects, but the one you want to pay attention to is over at anandtech.com, since somehow Anand got permission to publish his benchmark results for the R300 while the other sites were stuck with whitepapers. The results? The R250V is a GF4MX killer, which is not saying much. On the other hand, the R300 absolutely trounces the GeForce4 Ti4600, running 54% faster in Unreal Tournament 2003 and 37% faster in Quake 3 at 1600x1200x32 on a Pentium4 2.4ghz."
This discussion has been archived. No new comments can be posted.

ATI R300 and R250V

Comments Filter:
  • Anand's benchmarks (Score:5, Insightful)

    by Zathrus ( 232140 ) on Thursday July 18, 2002 @09:32AM (#3908385) Homepage
    The reason Anand got to post his benchmarks is because he doesn't have actual numbers... just correlated data. The R300 didn't post 256 fps in UT2k3, it just did x% better than a GF4 Ti4600.

    Of course, as he points out, the GF4 numbers are available, and it only takes some simple math to extrapolate from there.

    The card looks very impressive. It's out 4 months before the NV30. Maybe by then ATI will have drivers worth a crap too.
    • by linzeal ( 197905 )
      Drivers have been good, well since they released the catylyst set. Took them 2 days to fix a fog table problem in GTA 3.
      • I'll personally take it seriously when I see some reviews of the Linux drivers, if any exist. NVIDIA's Linux drivers are of very high quality, but closed source. If ATI releases some high-quality open-source Linux drivers that get it equivalent framerates to the Windows counterparts, I'll definately switch.
        • I am not trying to diss nVidia's Linux drivers... They are pretty fast, but there are a lot of things that I hear they don't have... Like support for certain SDL video modes that things like Mplayer use. But then again, this may just be a bias from certain programmers that hate the idea that their binaries are closed source.

          I will note that often, the Linux drivers for nVidia cards are faster than in Windows. They use the same driver model.

          I've been pretty happy with the PowerVR Kyro 2 drivers in Linux. They are only beta drivers right now, but seem to be faster than when I used Windows 2000. RTC Wolfenstein flies, and looks gorgeous.

          Personally, I don't care if they are closed or open, as long as I have something that works well. Not everything can be open source. There need to be some exceptions at times- which is why I fronted $35 to www.opensound.com for proper CS4630 (for the Santa Cruz) drivers.
  • Am I the only one who read that story and had eyes glazing over at all the captial letters an numbers?
  • Not that ATI doesn't make some good shit - I loved their ATI-TV for the longest time (until I got a Mac, but that's another story).

    But they've been known to, um, "help" their drivers along with specific applications. When I see one plugged into my PowerMac while I'm playing Medal of Honor or Warcraft III and I see better performance, then I'll believe it.

    Now, see what happens to the boy who cried "framerates", kids?
    • Speaking of Macs. I really hope to see these new cards in the updated desktops expected mid-August. The low end of the line currently carries a 7500 and the higher end Gf4MX or Titanium.

      Apple needs to keep up with the high end graphics cards if they want to keep attracting gamers (and games) to their platform.

      -Spyky
  • It should! (Score:1, Troll)

    by Zone5 ( 179243 )
    No big surprise, it SHOULD trounce the GF4... after all, it's the first of the next generation cards - it's interesting to compare to previous generation cards, but it really should more appropriately be put head-to-head against the GF4 successor NV30. Trouncing current cards is a big yawn, but if it can go toe-to-toe with the big boys in 3 or 4 months they'll have something.

    That said, congrats to ATI - I love competition in the marketplace. Now if only they could write some decent drivers for once in their lives.
    • Re:It should! (Score:3, Insightful)

      by _UnderTow_ ( 86073 )
      As I see it, the R250 and R300 are ATI's answer to the GF4 line. The Radeon 8500 was meant to compete with the GF3 line but that hasn't stopped everyone from comparing it to the GF4's. Now the R300 will be out a few months before nvidia's next part, why not compare it to current cards.

      If you don't compare it to current cards you don't have a frame of reference for how powerful the new cards are.
    • two things:
      Now if only they could write some decent drivers for once in their lives.
      This is true, but their latest drivers, what they're branding "Catalyst" are supposedly very very solid. Time will tell, naturally, but most of the tech sites have had nice things to say about them. (incidently, better quality or not, I still can't fscking believe they're branding their drivers... oh, well.)

      secondly, about your sig: the "Linux is only free if your time is worth nothing" quote is NOT anonymous. It's from jwz [jwz.org], a rather famous (among geeks) Linux user. ;)
      • incidently, better quality or not, I still can't fscking believe they're branding their drivers... oh, well.

        Why not, is it any different from nVidia and their "Detonator" drivers?

        And the current Catalyst drivers have been having some issues with Neverwinter Nights, often requiring that they be uninstalled and the user revert to an older version of the drivers.

        The main contention that I have is that since they seem to have dropped the pretense that their drivers for the 8500 up were going to be binary compatible with newer cards that they will orphan the 8500 series drivers. Yes, it can happen, ask anyone who has a Rage Maxx and Windows 2000.

        • >> ask anyone who has a Rage Maxx and Windows 2000

          That was a somewhat different issue, as it had as much to do with
          how Win2k handled the two GPUs on the card together, as with how ATI
          designed the Rage Maxx. If you check now, you might find it surprising
          how many of ATI's legacy cards have had newer drivers released. I've even
          found W2K drivers for my old AIW Pro card, which most people thought ATI
          abandoned close to a decade ago.

          ATI still has a ways to go, but thier current level of driver support
          has gone from non-existant to visible, which is an unbelievable improvement
          from the viewpoint of most of thier users. If this keeps up, they might
          even be able to claim a consistent release schedule some time soon.

          This is also less of a break than it was when ATI went from the Rage 128 series
          to the Radeon. As the R300 is still based on the same design of the original
          Radeon. They may never be able to claim a complete UD model for all thier
          cards, but they is still some uniformity that can carry thru all generations
          of the Radeon.
    • Re:It should! (Score:3, Insightful)

      by Dalroth ( 85450 )
      It's a game of leap frog. It doesn't matter who had the best performance first, or who has the best performance now. It doesn't matter who's generation of cards are compared with who's generation of cards.

      All that matters is who has the best cost/performance ration (right now), and who has the best performance come Christmas time when people really start spending money.
    • Re:It should! (Score:3, Insightful)

      by bogie ( 31020 )
      "Trouncing current cards is a big yawn, but if it can go toe-to-toe with the big boys in 3 or 4 months they'll have something"

      Nice Logic. What are you some sort of Nvidia fanboy?
      So when the NV30 comes out(its more like 5 months away), is it O.K. for the ATI people say "yawn" big deal, put it up against our next card in 3 or 4 months.

      Disrepect ATI all you want but don't act like a card that will crush the top of the line Nvidia for several months to come is just something to take for granted.

  • Now the Radeon 8500 will come down in price enough for me to afford one. Sweet!
  • by cOdEgUru ( 181536 ) on Thursday July 18, 2002 @09:40AM (#3908453) Homepage Journal
    The page you really need to go is this [anandtech.com]. It talks about not just raw FPS, but about running UT2003 with 4X Anti Aliasing enabled at 1600x1200x32. This is where ATI trounces Nvidia with a whopping 251% faster performance.

    Though the framerates at 1600x1200 on UT2003 are not exactly playable (there goes my hopes of running DoomIII at 1600x1200 on this baby) ATI has finally produced a card worthy of their name.

    Nvidia has atleast six months to go before they can have something to show. And running the 927 leaked build of UT2003 on a GF4 Ti 4600, you dont get playable framerates beyond 1024x768 with every detailed notched up.
    • Actually it's more like 151% faster performance.

      2.51 is 251% of 1. 1 + 1.51 (which is 151% of 1) = 2.51. Here endeth the lesson.

      You could on the other hand say, "2.5 times faster than the nVidia card."
  • The only problem (Score:3, Informative)

    by vlad_petric ( 94134 ) on Thursday July 18, 2002 @09:45AM (#3908504) Homepage
    How long will we have to wait for good XFree drivers ? ATI has a very bad record with drivers, even for the Windoze platforms. For them, Linux doesn't even exist, so my bet is that it'll take at least a year 'till we get decent drivers for these babies

    The Raven.

    • by wowbagger ( 69688 ) on Thursday July 18, 2002 @10:00AM (#3908619) Homepage Journal
      <voice="BillClinton">That depends on what your definition of drivers is....</voice>

      If by "drivers" you mean "closed source drivers for the FireGL card based on this chip that support all the card's rendering features, but none of the video capture or tuner functions of the inevitable AIW version", then I would guess about 8 months.

      If you mean "closed source drivers that support all the rendering, video capture, tuner, etc. functions of this card" from ATI, then I suggest you monitor Mr. Andy Krist's credit cards for purchases of cold weather gear - this will happen about the same time the MBA selects Dr. Hawking as a star player.

      If you mean "Open source drivers that support some of the rendering, none of the video capture, and none of the tuner", then I would guess about 18 months.

      Sad but true. A pity - were there to be good drivers for this card (good = open source, all features supported by the standard APIs (Xv, Video4Linux2, DRI)) then I would pay up to $500 for one.

      Now, the question is, what about all the Mac owners?
      • For the most part, you're sadly correct on much of what you said.

        On the bright side, though, Linux actually has working tuner and capture drivers for a lot of ATI hardware here at the gatos project. [sf.net]
        • The drivers for the ATI 7500AIW aren't working very well with the latest DRI code now, and there are problems with the I2C bus on the ATI - it's not documented where the bits for it are. ATI has been asked to provide this information, and last I'd heard it has not been provided.
      • Perhaps you meant N BA? I'd imagine Hawking is smart enough to take some MIS classes at Wharton...

        Although, with his new exoskeleton [theonion.com], I'd think Hawking could take anybody on a little pick-n'-roll...
      • this will happen about the same time the MBA selects Dr. Hawking as a star player.

        MBA? What is that? Someone with a Master's degree in Business Administration? Wacky.

        Oh, maybe you meant NBA, the National Basketball's Association. In that case, your presumption isn't too far fetched. Dr. Hawking already has a powered exoskeleton [theonion.com] he can use to fight crime AND play basketball. So I guess those ATI drivers are just around the corner!
  • Linux drivers? (Score:2, Interesting)

    by Brummund ( 447393 )
    Will there be Linux drivers for these cards supporting 3D acceleration/OpenGL under XFree86, so I can play RtCW and Flightgear on my favorite platform?

    If so, count me in. Otherwise, I'll stick to NVidia.

    • Re:Linux drivers? (Score:2, Insightful)

      by davros74 ( 194914 )
      I don't see why there won't be. I play RtCW under Linux/XFree86 with my Radeon VIVO and it runs just fine at 1280x1024x32. The composite video in also works, as well as video capture. Give the GATOS people some credit, their Radeon drivers have performed better for me than any ATI Windows or OS/2 (back in the day) drivers ever did.

      It really isn't a question of will _ATI_ release linux drivers, but will they release enough documentation so folks like GATOS can implement a driver in a reasonable amount of time.
  • by MtViewGuy ( 197597 ) on Thursday July 18, 2002 @09:47AM (#3908520)
    If you think the Radeon 9700 is amazing, just wait till ATI produces the R300 chipset in the 0.13 micron process version and cranks up the graphics card clock speed to likely way over 400 MHz.

    Slow it definitely won't be. :-)
    • And if you had read the Anandtech review, you'd see that he comments on these rumors - that they're most likely baseless.

      The only product scheduled to come out of ATI by Q4 is the 9500, which is a slower, stripped down R300 for less money.

      And by that time it'll have to compete against the NV30, which is allegedly going to blow the R300 away (as it should given the time differences involved).
      • However, given that the current R300 chipset is built on the 0.15 micron process, it doesn't take much to figure out that ATI may produce a 0.13 micron R300 chip, which will allow cooler operation and/or faster clock speeds. The current 0.15 micron R300 chip requires a big heatsink/fan to cool it and extra power using a floppy drive power connector from the main system power supply, something that ATI may want to dispense with using a cooler-running R300 chip.

        I can imagine the nVidia NV30 (neé Geforce5) chip is probably going to need the floppy drive power connection, given its even higher transistor on die count than the ATI R300 chip.

        By the way, the Intel Northwood Pentium 4's are well-liked because the switch to the 0.13 micron process allowed Intel to run a much cooler CPU, which allowed Intel to crank up the CPU clock speed to 2.53 GHz.
        • Yes, but who are they going to have fab a 0.13 um chip? TSMC is (allegedly) going to be full - with nVidia taking up the majority of their capacity. I'm not in the fab business anymore, so I don't follow who's using who. And a 0.13 um fab isn't something you find laying around on the corner. Maybe someone has capacity, but will they be willing to lease it out at terms that are agreeable to ATI?

          Odds are they're stuck with .15 um for 6 months or more.

          The NV30 probably will need additional power as well - I don't expect a 0.02 um change to reduce power consumption enough to eliminate the need while at the same time adding 10-15% more transistors.

          I'd expect the next revision of AGP to seriously bolster the power available from the bus though. 3Dfx hit the wall 2 years ago, and now the non-monsterous die sizes are hitting it too.
          • There are a number of fabs that could probably go to 0.13 micron process fairly quickly--SGS-Thomson, IBM Microelectronics, or excess capacity from AMD.

            Given the fact that the 0.13 micron R300 variant isn't going to need huge production capacity, I think there is fab capacity around that could make the chip.
  • You see this is what upsets me about technology today.

    What we need to clue into is that due to marketing reasons ATI wanted to get the chip running at 300mhz. They didn't care about the possible performance loss, all the marketing assholes want is a high MHZ and they had to take a "different approach" (meaning, shittier design) to reach the 300mhz mark.

    The sooner the average joe can accept that MHZ no longer equals performance... the better off chip design will be.

    The Pentium 4 basically is less efficient than a pentium 3, however 2ghz makes morons happy. So 2ghz whatever the cost!

    Noodle.

    • You don't seem to understand the direct relationship between MHz and pixel/texual fill rates in video cards.

      The Parhelia gets beaten in DX8 bechmarks by the GF4Ti because of the difference in clock speeds. If the 9700/10000 wants to compete with the NV30 at the top then raw MHz is needed - the DX9 specs state that both cards need 8 pixel pipelines for compliance (and both ATI and Nvidia say their next-gen cards do). This means whoever has the highest clock rate will have the highest pixel fill rate. Need to wait and see if the NV30 has more than one texture unit per pipeline, the R300 only has one (for a total of 8 texture units) which means if the NV30 has two or more for each pipeline, then it will beat the R300 in texel filrates by architecture alone (though for more than one texture unit per pipeline will need HUGE amount of memory bandwidth which is not likely to happen until DDR-II is utilized).

      So MHz does equal performance... which can mean a marketing success or failure. Parhelia is not grat for gamers (other than TripleHead) due to low clock speeds. Nvidia have delayed the NV30 to make use of the 0.13u process to get higher clock speeds (rumoured to be 400+) and ATI plan on releasing the Radeon 10000 next year based on a 0.13u process as well to try and beat the NV30 if it proves to outperform the 9700.... which is expected to happen.

      - HeXa
    • Just a bit of food for thought...

      Intel had to REALLY stretch to get the PIII core up to 1.1GHz on a 180nm fab process, including having to recall their first attempt entirely. With the P4, they had little trouble releasing a 2.0GHz core on the exact same 180nm fab process.

      Which do you think is faster, a 1.1GHz PIII or a 2.0GHz P4? Intel's design strategy for the P4 wasn't all about marketing, being a little bit less efficient but clocking a LOT higher isn't entirely a bad thing.

      Of course, I don't know just how well this correlates to ATI's newest video card, we'll just have to wait and see.
    • How do you know it meant a shittier design? Were you in on any of the design meetings? Didn't think so.

      Now, reread the section you misquote so heavily. The reviewer was specifically discussing an Intel-like design in that they hand picked transistors to go in specific places, rather than letting a VLSI design program layout the chip according to how it thinks is best. Intel and others have shown that this strategy - hand tuning critical junctures - can pay off in performance and manufacturing.

      Intel's chip designs have been pretty damn amazing for the past two decades. They've frequently been the ones pushing Moore's law (yeah, go ahead and take the obvious whine - "because they needed to, their chips are so inefficient"), and they've eeked a helluva lot more features and performance out of designs than anyone in their right mind expected. Their fabbing is second to none and their processes are emulated industry wide. A 35% yield for a first run Intel design is godawful, but considered spectacular for other companies.

      Have they made missteps? Yup. And I largely attribute those to upper management sticking its head deeply up its ass rather than to the engineers. Intel's brass stopped listening to their engineers 4 or 5 years ago. And it's been biting them since. Remains to be seen if they've figured this one out yet.

      that MHZ no longer equals performance

      No, it doesn't. But it's often a damn good indicator of performance, particularly in the GPU world. Frankly, the only people who know what the clock speeds on the chips are are the geeks who are into this thing. They're not advertised on the packaging.
    • If you had a clue you'd realize that nobody buys graphics cards based on clock speed. They're not advertised on the box! The only people who know the clock speed are geeks who read up on it, and those people depend on benchmarks anyway!
  • "The power requirements will be more than what ATI wants to run through the AGP port, so the card will have an extra floppy-drive sized power connection."

    That's very interesting. For one thing, I don't know of many cases which come with two floppy power connections any more. Other than that, it sounds like a good idea. Finally use the legacy floppy crap for a modern purporse...
    • The picture on Anandtech clearly shows a pass through connector with a floppy power connection in the middle. So that should solve that.

      Honestly though, the past few power supplies I've bought did have a 2nd floppy connector on them. Never figured out what the hell they'd be used for until now though.
    • I just (4 months ago) bought an InWin midtower atx case (300 watt psu) and it has 2 of the floppy-style power connectors, for whatever reason. So apparently they're still there, but I don't see why they didn't just use a regular mollex connector for power.
    • Some Firewire cards have a power connector to provide power over the bus... I've seen one or two use the floppy style power connector and a couple using the hard drive style power connector. I also think the Sound Blaster Audigy Platinum (both the regular and the eX models) use a floppy power connector to provide power for the FireWire ports...

      My older computer power supplies don't have two floppy power connectors, but my newer machine does.
  • If I change the name from "QUAKE" to "QUACK", will the performance drop by 20%?
  • I thought this might help those people who dont want to take the time to calculate the FPS from the earlier article Anandtech did.

    Unreal Tournament 2003 (DM-Antalus) 1024x768x32 High Detail Settings
    Radeon 9700: 130.4
    GF4 Ti4600: 94.5
    Parhelia: 54.4
    Radeon 8500: 57.6

    Unreal Tournament 2003 (DM-Antalus) 1280x1024x32 High Detail Settings
    Radeon 9700: 87.8
    GF4 Ti4600: 59.3
    Parhelia: 35.1
    Radeon 8500: 37.9

    Unreal Tournament 2003 (DM-Antalus) 1600x1200x32 High Detail Settings
    Radeon 9700: 63.3
    GF4 Ti4600: 41.1
    Parhelia: 24.6
    Radeon 8500: 25.2

    Unreal Tournament 2003 (DM-Asbestos)1024x768x32 High Detail Settings
    Radeon 9700: 210.3
    GF4 Ti4600: 178.2
    Parhelia: 100.4
    Radeon 8500: 91.1

    Unreal Tournament 2003 (DM-Asbestos)1280x1024 High Detail Settings
    Radeon 9700: 144.3
    GF4 Ti4600: 115.4
    Parhelia: 65.5
    Radeon 8500: 58.9

    Unreal Tournament 2003 (DM-Asbestos)1600x1200 High Detail Settings
    Radeon 9700: 104.1
    GF4 Ti4600: 82.0
    Parhelia: 46.9
    Radeon 8500: 42.0

    Jedi Knight 2 'demo jk2ffa' @ 1600x1200
    Radeon 9700: 124.3
    GF4 Ti4600: 113.0
    Parhelia: 65.9
    Radeon 8500: 93.2

    Serious Sam 2: The Second Encounter 'Little Trouble' 1024x768x32
    Radeon 9700: 115.2
    GF4 Ti4600: 100.2
    Parhelia: 67.4
    Radeon 8500: 58.2

    Serious Sam 2: The Second Encounter 'Little Trouble' 1280x1024x32
    Radeon 9700: 102.6
    GF4 Ti4600: 72.9
    Parhelia: 49.5
    Radeon 8500: 45.3

    Serious Sam 2: The Second Encounter 'Little Trouble' 1600x1200x32
    Radeon 9700: 77.6
    GF4 Ti4600: 51.7
    Parhelia: 37.3
    Radeon 8500: 32.1
  • Now, with no claims of being objective, go buy a 9700 for yourself, and show your family how much you care by buying one for all your family members.

    If the stock triples, I might be able to afford a 1987 Honda Civic with only 200,000 kilometres on the engine.
  • Many pages only have 10 or 11 sentences of text on them. Is the high number of page/banner views that Anand gets really worth the extra traffic (and annoyance) that this causes?
  • I am personally waiting to see benchmarks from retail off the shelf cards using the supplied drivers. ATI always seems to have good lab numbers but comes up short when the released product is tested under everday conditions. Also as others have pointed out it's not too impressive comparing it to video cards that have been out for awhile now. I hope that ATI can bring a truly competitive final product including stable drivers and I look forward to seeing the benchmarks against a released NV30 in the next several months. Christmas could be expensive this year.
  • by Phoenix ( 2762 ) on Thursday July 18, 2002 @10:47AM (#3908952)
    Who cares if ATI is top dog untill the next chip comes out? Who cares if ATI will be second place next month? Who cares who is the current kick ass product?

    Personally I don't, and you know why? Competition. Good clean, healthy, product innovating competition.

    Something that is sadly lacking in the DeskTop OS market. Not to name any names *cough*microsoft*cough* but there is a very good example of what having one and only one player in the field. Poor quality product that we keep seeing so many bugs that we've become desensitized. Really, who falls over stunned when /. releases a story about the security hole big enough to drive a semi-trailer through anymore? Or do we read it, and patch it as needed (or mutter something to the effect of thank god I have Linux)?

    So ATI is ahead today, so nVidia will be ahead tomorrow, so what?

    Be glad that there is more than one dog fighting over the bone

    Phoenix
  • Seems to me that if ATI is going to be making a new video card, perhaps first they should make some drivers for their old cards that actually work. I switched awhile back from a Voodoo3 3500TV to a Radeon AIW. It's a really nice card, it's got lots of features, it's fast... it's a nice card... but the drivers are such crap that it almost never works like it's supposed to. The win98 drivers are swell... but in XP it doesn't run anything that runs fullscreen (that used to run on the 3500... XP was a fresh install so it's not a residual driver issue) and 90% of the directX stuff doesn't work (including Warcraft III or other new games). Note to ATI: If you make a good card, make drivers that make it work!
  • Anand states: We didn't have much time with the Radeon 9700 so we couldn't run a full suite of AA tests nor take screen shots

    Come on, this has got to be bullshit. All it takes in most games to take a screenshot is to push PRINT-SCREEN on the keyboard. All these tests, different games, different settings, it must have taken at least half a day to complete, and not once did they have the time to push print-screen?

    Just say 'ATI wouldn't let us publish screenshots' instead of lying about it. Oh, maybe you weren't allowed to say that either? Bah.

    I think it's great we'll soon have some competition in the arena, but these are really previews of things to come, previews tightly controlled by ATI, I'll wager.

  • "37% faster in Quake 3 at 1600x1200x32 on a Pentium4 2.4ghz."

    Does anyone else get the feeling there's a director of marketing at the GPU companies who just pulls a Dr. Evil inspired number out of the air and declares to the engineers that that's the FPS for Q3 they need for the next card?

    "I want our next card to not just give me a frame per second but TEN frames per second in Quake. We'll see how they like that, huh?!"
    "Uh, Sir? We already do one hundred and ninety two frames per second at maximum resolution."
    "I see. Then I want..." finger curls against his bottom lip, "....One Beeeeeeeellion frames per second."
    "But Sir, that's insane! No one can tell after about 50fps anyway!"
    "Mini-Marketing-Me, keeeeeell him!"

  • The article at anand's says that ATI has abandoned their previous 'unified' drivers and that the drivers for R300 are 'brand new' and that this could create compatibility issues with older games and so on...

    Now, if there's somebody here that knows more, what exactly does 'unified' mean in technical terms (not marketspeak)? How can a 'unified' driver work for vastly different cards like GeForce1 (basic T&L) GeForce3/4 (shaders and so on) and NV30 (FP pipeline)?

    I don't see much that can be 'unified' in those architectures, and even less between the DX_8.1_and_lower and DX_9_and_higher parts, given the jump from integer to FP pipelines.

    So, is the claim of 'unified' drivers purely marketspeak (and maybe it's just a collection of 'workarounds' for specific game problems) or is there a technical case to be made for them?
  • Not a troll, and not off topic, or at least, not intended to be.

    Guess I've just been burned too many times by crappy drivers that don't do everything that was promised. And I'm talking Windows drivers here. Not quite complete OpenGL support, games not working correctly when they come out, games still not working correctly months after they come out because they are not big enough titles to get ATI's attention...

    Now I'm not the world's greatest NVidia fan either. I'll complaign about their lack of innovation, the way they seem to just want to throw more hardware at the problem, rather than find a more elegant solution, whatever. But as long as their drivers manage to play the new games, and they keep the new drivers comming out that I can play the new games when they come out, I'll take their cards, even if they are slower.

    A faster video card that doesn't play what I want it to, when I want it to, is of no use to me...

    And to bring this a bit back on topic... This is basically a warning to prospective buyers, check out ATI's track record for drivers before making a purchase. I haven't heard too many problems with the Radeon card, so they may have finally turned their policies of not caring about you after you give them your money around. But lets be honest here, current correctly working drivers are more important than the gap from 120 fps to 150 fps...
  • No, that's not 51% faster. If the GF4 is normalized to 1, and the ATI card is at 1.38, that means it's running at 138% of the GF4 capabilities, or 38% faster.

    Honestly, if they'd used an fps base and you'd had to do this with Gnome's calc, I could see you screwing up. But screwing up (1.38 - 1) * 10? Ouch. You should be able to do that in your head.
  • Yeah, it's fast, but what about the drivers?
    Remember how long it took them to get the drivers for their other radeon right? (Forgive me if I can't remeber exactly which one).
    Competition is great, but I'm not buying another ATI card until the have worthwhile Windows and Linux drivers.
    I had an All-in-Wonder Pro card, and I could never get it to run anything 3d correctly. Yes, I tried to newest drivers. I tried the experimental drivers. I tried the drivers for non-AIW cards. Nothing made it not crash. I couldn't run Quake III for more than a couple minutes without my computer completely locking up.
  • I can't help but wonder if the performance is the same with the game's executable renamed. That really concerned me when ATI tweaked their drivers a while back to score high benchmarks on only certain games (Quake 3).

Swap read error. You lose your mind.

Working...