Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software

Carmack On ATI's Driver Modifications 219

CitizenC points out that John Carmack's .plan file has been updated to discuss ATI's driver optimizations. If you weren't paying attention, ATI put code in their drivers to optimize for Quake3, based on the name of the executable - so when running Quake3, you'd get a (good) set of optimizations for the game, but when running the same game after changing the name of the executable, you'd get a default set of optimizations with lesser performance. Some people called this cheating since Quake3 is a typical benchmark application these days.
This discussion has been archived. No new comments can be posted.

Carmack On ATI's Driver Modifications

Comments Filter:
  • Well them... (Score:2, Insightful)

    by Wire Tap ( 61370 )
    The solution is simple: when benchmarking, don't use the executable by the default name. No reason to panick about this "cheating".
  • Of when manufacturers altered the BIOS of a computer to report it was faster than it was - mostly by pretending the motherboard had a level 2 cache - back in the days when memory was expensive.

    Now that there is alot of money to be made on graphics, it shouldn't be a surprise that something similar was attempted.

    This will probably backfire on them now that it is out in the open.
    • Not even close, dude. An ATI user actually receives the performance gains while playing Quake in this case. In your example, the user is sold a "bill of goods", believing they receive a performance benefit when they actually do not.

      The real problem here is that so many of the cry babies in the world of 3D benchmarking want to benchmark xyz 3D app by proxy, accepting the Quake3 benchmark as the golden rule. Obviously, this is their own folly, and they really ought to accept responsbility for THAT rather than blaming ATI for optimizing performance for what is arguably the most popular 3D app on the market. Reading through Carmack's diatribe on the topic, I hear him saying that different apps and even different versions of the same app will perform differently. To me, this strongly suggests each pertinent application needs to be benchmarked individually, rather than by proxy. Well, if one really wants to know what is going on, that is.

      The problem is accepting the Q3 benchmark alone. Savvy buyers should also take other raw benchmarks into consideration before plunking down their cash.

  • by Anonymous Coward
    Full Plan:

    Welcome to id Software's Finger Service V1.5!

    Name: John Carmack
    Email: johnc@idsoftware.com
    Description: Programmer
    Project:
    Last Updated: 11/16/2001 23:22:17 (Central Standard Time)
    ------
    Nov 16, 2001
    -----
    Driver optimizations have been discussed a lot lately because of the quake3
    name checking in ATI's recent drivers, so I am going to lay out my
    position on the subject.

    There are many driver optimizations that are pure improvements in all cases,
    with no negative effects. The difficult decisions come up when it comes to
    "trades" of various kinds, where a change will give an increase in
    performance, but at a cost.

    Relative performance trades. Part of being a driver writer is being able to
    say "I don't care if stippled, anti-aliased points with texturing go slow",
    and optimizing accordingly. Some hardware features, like caches and
    hierarchical buffers, may be advantages on some apps, and disadvantages on
    others. Command buffer sizes often tune differently for different
    applications.

    Quality trades. There is a small amount of wiggle room in the specs for pixel
    level variability, and some performance gains can be had by leaning towards
    the minimums. Most quality trades would actually be conformance trades,
    because the results are not exactly conformant, but they still do "roughly"
    the right thing from a visual standpoint. Compressing textures automatically,
    avoiding blending of very faint transparent pixels, using a 16 bit depth
    buffer, etc. A good application will allow the user to make most of these
    choices directly, but there is good call for having driver preference panels
    to enable these types of changes on naive applications. Many drivers now
    allow you to quality trade in an opposite manner -- slowing application
    performance by turning on anti-aliasing or anisotropic texture filtering.

    Conformance trades. Most conformance trades that happen with drivers are
    unintentional, where the slower, more general fallback case just didn't get
    called when it was supposed to, because the driver didn't check for a certain
    combination to exit some specially optimized path. However, there are
    optimizations that can give performance improvements in ways that make it
    impossible to remain conformant. For example, a driver could choose to skip
    storing of a color value before it is passed on to the hardware, which would
    save a few cycles, but make it impossible to correctly answer
    glGetFloatv( GL_CURRENT_COLOR, buffer ).

    Normally, driver writers will just pick their priorities and make the trades,
    but sometimes there will be a desire to make different trades in different
    circumstances, so as to get the best of both worlds.

    Explicit application hints are a nice way to offer different performance
    characteristics, but that requires cooperation from the application, so it
    doesn't help in an ongoing benchmark battle. OpenGL's glHint() call is the
    right thought, but not really set up as flexibly as you would like. Explicit
    extensions are probably the right way to expose performance trades, but it
    isn't clear to me that any conformant trade will be a big enough difference
    to add code for.

    End-user selectable optimizations. Put a selection option in the driver
    properties window to allow the user to choose which application class they
    would like to be favored in some way. This has been done many times, and is a
    reasonable way to do things. Most users would never touch the setting, so
    some applications may be slightly faster or slower than in their "optimal
    benchmark mode".

    Attempt to guess the application from app names, window strings, etc. Drivers
    are sometimes forced to do this to work around bugs in established software,
    and occasionally they will try to use this as a cue for certain optimizations.

    My positions:

    Making any automatic optimization based on a benchmark name is wrong. It
    subverts the purpose of benchmarking, which is to gauge how a similar class of
    applications will perform on a tested configuration, not just how the single
    application chosen as representative performs.

    It is never acceptable to have the driver automatically make a conformance
    tradeoff, even if they are positive that it won't make any difference. The
    reason is that applications evolve, and there is no guarantee that a future
    release won't have different assumptions, causing the upgrade to misbehave.
    We have seen this in practice with Quake3 and derivatives, where vendors
    assumed something about what may or may not be enabled during a compiled
    vertex array call. Most of these are just mistakes, or, occasionally,
    laziness.

    Allowing a driver to present a non-conformant option for the user to select is
    an interesting question. I know that as a developer, I would get hate mail
    from users when a point release breaks on their whiz-bang optimized driver,
    just like I do with overclocked CPUs, and I would get the same "but it works
    with everything else!" response when I tell them to put it back to normal. On
    the other hand, being able to tweak around with that sort of think is fun for
    technically inclined users. I lean towards frowning on it, because it is a
    slippery slope from there down in to "cheating drivers" of the see-through-
    walls variety.

    Quality trades are here to stay, with anti-aliasing, anisotropic texture
    filtering, and other options being positive trades that a user can make, and
    allowing various texture memory optimizations can be a very nice thing for a
    user trying to get some games to work well. However, it is still important
    that it start from a completely conformant state by default. This is one area
    where application naming can be used reasonably by the driver, to maintain
    user selected per-application modifiers.

    I'm not fanatical on any of this, because the overriding purpose of software
    is to be useful, rather than correct, but the days of game-specific mini-
    drivers that can just barely cut it are past, and we should demand more from
    the remaining vendors.

    Also, excessive optimization is the cause of quite a bit of ill user
    experience with computers. Byzantine code paths extract costs as long as they
    exist, not just as they are written.
  • It's an excuse for their normally sub-par driver quality, they can't get their generic all purpose drivers to deliver the goods so they instead do what they have done. They'll never live down the driver thing until they get their act together and just build some quality drivers that fully exploit their hardware.
    • some drivers that dont reduce the system to a quivering pile of crap would be nice.

      oh, and driver upgrades that dont cripple the whole system (like my Radeon AIW on Win2K, update the drivers and no more movies, no more DVD.... only a reinstall fixes it, yeah yeah win2K i know... its only my multimedia box...).

      I care less about performance than I do about stability. A few fewer frames/sec is faaar less crucial than crashes and inoperable features.
      • by Sawbones ( 176430 )
        yeah yeah win2K i know... its only my multimedia box

        It's really pretty sad that we have to say stuff like this. Isn't one of the (many) mantras around here to use whatever tool works the best for you (which is often argued when switching from win2k to linux). For those of us who can't just "code our own" if a solution isn't available in one platform we use another. sheesh.

        Well, there goes my karma.
  • by mESSDan ( 302670 ) on Saturday November 17, 2001 @08:35AM (#2578135) Homepage
    Basically, here's the gist of what Carmack said:

    Uhh, it's bad, but don't quote me on that.
  • by trilucid ( 515316 ) <pparadis@havensystems.net> on Saturday November 17, 2001 @08:37AM (#2578137) Homepage Journal

    "Making any automatic optimization based on a benchmark name is wrong. It subverts the purpose of benchmarking, which is to gauge how a similar class of applications will perform on a tested configuration, not just how the single application chosen as representative performs.

    It is never acceptable to have the driver automatically make a conformance tradeoff, even if they are positive that it won't make any difference. The reason is that applications evolve, and there is no guarantee that a future release won't have different assumptions, causing the upgrade to misbehave. We have seen this in practice with Quake3 and derivatives, where vendors assumed something about what may or may not be enabled during a compiled vertex array call. Most of these are just mistakes, or, occasionally, laziness."


    Carmack seems pretty well decided on this one, and not in favor of it. He *does* show a bit of support for having super-ultra-tweak control panels on driver config screens, but that's (almost) an entirely different matter.

    It seems pretty cut and dried, at least from his perspective. I for one have got to agree with this viewpoint. Of course, to anyone who has said or will say that you should just rename the executable, I completely agree that this is the only real way of getting an objective test out of it.

    Card manufacturers who do this sort of thing *will* get egg on their faces when they start hearing all about the crappy performance of their cards after a new game version comes out with different thinking on what's important visually. Unfortunately, as John points out, a lot of the flack will end up on the developers' doorsteps (misplaced, but a lot of gamers won't know that).

    • I think Carmack is supporting what MS has being doing for quite a long time since Windows 3.x: Compatibility mode. It detects whether a certain program is running and adjusts itself accordingly to support quirks and bugs. But a driver that deceives the end-users without they ever knowing that it is "optimizing" on their behalf (and degrading the quality and as if it is good for them) is not acceptable.

      Many people would cry foul if MS attempts to "optimize" MS Office in such a way. In fact, they have been claiming that IE is faster than Netscape because it is preloaded when it is not the case at all (it is only preloaded when Active Desktop is enabled). Netscape is a bloat and you will see the difference if you compare the load time of Netscape Messenger versus Outlook Express in case you still insist that IE is preloaded. Outlook Express is not integrated and thus not preloaded. Of course, you get additional features like automatic virus loading, but that's a different story.


      • "I think Carmack is supporting what MS has being doing for quite a long time since Windows 3.x: Compatibility mode. It detects whether a certain program is running and adjusts itself accordingly to support quirks and bugs. But a driver that deceives the end-users without they ever knowing that it is "optimizing" on their behalf (and degrading the quality and as if it is good for them) is not acceptable."

        I wholeheartedly agree. I can even think of several cases where such "compensation" has been very valuabel in solving certain engineering problems (not all related to gfx, but what the heck). It just seems they went a bit stir-crazy in their mad pursuit of benchmark glory. Of course, they're telling customers something resembling "hey, you've got it all wrong... we just wanna help you frag faster!" he he he.

        About the other stuff (IE and LookOut), it's just been so gosh darned long since I used Windows... all I can really comment on is KMail ;).

  • How does a graphics card driver know the name of the application which is using it?
  • This is sort of similar to the trick Intel did with MMX. Certain Photoshop filters rendered much faster when certain common settings were used, and not others.

    This created a big stink at the time.

    Of course, there is the question of how much of that sort of thing was really hardware dependant, vs a minor feature update that justified another round of hardware upgrades.

    And so the tactics of the marketing monster enter the picture once again. What can you do, what can you get away with, and how much does it cost ...

    • Intel doesn't make drivers for a CPU, and thus can't cheat on the drivers. And not even Intel is not messed up enough to design an entire instruction set just to cheat on a couple of Photoshop filters. The real reason that only certain Photoshop features were helped is because MMX is a very limited instruction set and only a few Photoshop filters could really take advantage of it.
      • You don't understand, the same plug-in was much slower if you used (slightly) different settings.
        • It could still be the limitations of MMX. We're talking an instruction set that conflicts with the basic FPU unit of the processor. I wouldn't be surprised if the speed increases were limited only to special cases where the numbers worked out right.
    • There's a big difference between these two situations. First, with the Intel situation, any modifications to the code had to be done by the application developer. Secondly, there was no quality loss in the filter; the same filter produced the exact same results, only faster.

      With ATI, the drivers took control of the quality settings, and achieved higher framerates by forcibly degrading image quality. These drivers were released by ATI to improve the ATI product's benchmark scores. Nobody can argue that going from 200fps to 220 fps is worth degrading image quality, a feature for which ATIs have come to be known and revered.

      If Intel somehow made the Photoshop filters work on every other pixel/area to cut the processing time in half, that would be analagous to ATI's situation. However, if ATI wrote drivers that set small things and had optimizations for the QIII engine that worked at any image setting, nobody would be complaining; we'd be applauding ATI for releasing good drivers.
  • Two things really surprise me:

    (1) That they didn't think anyone would notice. And that people would be a little upset when they realised that the graphics card they'd bought wasn't as quick at Half Life as hoped.

    and

    (2) They went to so much trouble. Sure, optimising the drivers so they scored well on Quake is not a bad idea. But it must have been quite a lot of work against the risk of detection.

    That said, if I want to play Quake, this is definitely the card I'll buy. I'll just make sure I don't rename any of the files.
    • They didn't do any harm. They didn't optimize the game at the expense of features, or use any other subversive method to gain benchmark points without actually making it faster. If you like quake3, you know you have a video card optimized for it.
      • The issue is, if you were to drop the visual quality, which the ATI Drivers do automatically, despite what you've selected in the game, to the same level on an nVidia card, would the ATI card STILL be slightly faster than the nVidia card, I'd guess it wouldn't be, but we'll never know, because we don't know exactly WHAT the ATI card is modifying to make itself run faster. If I select HIgh Quality, 32 Bit Textures in a game, I expect to get 32 Bit textures, I don't expect my Video Card's Drivers to decide that since it's Quake 3, 16 Bit Textures will be just fine.
    • They no longer have trade off optimisations (sacrificing quality for more performance) for quake.exe.

      Instead they now have genuine optimisations for the quake engine, so any program (no matter what the executable is called) that uses the quake engines gets genuine optimisations.

      I read a couple of reviews of the latest drivers, & they said that the Radeons (using the new drivers) do perform better compared with GeForce 3, with all (tested) quake engine games, relative to how the radeon compares with the GeForce 3 as far as non quake engine games are concerned.

      Plus there's no image compromise this time.
  • by brianvan ( 42539 ) on Saturday November 17, 2001 @08:57AM (#2578156)
    It's not surprising to see ATI do something like this. In the business world, this kind of thing may pass off as slightly unethical, but for the target market of graphics card vendors, this is just plain #$*&'ed up. It's quite true that Quake 3 Arena is a very standard benchmarking application nowadays, so any performance gain in the driver for that specific application that does not apply to all programs in general will mislead the consumer in making a purchasing decision. An analogy: no one would appreciate it if a company said their car goes from 0 to 60 in 3.2 seconds and in reality it takes 3.2 seconds just to make it up to 20.

    Carmack's right, this whole driver situation in general is a slippery slope. One of the biggest hassles of his job since Quake (#1) came out has been to get all the graphics card companies to play nice and write good drivers. This has not happened for one second. Some companies are better than others (Nvidia is an example of one of the better companies in the field all around), and the situation right now is FAR better than it was. But, between the race for speed, trying to keep up with the most popular applications, and having to support various APIs being pushed by different 500 lb gorillas in the field... it's a mess. ATI was ALWAYS a mess with their drivers, so this doesn't really surprise me. But this isn't right, cause it'll lead to a day when nothing but the lastest Quake game works on new graphics cards. I mean, nothing works AT ALL. And if it's easy enough to get to that situation, I can just see bribes and payoffs... ahem, "strategic partnerships" being made to accomodate game vendors who want to publish a working 3D game at some point in time...

    But anyway...

    One small comfort is that no one buys ATI cards for performance. Any cards of the GeForce (Nvidia's brand) variety handily whip anything at the same price point with ATI. The big thing ATI has is OEM agreements, and they also sell some really exotic TV/Video Capture/MPEG recording cards that are really snazzy sometimes. I know cause I'm using one of em right now. But their drivers suck in a lot of ways, they were never the fastest, and I'd love to see them stick to a product release schedule EVER. This is the kind of company that gets wiped out when someone new on the scene releases something better/faster/cheaper.

    Unfortunately, the last time we saw better/faster/cheaper in the graphics industry was five years ago. Nowadays, you usually get one of those improvements in a new release video card... never all three.
    • Err...you are somewhat behind the times with that statement about the R8500's speed. With the new drivers ATI has really upped the speed of the R8500 and removed the Quake 3 issue.

      The only game the Ti500 (Around 100 quid/dollars more expensive than the R8550) beats the R8500 by any significant margin is UT. With SMOOTHVISION, the R8500 has better IQ and in many cases better performance than the TI500.

      http://www.anandtech.com/video/showdoc.html?i=15 58
    • by Hollins ( 83264 ) on Saturday November 17, 2001 @10:26AM (#2578299) Homepage

      One small comfort is that no one buys ATI cards for performance. Any cards of the GeForce (Nvidia's brand) variety handily whip anything at the same price point with ATI.

      This really isn't true any longer. ATI finally released better drivers for the 8500 this week and it keeps up fine with a GeForce3 Ti 500, for $50 less. Here's [tomshardware.com] a review at Tom's Hardware.

    • So ATI runs Quake3 at lower visual quality to get a higher FPS score. So fscking what? The Quake3 crowd [quickly dons flame suit] are the ones who killed off 3DFX because nothing mattered but RAW FRAMES anyway. Now they're upset that ATI gives them more FAW FRAMES in Quake3 by sacrificing a teensy bit of image quality?

      What's the difference? All the Quake3 players are playing at 640x480, no detail, low color depth anyway with their video cars cooled by liquid nitrogen so that they can overclock by 400% and get 500fps, which they can CLEARLY tell from the 488fps they get from the same card when only overclocked with an aluminum 44 lb. heatsink/fan combo.

      Come on, ATI gave the crazy gamers exactly what they wanted: raw speed in Quake3 'cause that's all I play especially when showing frame rate off to my geek friends at lannies and who gives a fsck about anything else.

      And the car thing: every damn American performance car is build for and sold on the basis of a 0-60 time, meaning that the 0-100 times and quarter mile may suffer so that 0-60 marketing hype can look good. It's the same thing. [2nd flame layer on by now]

      Disclaimer: Yes, I own a Voodoo5 card. I bought it to *replace* a Geforce2 Ultra early this year after seeing the much better image quality, esp. FSAA, at a friend's house. Yes, I also own a foreign car. Down with applie pie!
      • by Namarrgon ( 105036 ) on Saturday November 17, 2001 @12:55PM (#2578716) Homepage
        The two problems with ATI's "optimisations" and their quality tradeoffs are:

        a) There's no way to turn them off, except by hex-editing the app. They happen automatically, and without the player (or reviewer) even realising, especially in the high-speed benchmark mode.

        b) This is not just any old game, not even a particularly heavily played game these days. Its major importance is as the #1 benchmark used by gaming sites.

        The conclusion is inescapable. This "optimisation" was not made for players, it was made to subvert benchmarks, pure & simple.

        And if you claim to prefer a higher image quality, take a look at what ATI has actually done [tech-report.com] to the visual quality of the game!

        • Yes, but you missed my point: there is nobody left playing Quake3 but the people who turn ALL IMAGE FEATURES off anyway because while they're playing, they're bragging about their frame rate, their overclock, etc. I just don't play Quake3 period. It's a boring game, my cash was wasted. I generally enjoy playing games that "hardcore gamers" hate with a passion. Ultima IX, Mask of Eternity, RealMyst, Rune, etc. And in these, ATI doesn't dick around with image quality. They turned the quality down in Quake3 because they realized that none of the Quake3 players would notice. Unfortunately, someone else did, and they told the Quake3 players and now in true dick-size-competition fashion they're all upset about it...

          Neener. I can enjoy myself without having to constantly show off the size of my balls with faster frame rates, bigger cards, and borrrrriing games like Quake3.
          • I'm all for ATI focussing their efforts on higher-quality for slower-paced games, if they choose to. But I think you missed my point: This is not just a game, it's a benchmark, and these tricks mislead people looking for the performance of the card.

            Again, if this hack was obvious & could be turned off, that'd be just fine. Reviewers could still use a known configuration & get reliable numbers, and hardcore gamers would be happy too.

            But Quake3 already has a texture quality setting. Hardcore gamers turn it down, as you say, but reviewers turn it up to max for a reason - to evaluate the cards ability to process high-quality frames at speed. And in fact, this hack makes no perceivable difference at low-texture settings, but it makes a huge difference at the high-textures settings used primarily by benchmarks.

            Don't you think that's a hell of a coincidence?

          • You're missing my point. I'm saying that if you're using Quake3 as your benchmark, you don't care about anything but frame rates anyway and you get what you deserve.

            If you really care about image quality, you're sure as hell not paying attention to any of the Quake3 benchmark numbers because reality has taught us that 3D benchmarks are utterly meaningless except if you're an I-don't-care-about-quality 3D gamer anyway.
  • Let me get this straight. ATI takes the time to create optimizations in their drivers that make Quake III run faster, and people are unhappy about it?

    I'm not implying that ATI did it in a selfless manner; enlightened self interest is a good thing. ATI does well in reviews and Quake III players that buy ATI cards get faster operation. Other than the competition, who loses here?

    It's not as if ATI contracted with id to make other games slower. They just chose to optimize for the common case. There's a phrase to describe that type of choice: Good engineering.

    There's another as well, that I suspect may be a part of the "controversy" here: Good business. And as we all know, business is bad.

    <sigh>

    Maybe I'm wrong, and folks just haven't taken the time to think about this issue and instead are reacting w/o understanding. Frankly, I'm not sure which thought depresses me more.

    • by Mark J Tilford ( 186 ) on Saturday November 17, 2001 @09:18AM (#2578180)
      What they did was not "Have Quake run in mode which is faster" but "Have Quake run in mode which has lower quality than it should so it will run faster"
      • Actually, that's *not* what they did. Carmack make's their hack quite clear. Like all applications, Quake only uses a limited amount of the OpenGL API. Because of that, the graphics driver can make certain assumptions that would normally break certain features, but really don't hurt anything since Quake doesn't use them. Thus, the program can run faster. Still, its pretty devious nonetheless. If they came up with some nifty hacks to make Quake 3 faster on their (cruddy) drivers, then they should have released this as a feature, not hidden it. The fact that it is hidden is what makes people think it is just there to cheat on benchmarks.
        • That brings up this question:

          If assuming routines are not used, why can the game design houses not share source (open source drivers for commercial hardware would be a bad thing IMO), but a simple list of what OpenGL calls they use, so that the driver can adapt itself accordingly for -any- program? I'm talking about a paper that says, "we use this, this this" so the driver could adapt itself to not use the missing calls.

          I'm sure that UT in OpenGL mode doesn't use all the OpenGL calls, nor does any other 3d engine.
      • Actually, if you go look at the information that was published (HardOCP and Tech-Report), you'll see that ATI DID sacrifice image quality for speed, only in the case of Quake3. To give some background, in the early 90's a card company tried to do this to PC Magazine's benchmarks by hard coding the text fill string used by PC Magazine in their BIOS. In principle, This is no different.

        This is all fairly moot at this point, since ATI has fixed the drivers so they don't screw with the default texture settings when you are running quake3.

        However, the damage is done. And I know I sure as hell won't be buying an ATI card for a while.
    • Read the old article (Score:2, Informative)

      by mESSDan ( 302670 )
      The issue wasn't that it made Quake 3 faster and nothing else, the issue was that it made it faster by degrading it's visual quality, and did it without informing the user.

      old article:
      ATI Drivers Geared For Quake 3? [slashdot.org]
      Or if you hate clicking and want to cut-n-paste:
      http://slashdot.org/article.pl?sid=01/10/24/164320 4&mode=thread
    • The point is that ATI weren't just optimising their code to run faster in Quake3, they were forcing quake3 to use low-quality textures etc in order to make it run faster. This impacts the user's experience and makes things dificult for developers.
      Imagine an inexperienced user with an ATI card, loading up quake3 and say UT. In UT he/she gets smooth high-quality textures, and in Quake3 shabby low-quality stuff. Who gets the blame? Not ATI.

      It also makes it more difficult to develop in general when driver writers don't stick to standards.

      It'd have been fine to allow the low-quality/high-performance setting as an option; it's not fine to force users to use it.
    • I'm not implying that ATI did it in a selfless manner; enlightened self interest is a good thing. ATI does well in reviews and Quake III players that buy ATI cards get faster operation. Other than the competition, who loses here?
      It is in fact their motivation which is questionable, and it does not appear to be enlightened self-interest.

      If they were optimizing for Quake 3 because that is where their largest audience was, and therefore they could satisfy the greatest number of their customers in that way, then yes, I would have a hard time faulting their actions from the standpoint of the free market(although I still wouldn't buy an ATI card, because that is a stupid engineering decision). However, this is not why they optimized for Quake 3.

      Confining the dicussion solely to first-person shooters, it should be pointed out that Quake 3 is *not* the most popular game in this genre. Gamespy.com [gamespy.com] tracks playing statistics -- the numbers they display for today have been consistent for some months(excepting the addition of RtCW). I presume Carmack didn't want to dwell on this at great length in his .plan, and who can blame him? Quake 3's failures are not failures of technology(for which he is responsible) but of gameplay(for which he is not). The reason that ATI optimized for Quake 3 is that it is used as a benchmarking tool. Their expectation, probably, is that gamers will see relatively high benchmarks for Quake 3 in reviews of ATI cards, and generalize from that to assume that all OpenGL games will enjoy excellent performance on ATI's card. This is likely a false assumption for anything other than Quake 3 or a Quake 3 mod. (Even games based on the Quake 3 engine will not share the performance benefits, unless you rename their executable -- and possibly not even then, depending on how modified the engine is.)

      So in other words, the problem is not that ATI has cheated on their drivers to please the massive Quake 3 community. The problem is that they have cheated on their drivers to deceive reviewers, and(they hope) customers who read reviews. This is "self-interest," to be sure, but I do not find it to be "enlightened."

    • by Penrif ( 33473 ) on Saturday November 17, 2001 @10:03AM (#2578249) Homepage
      Let me get this straight. ATI takes the time to create optimizations in their drivers that make Quake III run faster, and people are unhappy about it?

      You're damn right I am. As someone who uses graphics hardware for scientific visualization, any time hardware manufactures spend on making games faster is a complete waste. With the latest cards and drivers from ATI, we've seen this problem en masse. Games run great, but code that we write (that's general OpenGL and works great on GeForce cards) runs slower than hell on the 8500.

      So, if all ATI wants to do is sell cards to gamers (and that's certainly a good market), they can go right on ahead and do that. Nvidia takes more time with their drivers and makes them much more optimized for the general case...so they've got my buisness for now.

      Just my thoughts. I'm going to skip the part where I tell everyone else that they're being knee-jerk.
      • There wouldn't be consumer level vector graphic cards. You just don't get the economies of scale to be able to produce a 100 dollar 3D card.

        Games are the reason that you're not spending 10's/100's of thousands of dollars on SGI equipment.
    • The only reason this is bad is because they might have mislead people into buying their card because of Quake benchmarks. People base their buying decisions heavily on Quake framerates, whether or not they're going to run Quake all the time. So they look at a review, see ATI runs Quake quickly, buy the card, take it home, try playing UT and wonder why they're not getting the same framerate.

      They could have released it as a feature: "ATI drivers especially optimized for Quake!" and people who care a lot about Quake would have been happy to buy it, and perhaps some people who didn't care as much would have bought it as well.

      I think it would be a great idea if more graphics card manufacturers tried to specifically optimize their drivers for some of the most popular games out there. After all, if half your clientele is playing the same thing, there's no reason that shouldn't be part of the driver. Except it has to be publicized properly.

      • Maybe I'm wrong, and folks just haven't taken the time to think about this issue and instead are reacting w/o understanding

      Are you a games or gfx developer? If not, then please feel free to take a long walk off a short pier. This ATI driver kludge blows my stack; when I tell the driver what to do, I expect it to damn well do it and not to second guess me without even having the common courtesy to document that it's doing so.

      Picture the situation where I write a generic 3D engine that I use in multiple applications. Do I want the driver to decide how to act based on the application name? Do I hell. It's a support nightmare, and ATI should be roundly cuffed not for doing it per se, but for failing to be upfront that they have done it and for failing to disclose exactly what they've done.

    • As others have pointed out, the quality was lowered to gain faster frame rates. However, you also say that "They just chose to optimize for the common case".

      I'm not so sure Q3 is the common case, except in benchmarks. I believe the main reason Q3 is used is that there is a reproducable method of testing, done by running the demos. Other 3D games often don't have that feature, making it harder to do benchmarks. As Q3 is easy to benchmark, people use it as such even if Half-Life or UT are more common (Note: I have no idea which is most common).

    • Let me get this straight. ATI takes the time to create optimizations in their drivers that make Quake III run faster, and people are unhappy about it?

      [snip]

      Maybe I'm wrong, and folks just haven't taken the time to think about this issue and instead are reacting w/o understanding. Frankly, I'm not sure which thought depresses me more.

      I agree with your last paragraph. You need to understand what you're talking about before you make assumptions. Take a look at some of the graphics quality comparisons [3dcenter.de] between the original and quack executable names for Quake 3. It's pretty damn obvious that ATi forced some reduced visual settings (which are normally settable by the user in-game) in order to gain extra FPS, which is really really bad. ATi is causing uneven visual quality when compared to a GeForce at better image quality in order to get better benchmark scores.

      It's not the job of the graphics card manufacturers to dictate visual qualities; benchmarkers will benchmark each game with the same settings (when available) for all cards. When the reviewer loses control of these settings, the benchmarks become tainted and meaningless, and become just a tool for ATi marketing.

    • This guy has OBVIOUSLY not looked at comparision screenshots, which will blow your mind. Quake3 with ATI has enourmously lower quality than it should, they didn't practice 'Good Engineering' or whatever this idiot is blabing about. They cheated and got caught.

      That this is modded as 'informative' is rediculous, this guy is barely informed
  • Isn't it about time for Asus to release a new driver to "aid inexperienced players in single player games"? I mean it's been what? 6 months since they last tried to get those cheating-llama drivers out, about time they tried again methinks.
    • Isn't it about time for Asus to release a new driver to "aid inexperienced players in single player games"?

      Oh, good news [zeroping.com]! That's not necessary! See, now punk bitches [penny-arcade.com] of the non-Asus card-owning variety can enjoy all the benefits of superpowers in online games!

  • The results of using application X as a benchmark program, tells you nothing more than how fast X runs on your hardware with your drivers. If people want to know how fast hardware Y can run application X using the provided drivers, that's what they should told -- not how fast it runs with crippeled drivers.

    As for testing the general performance of a card, software-specific optimizations should, of course, be turned off, and you shouldn't test the card on one application only. In particular, you shouldn't us a so-called "benchmark" program, as these usually are poor written, don't resemble the performance of the card in the Real World, and card manufacturers easily can optimize their drivers for these applications. Quake 3 is a nice Real World example, but it shouldn't be the only one, if you're planning on doing a serious benchmark.

    Do compare the hardwares performance on several applications using different kinds of vertex submission, lighting models, amount of textures and so on.

    As for benchmarks, I have another complaint to make: watch out for drivers that comes with framerate clamped to the monitors vertical synchronization rate (e.g. maximum 75 fps on a 75 Hz monitor). This looks better visually (since the picture doesn't get updated while it's currently being drawn on your monitor), but a lot worse on benchmarks that do framerate comparasion.

  • by freaker_TuC ( 7632 ) on Saturday November 17, 2001 @09:26AM (#2578191) Homepage Journal
    So I need to rename all my games, Simcity, Theme Hospital, UT, to quake3.exe to get a optimized version ? :o)

    cd \sc2000
    ren simcity.exe quake3.exe
    cd \hospital
    ren theme.exe quake3.exe
    cd \ut
    ren unreal.exe quake3.exe
    ...

    (already can see it happen!)
  • Who give's a shit about Quake 3 any more. I want to know when Doom 3 is going to be out.
  • ATI put code in their drivers to optimize for Quake3, based on the name of the executable

    That's not how I'd describe it. As I understand it, they made what Carmack called a "conformance tradeoff", arbitrarily reducing image quality to increase speed. That's not a Quake3-specific optimization, that's a Quake3-specific dirty, sneaking, benchmark-busting hack.

    • Yes it is a Quake 3 specific optimization.... the ONLY time it does anything other than it's "normal" drivering is when you launch an app called "quake3.exe". I'm sure whatever it does could be of benefit to other OpenGL apps (I dunno the technical details well enough) but it's doesn't matter, because the special code will never get run for any other app than Quake3.


      I don't think you can get more specific than that =)

      • "Yes it is a Quake 3 specific optimization.... the ONLY time it does anything other than it's "normal" drivering is when you launch an app called "quake3.exe"."

        I suggest you re-read the last sentence of the post that you replied to. The poster does not dispute that the thing being discussed is Quake-specific. Instead, the poser is (correctly) disputing that the thing being discussed is an optimization. It's not. It's a quality change designed to give the illusion of all-round superior performance.

        • IT gives you more frames per second. As a gamer, where FPS is often more desirable than visuals, it is an optimization. I'd rather have an extra 10 fps so my mouse is smoother and my railgun aim is better than have that lava in the corner look nice.

          Though I get your point... I guess it depends on which side of the coin you are sitting on... looks vs. speed.
      • Here's a little analogy.

        In the state where I live, it's required by law that you get your car's emissions tested annually. If the car pollutes too much, you cannot renew your registration.

        There are a few mechanics around that offer a special (and illegal) service to help your old clunker pass the pollution benchmark test. They basically adjust the timing and fuel mixture on your car to force the emissions level way down until you pass. Now, when adjusted in this way, your car will barely run, and is certainly not driveable, but it looks good on the test.

        Lowering the usability/quality of a product in a particular situation in order to improve the numbers on an arbitrary benchmark is *not* an optimization, it's a dirty hack.

  • by be-fan ( 61476 ) on Saturday November 17, 2001 @09:38AM (#2578212)
    To all you OSS zealots out there, *this* is why NVIDIA's drivers are closed source. You can bet ATI would love to put this whole driver fiasco behind them and just steal the high level OpenGL code (an OpenGL driver has to implement the whole GL API, not just hardware interfacing) from NVIDIA's ICD.
    • Why would they need the high-level OpenGL code (Besides, here's a clue for you- they probably DO happen to have the same high-level code, licensed from SGI...) when it's the innards of the driver, working with the low-level pieces that makes the card fast or slow? Since the low-level pieces are not going to be the same (nor, would they likely ever be so) the low-level stuff would be of little help for either company.
      • The high level OpenGL code that comes from SGI isn't terribly well optimized. It is a reference implementation meant to be feature complete and correct, not fast. NVIDIA license the code from SGI and has spent the last several years hacking on it, ever since the original ICD came out with the Riva TNT-1. In that time, the code has probably changed a huge amount. Also, the high level code can have a lot of influence on how a card peforms. The OpenGL ICD driver is basically the entire OpenGL library. How it handles and optimizes the various calls made by the application to the driver makes a big difference in speed. For example, if the application calls for a certain set of rendering parameters, is the driver smart enough to take certain shortcuts based on those parameters to make rendering faster? A lot of this stuff is hardware independant, and the optimizations in the high level layer would help any 3D card.
    • by Animats ( 122034 ) on Saturday November 17, 2001 @02:28PM (#2578960) Homepage
      There's another reason. NVidia has two product lines, the "GEForce" for consumers, and the "Quadro" for professionals. The same chips are used for both [geocities.com], there's a jumper on the card that identifies the model, and the driver turns off some features on the consumer model.
  • I seem to recall an article on Toms Hardware about some new drivers being released for the Geforce (1 not 3) that provided "incredible" performance gains in Quake2. Other benchmark progs showed very little or no benefit at all, but Quake2 benchmarks went up by quite a bit. I don't know if this article is still available, and yes, I could be wrong, but it seems everyone is lambasting ATI for something others have done. Namely, creating drivers aimed at a specific benchmark. Is it wrong? Yes, if you want to see the true performance of the card in a particular prog. Should they be getting this hammered for it? No, I don't believe so.

    BTW, the benchmark may have been ZDNets' old graphics benchmark prog....
    • They should be hammered if they effectively override the quality settings made by the user in the q3a settings menu. And that's what ATI's drivers did...

      It's just plain cheating. If the user is willing to sacrifice image quality for better performance he can easily change those settings himself. There is absolutely no reason why the driver should do that for him (except to get better results in benchmarks).
  • Do I care that ATI optimized the drivers for Quake? No - and I didn't buy it to play Quake.

    I bought the card because I wanted a card with good 2D image quality (something lacking in most nvidia-based cards, as I read in most reviews), good TV-out with all the tweaks (a friend of mine has an Asus Geforce3 and its TV-out looks like a joke compared to ATI's), and decent 3D performence is just the icing on the cake. Since I spend more time looking at web pages, text and Divx movies than I do "gaming", this was my card of choice.

    Do I regret the decision? Nope. The card performs respectably in every game I do play and even looks good in 3D Mark 2001... Would a hardcore gamer buy this card? I think not - but again, I'm not a hardcore gamer...
  • Do you honestly think ATI is the only one doing this? I sincerely doubt it. All teh video card makers know the HW sites use X set of benchmarks and will work to ensure the best possible performance for them. Besides - who cares? Its not cheating. Just because you have certain optomizations which kick in for various popular apps is just giving your customer better performance for stuff they are likely to use.

    There are unknowns like perhaps these optomizations impact overall performance? So they are selective? What would you rather have? Faster Quake performance at the expense of over all performance or better Quake performance with the best overall performance you can get?

    This is just nitpicking to try and make news. Everyone knows the HW makers tune their systems around benchmarks - hell even the CPU makers try to do it I'm sure.

    • The optimizations impact image quality.

      Set quake3 to low quality in the in-game menu and you get similar performance as with ATI's hack. And equally degraded image quality.
      • What would you rather have? Faster Quake performance at the expense of over all performance or better Quake performance with the best overall performance you can get

      I'd rather have the driver do what I damn well tell it to do, and not second guess me based on the application name. As you clearly haven't followed this thread, feel free to take a clue check. These aren't "optimisations" they are "trade offs". The driver ignores the quality settings and drops the quality to up the frame rate, without even telling you it's doing it.

      What that means in real terms is that on kick ass hardware, the peak frame rate goes from a theoretical 150fps to 160fps (which you never get to see because of your monitor refresh) while at the same time reducing the image quality, which you will see on a big enough monitor.

      If this kludge attempted to do something sensible, like dynamically reduce texture bandwidth to increase minumum framerate in busy scenes (which is what matters), I'd be more inclined to like it, but that's not what it does at all.

      I'm a games developer (hobby, ex commercial), by the way. I do not appreciate having my engines run differently depending on what application I use them in.

  • I'm sure those people who play Quake 3 will be VERY unhappy about getting better performance. :-)~
  • Would be to let the users themselves set low-level details for the drivers, like different command-buffer sizes and so forth. I'm sure that more than one hardcore tweaker would drewl over the possibilities. With the fanbase many games have there would surely appear an extensive archive of "ultimate" tweaks for each game.

    Another feasible option would perhaps be to *cough*OPEN-SOURCE THE DRIVERS*cough*

    What is the current status of linux-Radeon 8500 drivers? I'm guessing that more people than me are using their 'nux boxes for 3d-gaming, and would jump at the chance of getting a card with NICE drivers (not to pick on NVdriver *cough*)
  • the weakness of using an application (and a single one at that) as a hardware benchmark.

    ATI shouldn't be bashed just for optimizing it's drivers. Ok, so it's a software optimization,
    and it's geared toward a single application (albeit a popular one). It's still no different from what a lot of
    manufacturers are already doing with winmodems and other pieces of hardware 'optimized for windows'.

    The solution is obvious: Stop using single applications/OSes/etc. as benchmarks,
    use real benchmark applications or at the very least, a battery of different apps.
  • It doesn't matter if the end result is desirable or not.

    It doesn't matter if other manufacturers are doing it too.

    It is blatent benchmark manipulation (cheating) and it is dishonest because the intent isn't to provide better performance to Quake players, it is to make their product look better in magazine benchmark shootouts.

    It is deceitful and wrong. ATI, and any other manufacturer who engages in this sort of activity, deserves every bit of flak they get for it.

    ATI products have long lagged behind the competition. Apparently, they can't build better products, so they have to resort to dirty tricks.

  • ATI is cheating (Score:3, Informative)

    by kill-1 ( 36256 ) on Saturday November 17, 2001 @10:29AM (#2578303)
    A lot of people still don't realize, what ATI's "optimizations" are about. It's not a optimization specific to Quake 3 which noone could complain about. ATI's drivers forced some features to be disabled massively sacrificing image quality only when Quake 3 is run. It's not clear what they did exactly, but it looks like they were forcing 16-bit textures and/or using lower-resolution mipmaps. See the results here [3dcenter.de]. (The site is in German, but they have some nice details of screenshots, showing the difference onmouseover.)
  • Back in the day, there were cards optimized to speed up ONE application. Radius (before it became just another name on cheap monitors) used to make cards that were specfically optimized to speed up photoshop filters. And they sold them in droves for $500+ Others did the same thing. The difference here is that Radius TOLD you beforehand that the card was basically designed to give you 24 bit graphics (when most computers were 8 bit) and to speed up photoshop.

    Evidently ATI never told anyone they were buying a "Quake Accelerator". And therein lies the problem.
  • If you'd read the analysis articles at all, you'd note that the trick ATI is trying to pull is NOT driver (good)optimization for quake 3, but instead is intentionally degrading image quality to improve benchmarks. Quake3 looks like total crap with the ATI drivers, that's why it's fast.

    It's essentially of forcing any game with the name quake3 to run at below the minimum detail levels, regardless of what the user has selected, just in order to manipulate benchmarks.

    You can debate whether optimizing for a certain game is good or not, but that is a totally different question from what ATI is actually doing, which is intentionally manipulating benchmarks.

    I know it's hard to keep track of all the news, but before saying that some driver changes are "good optimizations", you should really check out the facts first. You can look through the comments for this article and see that most of the slashdot readers only read the headlines and initial blurb. Because of this, a lot of people are misinformed about what is really going on.

  • LINKS. (Score:5, Informative)

    by leuk_he ( 194174 ) on Saturday November 17, 2001 @11:48AM (#2578523) Homepage Journal
    If you missed the discussion in the first place.

    The Register [theregister.co.uk] : As we say, if you like the 8500's Quake III frame-rate but aren't willing to put up with the dip in image quality, buy a different card. Or wait for ATI to change its drivers, which, we understand, it's in the process of doing.

    HardOCP [hardocp.com] was the first to publish about this: The Facts As We See Them: It certainly seems to us here at [H]ardOCP that ATi has in fact included application specific instructions in their version 5.13.01.3276 Win2K drivers that make Quake 3 arena benchmarks faster by up to over 15%. Either way, the driver optimisations for Quake III are just one of the (many) factors that differentiate different vendors' products. ®

    firingsquad [gamers.com] show show some details how the quack.exe is made and concludes:
    To some of us, it seems like the evidence points towards intentionally deceptive code designed not only to inflate benchmark scores, but also to keep anyone from finding out. To others, this is nothing more than an overreation to a perfectly legitimate game optimization. In our eyes, anyone who vehemently peddles either of these explanations is either naive or pushing an agenda of their own.

    there later in Q&A with ati [gamers.com] explains in 2 pages that :Our goal for the RADEON 8500 is and always has been to deliver the best possible gaming experience to our customers.

    yeah right.!
  • by tcc ( 140386 ) on Saturday November 17, 2001 @11:50AM (#2578528) Homepage Journal
    If they would have DOCUMENTED the changes, and put a button in the control pannel (like where all the antialiasing on off, buffer level and all is) it would have been legit, they could have put it in their press release and send them around WITHOUT getting any flames.

    There are ways to do things, you can bend the rules, but breaking it will only get fire back. I'm all for specific-engine optimization, and like carmack says, Conformance trade-offs shouldn't be made at the driver level in general, but to this I'll say, if it's documented and available as an OPTION somewhere, it won't apply only to techies, it can be applied to joe schmoe as well "if you click here, game will run faster with a small quality degradation (that you will probably not even notice because at the speed of events in quake 3)". Heck if they can put something like Quancun Aliasing in the Panel options, you can be sure people will understand specific game optimization as well.

    Anyways forcing something to the users without their knowledge is just plain bad and lame. Of course people will end up finding out, and of course it will backfire... what is gain from doing that!? there sure is poor judgement at ATI, and there's probably a bunch of people there right now saying "see? told you so!". Management, you should listen to the people in the lower part of your food chain (i.e. QA testers), they might know more about the end-user market and hardware benchmarkarking scene (not the OEM of course) than you do!).
  • New ATI drivers (Score:3, Informative)

    by queequeg1 ( 180099 ) on Saturday November 17, 2001 @11:57AM (#2578548)
    Anand did a new look at the 8500 with the new ATI drivers. He claims that the image quality is now identical to the Geforce3 and that the driver optimizations work for all Quake3 engine based games. Anandtech article [anandtech.com]
  • Carmack makes a good case for why this would be reasonable to do in certain cases.

    ATI says that they optomise their driver for some other popular games. I would be interested to know which other games. I doubt they could list them all for pr reasons but it would be cool if they would just list 2 or 3 other games that they optomise for.
  • by Namarrgon ( 105036 ) on Saturday November 17, 2001 @01:33PM (#2578822) Homepage
    If ATI had just optimised for a specific game, I think nearly everyone would be fine with that. Perhaps the only objection you could make is that they might have spent their time optimising for the Quake3 engine instead, covering a wider range of games and benefiting more users. When that game is a benchmark (or rather, THE primary benchmark used to determine real-world performance under a set of well-known and duplicatable conditions), then optimisations specific to that are a little more questionable.

    While many companies focus upon optimising for benchmarks, most simply optimise the driver paths for the specific cases that those benchmarks use. Any other app that uses similar settings can gain performance from that work. But ATI have made their optimisations dependant upon the name of the app, so no other apps can benefit from their work. While a Quake3 player might not mind, Q3 isn't as widely played today as it once was, and that same player might be less pleased when the card fails to perform to the same standard on any other of their games.

    But what makes this particular "optimisation" underhanded is that it's not better or more tuned code, it trades off quality. Have a look here [tech-report.com] to see the mess that it makes of textures!

    Now, if players wanted to see blurry textures in exchange for more performance, they'd simply lower the texture quality slider in the Quake3 game. ATI's drivers do this for them; they're forcing the mipmaps two levels down - a 16x reduction in texture detail - to get the extra speed. And this isn't optional. You can't turn it off, short of using a hex editor.

    Particularly, a reviewer running the standard Quake3 High Quality benchmark will never notice the difference (as the frames run by way too fast). There's no extra sliders in the driver, no other indications or switches, and of course no notification in the driver documentation. All the reviewer sees is higher framerates, because the drivers are, quite literally, cheating. They're giving low-texture numbers on the high-texture setting.

    I for one applaud ATI's renewed efforts to improve their drivers (and I still plan to buy an 8500DV alongside my existing QuadroDCC), but I feel ATI really have attempted to subvert the benchmark process, and shot themselves in the foot. The strong implication is that, even though their hardware is fine, they don't feel they can compete with nVidia's driver team so they have to resort to methods like these.

    Very unwise - they've lost a lot of the support they had as the underdog to nVidia, through these tactics.

  • We can all agree that nVidia has the best drivers in the industry. ATI is #2.

    Sometimes when you are #2 you try harder. ATI, being behind on drivers, has been much more open about their cards' architectures, and so Xfree86 has fully open 3D drivers for Radeon but not for nVidia.

    Even under Linux, the nVidia drivers perform better than the ATI drivers. But there is a chance that the open source community will improve the ATI drivers. It may take a long time -- Mozilla took years before it became really good -- but at least under Xfree86, ATI may catch up or even pull ahead.

    If I were ATI, I would be paying money to one or more Xfree86 developers. Not only would that mean one developer would be working to improve the drivers, but that developer would also be collecting patches and integrating them. Long-term it could pay off very well. (The free software community will improve the ATI drivers no matter what, but it would go faster if they paid one or more people to work on the drivers full-time.)

    In fact, if I were ATI I think I would release the Windows drivers as open source! It's not as if they need to worry about nVidia stealing their code.

    P.S. If you feel strongly about free software, then a Radeon is the card for you. With a decent computer you get enough performance to play games. (Tux Racer runs great.) Maybe ATI should get Richard Stallman to endorse their 3D cards? :-)

    steveha
  • ...what ATI would say if the next ID title was designed to run slower on an ATI card?
    I get the feeling they wouldn't like it much, but I'd love to see it happen.
    They could alway patch it, so you suckers that continue to buy ATI products shouldn't whine to loudly.
    • They'd probably say that that was unfair competition, and I'd say that it was stupid.

      ATi products are quality products - sure, the drivers are less than perfect, but for the price/performance ratio, and the feature set, it's great, and the support is excellent (I always feel like I'm dealing with a five-person company, but at least it feels like those five people really know what they're doing).

      Hell, I got an All-In-Wonder Pro when it was still a $180 card, for something like $120, just by agreeing to send them in my old video card. I figured, I had a spare one lying around, I could send that one - but no, they told me, you don't have to send US a card until you get yours.

      ATi provides quality mobile chips, and they provide video cards for all situations - I've seen AGP video cards (decent chipsets too) for less than $100 CDN, and I've seen AIW Raedons for $589, and everything in between.

      They're a good company, they've just always been a little bit behind nVidia. The Raedon was a leap, but they're only in second place now.

      Anyway, their new drivers were apparantly released, and the performance is better in all Quake3-based apps, AND the image quality is comparable to the GeForce cards.

      Why would id software piss off the second-biggest video card maker on the planet? Why burn bridges you don't have to, just to be an asshole? You don't get anywhere by stepping on toes.

      --Dan
  • Let's put it this way. Many companies cheat when it comes to "demoing" their products. And benchmarks are, for pretty much all intents and purposes, a demo of the product being benchmarked. So they're cheating. OH WELL.

Real programmers don't bring brown-bag lunches. If the vending machine doesn't sell it, they don't eat it. Vending machines don't sell quiche.

Working...