Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software

Radeon 8500/GeForce3 Ti500 comparison 140

RainDog writes: "The Tech Report has put together a pretty detailed comparison of ATI's Radeon 8500 and NVIDIA's GeForce3 Titanium 500 graphics parts. Despite being incredibly thorough, the review is also a pretty entertaining read. Definitely the best comparison of these cards I've seen to date."
This discussion has been archived. No new comments can be posted.

Radeon 8500/GeForce3 Ti500 comparison

Comments Filter:
  • Open Drivers (Score:5, Insightful)

    by riggwelter ( 84180 ) on Friday December 07, 2001 @10:11AM (#2670420) Homepage Journal
    For any Linux users looking at these cards, remember you can get 3D hardware acceleration on the Radeon with the Open Source drivers, you need to download the closed drivers for the NVidia card...
    • Re:Open Drivers (Score:5, Informative)

      by brunes69 ( 86786 ) <`gro.daetsriek' `ta' `todhsals'> on Friday December 07, 2001 @10:27AM (#2670499)

      Also remember that the closed source NVidia drivers are far more advanced than the Open Source Raedon ones, and include options like Full Screen Anti-aliasing, Twinview with TVout or a second Monitor, etc etc. THe NVidia linux drivers use the same core as the windows NVidia drivers (the benefit of theur Unified Driver Model), so the latest linux drivers are usually as fast or faster than the latest windows detenators.

      • But they can be difficult to get working and have caused many crashes and don't work with SMP.
        • and don't work with SMP

          This used to be very true, but the latest or at least the 1541 drivers work quite well with SMP. I have a dual 500piii that the nvidia card used to lockup on all the time now I can play all of the 3d games for hours at a time with no issues at all. Although I still don't like the fact that a portion of the drivers are closed source, and my g400 has better 2d, it does work quite well.
      • Re:Open Drivers (Score:5, Interesting)

        by Junta ( 36770 ) on Friday December 07, 2001 @11:34AM (#2670783)
        Also note that if your nVidia has TV capture ability, it's not likely to have Video4Linux compatibility. ATI All-in-Wonder cards have historically had this ability, through the GATOS project.
        As far as TV-Out goes, it really annoys me how ATI protects this for the express purpose of protecting the bits that control MacroVision. I mean, under Windows there are always hacks for MacroVision even when there is no documentation available, and if you are a linux head and want to do this, you can use the framebuffer and XFree FBDev and mplayer in console mode to acheive TV-out that sucks for normal usages (unaccelerated) but suffices for those who want to copy stuff to tape.
        All that said, the TV-Capture capabilities combined with a really nice 3D-chipset and open drivers make me want a Radeon AIW 8500DV. I'll wait til GATOS has official support though.
        The problem with binary drivers is that:
        1) You are stuck with Linux on x86. No *BSD, no alpha/powerpc/etc.
        2) You are stuck with what linux kernel that nVidia deems ok. This may be fine for now, but when nVidia releases new products, and cease support of older ones, when you upgrade your distro to something with, say kernel 3.0, your screwed because they only support the GeForce 4 and newer (hypothetical future)
        • well as long as nvidia keeps using unified drivers i dont think the second problem will ever be a problem. i'm still using a TNT 1, but i use the same drivers the GeForce 3 Ti500 uses.
        • You are stuck with what linux kernel that nVidia deems ok

          Bzzt. Wrong. The kernel module is open source, and thus is compilable with any kernel. The only binary only part is the X module, which will work with any version of X4. In fact, I'm using the driver on kernel 2.5.1-pre1 right now, and that sure as hell isn't listed on the NVidia site.

        • Re:Open Drivers (Score:5, Insightful)

          by aussersterne ( 212916 ) on Friday December 07, 2001 @01:49PM (#2671538) Homepage
          You're wrong about this.

          a) There is an open-source component which hooks the driver core into your kernel. As long as you have XFree86 4, you'll should be able to use the latest NVidia driver by issuing "make install" in the source directory. I have not had any problems with NVidia drivers yet, on any version of the kernel, and I'm now in the 2.5.x-prex series.

          b) Which brings me to support of older cards... You haven't bothered to look at the list of hardware supported by the NVidia driver, have you? You might be surprised... driver support goes all the way back to the NVidia RivaTNT... which predates Linux DRI 3D support!

          This anti-NVidia-Linux stuff is just a lot of GPL-fanatic FUD.

          I've personally owned and tried a Voodoo5 5500, a Radeon (original) and my current hardware, a Geforce2Pro, under Linux. There is no comparison in driver support/how well the cards work... The NVidia card "just works" with Linux and is as fast or faster than under Windows. By comparison, the others feel half-supported by Linux at best.
          • []
            >The NVidia card "just works" with Linux
            []

            So explain to me, why Alan Cox and the other kernel hackers have installed some code in order to detect when a kernel has a non-free driver used?

            Because they refuse to troubleshoot problems caused by NVidia non-free drivers (as said by Alan Cox himself).

            The driver is working for you, and to be honest, I had once a TNT (bought because I believed that the driver was going to be open-source) and the driver was also working for me.
            But is-it working with SMP? or some other bizarre situations?
            I wouldn't bet on it!
            • I am using the Nvidia drivers under 2.4.16 w/ kernel pre-emption patches; and unfortunately, it is true that this is still the best overall OpenGL implementation available for Linux (without paying; I haven't used Xig's commercial drivers. I'd like to hear comments if anyone has used them). The ATI drivers are not bad, but miss some useful extensions (Imaging, for example). The Nividia drivers seem to (finally) work well in SMP.

              That said, I originally bought a TNT2 for my home machine when I heard about "open source" drivers for Linux being released; I was VERY annoyed with the first round of obfuscated code. I would very much like to see the Radeon 8500 supported well under linux; I would choose it for my home machine over an Nvidia board for that reason alone (and this after swearing to NEVER buy an ATI again, after dealing with their HORRIFIC old Windows drivers).

              But for my job, I simply can't recommend other than Nvidia for Linux, currently. Their drivers are simply the best available at the moment, and they do seem to be commited to supporting Linux well.

              As for the kernel hackers, Alan has said he doesn't think the Nvidia drivers are particularly bad (not the more recent ones, anyway). Just that USERS often pester the kernel list for help with problems that cannot be debugged with proprietary drivers running. They came up with a technical solution that helps them more easily weed out the drivers that cannot be debugged by them.
      • Re:Open Drivers (Score:2, Insightful)

        by shepd ( 155729 )
        And also don't forget that because the drivers are closed Nvidia can:

        - Stop making updates when linux isn't so "hot"
        - Stop you from adding features
        - Stop you from making modifications to make the binaries work with your system
        - Stop supporting newer kernel revisions
        - Stop supporting newer X releases
        - Stop support for older cards
        - Stop support for "clone" cards (should they be made)
        - Make you pay for updates

        And many, many, many other uncontrollable thinigs.

        Fortunately, with open source, most all of those problems can be avoided.

        I'd take slow and reliable over fast and loose any day. But that's just me.
        • Re:Open Drivers (Score:1, Flamebait)

          by brunes69 ( 86786 )

          You obviously have no idea how NVidia's cards work. Every NVidia chipset, on every platform, uses the same driver code. This totally negates points 4 and 5, and effectively negates point 1 and 6, since as long as they make windows dirvers, the cost to port them to linux is almost zero for them, so they'll probably continue to do it. Point 4 makes no sense, since the kernel loader is open source, and compiles on any kernel. As for point 2, the drivers support the full range of features on the card (including TV out with no gay macrovision *ahem ATI lovers*), so what more could I add? As for point 3 and 5, it is up to the XFree86 team to make sure that binary XFree drivers work with new XFree versions, not NVidia.

          So that pretty much destryos your whol arguement. Any more FUD?

          • Re:Open Drivers (Score:1, Flamebait)

            by shepd ( 155729 )
            >You obviously have no idea how NVidia's cards work.

            You obviously don't know how to set up a proper counter argument.

            Let's pick it apart, shall we?

            >Every NVidia chipset, on every platform, uses the same driver code.

            Yes. But closed source.

            >This totally negates points 4 and 5, and effectively negates point 1 and 6,

            It does? You mean because their cards all use the same closed-source code they can avoid releasing updates for the kernel and X?

            Wrong. Here's the counterpoint: The VGA code in the kernel is the same on every card. Yet it has undergone updates. VGA is supported by many more cards yet it made no difference. Therefore points 4 and 5 stand as regards to Nvidia cards.

            Why do you think they will keep updating their closed-source drivers because their cards are the same? You are making no sense. If they drop linux, they will drop it. Plonk! No support for any cards, old or new. Points 1 and 6 stand.

            >since as long as they make windows dirvers, the cost to port them to linux is almost zero for them, so they'll probably continue to do it.

            Ahh, ok. I see. You are saying that because the effort involved in porting from one to the other is zero they will continue in the future.

            You obviously have no idea about the political decisions companies have been known to make. Reply with business sense next time.

            >Point 4 makes no sense, since the kernel loader is open source, and compiles on any kernel.

            You obviously don't know how the kernel works! If they change it enough, trust me, there's absolutely no way that closed source module will fit.

            Give it a couple of years. Tell me if kernel 3.0 will compile against the first revision of the closed source drivers Nvidia released, using whatever hacking you need in the open source section. If it's successful, tell me if your kernel panics when you insert the module. I think it'll break horribly.

            >As for point 2, the drivers support the full range of features on the card

            That's just peachy. I guess you put 100% trust in Nvidia, even though they've been known to perform the same dirty tricks as ATi to fake performance in their cards.

            Personally, if I can't see it work, I won't trust it.

            >(including TV out with no gay macrovision *ahem ATI lovers*)

            Great. And when they decide to change that your recourse is... ? Run an outdated version of the code?

            >so what more could I add?

            Counterpoints that make sense, perhaps?

            >As for point 3 and 5, it is up to the XFree86 team to make sure that binary XFree drivers work with new XFree versions, not NVidia.

            Now you really show you don't have a clue.

            XFree has no obligation to keep their software compatible with outdated binaries. Look how long DOS was supported in the MS Windows line and just see what it did for them (ugggh hellish nightmares coming to mind)...

            Its called X, not XP. Get with the times. Continuing support for binaries in Linux was dead from the start.

            >So that pretty much destryos your whol arguement. Any more FUD?

            No, but then again I was right to start with and there's no FUD in my argument. Any more lies from you?
            • Wrong. Here's the counterpoint: The VGA code in the kernel is the same on every card. Yet it has undergone updates. VGA is supported by many more cards yet it made no difference. Therefore points 4 and 5 stand as regards to Nvidia cards.

              This paragraph makes no sense, grammatical or otherwise, so I don't know how to respond.

              Why do you think they will keep updating their closed-source drivers because their cards are the same? You are making no sense. If they drop linux, they will drop it. Plonk! No support for any cards, old or new

              This would totally not be in NVidia's best intrest, and they are not idiots. It only makes sound financial sense to continue selling and supporting a product that costs a minimal amount to support.

              Give it a couple of years. Tell me if kernel 3.0 will compile against the first revision of the closed source drivers Nvidia released, using whatever hacking you need in the open source section

              See previous statement. NVidia has no reason to discontinue support for their cards.

              To condense the rest of your argument, and most of the previous, you are basiclly saying that "We can't trust NVidia to maintain the drivers, therefore they are bad. The only way we know for sure the drivers will be updated is to have open source ones." Well, let me put his to you: The only reason accellerated 3D works on Raedon at all is because ATI has been providing specs to the open source community. Don't believe me? Check it out. Therefore, apply all your big, bad NVidia comments to ATI, and see whee you stand. What if in the future, the Raedon 12500 XP Pro specs aren't released? Good bye ATI 3D.

              No thanks, I'd rather have a product that works fully NOW, including FSAA and hardware T&L (yeah, ATI drivers don't have that), rather than rant on about some damn licensing problemwhich isn't even an issue.

              • >Wrong. Here's the counterpoint: The VGA code in the kernel is the same on every card. Yet it has undergone updates. VGA is supported by many more cards yet it made no difference. Therefore points 4 and 5 stand as regards to Nvidia cards.

                >This paragraph makes no sense, grammatical or otherwise, so I don't know how to respond.

                Sorry for the imperfect english. I was kind enough to skip over yours and was hoping you might do the same for me.

                Let me explain it.

                Lets say in Subject A situation 1 occurs.
                Subject B is identical to Subject A except that subject B was created at a later time.

                Barring differences occuring over time (and in my example there's been none) then situation 1 should occur to Subject B. ie: VGA drivers vs. Nvidia drivers. What happened above to the VGA support will happen to Nvidia support.

                Is that a little more clear?

                >This would totally not be in NVidia's best intrest, and they are not idiots. It only makes sound financial sense to continue selling and supporting a product that costs a minimal amount to support.

                I can't come up with examples right now, but it wouldn't be the first time a company has dropped support for a product even if requires absolute minimal effort.

                Oh wait, I just did... :-) Remember how SoundBlaster dropped support for the Cyrix processor in Windows way back when? All they had to do was recompile their drivers for the 486 but they wouldn't.

                Even today, most companies still provide 486 compatible [closed source] Windows drivers today because the financial hardship imposed by compiling for another fully compatible processor is nil. Yet a company as big as Creative Labs wouldn't do it.

                If Creative Labs would do that for hardware, why not Nvidia with software?

                Nvidia might have no good reason to discontinue support. That is until Linus (or another important Linux member... I'm thinking RMS) does something to anger them (like Cyrix did to Creative). Then they'll drop Linux like a hot potato.

                >"We can't trust NVidia to maintain the drivers, therefore they are bad. The only way we know for sure the drivers will be updated is to have open source ones." Well, let me put his to you: The only reason accellerated 3D works on Raedon at all is because ATI has been providing specs to the open source community. Don't believe me?

                Oh, I believe you. But now the comminuty has the specs and can continue support for the ATi Radeon into perpetuity.

                Nvidia users can only wish to be so lucky.

                I don't base my high-price hardware decisions on maybes and "good vibes". I like to base them on hard facts. The hard fact is that (barring a lack of knowledge) one can keep their Radeon working in Linux (and in any other operating system) for as long as they want.

                With an Nvidia card you will only have support on systems they want to support, and only for as long as they want to support it. Better hope they don't pull an S3, Number Nine, 3DFX, et al. on you!

                >What if in the future, the Raedon 12500 XP Pro specs aren't released? Good bye ATI 3D.

                Huh? Will my original Radeon all of a sudden stop working because a new one is realsed?

                When they stop providing support for newer cards, this will affect future purchasing decisions. At least they can't hold me hostage into buying their new cards by not supporting older ones (or gasp! try to hold me hostage attempting to make me buy a proprietary OS if they give up on Linux).

                I think my old Radeon (the one I bought with Open Source support) will continue to work in Linux long after your card requires revisions of software that don't work with anything made that decade.

                >No thanks, I'd rather have a product that works fully NOW, including FSAA and hardware T&L (yeah, ATI drivers don't have that), rather than rant on about some damn licensing problemwhich isn't even an issue.

                Just like I said. Fast and loose. I prefer slow and stable.

                Pick your poison.

                Don't come crying to anyone about your latest Nvidia card not working in Linux, though. It will fall on deaf ears, just as yours are deaf to the cries of open source software.
                • There is anothe rlarge aspect of NVidia you are also totally dismissing, the UDM. The UDM means I can use the original NVdiia X drivers, which were released before the Geforce3 even existed, on me GeForce3. As such, any future card that follows the UDM (and all NV cards do, and most likely will continue to do) will be supported under current linux drivers. Therefore, even if NVidia "drops linux like a hot potato", any card they make will continue to function until the end of time. Unlike future ATI cards, which one can only hope someone may possibly be able to reverse engineer.

                  • >any card they make will continue to function until the end of time.

                    You are telling me something about the hardware, not the software.

                    Yes, I agree, their hardware is nice. We are talking about software in this thread, not hardware.

                    Please stay on topic!
        • And don't forget that because Microsoft win2k for example isn't open source they can:

          -Stop making good software for it
          -Stop supporting your new hardware
          -Stop supporting your hardware of choice
          -Drop all old support because it's "cluttering" things up
          -Make you start paying for updates!

          wait lemme look at that.. lol anyone see the problem haha!

          frankly nvidia has the stablest graphics card I've seen, I'm happy with my gf2 256DDR 32meg card! :P and yes amd and gf2 work great together!

          After christmas I'll have this in a new computer a 1.4gig, with that card, I expect to get nice fps, and still not have these crashes that shepd speaks of, which seems to remind me more of my voodoo2's then an nvidia product!


          • Note:
          I may be biased as all my friends and me run one of the nvidia cards and have no problems, or wait a second, doesn't that mean we all just like not having crashes, and having consistant and good drivers? :)
          • >and still not have these crashes that shepd speaks of

            Why is it people are having such a hard time looking into the future?

            Yes, your card runs well now. Kernel 2.6 (or whatever) is coming up.

            Tell me then how much support for Nvidia you feel.
  • Yes, I am utterly biased & asking for flamage:

    Linux ATI Radeon Drivers:
    Open Source, reliable, fully featured.

    Linux NVIDIA GeForce Drivers:
    Binary only, unreliable, cause frequent hangs on many systems, not fully featured.

    'Nuff said.
    • I don't think so. Slashdot IS a pro-open source/linux/geek board. While some people just enjoy running Linux, others carefully make sure all of their software is open source, so thats useful info.

    • by cs668 ( 89484 ) <cservin&cromagnon,com> on Friday December 07, 2001 @10:23AM (#2670471)
      I have to say that I have had really good luck with the last 3 realeases of the nvidia drivers for Linux - X.

      I would prefer they were open sourced. But, I am not going to slam drivers that have worked well for me just because they are not.
      • I know a number of people who have had problems with these drivers, including myself. Thats why I changed to Radeon. I also know a couple of people who have had absolutely no probs whatsoever, like yourself (lucky bastard ;-). However, I am sure if the drivers were OS, these problems would be sorted out by the community pretty quick.

        Its not that I am anti-anything that isnt Open Source. Just that NVIDIA havent really made a particular effort with the linux drivers themselves:

        1. They havent bothered to make an easy instillation. They are obviously of the opinion of "Hey, theyre running linux, they are used to difficult instillations and getting under the bonnet, therefore we shouldnt bother making it easy for them".

        2. There are obviously some fundamental problems with the drivers stability on some systems. This is well known, but NVIDIA arent doing much about it.

        3. Certain features have not been enabled in the Linux Drivers.

        I can understand why NVIDIA have done this: Linux isnt a particularly big market for them, so theres no point wasting time & effort perfecting the drivers.

        However, if they cant be bothered to do it, there are alot of people out there woho would be willing to help - for free.

        All NVIDIA need to do is OS the drivers. They get to look like "Good Guys" & get some reliable linux drivers - for free. Is Open Sourcing the drivers really going to give away propreity secrets to the opposition? Surely, they make there money from the hardware, so they cant have much to loose from OS.

        --MB
        • 1> Easy install? Umm...hello. You can grab the rpm's for SuSe, RedHat, Mandrake. rpm -i is hard? For the rest of us, grab the tarballs. Extract and make install, a little pull from your spice weasel, and bam! Debian (woody, sid) even has the packages nvidia-glx-src and nvidia-kernel-src that will build debian packages from the downloaded tarballs)

          2> In general, those who have compiled stuff themselves experience less problems than those using the biniary packages.

          3> What features? read the docs. There's plenty of features there. FSAA, VSynctoBlank, Shadow Cursor, TV Out, TwinView, all there. What's missing? I should have modded you as flamebait instead of commenting, but oh well.
          • Well, it was deliberate flamebait ;-) I said so in my post. Though I would get some replies though...

            In general, those who have compiled stuff themselves experience less problems than those using the biniary packages.

            You cant frigging compile them yourself. I may have been talking shit elswhere, but I do know this.

            You can grab the rpm's for SuSe, RedHat, Mandrake.

            Cant speak for the others, but the SuSe RPMs are no way just a matter of rpm -ivh ....
            Yeah, suse have got a nice little script switch_to_nvidia or something, but it didnt work correctly for me. This isnt entirely SuSe's fault, as due to dumb licsensing restrictions, SuSe cant put the real nvidia drivers in their distro. Furthermore, nvidia should be making their own installer, if they are so bent on keeping the source closed.

            What features? read the docs.
            Yes, you true. Sigh. Was getting confused with the nvidia RIVA, which aint fully featured
        • nvidia has stated repeatedly that they cannot OS the drivers because their drivers contain a ton of code licensed from other companies. In otherwords, they can't OS the code for the same reason MS can't put GPL code a proprietary license: it would be a copywrite violation.
        • Regarding the comment about driver stability and
          "the fact that NVidia weren't doing much about it" or that "NVidia haven't made a particular effort
          with linux drivers".

          Absolutely not true in my experience: When I had
          problems with a Ge 3 card, they were extremely
          helpful - for example, letting me test new
          (unreleased) drivers.
    • When it comes to MS Windows, I've never really believed that ATI fully supports their cards in terms of driver development. At least after having both an ATI Rage Fury and a Radeon 64MB DDR.
    • Re:Biased comparison (Score:3, Informative)

      by brunes69 ( 86786 )

      Umm, you are totally wrong under almost all assumptions.

      The NVidia drivers have been totally unwavering stable for me, and I have been using them for over 8 months. This I CANNOT say for previous XFree drivers I have used.

      THe NVidia drivers are totally fully featured, and support alot of things the raedons don't (Twinview, FullScreen Anti Aliasing to name a few).

      Because of NVidias Unified Driver Model, the same code core is used in both the windows and Linux drivers (this is why new linux drivers come out at the same time as new detenators). This assures you of as good, or better preformance as what you would get on windows.

      Next time, spread your FUD elsewhere.

      • Umm, you are totally wrong under almost all assumptions.

        * The NVidia drivers have been totally unwavering stable for me, and I have been using them for over 8 months. This I CANNOT say for previous XFree drivers I have used.


        Great as it is that you've had luck with the Nvidia drivers, you'd have to be living under a rock not to know that plenty of others have not. I've wasted weeks of my life trying to get them to work with a couple systems (although they did work the first time on system #3).

      • Umm, you are totally wrong under almost all assumptions. * The NVidia drivers have been totally unwavering stable for me, and I have been using them for over 8 months. This I CANNOT say for previous XFree drivers I have used.

        Well, that's nice and dandy for you but I cannot say the same. All the latest NVIDIA driver releases have crashed my machine in under 2 hours when using Opera/Konqueror (weirdly galeon doesn't crash.).
        And my machine isn't even VIA! it's just normal 440GX board with nothing really new/unsupported.


        one thing I don't see being compared, is visual quality in 2D, I mean those pesky RF filters. All I can say about it is that my GF2 SUCK deep. I'd change it for Matrox Millennium I if I could get some more memory into it!

    • I will update this with my personal experience

      Linux ATI Radeon Drivers:
      Open Source, fully featured, cause complete hangs of my system whenever GLX is accessed.

      Linux NVIDIA GeForce Drivers:
      Binary only, fully featured, work completely and perfectly with my system.
      • OK Lets finish this crappy debate here & now. Seems everyone has different experiences - now theres a suprise. Seeing as my parent comment was buggy, I should really post a revised version.... Here goes:

        UNBIASED COMPARISON:

        Linux ATI Radeon Drivers:
        Open Source, NOT fully featured (I just checked the docs), has caused some problems on some systems.

        Linux NVIDIA GeForce Drivers:
        Binary only, fully featured (well, very very nearly almost - just checked the docs), many people have reported problems.

        No what have we learned? SOD ALL.
        • I think a lot of crashes and hangs that users attribute to unstable video drivers are actually related to cheap ass hardware. My system will randomly lock up tight while doing 3D but I've never run across anyone else having a problem with my video card. I went for the cheapest motherboard I could find 2 years ago with the intent of upgrading to the dual processor Athlon boards when they came out, my expectation being that it wouldn't be more than 2 or 3 months.

          As soon as the Christmas gifts are paid off from the credit cards, I'm going to upgrade to a Tyan dual Athlon board. I'll give you good odds that my system hangs will disappear when I do that.

    • Look at the post #s clearly not redundant
    • by aussersterne ( 212916 ) on Friday December 07, 2001 @01:55PM (#2671580) Homepage
      You got it wrong. I've owned both. Truth:

      Linux ATI Radeon Drivers:
      Open Source, incomplete (no HW T&L), slower than Windows drivers, difficult to compile, unstable (prone to X hangs).

      Linux NVIDIA GeForce Drivers:
      Open-source kernel module, binary core identical to Windows drivers (Detonator UDM), complete hardware support (incl. HW T&L and FSAA), as fast as Windows drivers, available as RPM download, complete OpenGL support, and I have never once had an X hang.

      I sold my Radeon because I just couldn't get it to work right with Linux even after months of trying. I bought a GeForce2 Pro card for cheap, downloaded the NVidia drivers, and have been sailing ever since without problems or crashes.

      GPL fanatic FUDder, you are.
      • The kernel module for NVidia drivers is in no way shape or form open source. It consists of 2 parts, an open source wrapper and a binary core. The binary part is where *all* of the code lives, the wrapper is just used so that it can be run on multiple kernel versions without NVidia needing to change the core every time a new kernel is released.
  • Radeon, definatly (Score:3, Informative)

    by Tinfoil ( 109794 ) on Friday December 07, 2001 @10:16AM (#2670437) Homepage Journal
    When I decided to replace my Asus 7700 Deluxe with something a little newer, my budget seemed to limit me to a GeForce 3 Ti200. Further research showed me I could get a Radeon 8500 for just a little more. I ofcourse picked that up. It performs far better than a Ti200, nearly as good as a Ti500 (and better in many tests) for the price of a Ti200. Can't beat that eh?
  • by gmhowell ( 26755 ) <gmhowell@gmail.com> on Friday December 07, 2001 @10:18AM (#2670442) Homepage Journal
    It looks like sol.exe is really gonna rock on these things.
  • by turbine216 ( 458014 ) <turbine216@NosPAm.gmail.com> on Friday December 07, 2001 @10:20AM (#2670455)
    Hopefully NVidia will wise up and drop the price on the GeForce 3 line...at a little over $200 (OEM), I can get two 8500's for the price of a single GeForce3 Ti500. And the difference is SO negligible. Since my idea of "practical uses for a video card" is not "watching 3DMark 2001 run all day", I think i can give up that unperceivable 10 FPS without any guilt.
  • The shipping drivers left out many pronised features. The first driver update supported them, but from what I read not very well. What is the current state of ATI drivers?
    • You will notice that the latest drivers make it as good as (or maybe a bit better than) the nVidia.

      It's the usual ATI thing. The first drivers suck, the second are OK, but the third are very good.

      My that sounds familiar [microsoft.com] ;)

      • And once you get under that crusty old ATI veneer of lousy drivers

        The features and performance are up to the nvidea card. But do you know about the stabilty of the card (drivers). If people say: The card is good, but the driver needs some work, i'll trust it will get better, _I_ do not want to buy (or advise) it.

        It is like the kyro2. It promises good thing, but if you look at the test some test are omitted because the program would not run with that card. What if you bought that game that did not work. (Or worse what if it does not run diabloII?8-)

        The fact that it did run all the test in this review is good. However the one statement about lousy drivers makes me shiver.

        my advise: don't buy a card that promises it will get good drivers, buy a card that already has good drivers.

        i.e. In other test the 8500 did not perform Antialiasing under some resolutions.

        and your point:
        Did microsoft do it ok after 3 tries? (really)
        • Well, at the time Windows 3.0 was the better than anything except OS/2, which IBM refused to market properly.

          Taking a look at MS today vs IBM 12 years ago there are some interesting parallels in the corporate hubris area.

    • the 3286 (?) drivers for XP have Smoothvision enabled, Hydravision (dual monitor support) works great, though under Win2k it isn't that fantastic (98/Moron Edition is supposed to work fine as well) and the Quake3 'cheat / bug' has been fixed. All in all, they are decent drivers and alot better than the latest NVidia experimental drivers that a large number of people report problems / crashes with.
  • by wiredog ( 43288 ) on Friday December 07, 2001 @10:23AM (#2670478) Journal
    All sorts of high-end 3-d capability in these cards means that the very good 2-d capability (which used to only be in high-end cards) is much less expensive. A card that's Good Enough(TM) for non-gamers (like me) is now incredibly inexpensive. One more step in the commoditization of the PC.

    Why do I care? Well, my father (age 72) is looking for a new PC and has budgeted $2,000 for it. He uses it for editing (of Photogrammetric Engineering & Remote Sensing [asprs.org]), web surfing, Quicken, and e-mail. He needs the best LCD monitor/card combo because his eyes are 72 years old, but any CPU that's on the market will do. Plus 256 Mb ram, any current hard drive capacity, and cd-rw.

    Remember when you couldn't get much more than the basics for 2 grand? I like Moore's Law.

    • A card that's Good Enough(TM) for non-gamers (like me) is now incredibly inexpensive.

      Was that a typo? Because there are plenty of "Good Enough" cards flying around for well under $100 these days. There are actually some really nice ones, like the GeForce MX lineup...they range from $70-$100, and they're quite capable.

    • All sorts of high-end 3-d capability in these cards means that the very good 2-d capability (which used to only be in high-end cards) is much less expensive.


      2 weeks ago I built a new server that wasn't going to run x-windows, much less any sort of games. I'd gone to 2 different Fry's, a Best Buy, and CompUSA to find a low end, PCI video card (the 2u case riser card didn't support AGP). The lowest end card I could find was a 3D TNT2 for $55.

      Just a few years ago you could easily find a cheap ~4 meg card for around $10-20. Doesn't seem to be the case anymore. Of course, I probably coulda found one on-line but I like being able to actually walk into the store for a refund/exchange if something goes wrong.
    • matrox cards usually are nowhere near the fastest in terms of 3d performance, but their 2d performance is top notch. The image quality and text clarity on a matrox card is soooo much better than anything nVidia or ATI offers it's just not even funny. Plus, matrox cards have good LCD/DVI support. (Neither of these two features is suprising given that they aim for the professional/graphics design market.) Although, for $2000, you're options re: lcd moniters will be extremely limited if not nonexistant. Truthfully, I'd say go for a Trinitron tube or the like and run it at 80+Hz ( you won't see any distortion or flicker) becuase a 21" Trinitron is probably less than a 15-17" LCD (more screen space will be nice for page layout). Oh, and caution Gramps against moving said 21" Trinitron moniter, he'd probably herniate if he tried. (Hell, I almost would and I'm 30% of his age...) ;-)
  • .... after this fiasco [hardocp.com].

    For those who don't know, ATI was basiclly hacking their drivers, so that when someone ran Quake3, it turned down the quality of the rendering, so that you could get better framerates. They did this becaus emany 3D sites use Quake as a benchmark. More details, and a better description of what they did, can be found here [3dcenter.de].

    Now, I know, "well in Linux, this wouldn't be an issue since the drivers are open source". Well guess what, WHO CARES? If a company is going to be this underhanded with its users, I sure as hell am not going to support them.

    • by puetzk ( 98046 ) on Friday December 07, 2001 @11:09AM (#2670631) Homepage
      Face it... many games optimize special cases for specific cards, many cards optimize special cases for specific games. Mostly the cards optimize for the current generation of games (since they can't know about new games), the games optimize for the current generation of cards (since they cant' know about new cards). It common practice, and it improves performance quite significantly. nVidia's new drivers delivered a 30% boost in performance for a lot of apps... care to guess at what they did underneath?

      Admittedly, ATI did this to a fairly upacceptable degree in this case (since there was significant image quality damage), but they probably didn't optimize Quake because it was a benchmark, they probably did so because it's a popular game full of framerate-freaks who do things like hack their drivers to turn off texturing anyway :-). Read Carmack's comments on the issue before you burn them at the stake for giving you a significant performance boost. The one thing they did wrong was not provide the ability to turn optimization off for benchmarking.
      • This was NOT an 'optimization', it was plainly CHEATING on ATI's part to get better benchmark scores.

        If I select in q3 that I want a certain level of detail, the card better give it to me, otherwise what's the point!

        It's as if whenever you ran q3 the card defaulted to non textured polygons in 320x200 'because in this way you get 600fps', come on, please be objective, ATI blew it big time.

        I am not saying that NVidia are saints, but AFAIK people haven't found this kind of cheating in their drivers yet...

        (I actually don't own either of those brands, I own a Matrox G400Max so I consider myself fairly unbiased).
        • Just like AMD & Intel have never introduced )nearly) useless features designed to influence benchmarks? Face it, the people you're ticked at are _not_ ATi - it's ATi's _marketing_ division, & NVidia's & AMD's & Intel's & Apple's & ....
          • This is NOT a feature. Have you even _read_ the articles in question? What they did was up the threshhold to do mip-mapping so that it basiclly encopassed th entire playing area, only if you were running "Quake.exe". THis has the effect of giving you MUCh poorer quality than if you rename the xecutable to "Quack.exE" and run it again. So this is not an "optimization", it is a deliberate attempt to get higher framerates through lower quality.

    • Last think about ATI cards in Carmack's .plan [planetquake.com] is about "Quake3 name checking in ATI's recent drivers".
      I checked if JC has write somthing about the Radeon 8500, like the 6/1/00 update [planetquake.com] but there isn't nothing.
      May be in the next update?
    • what the hell - nvidia did same shit with the first detonator xp, many hardware sites said so. but they didn't do it only for quake 3, they degraded the 3d quality all over for faster drivers cause they were scared of ati's hardware
    • If ya read toms hardware.... you would know that they got rid of that little cheat...
  • by zeno_lee ( 125322 ) on Friday December 07, 2001 @10:41AM (#2670541)
    I called ATI tech support about a week ago, because my All-In-Wonder 128 Pro wouldn't work properly with Windows 2000. Specifically, the DVD player doesn't work, and in order for Windows 2000 not to crash you need to reapply service pack 2, which in itself means that ATI's drivers are messing up fixes from service pack 2. The details aren't important, but I basically hashed the issue out with the tech for a long time and we both came to the conclusion that the driver that ATI offers for this product don't work properly on Windows 2000. When I asked when a fix will be available, he told me that I'll just have to wait and keep on checking the website for updates.

    What kind of company sells you a non working product (driver) and tells you to wait and sit pretty while they fix it? In the meanwhile, I can't use the product for what it was advertised for.
    • Heh that's what happened to me with Sony Vaio exactly...the thing came with the foil sticker on it that said "Designed for Windows 2000 Professional", so I was like great we'll take this laptop and stick 2K on it later.

      No drivers available at this time...few months later, some drivers show up on sony's site. They don't work properly.

      Anyways $200 or so and 9 months later the laptop is still on ME which sucks for business use compared to 2K. Hehe I was quite pissed on the phone a few times also.

      Rrrrrrrrr....no more vaio's here to say the least!

      Anyways off main topic I guess. But a black mark on Sony in this company!
    • It's marketing. You bought marketing not a product. Any company trying to compete seems to have no choice. We demand that we get the latest product. If it doesn't work, do we return it? No way we wait for the driver. Things would be diffrent if we returned defective products, and yes I do consider a product defective if the marketing does not match the actual results.

    • The scarery thing is that this sort of thing is more common in the computer industry that you might realise. With OSs constantly evolving, in order to attract more sales, their APIs change once in a while, the third parties are left trying to support their hardware and updating their software to support the new OS and this is no easy task especially when the necessary documentation is not always ready. MacOS X and Win2K both suffer from this issue and there are still a good number of hardware devices that are left out in the cold, waiting for drivers for the new systems. Developing new drivers is certainly not cheap and most companies will concentrate on the products that are making them the most money.

      At the same time, if Apple decided to go with NVidia it was because they felt that ATI wasn't making a good enough effort when it came to developing the drivers for the MacOS - sure there were other reasons, but this is the one that is best known.
  • by WyldOne ( 29955 ) on Friday December 07, 2001 @11:03AM (#2670593) Homepage
    About time they got volumetric texturing on a 3D card. When I nuke someone I want to see the smoking hole.
  • by Anonymous Coward on Friday December 07, 2001 @11:12AM (#2670642)

    I've been looking to buy a new system (for running
    Linux and the BSDs), but the choice of video card
    is a sticking point...

    Nvidia cards are supported via binary drivers -
    there is also an open-source 2D driver for XF86 4.1.0. The kernel driver is open source. My fear here is that I wont be able to follow development kernels closely, as the drivers will break. Ditto for changes to Glibc. OTOH, the
    Nvidia drivers offer full support, including 3D w/ hardware T+L. The 2D hardware on Geforces have been lacking (i.e. blurry at higher resolutions).

    As for Ati cards: XF86 4.1.0 supports up to Radeon 64 DDR/VIVO.

    The CVS of XF86 supports that plus Radeon 7500 (2D+3D), and Radeon 8500 (2D only). None of the XF86 release or CVS supports hardware T+L, and probably never will (that support is complicated to write, and ATI isnt paying Precision Insight
    anymore). Radeon 64DDR is a safe choice, but not the fastest. Very good 2D clarity at hi resolutions.

    Matrox G400max/450 are supported pretty well - slower than Radeons, but they work. Excellent 2D quality. G550 is supported in CVS.

    PowerVR Kyro 1+2 drivers are being worked on by the company - they say they'll be released in February. They havent decided wether they'll be open source or not.

    OpenGL performance and features should improve when the Mesa 4.x sources are folded into the main XF86 tree.

    Xig, the makers of the commercial Accelerated X, now have released Summit, with improved 3D support. The fastest card they support now is the Radeon 7500, with full T+L, full accleration, support for pretty much everything the card can do except for the TV/VIVO hardware. The only problem here is that they, bare minimum, cost $79, and the software key you buy is good for EXACTLY ONE driver, on EXACTLY ONE computer. I.e. if you change cards, or even your hardware appreciably, you're screwed.

    Bottom line: if you want open source drivers: Radeon 7500 (risky, probably havent got all the bugs worked out yet, but fastest open source performance), Radeon 64 (stable), or Matrox G400/450/550 -- one of these together with the XFree86 CVS tree, compiling it yourself.
    • by Anonymous Coward
      uh, nvidia's drivers don't change with kernel changes. remember that only PART of their drivers are closed source. you compile the part that is open sourced, whose Makefile automatically includes the closed source parts. see the source package for NVDriver.
      • by Anonymous Coward
        Yeah, but kernel structures and the module interface DO change from time to time, and in the past it HAS broken NVdriver. Waiting for someone to fix that crimps my style. :)

        Glibc is also evolving, and changes there can make the nvidia drivers segfault. With open source drivers, you can just recompile the whole shootin' match. (i.e. libGL, libGLU, etc.)

        Plus there's the whole _taint_ issue.

        The Nvidia driver situation doesnt totally disqualify the card for me - but it is irksome.

        With things as they are, I'd go for a Radeon 8500 or Matrox G550, or Kyro II *IF* the drivers were open source, AND of decent quality. At the moment, that isnt quite where we are.

        I really hope PowerVR releases the Kyro drivers open source. Really.
        • I have 2 different nvidia cards on two different machines (home and work, gf2 pro and gf2 mx). I've yet to have problems with nvidia drivers breaking after an upgrade, and I always run the latest stable kernel, and Debian sid. Granted I haven't been running linux as my primary os on both of them forever, (maybe about 8 months now).

          I love my nvidia cards, you can't beat em if you wanna play quake3. :)
    • The only problem here is that they, bare minimum, cost $79, and the software key you buy is good for EXACTLY ONE driver, on EXACTLY ONE computer. I.e. if you change cards, or even your hardware appreciably, you're screwed.

      Yes, this is a bit annoying, but the license does not get lost if you change anything on your computer EXCEPT for the graphics card. So the license is bound to one particular graphics card model. Ok, this is annoying enough, but you get a helluva lot free updates and really superb and prompt support! And if you are into SERIOUS OpenGL usage that doesn't include only playing Quake 3, then you gotta stick with Xig. You've also gotta go with them if you want FULL OpenGL compliance, which they offer for all their drivers.
    • The 2D hardware on Geforces have been lacking (i.e. blurry at higher resolutions).

      This is a result of the OEM's that make the actual boards using cheap components for the analog output. LeadTek's newest Ti offerings use much improved RAMDAC's and filters, and the 2D quality rivals or surpasses Matrox or ATI cards. I've used all of them at 1600x1200x85 and can honestly say that the LeadTek Ti cards are the best looking Nvidia cards I've ever seen and can easily compare with Matrox and ATI.

      As stated, the ONLY problem with going with Nvidia is the possibility that experimental kernels will break their drivers. Their OpenGL drivers are the best in the (consumer) business, and are not too bad for "real" (CAD, etc.) work either.

      • Let's not forget Visiontek - an American company (Chicago based, and the cards are actually made in the USA) that is producing excellent alternatives to the Taiwanese giants. The 2D quality is great. One other really nice thing is they have actual tech support that you can call. Google "visiontek 6964" for a bunch of reviews of their latest.
    • I'm using the NVidia drivers on a GF2Pro. I've been tracking the 2.4 kernel series very closely (as in same day) as well as the latest NVidia driver releases, and they have yet to break when the two latest are used together. I don't know if this will hold up through 2.5... but I don't plan to track 2.5 closely until it starts to become 2.6-pre, so that worry is a long way off for me.

      But in any case, the driver has been rock solid (other than the fact that DPMS is still not correctly supported) -- it does not crash, freeze, artifact, etc., and OpenGL support is fast and excellent both for gaming (including FSAA!) and for applications not related to gaming. I've had some OpenGL-based simulations running for 72+ hours without a crash or hiccup. That is impossible with any 3D hardware+driver combination under Windows.

      The NVidia driver for Linux is a *real* piece of working, supported software which happens not to be open-source (just like their Windows drivers). Don't let anyone tell you otherwise.
  • there were a few glaring issues:

    It was pointed out that synthetic/"looking to the future" benchmarks favored the Radeon, but "real world" seemed to lean toward the GF.

    Hummm.

    Also a concern (well, maybe just for me) is that the mac version seems non-existant. You can buy or flash the GF Mx line, and older Radeons...what about the current line?
    What really tweaks my nipples is that Nvidia stated point blank that "adding bi-endian support was trivial"...sooo, why don't they make all their cards like that?

    And put a little pressure on ATI (or v/v)?

    Which begs the question, again, why is/was the mac version more expensive than the pc version when you could flash the darn thing?

    I thought about submitting this link yesterday, but alas, I can no longer handle the "rejection".

    And did anyone else notice that the 8500 is a perfect GF2 Ultra killer? Only problem is that pesky GF3 Titanium...

    Now if only we could get Win95 and DOS drivers [slashdot.org] for these new cards.

    Moose.

    .
    • Also a concern (well, maybe just for me) is that the mac version seems non-existant. You can buy or flash the GF Mx line, and older Radeons...what about the current line?


      Generally new Mac graphics cards are introduced by Steve Jobs alongside new Macs. 18 months ago at the summer MacWorld Steve got really pissed at ATI because they announced the Mac Radeon version the day before his keynote when he was supposed to unveil it. I would hope that at the MacWorld in January we'll see Mac versions of the GF3 Titanium and/or Radeon 8500.

  • by mystery_bowler ( 472698 ) on Friday December 07, 2001 @11:53AM (#2670901) Homepage
    A friend of mine said a while back that he hoped ATI sold plenty of video cards so nVidia would have a reason to keep progressing forward. "Without competition," he said, "nVidia will just stagnate and 3d gaming will go nowhere."

    Hogwash. nVidia has a great reason to keep progressing: profit. My mother (family EQ addict) runs a TNT2-based card and pretty soon I'll be upgrading her to a GeForce 3. I run a GeForce 2 Ultra, but I imagine I'll be upgrading to something else come spring time. If nVidia didn't keep moving 3d gaming forward, there would be no need to replace your 3d card with a new one...ergo, limited amounts of repeat customers. As it is, nVidia releases a new, more powerful 3d card every six months in both high-priced and value varieties. Game developers often adopt the latest and greatest as the standard by which they'll be producing a game, so gamers always have a reason to go out and get the latest smokin' piece o' silicon.

    But I am still glad to see that there is competition out there, which probably contributes to nVidia pushing the envelope harder and faster than if there were no competitors.

    • Although they might not stagnate, the competition will tend to push innovation farther faster, or at the very least, force them to use more competitive pricing. Look at the whole Intel vs AMD scenario. Sure Intel would still innovate without the competition, but the current competitive market forces both companies to stay on their toes technically and price things competitively. Although the battle might not be necessary to keep things moving, serious competition will certainly yield better results.
  • Is there a mirror? (Score:2, Informative)

    by GOTO 10 ( 413994 )
    Appears to be /.ed.

    Personally, I've had much respect for NVidia's quality. I've hated ATI since their All-In-Wonder-128 years (bleaak!).

    Besides, this is temporary until the new GeForce4(?) comes out and the GF3 Ti drop in price comparable to the Radeon 8500.

    And for those of you who complain that ATI is like Microsoft (ie: trying to make a buck first), may I remind you that NVidia purposely forced mobo manufactures to keep down the price of new mobos with their top-end n420 chipset *under* $180, when the manufacturers said "Hey, you could charge over $200 retail for this thing!"

    Just my $.02
    • by Anonymous Coward
      GF3 Ti 500 and Radeon 8500 came out at the same time.. GF4 and Radeon 9xxx (known as R300) will come out at about the same time, as well, so when Ti 500 drops the price, so will 8500.
    • by ergo98 ( 9391 )

      I have great respect for nvidia too, but I certainly don't want to talk the ball and go home: They need competition, and it is fantastic that ATI stepped up to plate where so many other companies faltered (S3, 3dfx, Matrox, Trident, Cirrus Logic). On top of that the local price for a Radeon 8550 64MB Retail is $390 CDN, compared to almost $600 for a GeForce 3 Ti500: I have to confess that the Radeon 8550 is on the top of my list right now (so long as they don't try "optimizing" Q3 again). The OEM down-clocked version of it is going for $300 CDN. Those prices are fantastic, and the reality is that the cutting edge in consumer grade graphics cards always was about the $300 mark until nvidia started losing competition, at which point it has ebbed upwards of $600 now (when every other computer component, from monitor to hard drive, has dropped for the latest and greatest).

  • by Billly Gates ( 198444 ) on Friday December 07, 2001 @12:27PM (#2671106) Journal
    I bet debian 3.1 will include the drivers and will also include new and onnovate thigns like the 2.4 kernel and kde 2.x! YIPEEEE!!
    • I don't get the joke. I'm a Debian user, and I have a computer with a Radeon DDR 32MB board. It's running Debian (the "unstable" branch). It has a recent 2.4.x kernel, the latest stuff from Xfree86, and 3D just works. The performance definitely isn't as good as the Windows drivers (no hardware T&L under Xfree86 yet) but everything just works, and it's pure Debian.

      Well, maybe I do get the joke: the Debian "stable" branch is legendary for being behind the times. But most of the people running Debian on a desktop are using "unstable" or "testing", not "stable"; stable more often found on servers. And Debian stable is also legendary for being, well, stable.

      By the way, the current version of KDE in unstable is 2.2.2. The next stable version of Debian will therefore have at least that recent a version of KDE.

      Debian has a sort of split personality: the stable branch has aging packages, with bug fixes and security patches lovingly applied (and back-ported from newer software versions if necessary). Meanwhile, the unstable branch is always right up on the bleeding edge, with the latest packages arriving within a day or two of the upstream release.

      steveha
  • For those of you who havn't been paying attention ATI got caught with their hand in the cookie jar, literally. They blantly attempted to cheat on benchmarks, lowering the quality of quake3 to improve performance in it since quake3 is used as a benchmark by many sites. Rename quake3 to quak3 and you get back the lost image quality and see the real (and slower) framerates.

    Added to that HORRENDOUS driver support and driver issues across all their platforms. I don't know what these "ATI open source drivers are soo good" folks are smoking, but the nvidia drivers are CONSISTENTLY more stable (which is what I really care about) and often faster on linux.

    I realize some users have weeks to spend futzing with their drivers, bios etc and for them the historical low quality of ATI drivers will not be a problem. But if you expect even a decent set of features to work right out of the box without waiting for six new releases, consider nvidia.

    They also have good justification for failing to open source on linux, since they have a common core and have implemented an entire ICD I beleive in their opengl variants. There is a real competitive advantage to doiing things right in that space, and keeping it closed allows them to bring those changes to the linux platform as well.

    Not to mention, the hype machine at ATI won't quit, their paper specs always show them BLOWING the socks of nvidia but wheneveranyone gets down to benchmarking things (especially with legit drivers) they never seem to measure up.

    • I don't think that word means what you think it means.

      If ATI were LITERALLY cought with their hand in the cookie jar, it would mean someone cought the ATI empolyees, all with their hand in a cookie jar.

      Sheesh.
  • With the closed-source driver, there seems to be no way to put two nvidia cards with DVI-D flat panels in the same system. I don't believe the open driver supports DVI-D at all.

    This leaves me running one head when I boot Linux on either of my 3-headed systems. nvidia seems to have approximately zero interest in fixing this problem, as users with multiple geforce cards and multiple digital displays running Linux are a pretty small minority.

    Is ATI any nicer about this? Do the ATI drivers support multiple digital heads, or are the relevant bits of the source open?

  • by Namarrgon ( 105036 ) on Friday December 07, 2001 @05:51PM (#2672982) Homepage
    I'm surprised no-one brought this up.

    The article had some great coloured-mipmap shots of the two cards. The GeForce shots showed lovely trilinear filtering of the mipmaps, true per-pixel range-based transitions with nice soft blending. The older Radeon drivers did pseudo-approximate-range-based transitions with soft (but not as nice) blending between the mipmap levels.

    But the new Radeon drivers don't bother with soft trilinear blending at all. There is only one 50% blend level between mipmap levels, when trilinear is turned on. That's not trilinear - that's a "dual-bilinear" hack of some kind. And it's still not properly range-based.

    Worse, when anisotropic filtering is enabled, you don't get trilinear at all. The mipmap level transitions are bilinear, hard edged. Looks awful. And they're still not properly range-based. THIS is the reason anisotropic filtering doesn't cause the same performance hit on an 8500.

    I don't understand why people keep insisting that ATI cards have superior quality images. Certainly not in 3D - they're taking all kinds of quality-reduction shortcuts to try & boost their benchmarks. Fine so long as it's optional, but as before, it isn't.

    Their 2D output is fine, better than some brands of nVidia chip-based cards, but you can certainly find other GeForce-based brands which look great in 2D. My reference QuadroDCC looks superb, better than my Matrox G400.

    I really wish ATI would stop forcing these compromises on us just to squeeze a few more fps from Q3A. If I want faster performance on a game, I'll lower the resolution, or turn down the texture size or something - in the game, or the driver. If I ask the card for max quality, trilinear & all the nice stuff, I want to get max quality, not some half-assed performance hack.

    That said, I'm keen to see what Smoothvision looks like these days. Sounds nice :-)

    • Yep. The Radeons just don't seem to be doing things close enough to correct for me. A fair bit of cutting corners here and there.

      That's why I'd go for the GeForce, it'll probably be better for the general case - might run slower in some cases, but less likely to get some really ugly looking artifact.

      Actually I'm not so interested in AA. Because in the future when screens run at really high resolutions my eyes are going to do the AA.

      Of course in the distant future maybe my brain will be doing the AA, not my NuEye[TM] eyes ;) (full high res 32 bit colour peripheral vision + UV,Infra, + combo + native modes , better than the real thing, blahblahblah).

      Cheerio,
      Link.

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...