Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

New NVidia Graphics Cards Reviewed 298

UnixRevolution writes "Tom's Hardware has a review of Nvidia's new FX5950 and FX5700. According to Tom's Hardware, ATI's Radeon 9800XT is still at the top of the heap." They're still some pretty slick cards, if only for their heat sink designs.
This discussion has been archived. No new comments can be posted.

New NVidia Graphics Cards Reviewed

Comments Filter:
  • by Indy1 ( 99447 ) on Sunday November 09, 2003 @12:51AM (#7427283)
    is over three weeks old.

    • You missed it! (Score:5, Interesting)

      by randyest ( 589159 ) on Sunday November 09, 2003 @01:07PM (#7429372) Homepage
      Actually, I scanned this entire thread, and everyone missed it, or at least failed to mention the real news on the "Questionable optimizations in ATi's drivers?" page. Seems ATI may be pulling an Nvidia lately, to wit:

      Epic's Mark Rein confirmed that in some cases, high-res detail textures were not displayed in some areas by ATis drivers and that standard, lower-res textures are used instead. Randy Pitchford of the Halo development team also mentioned that there were optimizations present in ATi's drivers which are detrimental to Halo's image quality. However, Randy didn't want to go into more detail here. Finally, Massive's new DX9 benchmark, AquaMark 3, also displayed some irregularities of ATi drivers in the overdraw test.

      This page [tomshardware.com] shows some screenshots that do seem to show that ATI is cheating. And, part of the conclusion:

      The irregularities ATi's drivers allegedly display in AquaMark 3 and UT2003 require further investigation. Factors such as image quality, driver reliability, and compatibility are hard to convey in a review anyway. Then again, game developers such as Gearbox (Halo), Epic (Unreal Tournament), and EA (Battlefield 1942) all give NVIDIA good grades in this respect. Surely, NVIDIA's close contact with game developers will help to improve the image quality and the performance of current and future DX9 games even further.

      Even more interesting, Nvidia is touting a new policy and procedure for dirver optimizations. Details are here [tomshardware.com]. In summary:

      These are NVIDIAs optimization guidelines for driver developers:

      • An optimization must produce the correct image
        • Compare against refrast, competitor and unoptimized versions
        • DVS automatically verifies image quality
      • An optimization must accelerate more than just a benchmark
        • Is it general enough to help more than a single app? If so, can you point one out?
        • Algorithm must not be reducible to
        • Benchmark = true
        • If (benchmark) do_one_thing(); else do_something_else();

      An optimization must not contain pre-computed state

      • Like pre-computed geometry, cached textures, movie playback, etc.
      • Must not relay on particular order of state that is particular to a single application.

      So far, this kind of self-imposed discipline in the form of rules and mechanisms are unique within the industry.

      When ATI first cheated way back when, it hit the /. headlines. Then even more front-page attention (2 stories) was garnered by Nvidia's dubious benchmark optimizations earlier this year. Here we have some pretty compelling evidence that ATI is still cheating at the numbers game, while Nvidia seems to have had enough. Wonder why this wasn't mentioned in the summary? It's a lot more interesting than benchmarks showing ATI and Nvidia neck-and-neck throughout.

  • Cooling (Score:5, Insightful)

    by Tyler Eaves ( 344284 ) on Sunday November 09, 2003 @12:52AM (#7427291)
    You know what I find pretty damn interesting? That my Radeon 9600 operates with NO active cooling at all, only a simple heat conductor. Quite is good.
    • You know what I find pretty damn interesting? That my Radeon 9600 operates with NO active cooling at all, only a simple heat conductor. Quite[sic] is good.

      And quiet is even better.

    • Re:Cooling (Score:4, Interesting)

      by BWJones ( 18351 ) on Sunday November 09, 2003 @01:11AM (#7427370) Homepage Journal
      Fanless would always be preferred, but I should also say that on the cards in which ATI does use fans, they appear to be of higher quality than the fans Nvidia uses. I've had two GeForceTi cards that I've had to replace the fans on when they died as opposed to four ATI cards in which the fans are still going strong. The other frustrating issue with Nvidia is support. When you call them about support for bad fans, they refer you to the OEM manufacturer (Apple in this case) whose response to a bad fan was to replace the whole card!!! Now, I have been a fan of Nvidia's performance (thus my purchase of them in the dual G4's), but their cooling needs some more attention and in my latest dual G5, ATI got the nod with their 9800.

      • Re:Cooling (Score:2, Informative)

        by Kaboom13 ( 235759 )
        ATI doesnt actually produce their cards. They develop a reference implementation and license it out to manufacturers. One of the few places the manufactureres are allowed to differentiate from the reference model is the cooling system. This means you get what you pay for. The cheapest cooling solutions come with the cheapest cards. The blame for your problem here is entirely with Apple, for choosing a crappy fan. It's also worth pointing out that the fans arent always neccesary. The fan on my Abit Ge
        • Re:Cooling (Score:4, Informative)

          by goates ( 412876 ) on Sunday November 09, 2003 @04:39AM (#7427853)
          Actually ATI does manufacture their own cards as well as license the design. I don't think Nvidia produces any cards though.
        • The fan on my Abit Geforce 4 Ti 4200 died and it ran for months before I even noticed.

          If you didn't know it died, how can you know how long it had been dead? The only way you would know it ran for months with a non-functional fan is if you previously had observed the fan not working. In that case, you clearly would have noticed.

          So either, you noticed the fan not working at some point and chose to ignore for months OR you made this whole story up.

          Oh, and ATI does actually produce their own cards, but a
    • that IS interesting, regardless of the flames you've gotten/will get. I remember cursing the loud-azz fans on the card I borrowed in 1999 while my defective Matrox (Please don't attack me for this brand, trust me, good or bad, I agree with you) was enroute for replacement. I am quite certain that the loud-fan card was 25% as powerful as your card (at best) yet your Radeon is silent and fanless.

      Is that due to better heatsinks? Better chips? I wanna know because I want REALLY don't need another fan in
    • Re:Cooling (Score:5, Informative)

      by gad_zuki! ( 70830 ) on Sunday November 09, 2003 @01:54AM (#7427492)
      Lets see:

      Extasy GeForce 3: after 8 months the fan fails and it burns out the network card two slots over. Geforce survives. Fan replacement arrives and its still running to this day.

      Chaintech Geforce 4: after 3 months the fan fails and burns out the video card itself. Still waiting to hear back from chaintech for warranty service.

      Why do they put the world's cheapest fans on these things? Saving 10 cents can't be worth the warranty replacements when these things burn themselves out.
      • Re:Cooling (Score:3, Funny)

        by Kenja ( 541830 )
        By removing the cat from on top of the computer, you may find that fans last longer.
        • You may have intended this as a joke, but i've found that most of the gunk and dust bunnies clogging up my puter are 99% hair.
      • That's surprising. I have a PNY GeForce4, which I believe most people consider one of the more bottom-rung companies and have had no problems with my fan whatsoever. Perhaps you have something blocking an intake or outtake fan?
    • Re:Cooling (Score:2, Interesting)

      by C4P741N ( 698920 )
      I'm definately for passive cooling where possible, my Gigabyte Radeon 9700Pro had the stock fan die on it within a couple of months, I've replaced it with a Zalman ZM80A-HP passive heatsink, while it cost a bit more ($50AU), its keeping it cooler than the stock cooler with zero noise.
  • slick cards? (Score:2, Informative)

    by Anonymous Coward
    I suppose it depends on how you define it. If you've got a more holistic view of things, they are bad; their productions materials are made with a disregard for toxicity and sustainability, and their electrical consumption goes ever upwards. They're not slick to me.
  • Woah. (Score:5, Funny)

    by Anonymous Coward on Sunday November 09, 2003 @12:53AM (#7427296)
    When did Slashdot take over from the Wayback Machine?

    The article's old. Really, quite old. As in, "Hello? The 90s are calling -- they want their articles back" kind of old.
  • ...I want a video card with an HDTV tuner built in, not just NTSC like the misnamed ATI all-in-wonder, or NVIDIA's version. I want to build a portable media center PC with HDTV!
    • I'd be happy if I could just figure out how to hook-up my video card to the new satellite HDTV DVR's that are coming out, like this one. [vssll.com]

      Your HDTV signal comes at you already MPEG encoded, right? It seems to me that every other solution out there decodes the signals, then re-encodes it before it hits your disk.

      The nice thing about the satellite HDTV DVR's is that the MPEG stream goes right from the dish onto the disk, so there's no loss of fidelity.

      Or am I totally confused here?
      • The model name of the unit I linked to starts with DVR...
      • by Babbster ( 107076 ) <aaronbabb@NOspaM.gmail.com> on Sunday November 09, 2003 @02:06AM (#7427528) Homepage
        You are indeed a bit confused. The truth is that the PC HDTV cards available DO store the "raw" stream as it comes over the air in compressed form. This allows them to much more easily (less hardware overhead) save one HD stream while playing another - since the reception/save to hard drive doesn't need decoding, the card can handle decoding either a previously played stream or a delayed play of the stream currently being recorded. The same also applies to the currently available D-VHS tape recorders which receive and save HDTV signals in their compressed form with decoding saved for playback.

        You can see this demonstrated on the page you linked to which says that the Dish PVR's 250-GB hard drive can store 25 hours of HDTV while the MyHD FAQ [digitalconnection.com] (a popular HDTV tuner card) lists the card as storing HDTV signals at a rate of 9.4 GB/hour.

        As to why current HD recorders (both PC-based and stand-alone D-VHS) can't take satellite signals, it's because DirecTV and DishNetwork use a different signal from the OTA standard (FCC-mandated) 8VSB modulation. So, somewhat like NTSC VCRs and DVRs, you can't store the raw (compressed) satellite signal unless the unit is integrated or a method is provided by which the compressed signal can be transmitted (after the actual demodulation of the original satellite transmission) to the outside recording unit.

        This is where the IEEE-1394 (Firewire) interface on the new Dish DVR 921 will eventually (when the software is enabled) comes in. It will have at least the ability to connect to a D-VHS VCR so that HDTV programs can be permanently archived (compressed, of course). It *may* (given the right software on the PC) be able to connect to a computer so that the compressed stream can be dumped to a PC hard drive/server. Of course, such a PC would need either decoding software (and a pretty decent amount of power) or a decoder card like the MyHD to decode the stream.

        As I read back over this, it may be even more confusing, so I'll sum up:

        1. As it currently stands, no high-definition recording solution decodes and then re-encodes before saving to hard drive. This is done a) to reduce the hardware overhead and b) because there are no current consumer-level hardware HDTV encoding solutions.
        2. The DishNetwork PVR you mentioned (again, the 921 [vssll.com]) WILL have the capability, through Firewire, to connect to outside HDTV recording solutions - at least D-VHS and very likely PCs.

        Hope that helps! :)

        • no high-definition recording solution decodes and then re-encodes before saving to hard drive

          I think maybe we're both confused.

          Right now, if I to hook up a TiVO to my digital cable system, the cable box decodes the MPEG, gives it to my TiVO, which ends up re-encoding it before saving it to the disk. This sucks.

          The same happen would happen with digital satellite, with one notable exception that I'm aware of: DirecTV and TiVO jointly produce a unit which saves the MPEG stream directly to the disk.

          This
          • by Babbster ( 107076 ) <aaronbabb@NOspaM.gmail.com> on Sunday November 09, 2003 @03:18AM (#7427730) Homepage
            Right now, if I to hook up a TiVO to my digital cable system, the cable box decodes the MPEG, gives it to my TiVO, which ends up re-encoding it before saving it to the disk. This sucks.

            Yes it does (and my ReplayTV works the same way). And it's NTSC, having nothing to do with HDTV.

            The same happen would happen with digital satellite, with one notable exception that I'm aware of: DirecTV and TiVO jointly produce a unit which saves the MPEG stream directly to the disk

            Your one notable exception isn't the only one. The same thing happens on DishNetwork with their PVR501/721 line (the 721 was the full-featured dual-tuner big dog before the HDTV-capable 921). The reason that it doesn't happen outside of satellite right now is because the vast majority of channels are NTSC, and NTSC channels aren't "naturally" compressed. Thus, if you want this kind of capability with current digital cable/satellite, yes, you have to go proprietary.

            This is how I want to see it being done for HDTV.

            You listed the DVR721 from DishNetwork which IS that way. DirecTV I'm sure will be releasing a similar unit at some point. What's the question again?

            What you're talking about doesn't exist... where are you getting these signals from? Over the air? That doesn't interest me. I'm not going to invest the kind of cash to make this work just so I can watch CBS broadcast in HD.

            What are you referring to? Everything I described in my post exists. There are currently at least two over-the-air PC HDTV card solutions that I know of, both of which can interface with D-VHS recorders. The DVR921 (again, YOUR example) is planned to be able to interface via Firewire with a D-VHS VCR, allowing you to archive HDTV programs on tape. It's not a stretch at all to assume that this will probably be compatible with PCs in the same fashion. Again, at all points until actual viewing, the MPEG-2 HDTV stream will REMAIN compressed.

            Keep in mind that when I talk about signal modulation (8VSB and whatever the satellite companies are using - I can't recall the acronym off the top of my head), that's different from MPEG-2 compression. The former is the method by which the latter is transmitted through the atmosphere - once it hits the satellite or set-top box it is DEmodulated (before being decompressed) into the MPEG-2 stream. That stream can then be read and decoded by any HD-capable MPEG-2 decoder, whether it ends up on a satellite box hard drive (in the case of the DVR921), a D-VHS video tape or a PC hard drive. HDCP (high-definition copy protection which the MPAA is trying to force on everyone) adds a layer of complexity, but the basics I describe still hold true as long as the decoder can handle and pass HDCP.

            And the unit I linked to earlier is far better than using the standard tuner the cable/satellite co. gives you, and then plugging that into a HD PVR.

            I will only say that not once in my post did I describe anything like a standalone HD PVR. You're reading something that isn't there.

            I'm fully aware that people will need to go through a cable or satellite box to receive all the available HD signals (right now, about half). That's no different, really, than the way the current NTSC signals are handled - I can't watch ESPN, Comedy Central, etc. without having a satellite decoder and most digital cable systems have the same limitation (though in many cases cable companies are required to offer a basic analog package that doesn't require a box).

            Again, summing up: You seem to be confused as to what the DishNetwork DVR921 is capable of. Specifically, it can receive and store both NTSC and HDTV signals via satellite and "over the air" (regular broadcast networks). Said signals can then be decoded immediately for viewing and/or stored (BEFORE decoding) on the hard drive. With the Firewire, once it is enabled, it will be able to send the STILL-COMPRESSED recorded streams to other devices (such as a PC or D-VHS VCR) for archiving.

            As to other devices that are available, they are indeed all restricted to over-the-air broadcasts unless they are sent a stream from a cable/satellite device such as the DVR921.

            • OK, the confusion came in when you said the following:

              The truth is that the PC HDTV cards available DO store the "raw" stream as it comes over the air in compressed form.

              This may be true, but not for any streams I'm interested in watching.

              And I'm not interested in D-VHS, at least not yet. Time-shifting is all I want to do. And the only way I can do that for HDTV without re-encoding the signal is with a unit like the DVR-921 (until of course the TiVO/DirecTV model comes out.)

              Now, how do I connect my
              • by Babbster ( 107076 )
                The only relatively economical option ($100-200) would be to get a component-to-VGA transcoder like this [dcpuraty.com] (this one [digitalconnection.com] should also work even though it's designed for Xbox, since that console outputs standard HDTV signals). Unfortunately, the DVI standard on HDTV products is different from that of computer monitors, so even if you have that connector it wouldn't help.

                Good luck!

    • Honestly, I don't remember the specs but why bog down the AGP bus when you have a perfectly good PCI-X bus for the HD broadcast? PCI/PCI-X/AGP tend to run on seperate bus so I'd think this would be cleaner / faster than the All-In-One approach.

      Also, and this might be different with HDTV, but the quality of recordings from one of those ATI TV cards aren't all that great. One would think a dedicated card could do a better job. ATI just slaps these things on as an afterthought almost.
    • Correct me if I'm wrong, but it is going to be tough to get any significant amount of HDTV reception on a computer. As I see it, most of HDTV is going to come over satellite or cable broadcasts. In the case of satellite, you will need a card that can decrypt/authenticate with their systems. The same with cable. Most of their HDTV transmissions will come over the digital (FAT) bandwidth.

      You may be able to pick up local, over the air stations, but this will amount to a very small percentage of the eventu
  • Nvidia the new 3dfx? (Score:5, Interesting)

    by NightWulf ( 672561 ) on Sunday November 09, 2003 @12:55AM (#7427300)
    Is Nvidia doomed to not have learned from 3dfx? Seems to me all they're doing now is adding slightly faster/better boards, while charging the same prices. Where are the new cutting edge features and options? It seems ironic, 3dfx was put out of business by Nvidia, and now Nvidia may be put out of business by ATI, and 5 years from now we'll be talking about how ATI will be put out of business by XYZ. The problem I think is once the company starts making serious cash, the founders just don't care anymore.
    • Nvidia really isn't close to being put out of business. Sure, ATI's $500 card is slightly better than Nvidia's $500 card, but guess what, very few people spend $500 on a video card. Look at the GFFX 5700 Ultra vs Radeon 9600XT [anandtech.com] if you want to see a more mainstream comparison. Nvidia is very competitive in the mainstream market where are the business is at. Oh, and if you are running anything other than Windows you might want to go grab a Nvidia card because their drivers are still better than anything AT
      • Actually, the thing that is saving nVidia nowadays is the release of the Detonator 52.16 driver for Windows 2000/XP.

        Not only did they fix a lot of weird bugs that plagued earlier releases, but also the new driver has actually made nVidia's latest cards run quite fast with excellent 3-D graphics quality.

        I think nVidia will probably within six months introduce a whole new line of graphics chipsets that will probably beat ATI's, mostly because nVidia is aware of the known weaknesses of their current chipsets
    • Er, maybe you havent been paying attention, but ATI has -just- done the exact same thing wiht their XT line.
      The cards are identical in both cases except with very minor clock speed increases.
      I dont see either company going out of business any time soon.

      In fact, this is what nvidia has done since it started, and it doesnt seem to be doing toooo badly ;)

      IE:
      TNT..TNT2, TNT2 Ultra..
      Then Geforce, Geforce DDR..Geforce 2 (not just a clock speed increase, but offered little for new features)...Geforce 2 Ultra..
      the
    • by Sycraft-fu ( 314770 ) on Sunday November 09, 2003 @09:41AM (#7428497)
      They have major, minor, and speed release cycles. Every couple years you have a major release cycle. The GeForce FX, or ATi 9700 would be an example. So would the GeForce 3. This is when they go to a new architecture with majorly different feautres. Fore example the GeForce 3 instroduced (for the nVidia line) programmable pixel and vertex shaders.

      Well, within those major releases, they also have minor releases. The ATi 9800 or the GeForce 4 would be an example of that. Both had some actual different features over their predicessors,but only minor ones. The platform with still fundimentally the same. Both the GF3 and 4 are DirectX 8 cards and there is no real important feature difference between them.

      Then there are the little speed releases. This is when they just bump speeds up, or release a slower economy version, maybe move to a smaller fabrication process, etc. The GeForce 3 Ti lines were an example of that. Two new cards. Totally functionally equivalant to the orignal 3, one was just slower, and one faster.

      The problem 3dfx had was they, literally, kept remaking the same Voodoo chip over and over again. The Voodoo2 was the orignal chip, with support for 3 texture units, though only 2 were ever implemented (the orignal acutally supported 2 and some Quantum 3d units implemented both), SLI and a higher clock speed. The Voodoo 3 was just all 3 Voodoo 2 chips on a single chip with a higher clock speed and a larger unified ram. And there it sat for a long time.

      That's why they had their problems. BEcause all the while nVidia and ATi were moving up, in line with DirectX increases. The TNT2 was the last DirectX 6 card from nVidia. The GeForce was a DX7 card and supported the fixed function T&L unit that implied. Then when the GeForce 2 was out and the 3 was nearing completion, the VSA-100 that composed the Voodoo 4 and 5 came out. Basically, it was doomed to failure from the start. It didn't have any of the new DirectX 7 or upcomming 8 features. It was also a return to the expensive multi-chip designa nd non-shared memory. So while it had neet feautres like FSAA, it was too expesnvie and too dated to really make a big showing. Then The GeForce 3 and DX8 came out. This introduced a programmable T&L line (programmable pixel and vertex shaders). This was something really worht having and completely out of the question for VSA-100 anytime soon. PLus the 3 was quite a bit faster and it ALSO did all the FSAA stuff. It was done for 3dfx soon afterwards (they also made some other mistakes along the line like buying STB).

      No, nVidia has kept up well with the technology trends. The FX series are just as capale as the Radeon series, function wise. However, they've lost their crown as speed king, ATi is offering a better preice/performance ratio AND a higher high end right now, though not a whole lot. Couple tha with ATi drivers that finally work right, and nVidia is threatened. But, it's not the same as with 3dfx. nVidias products are still competitive, and they still have new designs in the pipe, not just rehashes of what they've got now. Doesn't mean they won't get run out of bussiness, but means they have a fighting chance at least.
  • Considering they take up half of the card. Why don't they come out with better technology rather than patching their old stuff (ie clocking it higher and putting an even more massive heatsink on it)?
  • We still have Nvidia's in our dual G4 Powermacs, but then new G5's all have 9800 pro. Granted these are for photoshop and Final Cut Pro, but I don't notice any great advantage with FCP).

    At home, my PC came with a 8MB Starfighter in 1998. I upgraded to an ATI Rage Fury 32MB card in 1999. Then I rebuilt the system in 2001 and purchased a Nvidia Geforce 2 MX400 card with 64MB ram for like $70 two years ago. And that seems to run the two games I play quite well.

    The "who has the fastest video card" no lo

    • by antiMStroll ( 664213 ) on Sunday November 09, 2003 @02:10AM (#7427539)
      ....purchased a Nvidia Geforce 2 MX400 card.....And that seems to run the two games I play quite well.

      Pong and Frogger?

    • by Artifex ( 18308 ) on Sunday November 09, 2003 @02:17AM (#7427563) Journal
      The latest ATI card isn't going to render text any faster than the 1MB trident card in my old 486...


      Funny you should say that. This week I learned first hand what it's like to scale back. My Nvidia Ti 4600 finally was crashing my computer too much, so I had to take it out and use my old 2MB S3 Virge card, while I try to get Visiontek to give me an RMA. (Check my sig for how that's not turning out)

      Anyway, bootup seems just the same speed, and most of the time, all the screen elements seem to work about the same speed when I'm just browsing the web or doing email. It's only when I'm trying to load up a database or spreadsheet or something like that that it's really noticeably slower. Of course, I can't play any of the newer games, but I've been playing PS/2 mostly, lately.

  • by rice_web ( 604109 ) on Sunday November 09, 2003 @01:10AM (#7427365)
    Who cares about the high-end graphics card market? Intel owns the graphics department, and they do for several important reasons.

    Firstly, they border on monopolistic and can force manufacturers like Dell to use their integrated chipsets. By offering the cheapest video cards on the market and likely offering package deals (CPU and GPU together) to drag the cost down further, there are a number of Benjamins on the line for the likes of Dell in using Intel's graphics chips.

    Secondly, however, the "Average Joe" comes into play. Quite simply, very few people buy high-end video cards because no game makes use of it, and many people do not game on their computer (it's typically less expensive to game on a console). As much as I hate the "Average Joe" spiel, it fits perfectly with the graphics department.

    The difference between my Radeon 9600 Pro and NVidia's latest offerings is surprisingly little, and I, a Slashdotting, video-gaming, computer nerd, will probably not be moving from my 9600 Pro until games come along that choke my system. Most users will probably just stop buying the latest games and wait several hardware revisions before becomming a gamer again, or they will buy a gaming console like everyone else, leaving their Intel chipset to crunch through webpages and Word documents.
    • Firstly, they border on monopolistic and can force manufacturers like Dell to use their integrated chipsets. By offering the cheapest video cards on the market and likely offering package deals (CPU and GPU together) to drag the cost down further, there are a number of Benjamins on the line for the likes of Dell in using Intel's graphics chips.

      Why then have I seen the majority of Dells with Radeon 7500 or 9000s (in Optiplex systems) and NVidia GeForce 4s (in Dimensions)?
      Dell picks the Intel integrated
  • by tim_mathews ( 585933 ) <tmathews04@gm[ ].com ['ail' in gap]> on Sunday November 09, 2003 @01:28AM (#7427408)
    It's late and I can't think of a better title for this post. I remember long ago when you could buy video RAM chips and plug them into sockets on your video card and add more memory. The idea never really caught on, the chips were hard to find and no one seemed to bother (that I recall). However, now with the amount of attention video cards get, I think it's time to bring that idea back. I just can't see spending 400 - 500 USD to buy a new video card just so that I can get another 50MHz and another 128MB of RAM. I want to be able to swap GPUs on my video card when new ones come out, add RAM and update the firmware without buying a new card.

    I know this will never happen, because it would be a huge loss for the card manufacturers. Or maybe it will. Once upon a time, you bought computers with the CPU and RAM soldered to the motherboard (think pre-386 and some 386's). True, the was a socket for a math Co-processor, but often upgrading the CPU was out of the question. This is where we are with video cards now. The upgrade path is rather steep.

    I'm waiting for the day when you buy a video card and then have the option of buying the fast processor, the really fast one, or the processor-thats-so-fast-it-melts-the-card, and then have the option of buying lots of RAM, a lot more RAM, or way too much RAM. Of course, I'll take option 3 :-)

    Anyway, I know I'll update my video card a lot more often if that ever happens.

    • the FX series has full Direct X 9 support. This is mroe than just a little bit more ram. The new high end nVidia cards are not targeted toward 5900 owners, they're targeted at GeForce 4 owners who want more performance, and more new features.
    • Unfortunately that probably isn't feasible with the current stat of things. You see the memory bus is pushed to it's absolute limits as it is with the chips soldered to the PCB, a socket connection would degrade the signal enough you wouldn't be able to run the memory bus nearly as fast. Not to mention a socket system that could even come close to the 700+MHz bus speeds we currently see would be prohibitively expensive.

      What we can hope for is the return of SLI type configurations when PCI-X becomes the car
  • Scorecard: (Score:3, Insightful)

    by Wolfier ( 94144 ) on Sunday November 09, 2003 @01:28AM (#7427409)
    ATI nVidia
    speed 10 9
    price 5 5
    heat 9 1
    noise 9 2
    features 10 9

    TOTAL 43 26

    Choose ATI.
    • Re:Scorecard: (Score:4, Insightful)

      by RedWizzard ( 192002 ) on Sunday November 09, 2003 @02:05AM (#7427524)
      "TOTAL (ATI) 43 (nVidia) 26"

      That's all very well if you consider those five factors to be equally important. I suspect very few people do though. Personally I don't care much about the heat or noise but value for money is key. Interesting too that you don't consider support (including driver updates) to be relevant.

      • Re:Scorecard: (Score:3, Interesting)

        by Sycraft-fu ( 314770 )
        "Interesting too that you don't consider support (including driver updates) to be relevant."

        Espically given that was THE reason I bought nVidia, until receantly. I mean nVidia cards always performed well and were on teh cutting edge as well but that wasn't what really made me buy them. It was the 100% rock solid drivers. I just don't like playing around with that kind of thing.

        Well I feel ATis drivers have reached that level too, but it's an important question. I want to know if the cards have any reliabi
    • Your right, with all the noise my video cards making I can't even hear the six hard drives, power supply and system fans. Oh wait, its the other way around.
  • by Chas ( 5144 ) on Sunday November 09, 2003 @01:30AM (#7427419) Homepage Journal

    Sure. If all you're looking for is umpty-bazillion frames a second, the 9800 is going to be what most power-gamers go drooling after.

    But, until ATI can actually come out with a stable driver that works with all games and apps, neither I, nor anybody I know can, in good conscience, actually recommend an ATI card.

    Additionally, if you want a decent 3D card for Linux, you can pretty much forget ATI.

    And don't just take my word for it. Go browse around a few of the ATI-centric sites that cater to ATI's users. Take a look at the issues being raised.

    And before some frothing fanboi starts yelling about driver cheats, DX9 compliance, etc...I acknowledge the issues with nVidia. But, even in the light of those issues, nVidia's drivers still work.

    PERIOD.

    • I have to agree with your post. ATI has some work ahead of them on their drivers. They don't seem to have what it takes in that area.
      Rage3D Forum For Driver Incompatabilities [rage3d.com]
    • But, until ATI can actually come out with a stable driver that works with all games and apps, neither I, nor anybody I know can, in good conscience, actually recommend an ATI card.

      Oh, stop trolling. Have you even tried any of the 3.x Catalyst series? When 3.0 hit, the driver quality matched that of the Detonators and has ever since.

      I have never had a single problem with any driver compatibility except with Enter the Matrix on my Mobility Radeon 7500. All the textures were being misaligned. With the l
      • Tried running Neverwinter Nights with the newest Catalysts? They don't work; period. And The installation fucks something up so that you can't roll back to a working version, either. ATI makes good hardware, but their drivers are total crap, especially if you run linux. Like the parent, I cannot in good conscience recommend ATI products to anyone.
    • by Babbster ( 107076 ) <aaronbabb@NOspaM.gmail.com> on Sunday November 09, 2003 @02:52AM (#7427679) Homepage
      Considering the number of times I've had to go back and forth on Nvidia driver versions over the past few years because of incompatibilities, I would conclude that NEITHER company has it all together in that area.

      PERIOD.

    • Heh, I had a nvidia card at work. I had to revert to the XFree86 drivers to actually get some work done. Now I have a Radeon (pinched from a dead PC), and I get to have working 3d without it locking my machine solid!

      Not everybody gets on with Nvidia's 'fantastic' binary-only drivers.
    • Linux drivers do work, but they aren't easy to install and setup. It is not impossible.
    • Over the last year, I've never really had a problem with driver stability with either my ATi Radeon VE or a Fire GL X1. I never had a crash except for the time I did something weird with the VE's originally supplied DVD player, but I don't use that thing anyway. Of course, I never really played games with either.

      nVidia does have an entry supporting 1280x960 out of the box which is commendable because 1280x1024 on a CRT results in non-square pixels and everything gets squashed by about 7% and makes thi
    • ATi's unstable drivers have become a bit of an urban legend. Yes, we all know that ATi's drivers were piss-poor back in the Rage 128 days. But if you've used ATi cards recently you'll probably notice an improvement. Their drivers have been steadily improving ever since they moved to an integrated architecture, and they're now pretty solid.

      I've been using ATi cards since the Catylist 3.2 drivers and and they've been very stable for me. I only get lockups when I do something stupid, like try to render a v
    • I'm sold on ATi's new drivers. Been a longtime nVidia fan (since the GeForce DDR) and still use them in most of my systems, but I think ATi is ready for prime time. Thus far on the two systems with 9800s in them, I've seen no crashing (at all, not just none that appears to be GFX related) and no visual flaws in any of the games tried.

      I completely agree wtih you, I want a card that WORKS, and if it has to be slower for that, so be it. However I now feel, and this is a first for me, that ATi can make that cl
  • by SensitiveMale ( 155605 ) on Sunday November 09, 2003 @01:46AM (#7427459)
    I just got the ATi 9800 Pro with Half-Life 2 included (DL later when available).

    ATi has always had the best video quality but they always released buggy drivers that were updated every 5 months. Not any more.

    New Catalyst drivers are released every other month and are no longer buggy.

    The card's performance is outstanding. My card settings are 6X anti-aliasing and 16x anisotropic filtering with all of the eye-candy. I run all my games at 1600x1200 and there is no stuttering at all.

    Simply amazing.
    • Video Quality (Score:3, Informative)

      by Drathos ( 1092 )
      Um.. No.

      The best video quality (in the x86 market) has always been Matrox. They've just gotten themselves way behind the curve in terms of performance.
    • well, considering ATI has been know to advertise stuff there card can't do, I wouldn't count HL2 being in your basket yet.

      And there drivers have a long history of sucking and 'getting better any time nowi tech support promises.

    • New Catalyst drivers are released every other month and are no longer buggy.

      Ok, I don't know much about video cards--but, isn't releasing a driver every month a bad thing? What are they fixing every month if the things is not buggy?
      • Ok, I don't know much about video cards--but, isn't releasing a driver every month a bad thing? What are they fixing every month if the things is not buggy?

        Well, I said every other month. :)

        Sure, there are bugs in every video driver from every manufacturer.

        But by releasing new drivers every other month they are fixed quickly (old ones and new ones from new games) and new features are added.

        ATi used to release drivers every year, Yup, it was that bad.
    • With All-In-Wonder cards and v3.9 drivers, OpenGL games no long work (crashes). See this thread [rage3d.com].
  • would include a few scores from previous generation cards as a base line. Id love to know how my GF4 stacks up to these cards, it might make me consider buying one more if i could see a definite speed gain. But to me, these scores mean nothing to me cause i have nothing to compare them to in my mind. how do i know a MS flight sim score of 64 is good when i dont know how my card would rate?

    please review sites, include some last gen cards as a baseline, if only the GF4 and Rad8500.

  • by rpozz ( 249652 )
    Look at some of the benchmarks in the article. Eratic isn't the word for it. While ATI has a clear gain with image quality settings, the FX 59x0 has a clear gain in OpenGL-based [tomshardware.com] games. The Doom 3 benchmarks will be interesting.
  • How long before one of these cards needs its own direct line to the wall plug? The only trickey part I see is fitting the Belden connector on the card's mounting bracket.
    • Re:It's Only 110VAC (Score:2, Informative)

      by dFaust ( 546790 )
      How long before one of these cards needs its own direct line to the wall plug?

      It's funny that you should mention this... 3dfx's last card (the Voodoo 5 6000 [eurogamer.net] which never ended up making it's way into the consumer channels) did EXACTLY this. It required that you plug it into the wall via it's own external power cable. Not even hooking it up to your power supply was enough at that point (which the Voodoo 5 5500 and some modern cards require).

      And for those of you not in the know, nVidia of course bought o

  • Bozo reviewers (Score:5, Insightful)

    by Animats ( 122034 ) on Sunday November 09, 2003 @03:08AM (#7427709) Homepage
    Hundreds of people labor for months to design a graphics processor that can do real-time procedural shading of near-Renderman complexity, and the "reviewers" at Tom's Hardware focus on the heat sink.

    Then there's this endless fascination with how many FPS you can get on some antique game. That's not what it's about. The question is how detailed a scene you can render at full frame rate.

    • How true. In fact this endless fascination of FPS on antiquated games I would bet has an effect on where money goes for R&D at these places. I can see it now, marketing insisting they don't need a new engine--just a faster processor on the chip or new algorithms so that Half Life can get a high number of frames per second.
    • You sound like those people who want to stop using horse powers when reviewing cars. After all horses are so obsolete. True perhaps. However it is a reference everybody can relate to. You know how your current card does in quake (in terms of looks and speed) so when you then see in a stat it has X better AA or X times framerate you can put it in perspective. This is kinda hard to do with a game no one has yet seen. Silent Storm is a current game that really pushes your hardware at the moment. It has a frame
  • XGI (Score:5, Interesting)

    by zeekiorage ( 545864 ) on Sunday November 09, 2003 @03:16AM (#7427724)
    While the nvidia article is a little old, there is an interesting article about a new company called 'XGI' [tomshardware.com], which was formed when SiS spun off its graphics division Xabre. According to THG cards based on XGI chips could arrive within one or two months and their top model could retail for a good $100-200 less than the flagship models of NVIDIA or ATi. The article includes a review of the prototype card called 'Volari Duo V8 Ultra' based on the XGI chip.
    • Some cheeky people managed to run a benchmark...

      Here...
      http://www.theinquirer.net/?article=12 4 92
      Also details and possible specs here....
      http://www.theinquirer.net/?article=1160 9

      Looks quite nice! Let's see what the prices are like.... If they're good (compared to nV & ATi) then SiS/Xabre will likely have a volume seller. Nice. This graphics war has got boring. 2 "competitors" and the prices and features are more or less the same.... Mind you, that was garaunteed! It's a tech race which seem to based
    • With many tests falling to last place, being edged out by ATi 9500 and 9600 products, so I hope they can fix things before going to market.

      The fact that they are using SiS's and Trident's technologies doesn't inspire me. Both have had a knack for making a great deal of fuss about how much their graphics product will dominate the industry, and then fall flat on their face with effectively last year's product.

      I often try to root for the underdog but too often they simply don't have what it takes to compete
  • One thing you don't see in these reviews is how compatible the cards are. I bought a 9800 Pro and am kicking myself for it ever since. I get better FPS in Half Life mods with my GF2. I've actually reinstalled my GF2 to play Half life, and my $300 dollar card is just sitting on a shelf. Like most problems, not everyone has it, but if X video card has problems with Y game, you may want to factor that into your buying decision. Just go to support forums for the card you are interested in, and if you see a 41 p
  • I just got a new computer [mini-itx.com] for the price of one of these cards.

    Rich.

Put your Nose to the Grindstone! -- Amalgamated Plastic Surgeons and Toolmakers, Ltd.

Working...