Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

ATi FireGL X1 Vs. NVIDIA Quadro FX 2000 242

SpinnerBait writes "The professional graphics card arena has been heating up as of late, with new products from ATi and NVIDIA hitting the streets on the heels of SIGGRAPH unveilings. In a first of two article series, HotHardware has a showcase with benchmarks on the ATi FireGL X1 and NVIDIA Quadro FX 2000. It seems as though NVIDIA still has a stronghold in this market, as their card seems to dominate many of the benchmark runs shown here."
This discussion has been archived. No new comments can be posted.

ATi FireGL X1 Vs. NVIDIA Quadro FX 2000

Comments Filter:
  • by Dancin_Santa ( 265275 ) <DancinSanta@gmail.com> on Thursday August 21, 2003 @07:16PM (#6760585) Journal
    Neither runs faster than the Orchid Farenheit 180.

    I used Lotus 123 and WordPerfect 5.1 as the test applications.
    • by WIAKywbfatw ( 307557 ) on Thursday August 21, 2003 @07:24PM (#6760636) Journal
      Neither runs faster than the Orchid Farenheit 180.

      I used Lotus 123 and WordPerfect 5.1 as the test applications.


      Aah, but do you have the ISA, EISA or MCA version of the card? That EISA version really kicks ass, especially on a system running MS-DOS 5.0 and tweaked with 386MAX.

      • I'm not sure if I should laugh or cry; the fact that I know exactly what you're talking about is disquieting. I remember spending hours talking about the virtues QEMM/Memmaker/386MAX; and why OS/2 was gonna beat them all. I'm startin to feel really old. There's some really snazzy new-fangled things a comin these days, ain't there? Then I hear 'bout this new-fangled penguin thing made by a bunch of young whipper-snappers... who are older than I am...

        Maybe I'm spending too much time talking to the old fo
  • by Gherald ( 682277 ) on Thursday August 21, 2003 @07:23PM (#6760627) Journal
    > It seems as though NVIDIA still has a stronghold in this market, as their card seems to dominate many of the benchmark runs shown here."

    Not really. The benchmarks were very close in most of the tests and if you consider what the end of the article says:

    At this point in time, various price search engines have the ATi FireGL X1 listed at or around $530. Conversely, the NVIDIA Quadro FX 2000 is listed at no less that $1250 and that's in the 128MB variant, not the 256MB model we tested. So with this in mind, the FireGL X1 price/performance ratio is rather compelling, at less than half the cost of the competing NVIDIA product.

    ...The FireGL looks like a much better value.
    • Even though $720 seems like a huge price difference, even small gains in performance can result in thousands of dollars in saved time. I would think the better performance of the Nvidia would end up costing far less than the ATI in the long run.
      • by Anonymous Coward
        But "the long run" doesn't exist when you can buy another much faster card in 6 months.
      • "Even though $720 seems like a huge price difference, even small gains in performance can result in thousands of dollars in saved time."

        Not as much as you'd think. These cards are for the UI to the 3D app, not for rendering. The difference between 30 fps and 60 fps isn't going to save any significant amount of money.
        • These cards are for the UI to the 3D app, not for rendering. The difference between 30 fps and 60 fps isn't going to save any significant amount of money.

          First of all, that isn't true even with last-generation hardware - CAD apps sure need realtime rendering. Speeding up a complex model from .5 FPS to 10 (or 100) FPS can result in big productivity gains.

          Even the VR apps like 3D Studio can use the programmable shader features of these newest cards to render production quality scenes in realtime or near r

    • This article doesn't factor in a number of important things such as extreme model sizes (that kill the FireGL), driver quality (ATI Radeon is historically bad, New FireGL (ATI parts not IBM parts) is worse), and precision. The FireGL X1/X2 have horribly low precision. The biggest area that shows this is in their sub pixel precision which results in many rendering errors per frame (holes, spots, tears, speckles).
    • If you actually look closely at the article's benchmarks, the ATI card is ahead more often than not..

      Remembers, for a lot of the graphs lower numbers are better.
  • Didn't RTFA (Score:3, Insightful)

    by EmagGeek ( 574360 ) on Thursday August 21, 2003 @07:28PM (#6760668) Journal
    Didn't have to.

    Do I trust benchmarks? No.

    Will I ever trust benchmarks? No.

    Are benchmarks meaningful in any way? No.

    Do benchmarks have any credibility whatsoever? No.

    'nuff said
    • Addition:

      Do I play games? Yes, I'm a level infinity Elf Warrior in Nethack and I don't need no stinkin' 3D acceleration.

      FPS are for wusses!
    • Re:Didn't RTFA (Score:5, Insightful)

      by VoiceOfRaisin ( 554019 ) on Thursday August 21, 2003 @09:29PM (#6761499)
      "Are benchmarks meaningful in any way? No.
      Do benchmarks have any credibility whatsoever? No."

      are you insane? are the people that modded you up insane?

      so a company that does 3d design needs new cards for their systems.. what do you suggest they do? buy the most expensive card on the market and hope its the best? buy the most expensive one they can afford? buy an assortment of cards and try every one themselves then decide which is best then try to return the other cards?

      or be sane and read a review of various cards that TEST them, with something thats called BENCHMARKING.
      is benchmarking a perfect science? no. will it ever be? no. but they are of use like it or not.
      • Re:Didn't RTFA (Score:4, Insightful)

        by virtual_mps ( 62997 ) on Friday August 22, 2003 @06:55AM (#6763652)
        so a company that does 3d design needs new cards for their systems.. what do you suggest they do?...buy an assortment of cards and try every one themselves then decide which is best then try to return the other cards?

        Well, yes. You don't make that kind of investment (you're probably talking a bunch of cards, not just one to outfit a whole shop) without testing how it works in your environment. Hell, if you're going to buy enough of them you can probably get the vendors to loan you a test sample. To buy on benchmarks is just nuts.
    • On Slashdot, it's cool to be an irrational pessimist.
  • Uhm (Score:5, Insightful)

    by Dark Lord Seth ( 584963 ) on Thursday August 21, 2003 @07:30PM (#6760687) Journal
    It seems as though NVIDIA still has a stronghold in this market, as their card seems to dominate many of the benchmark runs shown here.

    Apart from the fact nvidia got their asses kicked in most benchmarks it does indeed rock, yes. Especially the bit which claims the price for the damned thing is over 1200 USD a piece. Ah well, next time it will be an Ati I guess, considering they both fecking cheat with benchmarks these days I might as well go for the cheapest cheater.

  • by niko9 ( 315647 ) on Thursday August 21, 2003 @07:32PM (#6760707)
    It doesn't bother me that much who has the fastest card. All I know is that this sort of competition is great in the Linux arena. With the recent trends in 3d animation studios transition to Linux, they can't ignore the need for high quality drivers.

    Nvidia has really polished up their Linux drivers [nvidia.com] recently, and in response ATI has done the same [rage3d.com].

    This means Linux is one step closer to gaining a foothold on the desktop. Hopefully this will will spur interest 3D gaming on the linux platform.

    One can dream of the day of playing Battlefield 1942 on Linux. I'm using the Liux FireGL drivers on my Radeon 9700 Pro, and so far, they work great for playing RTCW ET.
    • I'm sorry for having to reply to myself, but the link to rage 3d shows that ATI has recently called for Linux Catalyst driver beta testers.

      They're not saying if they are going to support all the multimedia features, like TV Out, capture, but it lookm s like it might be going in that direction. You have to sign a NDA to be elgible.

      People have always bitched on the mailing lists that their AIW cards were half-ass supported, so this might be the turning point as far as these cards are concerned.
      • the link to rage 3d shows that ATI has recently called for Linux Catalyst driver beta testers.

        Recently? That post was done in October last year according to the time stamp.

        And ATI Linux drivers are *still* horrible. I'm yet to be able to properly play a game on my 8500. Neverwinter Nights client locks up at the title screen, Unreal Tournament has some really weird artifacts...hmm...I haven't tried any other games, but as we all know, the choices are not plentiful. Not to mention I can't use tv out..

        • The time stamp clearly shows August 11 2003.

          Now, as far as your driver troubles, are you using the November 2002 drivers? If so, please, download the FireGl drivers. Click on Linux, then FireGL, then the first card on the list. You could also follow this link [ati.com].

          They were updated last month, will work with your Radeon 8500 (I have both a 9700 Pro and a 8500), perform better, and have fixed Xv video support. They also convert nicely with alien if your using Debian (which I am).
          • The time stamp clearly shows August 11 2003.

            Ack...yeah, sorry about that. As another poster who replied to my message said, I got confused when looking at the date that the poster joined the forum

            Now, as far as your driver troubles, are you using the November 2002 drivers?...

            Actually, I am using the latest ones. The November 2002 ones don't work with XFree86 4.3. I still have issues though, but it makes me happy to know they're still working on it. I thought ATI had pretty much abandoned the who

        • Recently? That post was done in October last year according to the time stamp.

          No, you got that wrong. Oct 2002 is the date Catalyst Maker registered, the post was made Aug 11, 2003 - I'd call that recently...

          And ATI Linux drivers are *still* horrible. I'm yet to be able to properly play a game on my 8500. Neverwinter Nights client locks up at the title screen, Unreal Tournament has some really weird artifacts...hmm...I haven't tried any other games, but as we all know, the choices are not plentiful. No

    • Battlefield 1942 is a Direct3D game; OpenGL drivers won't really help. Till game developers drop Direct3D completely, which I don't foresee happening in the near future, Linux (& Mac) gamers will always be missing a few titles.
      • Alas, I wasn't sure which one, but there's always Carmack's games.
      • Battlefield 1942 is a Direct3D game; OpenGL drivers won't really help. Till game developers drop Direct3D completely, which I don't foresee happening in the near future, Linux (& Mac) gamers will always be missing a few titles.

        Yep, too bad that author was shortsighted and used sucky APIs that locked 'em to a single platform.

        Still, doing an OpenGL/SDL port probably wouldn't be too hard...

    • Wow! Thanks for the link to that Catalyst thread. I may very well be buying an ATi card soon. ;)

      That's just the news that I wanted to see. I'm glad that they're still serious about Linux drivers.
  • I've always gone for featureset when looking at graphics cards. Speed is a secondary and usually fairly costly function.

    If I need the speed, I turn off AA and lower the resolution and game detail settings. But if it's fast enough for me as is and looks like it'll suffice for a couple of years, I don't care about the benchmarks.
    • by NanoGator ( 522640 ) on Thursday August 21, 2003 @08:22PM (#6761063) Homepage Journal
      " But if it's fast enough for me as is and looks like it'll suffice for a couple of years, I don't care about the benchmarks."

      I have a similar philosophy about cards. I also do 3D work. Here's my requirements:

      1.) Does it support dual monitor? (not only support it, but good desktop support as well.)

      2.) Does it do decent anti-aliasing? In 3D modelling, anti-aliasing makes a huge difference. When you're building your object, this graphics mode can reveal more about how your model will look when you go to render. It's worth the hit in FPS.

      3.) Does it offer enough of a boost over the card I have now? Can I spend $200 and get 2x the speed?

      I used to want a professional card. However, by limiting myself a bit, I've grown to become more efficient in handling the 3D assets. I know have a more structured workflow than I would have had if I had a much much quicker machine. I'm insanely curious what a Quadro would do for me, but man I have a hard time imagining it's worth the price tag.

      However, I will likely change my tune soon. All the major 3D apps are taking much more advantage of the cards, previewing more and more of what the renderer will do. Modelling is anti-aliased, lens flares show up in real time, texturing, depth of field, motion blur, you name it. I can't help but think one day I'll be buying rendering cards instead of real time 3D cards.
  • by binaryDigit ( 557647 ) on Thursday August 21, 2003 @07:36PM (#6760740)
    Why do they bother with these "standardized" benchmarks. We already know that the manuf. tend to gear their products towards scoring well on these things. That and from a content pov, anybody with the requisite hardware could do what they did. Whatever happened to the days when a group with solid domain knowledge would take some products and run it through their "own" benchmarking? Instead of using some canned 3DStudio simulation benchmark, find a bunch of models you've created and test them out. Run the cards through tests that YOU (not YOU the reader, YOU the ficticious reviewer) know are important. In this way people get a MUCH more realistic feel for what type of performance they can expect and the reviewer actually has some value added to doing the review in the first place (not just running the same thing that the eight "other" benchmarking sites do).
  • by heli0 ( 659560 ) on Thursday August 21, 2003 @07:38PM (#6760755)
    "Why is it that both products we'll be looking at today, the [$725--128MB]ATi FireGL X1 and [$1500--256MB]NVIDIA Quadro FX 2000, share nearly identical hardware with their consumer counterparts, yet cost 3 to 5 times as much? The answer goes back to those highly specialized applications again, and optimizing the hardware and drivers to accelerate performance to the best of the core Graphics Processor's ability"

    ---

    It would have been nice if they also benchmarked a $400 GeForceFX5900-256MB and a $425 Radeon 9800Pro-256MB then. (current prices from pricewatch)

    Anyone have a link to another review that includes these?
    • The cards, as per the article, point out that they are identical (besides the DVI daughterboards) to the consumer versions, with "optimizations" done to the hardware, plus the optimized drivers. You'll notice that neither card has a beefier heatsink

      I wonder if the "hardware optimizations" are FUD, since you can get the FireGL and Quadro orkstation drivers from their websites for free.

      • "I wonder if the "hardware optimizations" are FUD"

        I had similar thoughts which is why I would be curious to see their consumer level counterparts benchmarked along side them.
      • Yes, the consumer and "pro" boards are almost identical. In some earlier NVidia boards, as others have pointed out, you could change a jumper and make a "consumer" board into a "pro" board. And all the jumper did was change the board type, which the driver read.

        This is why NVidia won't provide the hardware interface specs for their boards - the consumer/pro distinction would disappear. That's why the Linux drivers are closed source (and drivers can't be written for some other OSs.)

        I'm suprised NVidia s

        • Man, trying to keep up with all the things you know nothing about is hard but I'll try.

          NVidia can't provide hardware interface specs for their boards because they don't own the rights to do so. That's why they go out of their friggin way to make the best 3d drivers and experience for games and graphics for linux out of the kindness of their hearts on their time on their dime. It's not like they are making money doing it. Me and some other guys started a petition a while back, and after about a year they st
    • It would have been nice if they also benchmarked a $400 GeForceFX5900-256MB and a $425 Radeon 9800Pro-256MB then. (current prices from pricewatch)

      Just for grins, it'd be great to see the Radeon 9600/9800 benchmarked in the new Apple G5s. Those also claim to support pro level applications.

      The dirty little secret of the graphics chip industry is that these cards are really no different from the 'consumer' versions. It is simply a matter of the pro driver sensing a firmware dongle, then enabling the pro fe

      • yes, at least with earlier nvidia boards you could simply solder one resistor(or unsolder, can't remember which way) to trigger it into being a quadro.

        sounds like a scam? yeah.

    • Gamer boards work fine in DCC apps, as long as you dont push them too hard. Its not that they dont work, i mean openGL is still openGL regardless the card. BUT, there may be bugs in the app that only appear on the non 'pro' boards.

      how do i know? well i run Maya, 3D S.Max, SolidWorks and Alias StudioTools on both sets of cards. at work on quadros, at school on firegls, at home on a geforce 4. the only thing ive noticed are small bugs at home, like funny screen draws, and wireframes that dont dislpay 'per

  • Have we, the public, not learnt yet that ATI and Nvidia have "optimized" their drivers for whatever benchmark.

    Unless reviewers compare same motherboard, same amount of ram, same processor, same bios version, same version of the motherboard, as what their audience has then the numbers are MEANINGLESS.

    • Benchmark optimization can only go so far. With non-synthetic benchmarks, manufacturers must attempt to cheat without it being noticed in normal usage of the software. Synthetic benchmarks have only one or maybe a few ways they can be run, so for example objects that are obscured could be omitted from a full rendering, without the viewer noticing. Non-synthetic benchmarks have infinite ways they can run and analyzed, generally preventing such cheating (when's the last time we've heard of cheating in a ga
  • Um.... WTF? (Score:3, Funny)

    by Raxxon ( 6291 ) on Thursday August 21, 2003 @07:45PM (#6760803)
    I thought FireGL was from Diamond, not ATI. :p

    In fact, I'm holding a FireGL right now... I'll sell it for $200! Less than what ATI charges.

    god I wish peole would think before naming things sometimes.... USB2.0 vs USB2.0 "Hi-Speed"....
    • Wow. I know that Slashdot readers don't read the linked articles, but you take the cake. Not only did you not read the article synopsis, you didn't even read the headline.

      It's the new ATi FireGL X1 that the review is using, not the extremely old and outdated Diamond FireGL1.

  • I Love it when a Writer decides to capitalize Random Words when writing an Article.
  • Where exactly are they getting "new" from. The FireGL X1 card may as well have cobwebs on it. The current workstation cards being pushed by ATI are the FireGL X2 and FireGL T2 (X2 being highend as the X1 was, T2 being targeted at the budget market). Claiming "NVIDIA still has a stronghold in this market" is deceptive at the very least. Would you find a CPU benchmark accurate if they compared an Athlon XP 3200+ with a Pentium 4 @ 2.4GHz and concluded that AMD was leading the market?
  • 3Dlabs (Score:4, Insightful)

    by Distinguished Hero ( 618385 ) on Thursday August 21, 2003 @07:51PM (#6760852) Homepage
    They should have also benchmarked the latest 3Dlabs [3dlabs.com] cards in order to give us a proper frame of reference. For all we know, both these cards could be providing inferior performance compared to the latest Wildcat [3dlabs.com]; good gaming performance doesn't necessarily translate into a good professional video card.

    The Wildcats are also cheaper: $899 for the 512 MB VP990 Pro and $499 for the 256 MB VP880 Pro or the 128 MB VP970 (from the 3Dlabs eStore [3dlabs.com]) compared to $530 for the cheapest 128 MB ATi FireGL X1 and $1250 for the cheapest 128 MB nVidia Quadro FX 2000 (the 256 MB variant was used for benchmarking).

    Anyways, these aren't even ATi's and nVidia's top of the line cards; ATi's is the FireGL X2-256 and nVidia's is the Quadro FX 3000.
    • Re:3Dlabs (Score:2, Interesting)

      The (true) Wildcat 4 cards are much slower than the Quadro FX 2000 and Quadro FX 3000. They are also slower than the ATI FireGL products. The Wildcat VP products are slower than just about anything you put next to them. 3dlabs is no longer a significant player in the workstation graphics space.
      • That's news to me. Can you give me reference to
        a Pro/E or 3DS benchmark site for this? I would
        be very curious. I am looking for highest
        quality rendering with 1 Million polygons and
        8 lights. None of those NVidia tricks with
        reduced quality texturing and what not. Please,
        if you know of a good reference on this it would
        help me for real. Thanks.
  • To The Misinformed (Score:2, Informative)

    by palp ( 90815 )
    Since I'm seeing a lot of this, a note to the uninformed/misinformed who didn't RTFA or even much of the blurb:

    These are NOT cards for gaming, they are for professional graphics work. Very different market, so please refrain from telling us how you don't need a high end video card or don't play video games. It's of no consequence.
    • Since I'm also seeing a lot of this, a note to the uninformed/misinformed who did RTFA or even much of the blurb, please don't point out the obvious.

      It's as about as redundant as asking if a one legged duck swims in circles.
  • by DraconPern ( 521756 ) on Thursday August 21, 2003 @08:02PM (#6760942) Homepage
    I don't see dual monitor setups mentioned in the article. Does ATI's output quality stand up to NVidia's?

    I have a Radeon 8500, and I can tell you that ATI has some serious issues with output signal quality. On my main crt monitor, I can still see occasional sheering and small display glitches. The 2nd monitor quality was even worse. I am using a pci TNT card to get 2nd monitor suupport.

    Judging by the picture of the ATI card, the second DVI connection may have problems. It is an extra board so there is not a continuous trace which can introduce all sorts of problems (like contact resistance, oxidation, etc.) Yes, it is a digital signal, but it's like putting an ide ribbon cable with really short wires. You are going to get all sorts of problems...
  • by SuperBanana ( 662181 ) on Thursday August 21, 2003 @08:25PM (#6761091)

    HotHardware

    Um, pardon me, but...who?

    Call me when you've got benchmarks from a real magazine(say a CAD/CAM, 3D graphics and/or animation, etc related magazine), and not two-guys-in-a-dorm-room-who-write-reviews-for-kick backs websites who run Unreal Tournament to benchmark professional graphics cards.

    Case and point is their 'testbed' system: they used a "DFI LAN Party 875Pro" motherboard. They used Pentium 4's instead of workstation-class Xeon processors. IDE drives instead of SCSI. Folks, that's NOT a "workstation". A dual Xeon cHomPaq is a workstation.

    Oh, and the benchmarks? One no-name benchmark, and 3D Studio Max. Oh, and Unreal Tournament. No fill rates, no polygon counts, no NOTHIN. No mention of Linux, which is tearing into the market like crazy among top computer animation houses.

    This is pathetic- they're just a bunch of guys who compile daily linkages to other cheeseball review sites. They have no industry background, no experience, no nothing...just a P4 3GHz and a (probably pirated) copy of 3D Studio Max.

    • Solidworks is certainly not a no-name program; I'm not even a mechanical engineer (its primary market), and I've known what solidworks is for quite a long time...
    • One no-name benchmark, and 3D Studio Max.

      No name???
      Guess you don't do much with CAD/CAM, Solidworks is one of the most featurefull CAD apps out there, its usefullness is second to only possibly CADIA. At my last job the physical design guys modified their AP encasement after running Solidworks simulations which pointed out non-optimal heatflow from the CPU to the case exterior. They built up the case from components whos exact thermal, electrical and other properties were in the materials database.

      You
    • What is the difference between current (as in P4 architecture) Xeons and normal P4s?

      Answer to rhetorical question: Xeons do multi-processor and do not currently run on the 800mhz bus.

      That's it. Just receantly Intel did release one with double (1MB) the cache, but any Xeon slower than a 3.06 is a 512k, just like the normal P4.

      A normal P4 is fine for a workstation, in fact Intel notes it as being a workstation chip. Given the higher memory bus it can even be faster for some tasks. Xeons are basically only
    • SCSI drives only have significant benefits when used in a server environment with numerous portions of the disk being accessed in rapid succession. When you're saving to a few files with in a 3d graphics creation environment, IDE is fine (which is why there has been a huge decline in SCSI usage across the board, especially with the advent of SATA and good SATA drives like the WD Raptor).

      Someone else pointed out that SolidWorks is by no means a "no-name" benchmark. I'll add that fill rate and polycount be
  • Peny wars (Score:2, Insightful)

    by Anonymous Coward
    do we really need these video card peny wars on Slashdot, is this "stuff that matters" by any accounts?

  • Professional cards (Score:4, Interesting)

    by Ian-K ( 154151 ) on Thursday August 21, 2003 @08:36PM (#6761159) Homepage
    While this may be brushing 'redundant/offtopic', I have to say that getting one of those may cost you a bit more, but it's much nicer than a consumer graphics card.

    What the author fails to mention is that there's better R&D (build quality?) put in there. Not just application-specific optimisations. If they *had* tested the consumer equivalents, they'd see them outperformed, methinks. That's my experience, anyway.

    Back in '98 I had a Diamond FireGL 1000 Pro (yes, the FireGL series was owned by Diamond then), which was matching/outperforming many 'new' gaming cards my mates were buying (it was a fairly old model at the time, IIRC). Thing is, I hadn't paid a fortune for it, as you might think. It was a bit expensive, but not *that* different from what my mates were paying.

    Now I have a FireGL 8800 and again the performance is there. Gaming-wise, I can play GTA3 and CMR3 at resolutions previously undreamt of with the 9500 (1600x1200).

    Having said that, it's a pain to get (linux) support by ATI. Ever tried emailing them? Up 'till March (IIRC) things were OK and they even had good drivers. But now it's all shaky and iffy, as we all know.

    Now I'm looking for a 3DLabs/NVidia. The former are increasing their linux support (I even recall a /. article on it), while the latter have been traditionally good with it.

    It would have been very interesting if they'd included the VP990 Pro or the VP970 in the comparison...

    Trian
  • by archen ( 447353 ) on Thursday August 21, 2003 @08:51PM (#6761252)
    Other benchmarks include a leaf blower and a flaming jar of gasoline. The leaf blower actually did quite well in the noise tests against the Nvidia card but lost out due to the fact that it consumes less power than the video card. Ati unfortunately did not fare as well, and lagged behind with the noise factor. An ATI spokesman recommended that the card be coupled with a "cheap ass Athlon CPU fan" which would develop a good rattle to help the card become more obnoxious.

    The jar of flaming gasoline also did pretty good in the heat department against both cards, but unfortunately had to be refilled which was considered a drawback. Aside from that the life like animation the fire produced only ran at one frame rate, but was always consistent. Unfortunately the jar of gas lost out big time in the cost arena, but it seems that can be compensated for by tossing $1 bills into the flames at various intervals to get the costs up higher.

    Some wondered why we didn't benchmark a toaster as well, or instead of a jar of gasoline - but as we pointed out before, a toaster is far to practical to compare in a contest of flushing money down the toilet.
  • by Raven42rac ( 448205 ) * on Thursday August 21, 2003 @08:58PM (#6761308)
    How often does it fsck up a render? With consumer cards, who cares if you mess up a render, because it may just be a temporary jaggy, they just want to be all out speed-demons. But, with these corporate cards, a messed up render could be a misplaced weld, or something along those lines.
  • I am a CAD administrator, and use several different CAD packages. The problem I have with most graphic card now isn't performance - it's accuracy. When you zoom up on an intersection and the lines "move" at different zoom levels, it becomes impossible to know which surface is which.

    I have had this problem with Quadro cards. I have not had a chance to try ATI cards. I have had the best results with older 3D labs card (gx1 pro and gmx 2000.) Those cards did not offer the fasts performance, but were bett
  • by Alpha_Nerd ( 565637 ) on Thursday August 21, 2003 @10:49PM (#6761901)
    There's no need to actually buy a FireGl X1 as you can easily soft mod a Radeon 9700 Pro into one. instructions can be found here. [nvworld.ru]
  • Could someone with recent Ati 3D products say a few words about the current state of their linux drivers? Any glitches? Stability? Do they provide OpenGL headers for their libraries? Tv-out functional? etc...

    Personally I'm still a content Nvidia user, solely due to their drivers, they even run smoothly on top of 2.6.0-test3 (with the minion.de patch), _but_ I'm seriously thinking about Ati the next time around, which is around the corner as Doom3 comes out, also for linux as you all know.
  • HotHardware, where they test pro graphics card with games... cool...

    Now, for the ones who want a quite better review of the FireGL X1, QuadroFX 2000, FireGL Z1, compared to 6 others pro boards (including 3DLabs Wildcat VP970), Tom's Hardware has a nice one, dated March, 21st (so not only HW has an all but complete review, it is much late, too) :

    Tom's Hardware FireGL X1 vs QuadroFX 2000 Review [tomshardware.com]

    Have fun...
  • What interests me is if we keep getting better and better cards like this, will we one day get games which look so good so as to be indistinguishable from reality (albeit still on a screen). I certainly hope so because when/if this happens, games companies will have nowhere to go with graphics and will actually have to give more focus to making games more enjoyable. Fun to play instead of just flash, whereas the onus these days tends to be on graphics that take advantage of graphics card feature x.

"I am, therefore I am." -- Akira

Working...