Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software

NVIDIA's New Pro Graphics Quadro FX 3000 Reviewed 222

SpinnerBait writes "NVIDIA recently took the wraps off their next generation Workstation Graphics card, the Quadro FX 3000. This card is based on the same general GPU architecture as the NV35 but optimized for CAD and DCC applications. This article over at HotHardware shows what the new Quadro FX 3000 is capable of and it makes a strong showing. However, you've got to pay to play (or work) on this card, that's for sure."
This discussion has been archived. No new comments can be posted.

NVIDIA's New Pro Graphics Quadro FX 3000 Reviewed

Comments Filter:
  • by Hayzeus ( 596826 ) on Thursday September 11, 2003 @02:20PM (#6934857) Homepage
    But once I see FOUR-dimensional acceleration, /then/ I might upgrade.
  • by qmrq ( 648586 ) <qmrq@hotmail.com> on Thursday September 11, 2003 @02:21PM (#6934876) Journal
    I've yet to find anything that gives my Quadro4 a real workout.. I'll stick with what I have for now.
  • Optimized (Score:4, Funny)

    by mopslik ( 688435 ) on Thursday September 11, 2003 @02:22PM (#6934881)

    ...but optimized for CAD and DCC applications.

    That's funny, I thought nVidia was "optimzed" for 3DMark2003.

  • Games! (Score:2, Funny)

    by Ikeya ( 7401 )
    Yeah, yeah... So it's good at doing work. That's boring.
    What we really wanna know is how many FPS can we get in Quake?!
  • Dual DVI output: Drives two independent digital displays at 1600 x1200, or one at 3840x24005.

    Nice.
  • crazy (Score:2, Interesting)

    I can't believe how powerful video cards are getting. Is it just me, or has the computer industry gone a little too far with the whole "bigger is better/more-more-more" consumerist mantrae? I mean, currenty video cards render full color frames faster than the human eye can perceive. A cheap PC has enough RAM to store the entire contents of the Library of Congress many times over. Most commodity hard drives are in the 100+ GB range. And yet the hardware producers continue churning out bigger, faster machines
    • Who let the non-geek on here?

      just kidding buddy. But you should know... you need p0w3r for t3h sake of p0w3r.

    • Yeah but they are also getting more and more expensive. On the other hand though somoene must be buying them.
    • Re:crazy (Score:3, Informative)

      by pheared ( 446683 )
      And yet the hardware producers continue churning out bigger, faster machines, as if anyone will ever need a 10GHz CPU. When will enough be, finally, enough?

      It's because people keep using more and more things like perl and Java.

      OK sorry, cheap shot. :-)
      • You say cheap shot, but I agree. Java, .NET, and any other programming language that runs by relying on a "runtime environment" is definitely going to eat up cpu time. Yes, it's easier to program with, but it's the reason hardware needs to be faster and faster all the time.
        • >and any other programming language that runs by
          >relying on a "runtime environment"

          You mean like C and C++? That msvcrt.dll actually stands for something, you know. Microsoft Visual C Run-Time. IIRC Linux has glibc, Mac OS X probably has glibc too.

          If you were to program a Win32 application without msvcrt, you'd probably find, say, your malloc (using GlobalAlloc directly) to be a bit slower, because msvcrt's malloc is a bit better optimized for common usage. That, and having a malloc implimentation p
    • A cheap PC has enough RAM to store the entire contents of the Library of Congress many times over.

      I am not really sure how big in megs or gigs the LOC is, but I really doubt that this statement is true. LOC is always passed around as a joke, anyone know the actual size?
      • Re:crazy (Score:2, Informative)

        According to this [yahoo.com], 20 terabytes. Now please direct me to the store that sells cheap pcs with 20 terabytes of ram.
        • even Apple's new G5 models can only address 42 bits of memory (4 TB theoretically, limited substantially by the absence of 512 GB DIMMs)... But since he also speaks of 10 GHz CPU, perhaps he knows of a hidden stash of future technology.
    • Re:crazy (Score:4, Interesting)

      by Verteiron ( 224042 ) * on Thursday September 11, 2003 @02:30PM (#6935018) Homepage
      You may not need a 10GHz CPU. I know I don't, at least, not yet. But as interfaces improve, as software becomes more complicated, hardware like this may well a requirement. After all, monitor images are still nothing like as detailed as a printed page, and even a very high-end monitor has only a fraction of the resolution that can be produced by a $100 printer. It's going to take serious hardware to acheive that kind of image clarity in real time. 10GHz CPUs may be just the beginning.
      • You may not need a 10GHz CPU. I know I don't, at least, not yet. But as interfaces improve, as software becomes more complicated, hardware like this may well a requirement. After all, monitor images are still nothing like as detailed as a printed page, and even a very high-end monitor has only a fraction of the resolution that can be produced by a $100 printer. It's going to take serious hardware to acheive that kind of image clarity in real time. 10GHz CPUs may be just the beginning.

        Maybe. But some tech
        • Kind of off-topic, but seeing "impossibly complex" and X-Window in the same sentence reminds me of those one-line descriptions of XFree like "You need X like a fish needs a bicycle." Can someone please post a link to a comprehensive source for these? I think there are some for Emacs, among others, as well, so please post links to any similar set of quips if you've got them.
    • Re:crazy (Score:3, Interesting)

      by 2nd Post! ( 213333 )
      When someone designs something that uses all of it.

      I own a Mac, so the situation is slightly different, but as a user of iMovie, iDVD, and Final Cut Express, I can use *all* the ram, CPU, and HD space I can get. Two hours of footage take 30gb of raw storage. Rendering 5 minutes of video takes 30 minutes. iPhoto takes 1.2gb of ram.

      So for me, it's not enough. I need more.
      • I own a Mac, so the situation is slightly different, ..

        Yeah, Mac owners can use up all the resources of their machines with a fast tetris level ;).

    • books? come on now! 1 gig for the OS and 99 gigs for porn.
    • The big plus in newer cards, from my point of view, is that you can now turn on full-scene anti-aliasing (FSAA) and still be able to play games. That wasn't the case a few years back.

      FSAA can make scenes *beautiful*. Image quality is improving at the same time as framerate (not necessarily in the same platform though..)
      • FSAA doesn't make scenes beautiful, it makes them blurry and slower. Sharpness > Jaggies.
        • I dunno, when I compare a game with FSAA on vs off, I'm most impressed with the "on" version. They look way, way better. It's getting to the point that relatively inexpensive hardware can do it without letting the framerate drop to unacceptable levels (for me, 40FPS is the line).
    • A cheap PC has enough RAM to store the entire contents of the Library of Congress many times over.

      From http://www.loc.gov/about/:

      It is also the largest library in the world, with more than 126 million items on approximately 530 miles of bookshelves. The collections include nearly 19 million books, 2.6 million recordings, 12 million photographs, 4.8 million maps, and 56 million manuscripts.

      If there is about a meg of info in each of the 19 million books, ....

    • Re:crazy (Score:3, Insightful)

      by niko9 ( 315647 )
      Is it just me, or has the computer industry gone a little too far with the whole "bigger is better/more-more-more"

      I mean, currenty video cards render full color frames faster than the human eye can perceive

      Currently, I am addicted to Desert Combat, which is a mod for Battlfiled 1942. I consructed a top of the line gaming system just for this game. 3.0Ghz Pentium and Tyan 9700 Pro, 1 Gig 'o ram.

      The game is buttery smooth most of the time, even with AA and AF on and at a resolution of 1600x1200.

      But, t
    • 1. Games. Falls into new and shiny anti-aliased toy department.
      2. Multimedia. Encoding and editing stuff still can make a powerful computer cry.
      3. Compiling. Those things can never be too fast.
      4. Poor programming. Some newer applications just are resource hogs and its partially due to crappy programming/design.
      5. Never, ever underestimate the amount of pr0n which can be obtained. NEVER.
    • When will enough be, finally, enough?

      Never. (Trivial answer, but true)
    • Many others have replied to you, so I'll be brief: we need to constantly improve technology, making bigger-and-better systems, so that we will reach the singularity as fast as possible.

      Why? Because 50 million people will die this year. They don't have to die; once we reach the singularity our computers will be able to solve problems faster than our humans (and the rate of problem-solving will increase, as those machines build stronger and faster successors), eliminating old age and disease and the need

  • by Eros ( 6631 )
    Okay, I know what a CAD program is but what is a DCC application?
  • Does anyone know how their drivers for Linux are licensed? Greetings, Norbert.
    • Does anyone know how their drivers for Linux are licensed?

      The usual closed source stuff is the best I would hope for. Surely I would prefer a GPL driver, or at least just specifications released, so somebody else could write it. But I was afraid they might have gone the other way, and just release a closed source driver for Windows, and no specs. Luckily that seems not to be the case (yet). Still your question is important, much more important than so much other stuff, but unfortunately I don't know the
      • The Linux drivers will be around for awhile. A lot of high-end companies (like ILM) run their graphics workstations on Linux/NVIDIA. Since NVIDIA is the only high-end graphics company with quality Linux drivers, they're pretty much the only option in the market.
        • Since NVIDIA is the only high-end graphics company with quality Linux drivers, they're pretty much the only option in the market.

          The quality is being questioned. NVIDIA drivers have been blamed for some kernel crashes. The lack of options is the major reasons we don't want to see those drivers go. But that is not necesarilly what counts for NVIDIA. How large a fraction of the NVIDIA cards sold are being used with Linux? That is important to NVIDIA.
          • by be-fan ( 61476 ) on Thursday September 11, 2003 @10:09PM (#6939309)
            In random configurations, yes, there are issues with NVIDIA's stability. In closed configurations, there is not. I've never had a problem on any of my systems with NVIDIA's drivers, so its quite possible to build a system that runs rock solid with NVIDIA's drivers. Companies that are using NVIDIA on Linux can use these configurations, so that's not a problem.

            As for Linux's importance to NVIDIA, remember three things:

            1) Being the only option for high-end 3D on Linux is a big boon to them. A lot of computer graphics companies are moving to Linux (ILM, as I mentioned, runs Linux on almost all of their graphics workstations). These users are a very profitable market segment for NVIDIA's high end hardware (like the Quadro this article is about).
            2) Being the only option on Linux is great for publicity. Gaming geeks are at least aware of Linux, and many have fooled around with it. Running on Linux is nice PR towards this segment.
            3) Maintaining these drivers doesn't cost NVIDIA much. Their driver architecture is highly modular, so being cross platform is relatively inexpensive.
    • Binary only under a proprietary license. They're decent, though. On par with the Windows counterparts, and latest hardware tends to be supported. No (official) 2.6 support yet, though. There's a patch [minion.de], but I haven't had much luck with it.
  • by Serapth ( 643581 ) on Thursday September 11, 2003 @02:23PM (#6934903)
    I have never understood how this breed of cards exists to this day. Really... the difference between a "stock" GeForce and a workstation class Quadro GeForce... just doesnt justify the cost difference anymore.

    When you go back about 3 or 4 years ago... when you contrasted a Oxygen video card, or a FireGL vs a TNT or 3DFX card, you could see where the extra money went. But now, todays commerical grade video cards are more then capable. In fact, alot of people I know that work as graphic artists, use traditional Radeon or GeForce 4's in their workstation machines. Outside of say... Pixar, I just dont understand people buying the workstation class cards.

    Now, to go back to this arguement... its nice to see that nVidia managed to get rid of the vacumn cleaner sized fan!
    • Did you read the article? Some of the benchmarks show that the workstation cards do the job at a magnitude of 3 times faster in some test than the fastest gaming card from NVIDIA.

      If you need this speed, then you pay for it. Paying say $2k for a video card is not much if it makes your employees more productive.
      • For a little perspective here's a quote from one of Nvidia's driver authors:

        I use a Quadro FX 3000 with 256MB of VRAM, which is the absolute high end of NVIDIA's professional graphics series. It's basically a souped-up FX 5900 core manufactured for the CAD, DCC and visualization markets. There are many decent used vehicles that cost less than a Quadro FX 3000.

        Hmmm. A car or a video card.... For me it's an easy question to answer, but for work the equation works out a bit differently. There the CAD engin
    • Now, to go back to this arguement... its nice to see that nVidia managed to get rid of the vacumn cleaner sized fan!

      What do you mean, the fan is huge on this thing. I've had one for the past month or so and it is a nice card, but man, it's a total hog. The card requires you plug it in to a hard drive power connector because it can't suck enough juice off the AGP slot. It also requires that the PCI slot adjacent to the AGP slot remain open for cooling.

      I find that price is usually commensurate with the
    • Do you really know what are you talking about? Workstation class cards are made for stability, game cards for performance or price/performance. See this review [3dchips.net]: using CAD applications with a game card will lead to crashes and freezes.

      Comparing workstation graphic cards and games graphic cards is like saying you can put your white box PC on a high speed conection and use it as a server. Why do you think people still buy IBM servers with Penitum III when the Pentium 4 costs the same? Because they want stab

    • I have never understood how this breed of cards exists to this day. Really... the difference between a "stock" GeForce and a workstation class Quadro GeForce... just doesnt justify the cost difference anymore.

      Just take a model of some hundred thousand triangles, and render them as wireframe. The gaming card ("stock" as you call it) will just slow down to a crawl. Add a few clipping planes, and the frame rate gets even worse. But the Quadro cards does indeed cater this common usage in CAD, and charge ac

    • Yah, I looked at the high-end cards and looked at what software I would be using that would push it, mostly popular first-person-shooter games, and decided on the NVidia GeForce4 Ti 4200 128MB. It's the most bang for the buck. Hell, running on a P4 2.0 GHz with 1GB RAM, any game would be happy.
    • I have never understood how this breed of cards exists to this day. Really... the difference between a "stock" GeForce and a workstation class Quadro GeForce... just doesnt justify the cost difference anymore.

      Well, a specialist model requires retooling of the manufacturing process, which costs a fixed amount of money, and you sell fewer of them, so the added cost per unit produced is higher than you'd expect from the extra features to be gotten.

      Of course, if nvidia would churn out Quadro models by defau
  • Why is it that I have have to pay for for a video card alone that I have to pay for an entire gaming console?

    I know PC gamers are very die-hard (I'm a little more casual than most), but do you ever stop and look at the damn COST of the hobby? It's ridiculously expensive.

    I can understand why people want to have the best boxes out there to play UT2003, but do you ever question how much you are spending? You've got to stop and think about it sometimes.

    I'm just figuring off the top of my head, but I'd bet
    • (Slaps forehead)

      No wonder I have three computer systems in my house!

      1) 1.1 GHZ, 500 MB RAM
      2) 2.2 GHZ, 1.5 GB RAM
      3) 4.4 GHZ, 1.0 GB RAM

      The funny thing is, I have never been into console games. Reason is, I also do a lot of net research, computer graphics, coding JavaScript (C and Java). System number 3 is my main gaming system, I use my others for my research and other work. Hence, I have tended to "accumulate" systems since the early 1990's. Not just for upgrading for games, but also for faster comp

    • It's not a gaming card, but it's primarily for CAD/AM stuff. It's loosely based on an older gaming card, and that's why they decided on a whim to try out UT2003, etc.
    • >>Any hardcore gamers like to give their opinions? I'm very curious why people would drop SO MUCH money for cards like this...

      RTFA, this is not a gaming card, it's a workstation card and the drivers are tweaked for such apps as 3DSMax and Photoshop rather than Quake and UT. It's NOT for the average gamer and it's NOT for the average desktop user. It's for a very niche market, nothing more. That question is sort of like asking why people drop $10k+ into an SGI workstation that is clocked lower and per
    • I personally never pay much more than $100 for a video card. I look at what's available at that price point and decide if it fits my needs, every time I've looked it has =) Besides I buy games that keep me interested for years instead of months like most console offerings. I have played Moo3, Diablo2, Diablo, Civ, MoM, HoMM3, and several other computer games many years after their release, Soul Caliber and Dragon Warrior are the only console titles I can say that about. My annual outlay for my PC including
    • Why is it that I have to pay more for a video card alone that I have to pay for an entire gaming console?

      Yeah, but the games for a console are expensive, while games for the PC are free!

  • by FileNotFound ( 85933 ) on Thursday September 11, 2003 @02:23PM (#6934918) Homepage Journal
    Having RTFA I am surprised that they liked the card.

    I mean how can you say that the Quadro is a good card when it costs 6 times more than the competition and is less than 10% faster?

    6 times for for 10% faster? No thanks.
    • People like Macs for the same reason.

      Call them fans/zealots/cheerleaders, whatever.

      I'm sure someone will tell you how super-fantastic nVidia's drivers are and how mega-gay ATI's are. (In my experience it's been exactly the opposite, but such is life)

      Just like someone will no doubt tell me how rootin' tootin' fantastic OSX is, and how it's worth a $2000 difference in price.
    • Another way of looking at it.
      If the competition card is rated at a spped of 10, then this card goes to

      11 !!!

      The movie was Spinal Tap, for all those wet behind the ear kiddies..

      • Thing is, this is the business market.

        It's often ok for home users to pay $400 for a card that's only marginaly faster than a $150 card because they want "The fastest card in the world" in their PC.

        This is not the case in the business world. Nobody cares if your employees have "the fasterst card in the world" in their PCs. They care if you have "the best ROI in the world!"
    • by JungleBoy ( 7578 ) on Thursday September 11, 2003 @03:29PM (#6935852)
      There IS a reason, but its a high cost setup. The Quadro FX 3000 can do Dual DVI at 1600x1200 on each screen. Very few cards can do dual DVI at this resolution. In fact the only other one I know of is the Quadro4 XGL 900 (which I use to drive my viewsonic vp201mb LCDs). Many high end game cards can now do insane dual monitor resolutions, but only dual analog is supported; or if they do support dual dvi, they only support it at 1280x1024. I think the Matrox Parahelia can do Dual DVI @ 1600x1200, but I bet UT2k3 doesn't run as well as on my Quadro. And with 3200 pixels of width, you can set a nice high field of view.
      • There IS a reason, but its a high cost setup. The Quadro FX 3000 can do Dual DVI at 1600x1200 on each screen.

        Are CAD and DCC users that have two monitors going to be using LCDs instead of CRTs? Perhaps LCDs are much improved now, but I remember "color fidelity" issues and response time being considerations.
      • Pretty much all cards with dvi connectors can do 1600x1200 per connector nowadays, though it should be mentioned that all consumer ati radeon cards only have one dvi connector. However, all newer Matrox cards (Parhelia, P650, P750) as well as some nvidia consumer cards, including dirt cheap GeForce4MX (not all of them have two dvi ports of course, but some do) can easily do 1600x1200 with both dvi connectors.
        However, the FX2000 (and FX3000, but no older nvidia workstation graphic cards) indeed have an adva
      • Ok maybe I'm a noob, but what good is a FPS game with two monitors? Wouldn't your view be centered where the two montitors meet?

        It would seem to me like only an odd number of monitors would be good for playing games...
  • by lateralus_1024 ( 583730 ) <mattbaha@gm a i l . com> on Thursday September 11, 2003 @02:25PM (#6934940)
    One has to wonder how well it can run Police Quest. Time will tell. I'm sure NVidia is dodging this test.
  • The review is good and well, but what framerate do I get when i run Half-Life 2?
  • I like it a lot [hothardware.com]!
  • by LxDengar ( 610889 ) on Thursday September 11, 2003 @02:57PM (#6935378)
    Max 4.2 as a benchmark? Please, move up to 5.1 at least, and get XSI and Maya in there (Although admittedly Maya can be a troublesome beast for new graphic cards). Its a workstation card, run it against workstation apps.

    I appreciate the effort with the SPECopc benchmarks, but review sites need to put more effort into testing a card like this. I cannot see what the reviewers were thinking by not putting this card up against what it's price point is aimed at - specifically Wildcat cards.

    • They likely couldn't afford the X thousand dollar liscense for Maya, let alone one for Autocad, Solidworks, etc. The best place to get info on what video card to purchase is from you software vendor. All of the CAD/DCC companies have a short list of recomended cards which are the ones they test the most with and which are best supported by their products. You're spending big cash on the software and the talent, through a little time and money at getting the right hardware.
  • OK, graphics cards are finally where we want them to be for price-performance point. Now what I am needing is the same value in video projection systems, like 2000 lumens for $200. Because until you have experienced X-Plane in 8-by-10 foot format, you have not experienced X-Plane.

    Porn would also be incredible experience in that scale, yes?
    • Nah, almost all P0rn is shot with equipment like Sony Handicam's. Low budget rules the sleazy underworld, so projecting a lousy picture on a big screen just makes you realize how poor the quality is. It's even worse with Divx rips, ugh. But I have to agree with you, 2,000 lumens for even $500 would be great.
  • by bombom ( 168256 ) on Thursday September 11, 2003 @03:19PM (#6935674)
    I work as a CAD developer and you won't belive how much Quadro cards can outperform regular Geforce cards (don't get me strarted about ATI and thier crappy drivers for the FireGLs).

    The single biggest limitation of the GEForce cards is they are optimized for 1 window. A Quadro card OTH can have open multiple windows in a 3D cad program . (e.g. Geforce 4's choke after having 3-4 moderatly complex parts open and a good Quadro 4 can handel 12-15 windows no sweat). AutoCAD is pretty light on the GFX card so a older quadro might suffice but something like Inventor or Solidworks or Catia can really benefit from these cards. If one of your engineers sees a 20% speed up in generating drawings of a complex assembly, or a 10% speed up rotating a gear assemlby, 3K is money very very well spent.
  • At the latest Siggraph conference in July, there were quite a number of papers describings systems that used these new graphics cards in novel ways. There is a tremendous amount of horsepower available, but it is often not obvious how to apply that horsepower to your particular computation.

    Graphics cards seem to be increasing performance on a faster curve than CPUs these days, so people hope that the effort made in restructuring programs to use these graphics cards will pay off big down the road a year or

Put your Nose to the Grindstone! -- Amalgamated Plastic Surgeons and Toolmakers, Ltd.

Working...