Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Affordable Workstation Graphics Card Shoot-Out 141

MojoKid writes "While workstation graphics cards are generally much more expensive than their gaming-class brethren, it's absolutely possible to build a budget-minded system with a workstation-class graphics card to match. Both NVIDIA and ATI have workstation-class cards that scale down below $500, a fraction of the price of most high-end workstation cards. This round-up looks at three affordable workstation cards, two new FireGL cards from AMD/ATI and a QuadroFX card from NVIDIA, and offers an evaluation of their relative performance in applications like Cinema 4D, 3D StudioMax, and SpecViewperf, as well as their respective price points."
This discussion has been archived. No new comments can be posted.

Affordable Workstation Graphics Card Shoot-Out

Comments Filter:
  • Um.. I use my 'workstation' for spreadsheets and web browsing. The dell integrated shares-sys-memory controller is fine and didn't cost me no 500 bucks.
    • Re: (Score:3, Interesting)

      Well i guess you really don't have a clue what they are referring to when they say "Workstation".
      • I'd hazard a guess that by "workstation" they might just mean people who make a living with graphics ;).
      • by bytesex ( 112972 ) on Wednesday February 06, 2008 @06:32AM (#22319198) Homepage
        And I daresay that he isn't the only one. The write-up is confusing, at best. Had me going for a bit, anyway. To ordinary people, even 'ordinary' slashdot-readers, a 'workstation' is some 'station' (a desk with a computer) that you do your 'work' on. That thing will usually contain a graphics controller that is on-board these days, the cost of which has been discounted in the price of the board, and certainly isn't expensive to an extent that a gaming-person's graphics controller will have a 'fraction' of the cost. Chagrin or no chagrin about lay (non-graphics) people reading topics that aren't meant for them, but to act as if this is logical, implicit or otherwise self-explanatory, is disingenuous and not much different from those slashdot-write-ups that start off describing some event in second life as if it happened in real life, and pretend that everybody knows what they're talking about. Clarity is king, and no man is an island and that sort of thing.
        • by drsmithy ( 35869 )

          To ordinary people, even 'ordinary' slashdot-readers, a 'workstation' is some 'station' (a desk with a computer) that you do your 'work' on.

          Truly, then, much has been lost...

          • by bytesex ( 112972 )
            Well, I've just looked at the wiki article somebody helpfully posted and I must apologize. I run in this business for a good number of years now, but I'd never realized that there's an official distinction desktop -> workstation -> server. I mean, if you say it so explicitly, it kinda makes sense, but If I run all my server programs on my desktop (which I do), do I end up in the middle somewhere and do I now have a workstation ? I always thought that desktops and workstations were the same thing an
        • Have you looked up 'Workstation' on wikipedia or any number of 'jargon' sites? The overwhelming interpretation is that a workstation is a high-end computer, with exceptional graphical or computational abilities.
        • To ordinary people, even 'ordinary' slashdot-readers, a 'workstation' is some 'station' (a desk with a computer) that you do your 'work' on.

          Well, even just a few years ago "workstation" implies a computer that is more powerful then a normal desktop computer. Or which is designed for a specific purpose. Such as "sun workstations" or "CAD workstations".

          Workstations typically have ECC memory and are just beefier all around (including cost) and are used by folks where time = money.

          (It's a marketing ter
    • by merreborn ( 853723 ) on Wednesday February 06, 2008 @02:48AM (#22318304) Journal

      applications like Cinema 4D, 3D StudioMax, and SpecViewperf
      You may not use those applications on your "workstation", but there are thousands of professionals who do

      Note that the term workstation usually means a high end system used for something a little more complex than web browsing and spreadsheets:

      http://en.wikipedia.org/wiki/Workstation [wikipedia.org]

      I believe the progression, marketing-wise, goes:
      Desktop -> Workstation -> Server

      You're thinking of desktop hardware/software.
      • by TheLink ( 130905 )
        "Desktop -> Workstation -> Server"

        Don't think that has been true for the past number of years - they're different breeds now.

        Servers nowadays don't have much video - really low end video.

        Whereas workstations have much better video cards, if only to be able to display stuff from the render farm at high enough resolution.

        And desktops are the cheap and nasty stuff where 1 in 10 (or 7) are dead on arrival :).
    • speadsheets and web browsing? how is that a workstation?

      I'm thinking more along the lines of CAD and EDA.
  • Difference? (Score:5, Interesting)

    by AdeBaumann ( 126557 ) on Wednesday February 06, 2008 @02:38AM (#22318252) Homepage
    I'm sure I'm not the only one who doesn't know... so can anybody explain the difference between a high-end workstation card and a high-end gaming card?
    • Re:Difference? (Score:5, Insightful)

      by Broken scope ( 973885 ) on Wednesday February 06, 2008 @02:53AM (#22318324) Homepage
      I think it works like this.

      Game cards are designed to render stuff as fast as possible, many times a second.

      Workstation cards are designed to render everything in the desired quality, and take as long as it needs.
      • Re:Difference? (Score:5, Insightful)

        by prefect42 ( 141309 ) on Wednesday February 06, 2008 @04:21AM (#22318706)
        I'd say it's more complicated than that. Gamer cards push game graphics around fast. This often means high memory bandwidth for texturing, fast full screen anti-aliasing, and these days fast shader performance. Workstation cards often are better at line-antialiasing, much better with high polygon count work, much better working with mutiple windows. Quadros always used to support more clipping planes in hardware for example. How much of this is a real hardware difference, who knows.

        We've got a home-grown application rendering a 4 million polygon model. Quadro 4500 is an order of magnitude faster than a 7800 GTX. You wouldn't guess that from the tech specs.

        • Quadro's also have 'full' OpenGL compliance, whereas GeForce's apparently don't (well that's what one of the engineers here claimed). I remember a tool a few years back for fooling your computer into thinking that your GeForce was a Quadro, but indeed it wouldn't run at the same speed as a Quadro so there must have been more parallelism for basic geometry rendering in the Quadros (as you point out). But when I've tried Quadros for actual gaming in the Source engine a few years back, it sucked compared to my
          • or it could have been because you were on a "Graphics Workstation", not just a regular "workstation". took me forever to figure out what everyone was talking about.
            I'm a semi-pro CGI guy, and I'm getting a kick out of...
            sorry, wrong board.
            It's getting really weird in the world of CGI; most of the major, and some of the Minor graphics apps are letting you make use of your GPU during rendering; these cards HAVE to be designed with that in mind.
            But I'll admit I don't get it. outside of rendering, most Major
            • Well, we do a lot of 3D CAD, rather than CGI, though they're okay with 3DSM too, and you can use Inventor to render shiny images of your designs if you wish. We used to always get in Dell workstations, but more recently I found an independent company down in England who do nice workstations, they were basically the only place I found that had a choice of AMD processors back when Athlons were on the rise (these says you have a choice of Athlon X2s or Core 2 Duos for the basic workstations), and they also hav
    • Re:Difference? (Score:5, Informative)

      by kcbanner ( 929309 ) * on Wednesday February 06, 2008 @02:54AM (#22318330) Homepage Journal
      The workstation cards tend to have very low error tolerance, while the real time graphics cards allow for quite a bit of error in the name of speed. This is fine unless your rendering something.
      • Re:Difference? (Score:4, Informative)

        by Psychotria ( 953670 ) on Wednesday February 06, 2008 @03:41AM (#22318538)
        I am not sure "error tolerance" is the correct term; there is no "tolerance" (you are correct though; I am just debating the term), it's just that the high-end workstation cards sacrifice speed over accuracy. To say "error tolerance" implies that both types of card have errors (that they may or may not have and may or may not compensate for), and one tolerates them more than the other. This, strictly, isn't true. A better analogy would be something like high-end gaming cards have (for example... making the figures up) 24-bit precision and the high-end cards have 64-bit precision. There is no "tolerance" involved; just that one does the math better for accuracy and the other does the math better for speed.
        • Re: (Score:2, Informative)

          by 0xygen ( 595606 )
          Error tolerance refers to pixel errors in the output image compared to a reference rendering.

          eg, the fast texture sampling methods on gaming cards lead to aliasing errors, where the pixel is in error compared to a refernce rendering.

          There are also a lot more factors to this than just floating point precision, for example how the edges of polys are treated, how part-transparent textures are treated and how textures are sampled and blended.
      • >This is fine unless your rendering something.

        you dont render anything on the GPU (at least not yet), video cards are only for visualisation, that measn your theory is not valid
      • No, the workstation drivers emphasize precision over speed. The cards are exactly the same, except that there are laser-burned fuses on the workstation cards so you can't use "gaming" cards with the workstation drivers.

        Interestingly enough, at least on the NV side the drivers are the same too. They just enable/disable specific options (and optimizations) based on whether or not the card is a Quadro.
    • Advertising
      • by AvitarX ( 172628 )
        This post had an add as a reply

        I like that though hate the inline adds.

        Whatever happened to tasteful extra adds when they put them in (those were the square ones)?

        I wish whoever had the genius idea to interfere with discussions gets a plague.
    • I can. You tell the boss you need a high-end worstation card. What you mean is you need a high end gaming card. It's not a difference in the physical card you hold in your hand - it is the difference of being worked into the budget. It is pure coincidence that your nifty workstation card plays the latest and greatest games.

      I really apologise this is so unhelpful. I know what you meant, and I know what I've being telling the power that be for years. I guess in some regards I am being truthful and correct -
    • Re:Difference? (Score:4, Informative)

      by TheSpengo ( 1148351 ) on Wednesday February 06, 2008 @03:02AM (#22318368)
      High end gaming cards specialize in pure speed while high-end workstation cards specialize in extreme accuracy and precision is the basic answer. They are incredibly accurate with FSAA and sub-pixel precision. Workstation graphic cards also have other features such as overlay plane support which really helps in things like 3dsmax.
    • Re: (Score:2, Informative)

      by acidream ( 785987 )
      Workstation cards typically have certain features enabled that their gaming counterparts do not. Some are just driver features, others are in silicon. Hardware overlay planes are a common example. This is required by some 3d applications like maya in order to display parts of the gui properly.
    • Memory and accuracy in rendering. The reason that most of these cards have 1 gig+ is because of the need to display sometimes assemblies with 1000's to 10'000's of parts. A consumer grade simply does not have the memory in most cases to even to display at the resolution CAD operators work at which is often over common resolutions and goes into QXGA [wikipedia.org] monitors which cost a pretty penny [amazon.com].

      The cards are often the same GPUs you find in gaming cards with two important differences, drivers and chip quality. Thes

    • Seeing how absolutely amazing my 8800GT nvidia card is in Crysis and Gears of War, the bottleneck is the 3d software like max and maya, not the card. 3d animation software is definitely laging behind games. But Nvidia's recent purchase of mental ray might change that... I hope... I think the difference between a pro card and a high end gaming card is minimal. It would be great to see comparative benchmarks though.
    • can anybody explain the difference between a high-end workstation card and a high-end gaming card?

      About $2000 (check Quadro 5600 vs GeForce 8800)

      Seriously, there is no difference in the hardware any more, and anyone who tells you different has no idea what they're talking about. The only substantiative difference is in the drivers; for gaming cards NVIDIA and ATI omit certain driver features that games tend not to use, and omit performance tweaks for modeling programs, in a blatant attempt at price discrim [wikipedia.org]

      • For many applications, you are correct. It's a crime that some PC vendors will only sell you a "workstation class" graphics card with a "workstation class" PC. But if you need the differences, you need them. Line anti-aliasing, overlay planes, quad-buffered stereo, hardware synchronization across cards for tiled displays, etc.
    • Re: (Score:3, Informative)

      Another difference that at least existed in the past, and probably still holds true today, is that workstation cards have more geometry pipelines, whereas gaming cards have more pixel pipelines. The gamer stuff puts out very pretty but a lower number of polygons, whereas workstations often just use kajillions of tiny untextured polygons. It's a tradeoff that affects silicon size and internal chip bandwidth, and explains why games and workstation apps run slowly on the wrong 'type' of card with their diffe
      • Re:Difference? (Score:4, Informative)

        by Molt ( 116343 ) on Wednesday February 06, 2008 @06:22AM (#22319148)
        This doesn't hold true any more, the latest generation of hardware are all using the Unified Shader Model [wikipedia.org]. This removes the distinction between a pixel pipeline and a vertex pipeline as a unified pipeline is used which can be switched between pixel and vertex processing as the scene demands.
        • This doesn't hold true any more, the latest generation of hardware are all using the Unified Shader Model. This removes the distinction between a pixel pipeline and a vertex pipeline as a unified pipeline is used which can be switched between pixel and vertex processing as the scene demands.

          That doesn't mean that the pipelines couldn't be optimized for vertex over pixel shading or the other way around. Nor does it mean that there couldn't in fact be different pipelines optimized for different operations

    • Workstation cards are chip for chip identical to their gaming brethren, except that the drivers identify them as such and tweak settings accordingly. You more or less buy a cheap GeForce or Radeon but with a fancier label and a pricetag oriented around corporate budgets.

      3D engineering applications use a bit different rendering techniques as far as I heard. Less polygons and more high-order math of some sort. Better visual fidelity at the cost of performance. Also, CAD doesn't need high framerates.

      Gaming car
      • Workstation cards are chip for chip identical to their gaming brethren, except that the drivers identify them as such and tweak settings accordingly.

        I'll bet that's true when them come out of fab, but I'll also bet that certain crucial bits are burned out afterwards to prevent mere software modifications from converting a GeForce to a Quadro (there was such a hack maybe 8-10 years ago, I think).
    • Re: (Score:3, Interesting)

      "Workstation" generally means you're using some sort of 3D application, pushing hundreds of millions (or billions) of textured, lit triangles around. I have a 2S/2P Opteron workstation with 8GB of RAM and two Quadro FX 3500 cards, and I use it with 3DSMax.

      The difference in cards is subtle. Most gaming cards are tuned for ultimate speed (framerate) but perhaps not as much accuracy or quality. Workstation cards have things like hardware anti-aliasing of wireframes, a great feature when you're working with
      • I softmodded some gaming cards to workstation cards a few years back. Worked like a charm. However, it got to be more trouble than it was worth because NVIDIA kept trying to break the softmods with driver updates.

        I think we just found the real reason why NVIDIA won't release open sourced drivers for Linux.

        • While I think you were speaking tongue in cheek, I don't think that has any real bearing on it. The specifications for doing the softmods are well understood by the parties writing the modding software (i.e. Unwinder). There's even been speculation that Unwinder is either a current or former NVIDIA employee, although I think that might be stretching it a bit.
      • Just an FYI, NVidia's latest Quadro drivers (169.61) now work with almost any NVidia card. With a tweaked INF and SP3, I'm getting between 7% and 16% 3DMark score improvements with my GeForce 7600GT mobile as compared to SP2 with the original drivers, and about a 3% increase in framerate over the last WHQL driver release (163.something I believe).
  • by TheSunborn ( 68004 ) <mtilsted@NoSPAm.gmail.com> on Wednesday February 06, 2008 @02:43AM (#22318278)
    It's a shame they don't test them against 'game cards'. It would be really interesting to find out how theese cards differ from the normal gaming cards, when doing realtime 3d.

    • Not really, they are meant for completely different purposes. In a rendering competition between the latest geforce and the latest quadro in maya or 3ds max or something, the quadro would completely obliterate the geforce, but in the gaming scene the geforce would have a definite and very noticeable advantage. People who spend over $2k on their workstation card probably aren't too interested in how well it runs crysis though, hehe.
      • Re: (Score:3, Interesting)

        by TheSunborn ( 68004 )

        ]In a rendering competition between the latest geforce and the latest quadro in maya or 3ds max or something, the quadro would completely obliterate the geforce

        I know that's how its suposed to be, but I have newer seen a benchmark between 'workstation cards' and 'gaming cards' which included example images from the different cards, that showed the difference.

        This benchmark don't even include any example images, which I don't understand because it might be the biggest difference between the cards. Having a benchmark of 'workstation cards' that are suposed to look better then the gaming cards, and then not even including anything about the image quality is wierd.

      • Re: (Score:2, Insightful)

        by neumayr ( 819083 )
        I think the OP meant that a test against consumer cards would be very interesting for 3D artists on a budget.
        As in, do I stick to this GeForce and get that quadcore CPU in order to speed up my test renderings or does it make more sense to spend my money on a Quadro and stick to my slower CPU?
        • Ah that is a good point. Unfortunately I have not seen any comprehensive comparisons of this kind so I guess the best you can do is look at the performance of the cpus and gpus separately and take a best guess. :/
        • by Animaether ( 411575 ) on Wednesday February 06, 2008 @05:38AM (#22318968) Journal
          sorry, but a graphics card does not speed up your rendering unless your renderer can take advantage of the graphic card; hint: that's not very many, and those that do only do so for very limited tasks.

          The only reason you should have for upgrading your graphics card within the 'consumer' market is if your viewport redraws are being sluggish; this will still allow you to play games properly* as well.
          The only reason to upgrade to e.g. FireGL or a QuadroFX is if you're pushing really massive amounts of polys and want a dedicated support line; e.g. for 3ds Max, there's the MaxTreme drivers for the QuadroFX line - you don't get that for a consumer card.

          * on the other hand, do *not* expect to play games with a QuadroFX properly. Do not expect frequent driver upgrades just to fix a glitch with some game. Do not expect the performance in games to be similar to, let alone better than, that of the consumer cards.

          For 3D Artists dealing with rendering, the CPU should always be the primary concern (faster CPU / more cores = faster rendering**) followed by more RAM (more fits in a single render; consider a 64bit O/S and 3D Application), followed by a faster bus (tends to come with the CPU)/faster RAM, followed by a faster drive (if you -are- going to swap, or read in lots of data, or write out lots of data, you don't want to be doing that on a 4200RPM drive with little to no cache) followed by another machine to take over half the frames or half the image being rendered (** 'more cores' only scales up to a limited point. A second machine overtakes this limit in a snap), as long as you don't have something slow like a 10MBit network going (for data transfer).
          • by neumayr ( 819083 )
            .o(how on earth could that have been modded insightful?)

            Barely (or rather, not really) suppressing the urge to just reply "no shit, shirlock", let me ask you this:
            Who claimed that the rendering takes place on the GPU (ignoring nVidia's gelato, don't know how relevant it is)?

            Also, the major bottleneck in rendendering is CPU, followed by memory and bus. Bandwidth of the storage system and the network are a very distant third.
    • by Sique ( 173459 )
      It's time for the confusing and wrong car analogy.

      Normally you also don't see tests of vans against trucks even though they may build on the same frame and engine, and both are designed to carry more than a car.
  • by alwaystheretrading ( 750171 ) on Wednesday February 06, 2008 @02:46AM (#22318294)
    I'd really like to see a low end workstation card like one of these compared to a high end consumer card. When I'm working with half a million polys in 3DS Max 2008 is it really going to be worth the extra money to get the workstation card?
    • I think workstation cards have more memory typically and some optimizations. If you want to make the best and most accurate immage, yeah get the work station card.
    • I'm not too sure about 3DS Max, but I know in Maya a Quadro 5700 would blow the 8800 GTS away. We have some at the studio and their ridiculously fast.
    • I bought an 8800 GTS because I wanted to both play games and do some 3D, Compositing work.
      It seems though that the combination of 8800 GTS, Windows XP x64 and Maya is not a very good one, my viewports seem to freeze after orbiting around for a while, sometimes it gets sluggish.
      Since only the Quadro line is certified for this sort of work nVidia doesn't seem too eager to fix those issues.

      If could, I'd return this card and get a Quadro.
  • All I can say is... (Score:5, Informative)

    by snl2587 ( 1177409 ) on Wednesday February 06, 2008 @02:51AM (#22318322)

    ...if you're planning on using a Linux workstation, don't buy an ATI card. I don't mean this as flamebait, just practical advice. Even with the new proprietary drivers or even the open source drivers, there are still many, many problems. Of course, I prefer ATI on Windows, so it all depends on what you want to do.

    • Re: (Score:3, Informative)

      It depends. I bought a new ATI card after they opened up the 2D driver specs. When booted into Linux I haven't had any problems with my day to day activities. Its only when it tries to render anything in 3D that it shits bricks. To be fair there may be a problem besides the driver that I haven't found yet, but right now all signs are pointed to driver/card problems. Honestly its not a big deal to me. I just don't use any fancy compositing manager and I never played games in Linux anyways. While I'm o
      • Its only when it tries to render anything in 3D that it shits bricks.

        Mine too...as well as everyone else's on the seemingly endless discussion boards.

        While I'm on the subject, I know when they released the 2D specs they said the 3D specs were on their way, but then I never heard anything out of that again.

        Actually, they released a driver in January that was supposed to correct all of the issues. Apparently that claim didn't hold any water, and so last I heard they were trying to push out a new one by March.

      • by neumayr ( 819083 )
        Hardly relevant, seeing this is a discussion about an article about workstation GPUs and their respective performance. In 3D.
        If all this were about 2D performance we'd probably still be using Matrox cards..
      • by zIRtrON ( 48344 )
        Like flash 8 for linux or something or other...
    • Although I would generally agree that avoiding ATI is a good idea for Linux systems, I think this kind of misses the spirit of the article. Most graphics professionals are using a platform, not just a random mix of computing hardware and software. For these people, it usually winds up being what their software vendor will directly support.
      • by neumayr ( 819083 )
        But then, this article is about "Affordable Workstation Graphics Card[s]".
        Whoever cares about the price of that single component is not in the market for a platform.
  • We're next (Score:3, Funny)

    by Hansele ( 579672 ) * on Wednesday February 06, 2008 @03:00AM (#22318354)
    As soon as the shootout's over, they'll come gunning for us. I, for one, welcome our new graphical overlords.
  • Whatever - all I want to see is open specs on the cards, and support for open drivers a la Intel [intellinuxgraphics.org]. Then I'll start thinking about buying ATI/NVIDIA.
    • Does Intel, or the GPU makers besides Nvidia/ATI, even make a Workstation card capable of the high end 3d modeling?

      If they don't then yes we do care, because people need these cards to do their jobs, regardless of how much they want open source drivers.

    • by splutty ( 43475 )
      As far as I'm aware there aren't any open source projects that would have any use for workstation graphics card, so your sentiment of open source drivers is really nice, but somewhat beside the point.

      They're specifically in the market for 3D CAD, 3DS, Maya, that sort of stuff, of which there really isn't a heavy weight open source equivalent.

      So, although in principle I agree with you, I don't think it's even remotely important. I'd much rather see open source drivers for the gaming cards, since those *are*
      • Re: (Score:3, Interesting)

        by TimFenn ( 924261 )

        They're specifically in the market for 3D CAD, 3DS, Maya, that sort of stuff, of which there really isn't a heavy weight open source equivalent.

        I don't do 3D CAD, but being a biochemist type, I actually hang out with lots of folks that do work with all kinds of 3D data such as molecular models and volumetric MRI datasets. Workstation cards are especially useful for their stereo support, which many bio-folks find helpful when modelling. Most of the development is done on linux using stuff like VTK [vtk.org] or VMD [uiuc.edu] - its not just the engineering guys doing CAD in windows that want workstation cards.

        As a scientist that uses linux daily for 3D applicatio

        • by splutty ( 43475 )
          I wasn't aware of that, and that does look and sound rather interesting. In which case I have to retract my earlier statement and go picket the doors at ATI for open source drivers :)
      • by Verte ( 1053342 )
        The other thing these would be great for is as a GPGPU- open drivers and hooks for libraries such as the GSL could allow visualisation workstations to double as mini-supercomputers after hours.
  • by ludomancer ( 921940 ) on Wednesday February 06, 2008 @03:36AM (#22318510)
    There was a time when you could purchase a 3D card that worked excellently for both work and play. These new "workstation" cards are a farce. They are an ostensible attempt at a solution where there is no problem. I am a professional 3D Artist and I can attest to this due to personal experience over the last 15 years. Don't buy into this crap. They DO perform better for workstations, but only due to the fact that gaming cards are intentionally crippled in this area in order to push this alternative product. Luckily most gaming cards currently on the market work well enough for 3D workstations, so I encourage everyone to ignore this attempt at desultory market generation as much as possible, because it's perfectly possible for you to get great performance out of a gaming card for both purposes.

    • These new "workstation" cards are a farce.

      Is your point that these NEW workstation cards are a farce, or that THESE new workstation cards are a farce?

      If the former, you may be interested to learn that the concept of extra-powerful "workstation" graphics accelerators goes back quite a ways -- even back to the days before "workstation" simply meant "high-end PC". Consider the difference between the capabilities of a circa-1990 386/VGA machine, and a contemporary SGI Indigo.

      • Sorry, THESE new workstation cards are a farce. I've been a 3D developer for decades. I well remember SGI, but because the technology was in its infancy then it was simply the only way to get things done. NOW it's just an excuse to split the market and charge developers more money for something they don't need.
  • time to whip out my old SPARCstation and maybe my SGI anyone have a copy of IRIX i could borrow. I though the whole concept of workstations being a separate thing from high end PC went out when you could have 8 cores, RAID, and gig ethernet on the family PC in the Living room. Actually I am looking for some hardware XServers if anyone has one of those.
    • Oh i almost forgot. 64 bit processor support. That and how to properly type
    • Re: (Score:1, Troll)

      by jacquesm ( 154384 )
      sure, would you like your distribution to come on CD, QIC tape or DAT ? And what about the devkit or will you be using that open source stuff ? The last release I've got I think was IRIX 6.2, after that I switched to KDE/Linux and gave all my SGI stuff away.

      Can't say I miss it either :)
      • CD. That open source stuff is UNIX for the peasants, and it will never be used in REAL workstations, why RISC it when you have quality software developed for the superior race by the superior race.
  • Even better: buy a gaming card, then change its PCI deviceID and unlock the professional capabilities. Ta-dah!
  • by MindPrison ( 864299 ) on Wednesday February 06, 2008 @04:19AM (#22318700) Journal
    The biggest curtain that have ever been pulled over the artists eyes is the "PRO"-Graphics card-Fad! Youre paying to feel "pro" - you dont get more "pro" for your money at all, you just get to "feel-like-pro" but very little extra to justify the real bucks youre spending on Quadro & FireGL series.

    I know this, Im a "graphics pro" myself that makes a living of designing 3D-Models & prototyping every day and Ive used nearly every card known to mankind.

    Heres my advice - take it or leave it:

    Buy a Gaming-Nvidia card! The difference between the Gaming Series cards and the Quadro series card is just some extra driver software that is optimized for your "insert-favorite-3D-app-here", yes...there are some less pixel-flaws..but this will never ever affect your final-render unless youre using Nvidias Gelato (which has - by the way - proven in many cases to render less effectively than modern Multi-core-CPUs with software rendering)

    You will save up to THOUSANDS of Dollars by not buying into the "PRO" hype, and youll be one happy puppy you didnt - and work just as efficiently (I know - we do) as the ones with the "PRO" cards, the game cards are actually using the same chipsets (remember the Quad-Mod you could perform on their cards, it aint fake you know!)...it would make absolutely NO SENSE for them business wise to produce 2 different cards when their cards can in fact do the same thing....and actually use the same chips.

    • by prefect42 ( 141309 ) on Wednesday February 06, 2008 @04:30AM (#22318736)
      Problem is, your advice sounds reasonable even though it's not.

      Looking at the hardware spec sheets, I'd agree with you. But when it came to it, and I compared what at the time were the top cards (Quadro 4500 vs 7800GTX) the difference was night and day. If you wanted to play games, but the 7800GTX, it was waaaay faster. Want to do your own OpenGL apps that are quite demanding (high polygon count, multiple clipping planes, lots of transparency) and it's clear that not only is the 4500 faster, but it gives almost twice the bang for buck. That's pretty impressive for a 1500 ukp card, where you're not expecting value for money...

      What you need to see are benchmarks of a Quadro 1700 against a similarly priced 8800. I'd be tempted to call in favour of the Quadro for things that matter to me, but short of buying some to test, it's hard to get decent figures.
      • Re: (Score:2, Interesting)

        by Spy Hunter ( 317220 )
        Clipping planes are one of the features NVIDIA cripples in the gaming drivers (as games hardly use them); it has nothing to do with hardware. Buy a GeForce and flash it with the Quadro firmware if you really care about clipping planes, but honestly features like clipping planes, hardware overlays, etc are better implemented in your application and in your shaders anyway, where they will run equally fast on gaming and workstation cards.
      • by vimh42 ( 981236 )
        But your counter argument isn't taking into account what the parent is suggesting. That the difference is really in the drivers. Sure the Quadro will win with professional apps in terms of speed but why is that? If it's purely because of the drivers present then I'm not to happy paying substantially more money for a few lines of code. I'll take the cheap hack if the consumers cards performance doesn't cut it.

        Is it possible that if Nvidia drivers were completely open source that we'd quickly see drivers that
    • by geonik ( 1003109 )
      You have just solved a mystery that has been puzzling me for years, thanks for the great advice!
    • by Anonymous Coward
      You may think that your 3d modeling and prototyping is professional work - and I'm sure it is.

      However, you should be thinking of people using CATIA to build an entire car or even more exotic pieces of software for building entire airplanes. We're not talking the piddly few million of polies that the average Disney/Pixar movie ponders about in Maya/etc., even though those would benefit as well - we're talking a dew hundered million polies. Now we're talking 'pro'. Now we're talking the kind of people who
      • What you're saying about big models and big projects is true. However, getting a single workstation card is not going to allow you to render 10X more polygons/sec than a single gaming card. It is more useful for features like anti-aliased lines, tiled displays / multi-card rendering, etc.
    • Re: (Score:2, Interesting)

      by Anonymous Coward
      You are exactly right. Many Geforce series cards can be made to function exactly the same as the Quadro series with RivaTuner and the NVStrap driver. I have actually done this myself on one of my cards.

      The only people who buy Quadros are non-saavy artist types. Those of us who know better can have the exact same thing for a fraction of the cost.
  • by nano2nd ( 205661 ) on Wednesday February 06, 2008 @04:44AM (#22318782) Homepage
    Have a look at this site - it is possible to flash an 8800 GTX to Quadro FX 5600:

    http://aquamac.proboards106.com/index.cgi?board=hack2&action=display&thread=1178562617 [proboards106.com]
  • What we need for our audio workstations is a fanless (silent) graphics card that will do OpenGL nicely, using Free/Libre/Open Source drivers. Affordable is helpful, but not essential.

    I've been watching the gradual progress of the Open Graphics Project [duskglow.com] (and now Open Hardware Foundation [openhardwa...dation.org]) with interest and hope they can release something good before the major manufacturers get a clue - quite likely considering their years of promises (ATI) and proprietary drivers (nVidia). It seems that Intel [intellinuxgraphics.org] are doing good

  • High end gaming cards specialize in pure speed while high-end workstation cards specialize in extreme accuracy and precision is the basic answer.

    What accuracy, you are just rendering a complex assembly. You don't need accuracy, the human eye compensates. When you spin it, the drop bits, then it renders again accurately. I used workstation cards. A QuadroFX and one day it dies. I needed to do a job quickly so I rushed out and bought a cheap gaming card. I mean cheap. It worked just as good. I COULD HARDLY TELL THE DIFFERENCE. This workstation business is a scam. Try this test, as long as you have a fast CPU (ie less than a couple of years old CPU) and about 2Gig of ram. Download a demo of ProEngineer. You get 30 days. open the most complex assembly spin to your hearts content. I did this with an assembly with many thousands of parts. It flies. Graphics cards have advanced to a point, and this software worked will with top end cards in 1998. We have advanced 10 years now. The budget cards work great. Giorgis

  • Interesting, at last OpenGL benchmarks! But I wonder how do gaming cards result compare to this? Unfortunately, in every recent computer magazine I buy, all benchmarks are about DirectX performance only, no word about OpenGL... and since I'm running Linux only the latest figures are of any interest to me... At least, in the past they used to have both OpenGL and DirectX tests...
  • What does this tag mean? And on a related note why does whatcouldpossiblygowrong show up every third post? /rant
  • Okay, I can appreciate the ability to render using OpenGL in hardware for programs which are (a) on non-Windows workstations or (b) have no support for DirectX. However, why do you give a rats ass about sub-pixel, or even pixel-level, accuracy for cards which are rendering realtime for workstations graphics? It's not like users can actually see, or need to see, that type of effect. Unless you're recording off the video output (why?) instead of rendering to a completed file, it won't matter - and you would

news: gotcha

Working...