Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Nvidia 6600 Series Examined 251

DrunkenTerror writes "Yesterday at QuakeCon, Nvidia debuted their new affordable GPU mentioned a few days ago on Slashdot. Dubbed the GeForce 6600 and 6600 GT, they differ from their higher-end brethren by having only 8 pixel pipes (unlike the 12 & 16 of the 6800 line), and appear to be limited to 128MB of RAM. Both GPUs support Shader Model 3.0. The 6600 GT sports fast GDDR3 RAM, while the 6600 appears to use plain-jane DDR. The GT also supports the oft-recently-discussed SLI, which could 'enable millions of users to experience the power of two GPUs in their system.' The best part, however, may be the price/performance. With a suggested street price of US$199, the 6600 GT runs at a steady 42 FPS in Doom 3, at high-quality 1600*1200." Reader aceh0 adds a few links: "Nvidia is announcing their NV4x Sub $200 Level graphics hardware today with the GeForce 6600 Series. The 6600 Series is feature complete with the 6800s and the differences come in the number of pipelines and memory configuration. SLI has trickled down to the 6600GT as well. Coverage is available at Neoseeker, Tech Report and PC Perspective as well as other sites."
This discussion has been archived. No new comments can be posted.

Nvidia 6600 Series Examined

Comments Filter:
  • by blanks ( 108019 ) on Friday August 13, 2004 @10:53AM (#9959475) Homepage Journal
    http://www.xbitlabs.com/news/video/display/2004080 6105201.html "GeForce 6600 GT cards come with a 500 MHz clock and memory rate, 128-bit (GDDR3, 128 MB) and will cost $200-230, GeForce 6600 with 128-bit bus (GDDR, 128 MB) will cost $150. According to preliminary results and unconfirmed tests GeForce 6600 GT performs 20% better than RADEON 9800XT. "
  • SLI (Score:4, Informative)

    by Anonymous Coward on Friday August 13, 2004 @10:54AM (#9959482)
    Now we just need a motherboard with 2 PCIe 16X slots. Some of Intel's new server-class motherboards have it but they cost around $500.
    • Actually they don't, they have a single dedicated PCI Express x16 slot, and a physical x16 slot which is electrically an x4 slot coming off of a different x16 bus. There is no chipset with two dedicated x16 buses for single slots AFAIK.
    • Why do server class boards need multiple graphics card slots? Or are there ultra-high-speed network/compression/encryption cards than can take advantage of the bandwidth?
  • Not limited to 128mb (Score:5, Informative)

    by Pu'be ( 618443 ) on Friday August 13, 2004 @10:54AM (#9959488)
    It is limited to 256MB, but most manufactures will be shipping 128mb versions.
  • by LiberalApplication ( 570878 ) on Friday August 13, 2004 @10:55AM (#9959494)
    Dubbed the GeForce 6600 and 6600 GT, they differ from their higher-end brethren by having only 8 pixel pipes (unlike the 12 & 16 of the 6800 line)
    ...how long do you guys think it will it be before someone releases a driver mod/patch or hardware howto for unlocking the other pipelines? Or are they actually going to use chips that don't physically have them?
  • by Meat Blaster ( 578650 ) on Friday August 13, 2004 @10:57AM (#9959522)
    Graphics processing speed no longer seems to be the primary limiting factor in games.

    I.E. I noticed a bigger jump in performance by upgrading my mainboard, cpu, and memory while retaining my relatively mediocre (but fully DirectX 9 compliant) graphics card, whereas my friend who had a similar configuration spent his cash on the latest Nvidia and didn't seem to come out significantly ahead.

    If you can afford all of the above, I suppose this is the card for you (hell get two and run them together). But too often gamers focus on the graphics to the overall detriment of their performance.

    • Which games are you specifically referring to, I'm 90% sure that people buying these 'now' are primarily looking for gaming rigs for Doom3 (heh as am I :)) and that from the reviews is heavily dependant on GPU performance
    • **I.E. I noticed a bigger jump in performance by upgrading my mainboard, cpu, and memory while retaining my relatively mediocre (but fully DirectX 9 compliant) graphics card,**

      and which card is that and what were your previous specs before that?
      with games like doom3 graphics card is the dominant bottleneck.

      in fact I would say _just_the_opposite_, that the latest cpu no longer helps you as much as it used to before(in pre 1.5ghz days).
    • by ameoba ( 173803 ) on Friday August 13, 2004 @11:13AM (#9959718)
      Without mentioning system specs, this is a pointless post. Are we talking some guys coming from 800MHz P3s or do y'all have 2.4GHz machines?

      I'd suspect that your original CPU was somewhat lacking; 2-2.5GHz seems to be the the low end of what you can get away with for a respectable gaming machine these days. Once you reach this point, you're going to see a big difference jumping from your 5200 to a 5900.
    • Thats what I thought with doom3. So my coworkers and I all pitched in a few a bucks to buy doom. We wanted to see how it run on our dual xeon 3.06 ghz with 1gig of ram workstations. All I have to say it was unplayable with our quatro cards(still worth $300+ aka geforce 3 with line antialias).

      My box at home with a radeon9700 and 2.4 ghz P4 creams my workstation. 1024x768 on Medium, I can't even get 640x480 low to run smoothly on our workstations.

      • I think it is more like a little from column A and a little from column B.
      • That's because the Quadros don't take shortcuts that fuck up the image like consumer cards do. You can't afford to have your diagrams munged to get some extra fps. Gamers can, so that's what their cards do.
        • There are Quadro cards that use very old Nvidia chipset architectures. No amount of additional memory is going to make an NV25 (Geforce4 - 4x2 texel pipelines @ 300Mhz, ~10GB/s memory bandwidth, DX8 features) perform like an NV40 (Geforce6800 - 16 pixel pipelines @ 400Mhz, ~35GB/s memory bandwidth, DX9 compatible). Newer chips have more pipelines, more memory bandwidth, more advanced geometry engines, and run at higher speeds.
    • "Graphics processing speed no longer seems to be the primary limiting factor in games."

      Try playing 1600x1200 with 4xAA (or higher)... then see how your graphics card isn't the limiting factor :).

      Even with no AA, FS2004 can get my Radeon 9500 Pro down to 10 fps or less in heavy weather at 1600x1200.
    • It depends on the resolution you're running the game at. If you're running Doom 3 at 800x600 on a 6800GT then the CPU *IS* almost certainly the limiting factor for the framerate; you would probably see the exact same fps on a 5900. However, moving up to 1600x1200 (or above) and the rate's going to drop as the graphic card struggles to render the much larger images (the CPU is doing pretty much the same work as it was at 800x600).

      Of course, once you're running dual 6800GTs, then it's back to the CPU (and pr
  • by Anonymous Coward on Friday August 13, 2004 @10:57AM (#9959529)
    which will need to have a true GPU in order to run it's rendering engine much like Mac. If NVIDIA doesn't have a low end GPU it wouldn't get as big a part of the market.
    • quartz extreme has pretty liberal requirements.
      from apple's website:

      NVIDIA GeForce2 MX, GeForce3, GeForce4 MX, or GeForce4 Ti or any AGP-based ATI RADEON GPU. A minimum of 16MB VRAM is required.

      Tiger, coming out mid 2005, has an addition called Quartz 2D, in which the entire portion of the pipe line that used to require main system memory, is totally eliminated. This enables you to opengl pixel shaders on top of your already existing window compositions. If you haven't heard, or seen, dashboard for
    • If they don't make it even cheaper, and get it on OEM boards, it won't stand a chance anyway.

      The Nforce boards were nice, but they still failed to acheive good penetration.

      Steven V>

  • by Nos. ( 179609 ) <andrew@nOSPAm.thekerrs.ca> on Friday August 13, 2004 @10:59AM (#9959559) Homepage

    US$199, the 6600 GT runs at a steady 42 FPS in Doom 3, at high-quality 1600*1200

    In the end, regardless of what memory is being used, and what technologies, if I can play the newest game at its highest level of graphics at 42fps, then I'm a happy gamer, especially when the price is under $200 (USD).

    • The highest level of detail in Doom 3 is "Ultra" mode, not "High." Ultra requires a video card with 512MB of memory. Basically the difference between ultra and high is the use of completely uncompressed textures in ultra. I'm sure there's other minor differences, but not many.

      In fact, I've played Doom 3 at all 4 levels (low, medium, high, and ultra). The game looks damn near identical all the way through. You have to look pretty close to see differences.
    • Does anyone remember the scandle a little while ago when Half-Life2 came out and the NVidia cards were doing all sorts of crazy things to show misleading frame rates (like special optimizations only for HL2 framerate tests, etc). I imagine that thiese are real frame rates, but I wonder.
    • YMMV. I personally wouldn't play at anything less than 60.
  • version before the more ubiquitous AGP.
    Now if only they would write some firmware for the mac, I could finally have a half decent video card for my g5 that didn't cost me an arm and a leg.

    • The chip is native PCI Express. NVidia apparently is betting on the success of a second chip that they made for AGPPCI Express translation. The chip can operate in both directions and thus older GPUs can use this chip to work with PCI Express boards and this new 6600 GPU can use this same chip in the other direction to work with AGP boards. I wonder how much of a slowdown this chip brings though.
  • PCI Express only! (Score:5, Informative)

    by kneecarrot ( 646291 ) on Friday August 13, 2004 @11:01AM (#9959577)
    It should be noted that these cards will initially only arrive with PCI Express support. Given the fact that most people have only AGP ports, this is a barrier to adoption. It has been reported that AGP versions will follow.
    • I'd have PCI Express, if only I could find an AMD mobo with the slots. Anyone got any pointers for actual boards from decent manufacturers?
      • Re:PCI Express only! (Score:3, Interesting)

        by hawkbug ( 94280 )
        I guess by Christmas Nvidia will have released the Nforce4 chipset, which supports PCI Express. So, if you were to purchase a socket 939 AMD64 chip and one of these new boards, not only would you have a single PCI Express slot, but you'd have two! The new Nvidia chipset being worked on supposedly supports 2 slots for SLI gaming, which would be cool if I could afford it. I would think you'd need a 600 watt power supply or something though. And don't quote me on the Christmas thing, I had just read somewh
        • Re:PCI Express only! (Score:3, Interesting)

          by drinkypoo ( 153816 )
          My PC is an Athlon XP 2500+ with two optical drives, two hard drives, and a GF4Ti4200. It runs fine on a 350W power supply. I have a 450 now, just to be safe, but a system with a couple disks, a couple optical drives, a couple video cards and an opteron processor (more power-efficient than an Athlon XP) should be able to do fine with a 500W if not a 450.
  • YAY HYPEMACHINE (Score:2, Interesting)

    by ameoba ( 173803 )
    Great. We've got a dozen different hardware sites reciting press releases, specifications and mfgr's performance promises, all the while speculating about what the hardware may be capable of. Until somebody can actually say "I've been playing with one of these and they're pretty nifty", we might as well just have links point to pressrealease.nvidia.com.
    • I also like all those sites parroting the Nvidia slides and graphs. The "comparison" graphs that show that the 6600 is "3x" better than the X600 are a bunch of bullshit. Who the hell makes a bar graph where the ATI card is placed at 1.0 and then the Nvidia card is shown in towering bars that are so much bigger, thus means they are better. Give me a break. Show the actual fucking numbers instead of that bullshit graph.

      (As a sidenote, I'll still probably get one of these cards, but the damn biased graphs
  • Nvidia Rocks (Score:2, Insightful)

    by krgallagher ( 743575 )
    Nvidia rocks!

    OK glad I got that off my chest. Now, I run Linux and the only real gaming I do is in NeveWinter Nights. Maybe because I do not do any First Person Shooter / Real time gaming I do not notice a problem, but all my computers run the $49.00 special, Geforce card. I really like Geforce Cards. I love Nvidias support for Linux. I appreciate all you hard core gamers buying the new cards so they keep dropping the price on the other cards. I just can't get enthusiastic over a new video card when

  • by L0neW0lf ( 594121 ) on Friday August 13, 2004 @11:08AM (#9959660)
    This is going to be a very interesting comparison when the 6600 series comes out. Up until now, one could assume, at least in part, that a lot of the performance gains in the new NVidia 68xx series of hardware comes from the additional pipelines. I'd like very much to see how the 6600 series stack up against their older 8-pipeline brethren and ATI's 8-pipeline cards, such as the Radeon 97xx/98xx models.
  • Fanless? (Score:3, Insightful)

    by crow ( 16139 ) on Friday August 13, 2004 @11:08AM (#9959672) Homepage Journal
    Faster cards for gamers may be nice, but what I'm really interested in is a better card for my MythTV box. My main concern is having MPEG decoding for HDTV output and minimal heat output (no fans).

    I seem to recall nVidia promising better MPEG/HDTV support in there upcoming cards. Will the low-end of this generation be fanless?
  • TCO? (Score:5, Funny)

    by FerretFrottage ( 714136 ) on Friday August 13, 2004 @11:08AM (#9959674)
    2x6600 card: $400
    New Pci express motherboard: $150
    New case (cause you know your old pc will just be termed a "server"): $100
    New faster 1GB ram: $200
    New cpu because you're buying the other stuff anyway: $250
    Bigger sata 300 GB HD because bittorents are all about sharing: $200
    Wife cutting off your broadband connection: priceless

    • by swb ( 14022 )
      Very funny!

      It actually works that way, too. I bought a new mainboard and CPU. Had to buy a case, because my old one wouldn't support the new mainboard. I thought it would stop there, but I couldn't resist a new DVD-R drive. Since the new motherboard supported SATA, shit, I might as well get a SATA drive, too.

      Thank god the board has sound and NIC, or I'd have bought those too. The only thing I really DIDN'T buy was a new graphics card. The "server" did need it.
    • Two display cards? If you intend to SLI, please show me a $150 motherboard that has two x16 PCIe slots.
  • SLI hype? (Score:4, Insightful)

    by Daagar ( 764445 ) on Friday August 13, 2004 @11:10AM (#9959695)
    As nifty as the card sounds, the hype of SLI might be just that - as the Tech Report preview points out, there aren't any sub-$500 motherboards currently the sport dual PCI-Express slots. For people looking to incrementally upgrade, they'll have to factor in needed a new motherboard as well. We can only hope an "nForce3.5" chipset with dual PCI-Express slots and a sane price point shows up in tandem with the new cards...
  • by Mustang Matt ( 133426 ) on Friday August 13, 2004 @11:12AM (#9959712)
    Apple has a special version of the 6800 called the Dual Dual Link that can be doubled up to stack two of their new 30 inch monitors. In total this would give you 8.2 million pixels.

    Does anyone know if this $600 card will work on Windows?

    http://store.apple.com/1-800-MY-APPLE/WebObjects/A ppleStore.woa/71801/wo/Xh3MfiiL68pu26PtlpU8CzA3csU /0.0.9.1.0.6.21.1.2.1.3.0.0.1.0 [apple.com]
    • No, it will not work on Windows, for the same reason I can't go out and buy any video card I want for my G5. The DVI connections on the card are the newest version of the DVI spec. Just wait a little longer, and you'll be able to get a PC video card that supports Apple's 30" monitors.
  • If the price of the 6800GT part goes much higher than $199 it will cannibalize sales of the vanilla 6800 series. Man, it is such a challenge buying a video card these days. It's really difficult to find the "sweet spot" of price versus performance. For example, I can walk across the street to a retailer who will charge me $450 for a 9800XT at this very moment.
    • Man, it is such a challenge buying a video card these days. It's really difficult to find the "sweet spot" of price versus performance.

      No kidding.

      I prefer not to pay more then $200 for a video card, because that's the pain threshold for my wallet. (If a $500 card goes up in smoke, I'd be seriously put out, but a $200 card I would be more reasonable about it going poof.)

      Combined with the fact that there are easily two dozen manufs of NVIDIA cards, complete with their own differences (memory speeds, e
  • I wonder if a pair of 6600 GT's SLI'd would perform better than a 6800 Ultra? If so you could get more performance for a lot less $. Of course you would need a mobo with dual 16x PCI Express slots, which from searching NewEgg doesn't exist (or is so rare they don't carry it). Btw how DO they expect you to run SLI if you can't find a board with two PCI Express x16 slots?
  • by DroopyStonx ( 683090 ) on Friday August 13, 2004 @11:32AM (#9959926)
    Not to take points away from the article, but if you're looking to get a graphics card, take a peek at Nvidia's 6800 GT.

    16 pipelines AND it can *easily* be overclocked from it's 350Mhz core / 1000 Mhz memory to the 425/1100 speeds of a 6800 Ultra (which is $150 more).

    Compare benchmarks: http://www.nzone.com/object/nzone_doom3_benchmarks .html [nzone.com]

    ATI's X800 Pro has 12 pipelines.

    I dunno, if you're gonna spend money on a graphics card, might as well go balls-out with this one. Best deal I've seen on a card in quite some time.
    • That's what I ultimately decided. $400 is a hefty price for a video card, no doubt about that, but since I intend on playing a lot of Doom 3 (In GNU/Linux, of course!) and not upgrading for another two years, the 6800GT just seemed like the best chioce for prolonged gaming pleasure.

      Couple that with fantastic Linux support from nVidia, and the 6800GT is definitely a winner.
      • by Aadain2001 ( 684036 ) on Friday August 13, 2004 @12:59PM (#9960970) Journal
        The support for Linux is what is going to keep me buy Nvidia cards for the foreseeable future. Ati driver support is BAAAAAAAAAAD under Linux (I have a laptop with an Ati card running Linux). Nvidia gives soooooo much information! Their readme file that comes with the driver explains every option that you can put in the XFree86 or xorg config file and talks about setting up TV out and Dual Head displays. The Ati site does nothing more than say "yes, we can do TV out and dual heads" and then never explains how!!! Nvidia has done a great job embracing Linux, and I'm going to reward them with my $$ :)
    • Yes, the 6800 is $300, but double the performance of the 9800 Pro, which costs $200. So it has a better price/performance ratio!
    • It is also twice as expensive (MSRP of $399).
    • by bogie ( 31020 ) on Friday August 13, 2004 @01:11PM (#9961139) Journal
      "I dunno, if you're gonna spend money on a graphics card, might as well go balls-out with this one."

      Too bad that the majority of gamers are not in the position to do anything like that. I mean how likely is it that everyone has $350 to blow on a gpu? Not bloody likely is the reality. At that price and above your talking about a small percentage of gamers. The rest as always will be sticking with way sub $200 cards where gpu vendors continue to make their bread and butter.
      • If it was any other card w/ 16 pipelines then I wouldn't have made mention of it, but you get a LOT out of the 6800 GT. Not very many graphics cards (that I know of, anyway) are easily overclockable to the one above it like this one is.

        After Doom 3, games will no doubt start really pushing the limits when it comes to graphics. How long will an 8 pipeline card be able to hold its own?

        It might be good for another year or two, but beyond that you're gonna end up pushing it to the limits, or at the very least
  • Pennies Less (Score:4, Interesting)

    by Nom du Keyboard ( 633989 ) on Friday August 13, 2004 @11:53AM (#9960167)
    they differ from their higher-end brethren by having only 8 pixel pipes (unlike the 12 & 16 of the 6800 line)

    And I'll bet they cost pennies less to make than the higher-end chips. Translation: the higher-end chips should cost pennies -- not hundreds of $$$s -- more to buy.

    Think about it. How much has it cost Nvidia to engineer this new chip? Either it is a crippled version of their existing chip, or they had to re-engineer it, make new masks, and setup a new, qualified production line at quite high costs.

    Wouldn't we -- and they -- have been better off if they just punched out larger quanitites of the higher-end chips at less cost?

    • Re:Pennies Less (Score:5, Insightful)

      by squarooticus ( 5092 ) on Friday August 13, 2004 @12:23PM (#9960482) Homepage
      Wouldn't we -- and they -- have been better off if they just punched out larger quanitites of the higher-end chips at less cost?

      No, neither you nor they would be better off. Companies like nVidia and ATI rely on the quick infusion of lots of cash from the early adopters to fund R&D on better GPU's. If they sold their best chips at the same cost as an FX5200, funding for innovating these great chips would dry up and you'd have to wait much longer for new designs and better performance.

      The best way to get these companies to reduce their costs is simply not to buy their equipment. The laws of supply and demand will naturally produce an equilibrium in which they sell their products at a price point that maximizes their own profit. If their best cards are $500, then you can be assured that there are enough people out there willing to pay $500 to make it more profitable for them to sell it for that price than for $499, $250, $120, or $60. If you aren't one of those people willing to pay $500, then either (a) produce your own damn GPU or (b) wait for the prices to come down. Either way, stop whining.
    • There is a significant chance that the 6600 is a way of salvaging "reject" 6800s in which one or more of the pipelines failed QA. (Possibly because of a dust speck or other such problems - Even in the cleanest of clean rooms there are still contaminants, which is why IC yield is NEVER 100% and is typically lower the larger the chip is.)

      Solution: Rather than just junk the entire chip, disable the pipelines that don't work and sell the chip as a lower-performance one with the pipelines that do.

      It was once
  • 42fps (Score:3, Insightful)

    by Nom du Keyboard ( 633989 ) on Friday August 13, 2004 @12:05PM (#9960310)
    runs at a steady 42 FPS in Doom 3

    And how many people are disturbed by this on their 85Hz to 110Hz vertical refresh monitors? More than should be, I'll bet.

    Standard movies only run at 24fps, and American television is only a true 30fps (1/2 of the interlaced frame is written every 1/60 of a second). Demanding frame rates much above those seems an absurd form of posturing.

    • If your monitor is set to a slow 85Hz, then getting more than 85FPS is indeed a waste. But, 100+ Hz on both montior and game would be optimal. Just because YOU can't see it, doesn't mean that it doesn't irratate some of us.

      And, for your comment that standard movies are fast enough, I CANNOT go to movie theaters because of their slow frame rate. 100+ Hz would be better, or just project it through an LCD projector.
      • I do go to movies but when they do pans sideways it makes me want to spew. It feels like someone is ripping my forehead apart with a meat tenderizer when there's full-screen pans, it does something that just really really disturbs my head. One of the fantastic things about IMAX is that it's 30 fps, which solves that problem. However, I sometimes have a different problem with IMAX: I went to the Tijuana cultural center (there is such a thing) and in their IMAX watched a presentation on the culture of Mexico

    • by raygundan ( 16760 ) on Friday August 13, 2004 @12:28PM (#9960551) Homepage
      It may seem absurd, but there are legitimate reasons why 3D cards need a higher framerate to represent the same smooth motion of 24fps movies.

      I've explained this before, and I'll do it again. Television and video have motion blur-- the effect of the capture device essentially "averaging" the motion that occurred across the duration of capturing that frame.

      Video cards generate a crisp, instantaneous frame that represents only the precise instant the frame was rendered, not the whole time the "shutter" was open.

      At a *bare minimum* producing motion that looks as smooth as blurred 24fps requires double that. (You have to have two frames for your eye to blur between) To do it as well as a film camera requires even more, since their motion blur is effectively an infinite number of samples averaged together over the duration the shutter was open. I'd guess you could get a reasonable approximation at 3x the framerate.

      TV and Movies are also filmed with the 24fps limitation in mind-- good cinematographers are well aware of the limits and know how to avoid situations that would result in jerky movement.
      • At a *bare minimum* producing motion that looks as smooth as blurred 24fps requires double that.

        The simplest example I give to people who say '24fps is enough' is to tell them to wave their mouse cursor around onscreen as fast as they can. The cursor image is being updated at at least 30fps (more like 60fps), yet you can still see discrete cursor images with gaps between them as opposed to one smoothly-moving cursor.

  • You know, if my GF4 Ti4200 had SLI capabilities that would make my life much easier right now. i could step up the whole level of graphics performance in my system by a VERY noticeable margin. i look forward to these new cards because it means by the time my $200 video card is going the way of the dodo I can shel out $75 or whatever and add a second one and viola, extended life.

The truth of a proposition has nothing to do with its credibility. And vice versa.

Working...