Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Graphics Software Technology

NVIDIA 6200 w/ TurboCache Released 195

duanep writes "Gamers Depot has posted a first look review at NVIDIA's just announced GeForce 6200 cards with TurboCache - the first graphics cards that truely take advantage of the PCI Express bus by using system RAM to store textures."
This discussion has been archived. No new comments can be posted.

NVIDIA 6200 w/ TurboCache Released

Comments Filter:
  • More reviews (Score:5, Informative)

    by florin ( 2243 ) * on Thursday December 16, 2004 @09:02AM (#11102836)
    Here are some other reviews:
    TechReport [techreport.com]
    AnandTech [anandtech.com]
    HotHardware [hothardware.com]

    Some of these make a little more sense because they benchmark the 6200TC against some of its direct competitors in the low end instead of against a mid range card.

    I think Gamers Depot's conclusion is a bit off too. What's notable isn't that it is slower than enthusiast cards. Of course it is. What's surprising is how well it still runs the very newest games, despite the drawbacks associated with that pricing range.
    • hothardware "thx nvidia for giving us this card remark"

      hothardware:

      here is something we should mention about performance, however, especially considering the measurable performance hits the 6200's took with AA enabled. Here's a quote from NVIDIA that explains what was happening...

      "The 6200 with TurboCache was designed for the mainstream user. Because of this we made some architectural design decisions such as not supporting color and z-compression. This is not a TurboCache limitation. This was a consci
  • Great for windows / productivity use, and running of spinning cube 3D screensaver.

    What's sad is that this card will pop up in gazillion 'budget' home machine that are then sold by clueless salesdroids to even more clueless moms and pops as 'gaming machine' with 'TURBOcache' (so it must be TURBO good).

    And naturally such computer will stutter along happily with anything slightly more demanding than CounterStrike (the original one).

    *sigh*
    • Clueless Salesdroid must be pretty smart to realise that "mom and pop" don't give a rat's arse how well their PC plays games.
      • by Jarnis ( 266190 ) on Thursday December 16, 2004 @09:18AM (#11102938)
        Well, actually some do. If their kid is bugging for a gaming machine, and the salesdroid sells to the clueless parents a computer with part like this, the kid is bound to be disappointed.

        I work in computer repairs. EVERY christmas I get people who come asking to see if their computer is somehow broken because "it's so slow". Almost every one of them never bought it from us - they bought it cheap from some big chain electronics store (HP, Compaq, Fujitsu.. you name it) with non-existing support on computers ('call the manufacturer'). Quite often the 'so slow' is because they've been sold a cheap year-old system with 256MB RAM & Windows XP and a video card that can't run games. As a 'computer great for all uses. kids can even play games on it!'.

        Basically they were duped into buying not-so-cheap old tech with crap specs. Commonly with same money the could've bought a noticeably faster computer built from parts, but they trusted the 'big name' retail chain more than a specialist store.

        So, I stand by my original post. Clueless salesdroids will sell computers that contain these cards as 'great for gaming', and their target audience will be disappointed.

        If you want a computer for productivity apps, any builtin onboard video works just fine, and is cheaper to boot. A PCI-E 'turbocache' low end card is not going to change your windows desktop experience one iota. It's just a piece of junk 'low end gaming card' that underperforms for it's target use (gaming). Selling cheap crap cards using same brand name (GeForce) as their top end 500$ ubermonster cards is called 'milking the brand at all price points'. At least AMD has the decency to sell their low end stuff under another brand (Sempron). Videocard companies should do the exact same thing.

        Thankfully it's noticeably faster than crap like Geforce 4 MX and GeForce FX5200.
        • by Anonymous Coward on Thursday December 16, 2004 @09:22AM (#11102959)
          Maybe mom and pop should buy a "homework" PC, so their snot nosed kid can pay for their own gaming machine some day.
          • Maybe mom and pop should buy a "homework" PC, so their snot nosed kid can pay for their own gaming machine some day.

            Hmm, sounds like someone has issues.... I'm guessing Mom and Dad didn't spring for that new 386 back in the day? :)

            Anyway your argument is rather silly and counterproductive. Plenty of kids (including me) played a ton of computer video games and still got straight-A's. However, me getting a C64 when I was 7 got me interested in computers in the first place. After playing around with th
            • My son and I built his most recent computer. We started a while ago with a leftover Athlon 1200 of mine and a $20 "refurb" motherboard, and other various junk I had in the bin. Over the past couple of years virtually every component has been replaced through Christmas gifts, "great grades" gifts, birthday presents, etc., and he's purchased some parts with his summer job money. I think the only component he still has that he started with is a hand-me-down keyboard. My only restriction on him so far has b
        • I am one of those sales droids, unfortunatley. I will specifically tell people what each component does if they will listen to me, and when I say it will play a lot of games fine, but not most newer games, they will tell me, "oh, I don't want it for games anyway." and then bring it back in a week saying that their kid couldn't play Doom 3 on it.

          So, it's not just the sales people that create this problem, it's customers not listening to the sales people. I will, however, admit I do sometimes tell someone "o
        • I have an FX5200 and recently completed Half Life. It's not terrible, I practially bought it for the price of it's construction. It does run Counterstrike well and that's actually why I've not invested in new hardware for a long time. Counterstrike has contented me for the last year and a half.
        • Sorry, it's hard to beat a $800 system from Dell, Gateway, HPaq, EMachines that also includes a printer, a 17" lcd monitor, and a decent 3GHz P4.

          It's just not possible to do this on your own. Maybe spec from a whitebox seller you could, but not if you're getting your parts off of the retail bin, even if you're buying OEM parts.

          For one, buying XP Home/Pro retail eats a good chunk of your budget right there.
          • And, while an FX5200 may *seem* like a crap card, when I bought mine last year for $120, it seemed like a good price point.

            There is just no fucking way I could ever justify buying a $400 video card, not even to myself.

            Besides, Halo runs just fine with it, once I turned off most of the essential crap (like antialiasing, etc), on my Athlon 1400 to boot.

            If I were to recommend a "training" program to FPS players, it would be to play on-line using a slow system with framerate and lag issues. You tend to get V
    • Comment removed based on user account deletion
      • Yeah but for those stats they tested it on a P4 3.6Ghz with Corsair ram and a good mobo with an Audigy card in it.

        Anyone who puts this cheap ass video card in a system like that is only crippling their system. Heck, that CPU alone retails for $450. I would rather save money and buy a slightly cheaper CPU and then get a better video card.

        If you match this card up with what it's meant to be paired with, IE a value CPU, then it really is a piece of crap and you'd be better off getting a cheap Geforce 4 off E
      • I ignore benchmarks that test a low-end card with high end monster setup.

        Besides, 36FPS is kinda poor. Sure, it beats old low end crap. It's actually not that far from good midrange cards. But still, I would *not* recommend it for gaming. Maybe as a low-cost option for those who just can't afford better, and understand it might underperform a bit in latest games.

        Besides, Both Doom3 and HL2 are *not* that intensive on videocards. Try Everquest 2. It makes 6800 Ultras cry. Thing is, that's forward-looking e
      • http://theinquirer.net/?article=20318

        Is getting 6fps?

        Suddenly the achilles heel shows up. It uses system RAM to substitute for video ram, and boom, using 512MB RAM the FPS drops to one third of what it is with 1GB.

        Great thing nvidia gave guidelines to testers of this card - 'use fast cpu and 1GB RAM'

        This card is pure crap.
    • Well. if you actually RTFA, you would see that the card is playable for modern games at 1024x768. Slightly older games at 1280x1024. Seeing how the types of systems the card is meant for the consumer is likely to have an LCD that has a max resolution of 1024x768 or 1280x1024, this means the card will work fine.

      Not everyone wants to have an uber gaming card that takes up multiple slots, sounds like a dustbuster, and costs as much an entire mid spec computer. http://arstechnica.com/guides/buyer/system-09-2
      • I read the fucking article, and YOU'RE WRONG. Start from this benchmark [anandtech.com] and keep going. Doom3 800x600 no AA got 34 and 21.5fps with the new cards. That's barely playable and unplayable at 1024. Farcry 800x600 gets 36.3 and 27.1. In just six months the newest games will only be playable at 640x480.
    • You're too harsh on these low-end cards. A while back I bought my daughters a PC with built-in nForce graphics. Sure, it's not that quick, but it runs their games okay, if not brilliantly. Sure the resolution is normally 640x480 or 800x600, but they seem happy with it, and it's cheap. I figured I could get them a decent card for it if they needed it, but now it doesn't look like I'll need to.

      This is (IIRC) an nForce-2 board. Basically has a GeForce 4MX built in using 32Mb of system RAM for video memory.

      N
    • Mmm, yes, sad. I can feel myself getting a little misty even as I type this.

      In other news, mass murderer Osama Bin Laden released a new tape today [reuters.com], confirming he is alive and and kicking and intent upon more mass murdering; x people got blown up in Iraq today [news.com.au], where x is a real number between 10 and 300; The Sudanese are starving [breakingnews.iol.ie]; and N. Korea and Iran will probably have a shitload of nukes [rferl.org] by the end of the decade.

      Goddamn those bastards at Nvidia for needlessly adding to the world's sadness.
  • A much better review (Score:3, Informative)

    by Anonymous Coward on Thursday December 16, 2004 @09:05AM (#11102854)
    Since the review posted in the blurb is about as informative as an NVIDIA press release, check out the review at Hexus. [hexus.net] It's not Beyond3D, but it will do.
  • Also at Anand's (Score:5, Insightful)

    by Emil Brink ( 69213 ) on Thursday December 16, 2004 @09:06AM (#11102863) Homepage

    AnandTech [anandtech.com] also has a review up. I'm wondering if this solution will be interesting to... anyone, basically. Perhaps if/once it becomes available integrated into or onto motherboard chipsets.

    Btw, I find AnandTech's terminology annoying, they refer to all graphics memory as "the framebuffer" which I find inaccurate. In my world, the frame buffer is only that part of graphics memory that has a 1-to-1 mapping to on-screen pixels. Front- and backbuffers, stencil and Z buffers, basically. Not texture buffers, off-screen rendering targets, geometry arrays, and all that stuff. Oh well. Nice review anyway. :)

    • Re:Also at Anand's (Score:2, Informative)

      by Riff6809 ( 780550 )
      One definition of a frame buffer is any buffer that stores the contents of an image using individual pixels. Your prefered usage adds on the distinction that the buffer is used to refresh a raster image. I do prefer restricting the use of 'frame buffer' to the memory buffer used to refresh the raster display, but there are other instances where the other definition has been used. The Nuon architecture and programming documentation refers to any memory region that is capable of being displayed or manipula
  • Older card better? (Score:4, Insightful)

    by uid100 ( 540265 ) on Thursday December 16, 2004 @09:08AM (#11102882)
    Wouldn't it make more sense to buy a 6 month to year old card that has on board (and *faster*) memory?
    • And be that much farther behind the Curve? NO! more power to the Engines, Mister Scott!
    • by Quarters ( 18322 )
      If this card were meant to be sold primarily at retail, yes, but you and I are not the target market for this card. Dell, IBM, HP, eMachines, Apple, etc... are the customers nVidia wants with this. To a systems integrator "Runs the latest DX9 (or OGL) apps and is dirt cheap because it uses system RAM" is a huge selling point. nVidia wants a lucrative contract to supply this things to Dell for 12-18 months.
  • Advantage? (Score:5, Insightful)

    by RAMMS+EIN ( 578166 ) on Thursday December 16, 2004 @09:12AM (#11102902) Homepage Journal
    ``the first graphics cards that truely take advantage of the PCI Express bus by using system RAM to store textures''

    The advantage of which is that you have less system RAM available for other stuff?
    • Re:Advantage? (Score:3, Insightful)

      by Tx ( 96709 )
      The advantage of which is that you have less system RAM available for other stuff?

      The advantage of which is that system RAm is cheaper, and most people have more of it than they need.
      • I strongly disagree. Most people have 256mb or less. There are actually people out there running a pc w/128MB that CAME with XP preloaded. Win2k uses 128mb just booting and logging in, let alone XP. Most people have far less ram than they need. For intance one app should not put you into swap.
    • Not necessarily, because this card is largely meant to compete with integrated video chipsets from the likes of Intel, SiS, ATI and yes Nvidia themselves, which typically also use system RAM, but which are nowhere near competitive with this thing when it comes to feature support and performance.
    • What else are you doing while your gaming? Isn't hard enough playing Day of Defeat against 14 year olds who have the reflexes of a pissed off cheetahs.
    • This is the same way Intel used the i740 as a tool to promote AGP. The i740 drivers (and perhaps the hardware) were incapable of mapping textures from local memory. Since all textures had to be stored in system memory, the marketing diagrams showed "OMG! AGP2x is 4 times faster than PCI!!!".

      The i740 architecture eventually became integrated into the motherboard chipset as the i810. In-memory texturing makes a lot more sense there.

  • Am I the only one (Score:3, Insightful)

    by PhrostyMcByte ( 589271 ) <phrosty@gmail.com> on Thursday December 16, 2004 @09:12AM (#11102907) Homepage
    who wouldn't pay $80 for a card with 16mb of video ram? you can get a faster geforce4 card for the same price. no applications that use dx9 are going to run properly on the thing anyway, so what's the point?
    • You can't get a GF4 card in PCI Express, that's why, and most PCI-E motherboards don't have AGP slots.
    • by ionpro ( 34327 ) on Thursday December 16, 2004 @02:12PM (#11106785) Homepage
      ... because obviously, 45.9 fps in Half-Life 2 [anandtech.com] at 1024x768 is unplayable. And just look at those screenshots! All those missing features! [/sarcasm]
  • by WhatAmIDoingHere ( 742870 ) * <sexwithanimals@gmail.com> on Thursday December 16, 2004 @09:14AM (#11102919) Homepage
    Covered on TheReg [theregister.co.uk].
  • "...truely take advantage of the PCI Express bus by using system RAM to store textures."

    Isn't this just a so-called feature of the AGP spec originally that nearly no one used because performance sucked and it was cheaper to just place the RAM onboard to begin with?

    • Isn't this just a so-called feature of the AGP spec originally that nearly no one used because performance sucked and it was cheaper to just place the RAM onboard to begin with?

      Kinda... the whole idea with AGP was to use system memory instead of überexpensive RAM on the videocard that was avaliable at the time. Unfortunately (?) memory prices started a 3-year dive at the same time the first systems where introduced, making the whole thing unnecessary.
      • But still, this features has been available in all AGP-cards (OCI-Express as well). Why are they now telling people that this is a new and exiting technology, when in fact it has been around since AGP was introduced?

        Am I missing something here? What's the difference between TurboCache and regural AGP-texturing?
    • SiS made extensive use of it on cards like the 6326, although note that the 6326 also did this for PCI as well as AGP. Ironically this seems to work very well again with modern games as they use shaders and so need a lot less texture bandwidth anyway.

  • by B5_geek ( 638928 ) on Thursday December 16, 2004 @09:17AM (#11102930)
    This feels like Deja Vu all over again.

    I thought we were supposed to hate and graphic card that uses System RAM ?!?!

    My guess is either:

    a) Nvidia & ATi want more profit/card then they are getting. Onboard RAM is expensive so let's try this trick again.

    b) PCI-E is honestly and truly better able to keep up with the proformance and memory requirements that moden gamers require in a gaming box.

    I think it's all about the $$.
    • First off for those who didn't RTFA the card DOES have Fast RAM onboard 16MB of it -- which hasn't been enough to run a 3-d game in a long time, so it's capable of keeping textures (and only textures) in system ram. This is really only a 'good' solution for the 'value' segment. Frankly I think the boys at Nvidia have been feeling the crunch as thier 'low end' cards have been 'too good' and not enough people are paying for 'premium' cards. You can find 128MB video cards for as low as $35 online nowadays.
    • PCI-E is pretty damn snappy. I can run UT2004 at 1600x1200 with all the eye candy turned on and the only time I notice the framerate drop is when I'm doing tight turns in fast vehicles. This is with a mid-range ATI PCI-E card, though I doubt that the top of the line model would dramatically improve performance.
  • by Anonymous Coward
    NVIDIA's just announced GeForce 6200 cards with TurboCache - the first graphics cards that truely take advantage of the PCI Express bus by using system RAM to store textures."

    BZZT, WRONG. Here [3dlabs.com] is the first PCI Express video card that stores textures in system memory.

    (For that matter, 3Dlabs were the first to release an _AGP_ card that stored textures in system memory: anyone remember the Oxygen chip?)
  • by Kosi ( 589267 ) on Thursday December 16, 2004 @09:31AM (#11103002)
    And these morons at Nvidia try to sell it as

    a) new - WTF, abusing system RAM for
    graphics RAM is really old!

    and

    b) faster - BS, direct attached RAM on the
    card itself can't be outperformed
    over whatever bus the card sits in!

    instead of what it really is: a bad and old trick to save costs for real graphics memory.

    They even encourage the card manufacturers to conceal the fact of the crippeled RAM size, they tell them to write "supports up to 128 MB" instead of "has only 16 MB" on the packages.

    The bad thing is that there are enough idiots out there who will buy this shit that Nvidia will get away with it.
    • Errr... Thanks for the helpful comments. And good work there by the mods, modding this up. If you read the Anandtech review, all the way to the end (yes, yes, /. RTFA never happens...) you'll find that NV are forcing packagers to declare the supports up to and the size of the onboard memory, much to the chargrin of sysbuilders and dell (who don't produce computers, just pieces of shit). The point of this card is the price, and the fact that it fully supports DX9 and perhaps even beyond (when it gets defined
    • The first Intel AGP graphics cards used the same trick and some reviewers actually claimed that they'd out-perform other graphics cards from the likes of 3dfx, etc.

      That didn't happen by a long shot -- AGP never kept up with on-board memory speeds.
    • This card will probably end up being the number one card for OEMs to use in their not-shit systems. Hardly a 'moronic' target market for nVidia.

      Yes, having only 16/32 or 64 MB of graphics memory on a card is cheaper than having 128 or 256 MB of graphics memory on the card.

      Also nVidia is forcing the sellers to write how much local memory there is on the card on the packaging.

      I think that anyone getting a low-end computer with this card will be happy with the gaming performance. Then again, I read the revi
      • It turns out that the memory isn't the part that cranks the costs up. RAM is pretty damn cheap; the difference between memory costs for 64MB and 128MB is negligable. But to add a 128-bit memory bus, you have to add more layers to the card, which boosts production costs significantly. With the 32-bit bus on these cards, a 3-layer video card becomes possible, rather than 6-8 layers on something like a 6800GT.
    • ...instead of what it really is: a bad and old trick to save costs for real graphics memory.

      But see, you've missed the point entirely. This card is billed as a "value" card; it's not for us, it's for people (read: OEMs) who want to put a $60 card into a machine. Using this "trick", with the bandwidth that PCI-E provides, gets the cards unprecedented performance at that price point.

      PRICE is the priority, here, not performance. They're using this old trick with these new tools (PCI-E) to get good perfor

    • The card cost $79, which means the consumer saves money. Sure it won't perform as good as an onboard memory card, but for that price, it is a good card.
  • Truely? (Score:4, Funny)

    by AyeRoxor! ( 471669 ) on Thursday December 16, 2004 @09:42AM (#11103062) Journal
    Interesting. [reference.com]
  • This is not so bad (Score:2, Informative)

    by GeLeTo ( 527660 )
    This TurboCache thing is much beter than the original AGP texturing idea (that Intel used to push with their i740 chipsets).

    Imagine that when texturing instead of using 128 bit bus to the on-card memory - the card now uses a 128 bit bus to the on-card memory PLUS(!!!) another 128 bit bus to the local memory thus giving you higher bandwidth for the same cost.

    Of course this can be used to boost a bit the speed of cards with crippled (slow, 64 bit) memory bus, but in the end - you get what you paid for.
    • Imagine that when texturing instead of using 128 bit bus to the on-card memory - the card now uses a 128 bit bus to the on-card memory PLUS(!!!) another 128 bit bus to the local memory thus giving you higher bandwidth for the same cost.

      Isn't that the way it's done in AGP-texturing? Difference is, that these days vid-cards have so much on-board RAM that it's not really needed. But if you try to store too much textures, the ones that do not fit in the RAM on the vid-card are stored on the system-RAM and are

  • I'm hoping they fixed their quality problems on the 6600GT line. Between myself and a friend, 5 cards so far, all with bad video RAM. Go ahead, fire up 3DMark and see if your screen pixelates...

    I ended up getting a Radeon X700 Pro instead, and I had said I'd never buy another ATI card because of their crappy drivers.
  • At the price range of the TC64, one might as well just go get a Radeon 9800 Pro.

    First the SLI, now the TC crap, and the decline in quality of the Nforce chipset after the shining pinnacle that was the Nforce2Ultra, NVidia is really backsliding. Looks like I will refrain from buying NVidia products for the next while.

  • I'll grant you that's a driver issue, but since the manufacturer insists on maintaining control, then they are also responsible for any and all driver bugs.

    If they're the only one's who can fix it, then it's their responsibility.
  • I noticed some interesting text on the graphics [gamers-depot.com]: "Can render directly to system memory with 100% efficiency"

    Isn't this a fix for one of the 3D rendering artists' biggest complaints? I don't do a lot of it now, but back when I was working in a shop that did rendering of CG animation our 3D geeks constantly complained that our graphics cards didn't have as much bandwidth out to the system as they did through their video out port. They even tossed around the idea of doing video capture on the graphics card
  • Sounds to me like AGP DiME with another name. Anyone else remember the i740 and it's complete lack of texture memory? Yeah, that was a great idea, since it still had to have an onboard 8MB framebuffer. Oh, surprise, surprise, so does the 6200 TC (32MB). Oh yeah, that 32MB is probably also why the card sucks for AA...it doesn't have the memory to spare.

    So, let's see. You're buying an expensive system with PCIe and dual-channel DDR, plus an expensive CPU, and then you pair it with this excuse for a vide
  • by Kris_J ( 10111 ) * on Thursday December 16, 2004 @06:22PM (#11109904) Homepage Journal
    I notice the lack of fan. Is this the best passively cooled video card on the market? It's better than everything I've currently got, so when I do my next upgrade it might be worth trying for a silent PC, instead of giving up and going for the fastest, loudest thing available. A nice Zalman 7000-series heatsink for the CPU and my gaming PC doesn't have to sound like a plane taking off.
    • Here's a review [firingsquad.com]. Funny you should mention Zalman, this card comes with one of their heatpipe coolers on it. Quite a bit faster, but also nearly twice the price.

      -Ryan
      • You can bung a Zalman on lots of cards, but cards that need something that large still dump a lot of heat into the case that you have to shift. Small, simple, passive is much healthier if you have, say, an Antec Phantom fanless power supply.

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...