Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

GeForce 7800 GTX Review 377

ThinSkin writes "ExtremeTech has the first review of nVidia's latest GPU architecture, the Geforce 7800 GTX. Benchmarked against nVidia's previous 6800 Ultra and ATI's latest Radeon X850 XT PE, the 7800 GTX comes out as the fastest video card to date. The unit ships today with a price tag of $599. While nVidia may enjoy this brief moment in the limelight with the fastest card, it may be short-lived once ATI comes out with their latest GPU technology, code-named R520, which is suspected to come out within the next two months."
This discussion has been archived. No new comments can be posted.

GeForce 7800 GTX Review

Comments Filter:
  • And the ATI R520... (Score:3, Informative)

    by daveschroeder ( 516195 ) * on Wednesday June 22, 2005 @09:49AM (#12880862)
    ...has hardware H.264 codec support [xbitlabs.com].

    And this technology is, in part, targeted at low- to mid-range systems and laptops, meaning it's not going to be part of video chipsets that only cost $599...further meaning that it wouldn't be beyond the realm of comprehension, since Apple is already an ATI customer, for Apple to use something like this in a Mac mini-type product, answering the questions of "how could the Mac mini possibly play back HD?" in the Mac-mini-as-HD-media-center Mac-mini-as-iTunes-HD-Movie-Store-player scenarios.

    Off-topic? No, the R520 is mentioned directly in the submission, and one of its primary features is H.264 [apple.com] hardware acceleration. This is huge.
    • So in that case, if the R520 (and its associated H.264) is aimed at low-to-midrange systems, why is the article saying that it'll possibly surpass the speed of the new NVidia offering? Is it going to both be fast and cheap? That'd be nice.

      -Jesse
      • by TobyWong ( 168498 ) on Wednesday June 22, 2005 @10:08AM (#12881040)
        The R520 is going to be insanely expensive, just like the 7800GTX. These are bleeding edge enthusiast level cards not really intended for the mass market. The nutjobs aka early adopters spend $600+ USD to get the latest and greatest vid card and then a year later everyone else gets the same technology for $150.

        I say this as one of the aforementioned nutjobs.

    • uh, BFD? (Score:4, Informative)

      by ashpool7 ( 18172 ) on Wednesday June 22, 2005 @10:27AM (#12881222) Homepage Journal
      • Re:uh, BFD? (Score:2, Interesting)

        The 7800 is $599.

        Some of the R520 family offerings will be targeted at entire computers that are under $500.

        So, yes, BFD.
        • Re:uh, BFD? (Score:3, Insightful)

          by Slack3r78 ( 596506 )
          Are you really this unaware of how the GPU market works? There will be budget versions of the nVidia cards. Period.

          In fact, nVidia is the choice of many enthusiasts right now specifically because they pulled through with a solidly performing mid-range card in the 6600 where ATI failed to do so. ATI's competitor to the 6600GT was supposed to be the X700XT - IE: the card that was paper launched and *NEVER* made it to market. So while ATI had a slight lead on the high end this time around, nVidia was the way
    • It is only H.264 decode support. It would be a dream come true if it had H.264 encode support in hardware.
    • by DeadBugs ( 546475 ) on Wednesday June 22, 2005 @10:33AM (#12881259) Homepage
      Actually NVIDIA's old cards that have been out for over a year support H.264.

      http://www.nvidia.com/object/IO_16213.html [nvidia.com]

      If you download The latest Windows Media Player and have a 6xxx series video card with "pure video" you can run hardware accelerated H.264 video.

      Even the ultra-cheap 6200 line (which would work just fine in a mac-mini)can do this.
      • In that whole document H.264 is mentioned once, and only to say that PureVideo is "adaptable," which presumably means that it may support it in the future. From Anandtech:

        NVIDIA has also said that the 7800 GTX should support H.264, but have said that the driver will not have support until near year's end. As we have already seen an H.264 demo from ATI, and the lack of anything tangible from NVIDIA at this point is disappointing. We are hesitant to even mention NVIDIA's claimed "support" before we see it

      • There is a lot of difference in power consumption and heat between the Radeon 9200 vs. the 6200. The 9200 is basically a Radeon 8500, but built on a smaller die size, to reduce heat and power, and underclocked compaired to the 8500. This leads to very little heat and power use. While the 6200 is not the glutton that the 6800s or x850s are, it still requires lots of power and a large heatsink.
  • by mister_llah ( 891540 ) on Wednesday June 22, 2005 @09:50AM (#12880865) Homepage Journal
    Hmm, $600?

    I think I have a spare kidney. ...

    Bastards.
    • Bundle this card with two 17" flat panels and some means of driving both of them, for $1,000, and I might go knock over a few grocery stores.
      However, having my naughty bits tied to Redmond is unappealing; if you haven't the guts to GPL the drivers, at least make sure that the media-video/nvidia-kernel .ebuild is up to snuff.
      <lumberg voice>Thanks,</lumberg voice>
      Chris
    • spending over $200 for any video card is just nuts to me, but if you stop and think about the transistor count and sheer power in todays cards the price is justified compared to a motherboard and cpu.

      if they bundled a driver which emulated a x86 processor on the GPU and showed up as a normal cpu in Windows or Linux, so then you could run highly optimized vector and matrix math on it they might open a whole new market for these cards in the scientific communities.
  • RSX (Score:2, Interesting)

    by Anonymous Coward
    How does this compare to the RSX planned for the PS3?
    • Re:RSX (Score:3, Funny)

      by VAXcat ( 674775 )
      It's a crime that the sacred letters RSX are being used for anything on the PS3, or on cars...RSX11M V3.2 forever!
    • Re:RSX (Score:3, Insightful)

      by The Kow ( 184414 )
      I was at a presentation NVidia held here in San Francisco where they talked about it, and my vague understanding (they mentioned the PS3 mostly just to keep the crowd whipped up) was that the PS3 card was based on the 6800 model, though it would still support stuff like Shader Model 3.0, and possibly their High Dynamic Range rendering, too.
  • Wonderful (Score:5, Funny)

    by WayneTheGoblin ( 843267 ) on Wednesday June 22, 2005 @09:51AM (#12880877) Homepage
    So now I can almost play Doom 3.
  • by Anonymous Coward on Wednesday June 22, 2005 @09:52AM (#12880886)
    That's not limelight but the healthy green glow of a power supply pushed to the limit.
  • by It doesn't come easy ( 695416 ) * on Wednesday June 22, 2005 @09:52AM (#12880888) Journal
    While nVidia may enjoy this brief moment in the limelight with the fastest card, it may be short-lived once ATI comes out with their latest GPU technology, code-named R092064262670, which is suspected to come out within the next two minutes.
  • In some respects, laptops will always lag behind desktops as it always takes longer to miniaturize than to develop in the first place. So I wonder what the best graphics chips are for laptops and how much of a time lag we can expect between the release of a new card and the time it takes to put it into a portable machine.

    Desktops are very cumbersome and difficult to carry to LAN parties and elsewhere, which is why I prefer laptops (even desktop-replacement laptops are more portable than true desktop compu
    • is because the latest and greatest graphics cards guzzle power like there's no tomorrow and development of batteries is lagging way behind all the other bits.
      Oh and cooling them isn't too easy either.
      Even when laptop versions of GPUs are released, they're usually castrated versions with lower clocks and fewer pipelines (and occasionally completely different cores than the name would suggest).
      As with all things, you have to compromise. You want the fastest performance - you have to pay a premium. You want
  • 6600GT seems to be the best bang for the buck right now, and unless you have money to burn and do nothing but play video games (or you make money off of using such a high quality video card) then there's no point in buying it.
  • So much money for a card that hardly performs all that much better than a 6800 Ultra. I'm an nVidia fan, but if R520 is as good as everyone says, I'm getting that...in a year or two of course...unless an IT job actually pays me good money... *runs to closet to cry*
    • I'm an nVidia fan, but if R520 is as good as everyone says, I'm getting that

      Stick with nVidia, especially if you're running an OS other than Windows. ATI drivers in Linux still stink.

      unless an IT job actually pays me good money

      So.. much.. pain.. felt.
      • I don't mean to troll or be flamebait here, but why would you buy an extremely high end video card if you're going to use Linux? Generally speaking, people buy high end video cards to game. The overwhelming majority of games only have Windows support (though things are happily getting better for the Linux crowd, if slowly). Linux support isn't generally a concern for those spending 600$ on a video card to play video games with.

        Completely off topic, but are there emu10k1 drivers for the 2.6.x kernel yet
  • I know the lines have been drawn between the Nvidia folks and the ATI folks. Having used both myself, I'm more of an Nvidia guy personally, but I respect both sides... I'm just glad that both companies are actively involved in making better products, because without competition, I think the market would stagnate pretty quick. Its competition that drives us (ok, them!)

    As much as I'm not happy about a $600 card, I'll probably wait a year or two until it drops to maybe around $400, then I'll bite. I'd like to
    • by rAiNsT0rm ( 877553 ) on Wednesday June 22, 2005 @10:36AM (#12881285) Homepage
      This is precisely what is wrong, we need stagnation so that developers can actually focus on and utilize a video card. In the current state NONE of the features of current cards are being utilized properly.

      Ever notice how it takes a year or so for console games to really begin to shine? This never happens because in 14 months 8-10 cards have come and gone. If there was some standardization and a slowdown the industry could focus on content rather than FPS in a two or three year old game that doesn't utilize ANY of the new cards features.

      The FX line of cards had the ability to be great but needed to be programmed for directly, and because of trying to cover ATI and other vendors none of the cool features ever saw daylight (remember the cloth/trasparency demo's)

      I know ATI and Nvidia will never try to standardize, nor will they slow the flow of cards with small increases in actual performance at high prices, but if they would PC's could actually get utilized to their fullest potential (hell this 7800gtx TURNS OFF TRANSISTORS to save power, just showing how under-utilized and un-needed they truly are)

      Same for Game consoles, standardize, build them into consumer electronics... sell in quantity with less marketing, R&D, and loss and sell billions of games. It is a win/win for hardware manufacturers and developers... just as soon as people wake up.
      • um, Direct3D? OpenGL? Nobody, strictly speaking, targets any video card. They target specific APIs, and it's up to video card vendors to cater to those APIs. Just because NVIDIA is the only card today to support Shader Model 3.0 doesn't mean that vendors writing games for SM 3.0 are only supporting NVIDIA. Just means that ATI doesn't support SM 3.0 yet, but their next gen chip will, and then those games will run the same code as the NVIDIA cards do. :P
  • by alstor ( 587931 ) on Wednesday June 22, 2005 @09:58AM (#12880948) Homepage
    Tom's also has a set of reviews and links available.

    http://graphics.tomshardware.com/graphic/200506221 /index.html/ [tomshardware.com]
  • er nvidia.. (Score:2, Interesting)

    by DeathByDuke ( 823199 )
    ...why is this called Geforce 7? It behaves and performs more like a Geforce 6 refresh. It should've been a 6900.

    On another note, is the price tag worth it? Theres a lot of geforce 6800 Ultra/Radeon x800XT/850 users who arent going to see nothing more than a 10fps increase in Doom3 at 1600x1200 4xFSAA.

    Don't get me wrong, I'm not trolling. I'm disappointed. I like many other doubtlessly are, are checking these reviews with a view in mind on maybe purchasing such a card in the future. I'm actually conce
    • You're definitely right. And also if you look at some of the gains, you'd notice that they're from really high refresh rates to begin with, so it doesn't really make THAT much of a difference with respect to how much you're paying for the card.
    • They name the cards based on the GPU version. This is the Geforce 7 chip. This also isn't going to be the end-all, super-hi Geforce 7 as I would assume an Ultra version is to be released with a price tag that includes an arm and a leg.
      • No, they're replacing 'Ultra' with 'GTX' now. So this is the equivalent of the Ultra model from previous generations.
        • Well I'll stick with my recently-purchased 6600GT for now then. Those numbers are horrible for a new top-of-the-line card. Shame on you nVidia!
    • Because it's not a refresh?

      It's a completely new GPU.

      "refresh" = same GPU with faster clocked memory or something along those lines.

      If you believe any of the rumors then Nvidia is waiting for ATI to announce their R520 before they unleash their own "ultra" 7800 model or whatever they end up calling it.

      I wouldn't be suprised at all if this were true considering the conservative memory clockings on the GTX. GTX also only has 256MBs RAM on board and you gotta figure theres a 512 meg part waiting in the w
    • Re:er nvidia.. (Score:3, Insightful)

      by 2megs ( 8751 )
      On another note, is the price tag worth it? Theres a lot of geforce 6800 Ultra/Radeon x800XT/850 users who arent going to see nothing more than a 10fps increase in Doom3 at 1600x1200 4xFSAA.

      If you already own one of those cards, prossibly not, but not everyone has bought a new video card in the last six months. To someone with, say, a Radeon 9800 XT, perhaps the jump in performance has now gone from "not worth it" for a 6800 Ultra to "hey, that's a big step up". Similarly, the X800 was a worthwhile upgra
  • Brand loyalty... (Score:4, Insightful)

    by Xugumad ( 39311 ) on Wednesday June 22, 2005 @10:00AM (#12880966)
    "...gamers and PC builders staunchly defend their favorite brands while throwing mud in the face of the other, treating anecdotal evidence as gospel"

    Am I truly the only person willing to switch happily between Nvidia and ATi, depending on which best fits my needs at the time?
    • by crow ( 16139 ) on Wednesday June 22, 2005 @10:09AM (#12881052) Homepage Journal
      I'll only buy nVidia because it's the only one with halfway decent drivers for Linux.
    • I tend to be loyal to NVIDIA for a few reasons. But from a purely practical point of view, I just think ATI's drivers tend to suck. It's really a shame, because I feel like, in general, they make pretty nice hardware. It's not always the most innovative hardware, but it's good. But their drivers suck and are buggy, and their OpenGL implementation is abysmal. Don't get me started on Linux drivers.

      That said, ATI is working on rewriting their OpenGL implementation, and I heard they're expanding their Li

      • You know what though? ATI always has "good" drivers just around the corner. Heck, I know it's not a really fair comparision, but for just one game (City of Heros) I have a desktop with a GeForce 5900 and a laptop with a Mobility Radeon X300. Now, the X300 is a low end card, so I'm not expecting a whole lot out of it, but the fact that 20% of the time it forgets to render scenes entirely (on the latest drivers no less!) and just leaves the screen black is just annoying. The fact that it messes up the cur
    • Usually the reason for this is because that person has been burned by one company or the other in the past and have switched. Or, they've had exceptionally good performance from one brand and see others having abysmal performance in the other.

      Personally, I'm an ATI fan because back when I was upgrading my computer for the first time I saw how all the GeForce2s were the best cards on the market (at the time VooDoo who I'd used before was in its death throes). I saved my pennies and bought a GeForce2...

      • Re:Brand loyalty... (Score:5, Informative)

        by default luser ( 529332 ) on Wednesday June 22, 2005 @12:35PM (#12882421) Journal
        I saved my pennies and bought a GeForce2...MX...and found out what a horrible decision that was. That card was actually worse than the card I already had AND the GeForce 1 Ultras

        This was standard practice well before Nvidia released the GeForce 2 MX. Nvidia already pissed off the world by releasing the TNT2 M64, which performed worse than the original Riva TNT.

        Your venerated 3DFX is also guilty of such actions, by releasing the Banshee six months after the Voodoo 2. This single-pipe combo card performed worse than a single Voodoo 2, and offered no SLI upgrade path.

        You'll get no condolences from me. Price normally relects performance in this market. The GeForce 2 MX was actually a steal at the time it was released; it was one of the best-performing budget cards ever. It bested the previous generation GeForce SDR in performance, something you wouldn't expect from a budget card. It was, however, beaten in performance by the GeForce DDR...and the later breakdown into the models 200 (64-bit) and 400 (128-bit) only cheapened the MX brand.

        As far as I know (as in, this might not be the case in the recent past with the new PCI-X cards) ATI's numbering scheme is straightforward.

        Actually, ATI has been the WORST offender in this category, especially in the 9xx0 series of cards. For a simple example, the Radeon 9000, 9100, 9200 and 9250 are all DirectX 8.1 cards, and are all actually slower revamps of the Radeon 8500. This is contrary to the "9000" series numbering, which at the very least would imply these cards would have *some* defining new features.

        But let's look at your examples, thay have issues too...

        A 9600 is worse than a 9700. A 9600 Pro is worse than a 9700.

        True, but is a 9600 XT faster than a 9700? The performance is closer than you'd think. IS there really a need for the 9600 XT when the 9700 already exists? ATI sure thought so.

        A 9700 Pro is worse than a 9800, etc.

        Not true.

        9700 Pro: 325Mhz Core, 620MHz DDR memory.
        9800: 310MHz Core, 580Mhz DDR memory.

        There was little change in the core between 9700 and 9800, so the clock speeds can be directly compared.

        This, of course, ignores the extremely annoying lower cost "128-bit" Radeon 9800 cards (which are not well marked), 9600 SE cards that are barely as capable in performance as a 9200, the 9550 series (introduced well after the 9500 was replaced by the 9600).

        Its much easier than trying to explain to them "Oh, get the 7800GT, not the 7800LT" (or whatever their latest business-class card is for that generation)

        While Nvidia is just as guilty as ATI of playing the name game and causing ludicrous overlap (Nvidia FX series especially), they have really cleaned up their act with the 6000 series.

        This is the entire lineup:

        6200 TC, 6200

        6600, 6600 GT

        6800, 6800 GT, 6800 Ultra

        That's it. Compared to ATI's xXX0 PCIe lineup numbers, this is a walk in the park. Furthermore, there is no overlap between series (except say, overlap created by companies like BFG Tech who sell overclocked parts, but that's out of Nvidia's hands).

        The 6200 is slower than the 6600.

        The 6600 GT is slower than the 6800.

        And now, the 7800 is faster than the 6800 Ultra.

        What's so confusing here?
    • I'm still pissed off at ATi for not releasing XP drivers for a card I had when nVidia did. 4 years ago, sure, but I hold a grudge. I do my best to never give ATi any money.
  • by gmknobl ( 669948 ) on Wednesday June 22, 2005 @10:00AM (#12880974) Journal
    A few things:

    a) They are pricing themselves beyond reason for even enthusiasts. Not too long ago, the top level for a graphics card was $400. That was expensive but within reach. I think they may be passing the point were even the enthusiast crowd will purhcase this.

    b) Most people will wait until the next products come out from them and ATI. I mean, when you know that cheaper products will come out with most of the performance AND that better products will come out with better performance in this same series, why buy this? Just one example - remember ATI's 9700.

    c) It's just for prestige anyway. That's the real reason this card has been released. They'll wait until ATI comes out with a reply card, wait a few months, and come out with something faster again and get good PR OR not have anything faster and suffer the consequences in bad PR.
    • I think I agree with you. But it's hard to speculate. NVIDIA's sales will determine if they're pricing it out of reach. If people buy it for $600 anyway, then they'll continue to price new boards at $600 when the 8000 series first comes out. If nobody buys this thing now, then hopefully they'll lower the price for the next series.
    • Newest top end (Score:3, Informative)

      Well ... the very newest, top-end cards may be out of reach of all but those who consider "quad opteron" a serious option for their next gaming box. Don't laugh, there are certain to be a few out there.

      Thing is, it doesn't matter. Doing so:
      • means they can still claim to have "the fastest card on the market" even if they can't afford to sell many (remember, at small yields it can cost MEGABUCKS to make these things);
      • makes the other cards in their range look more reasonably priced by comparison
      • makes the
  • Price Point Comment (Score:3, Interesting)

    by dannyitc ( 892023 ) on Wednesday June 22, 2005 @10:07AM (#12881026)
    Quoted from anandtech:

    One of the most impressive aspects of this launch is that the part is available now. I mean right now. Order it today and plug it in tomorrow. That's right, not only has NVIDIA gotten the part to vendors, but vendors have gotten their product all the way to retailers. This is unprecedented for any graphics hardware launch in recent memory. In the midst of all the recent paper launches in the computer hardware industry, this move is a challenge to all other hardware design houses.

    ATI is particularly on the spot after today. Their recent history of announcing products that don't see any significant volume in the retail market for months is disruptive in and of itself. Now that NVIDIA has made this move, ATI absolutely must follow suit. Over the past year, the public has been getting quite tired of failed assurances that product will be available "next week". This very refreshing blast of availability is long overdue. ATI cannot afford to have R520 availability "soon" after launch; ATI must have products available for retail purchase at launch.

    I would assume one of the reasons the price point is higher is the fact that this card was pushed to retail much faster than either nvidia or ati has been able to do before. I would suspect that, given an amount of time comparable to the normal lag between launch and having the card available on shelves, the price will be more comparable to launch prices we're accustomed to seeing.

  • by rAiNsT0rm ( 877553 ) on Wednesday June 22, 2005 @10:10AM (#12881054) Homepage
    No doubt that this is the heart of the PS3, even the 302 million transistor comparisons were the same used at E3. We are seeing today what the GPU of the PS3 will be, and it is pretty darn impressive. However, even in volume the price point is very high... even a year down the road I can't see this bugger going lower than 300-400 retail.

    Even at a loss the PS3 seems to be placing itself in the $400+ market as thought.
    • No doubt that this is the heart of the PS3.

      Don't be silly. PS3 is coming out in a year, there's no way Sony will be buying a chip that has been out in mass market for a year for their new flagship entertainment product. On top of that, this GPU is the same as the previous one from nVidia, with extra pipelines -- it's hardly impressive!

      IMO, the PS3 will be using the next-generation GPU that will most likely be available for PC at about the same time as the PS3.
    • No, they've stated before that the RSX is not yet finished.

      This is the GPU that was used at E3 for the PS3 demos. But this is not the same as the RSX GPU that is going to go into the PS3. They were just using the G70 because that's the best thing that they had immediately, and it's what they've been providing the PS3 developers to use until they can actually get RSX hardware to them. That's the same thing Microsoft was doing by sending Xbox 360 developers Apple PowerMac machines to develop with. It's

    • by rAiNsT0rm ( 877553 ) on Wednesday June 22, 2005 @10:52AM (#12881443) Homepage
      I understand what you all are saying, but this is basically the heart of the PS3. It is what was running at E3, and no one would put that much effort into developing for a temporary vid card (especially the Unreal engine). Nothing major is going to be different between this and the RSX, just small tweaks... otherwise the transistor count wouldn't be similar nor the featureset.

      Sony needs the price to be reasonable, these will be stable in production by then and even if there are slight differences in production the major core will be the same. The costs will be down and this will indeed be basically the heart of the PS3. HDR, transparency, AA/AF, all these will be what the PS3's new titles utilize. Any variation from the 7800GTX to the RSX will be minimal.
      • One interesting thing to note is the focus on generic shaders. In all previous generation GPUs from both NVIDIA and ATI, the drivers have done a runtime substitution for shaders, in order to optimize certain parts of very populate games for certain hardware.

        The 7800 is relying upon having a really robust general-purpose shader engine. For example, they recognized that the MADD instruction is being used a lot so they've got it supported in multiple ALUs rather than one.

        This is important for the PS3, f

  • Reportedly will come with a nail stuck in it!
  • Need more power... (Score:5, Interesting)

    by Duncan3 ( 10537 ) on Wednesday June 22, 2005 @10:14AM (#12881087) Homepage
    100 watts... joy.

    Someone needs to build a card that draws single digit wattage and will drive 2048x1536 displays, and they will sell loads of them. I cannot be the only one sick of the jet engine noise and space heater performance.

    Ya know, like an Mac Mini, only with high resolution.
    • It's called compromise.

      An MSI FX5200 has no moving parts, gets ~30-45FPS at 800x600 or so [1024x768x16bpp works fine too] in games like UT2K4.

      Yeah, sure it isn't 1600x1200 with 16xAA and 78-bit colour ... but it's also only 100 bucks and doesn't make noise ;-)

      Tom
      • My FX5900 is still doing a great job for me both in Windows and Linux. Considering that my most GPU intensive games are Half Life 2, Doom 3, and Neverwinter Nights, I see no reason to upgrade in the near future.

        Maybe when Duke Nukem Forever [3drealms.com] is released I'll think about an upgrade. The 5900 was a pretty substantial leap from my previous 4MX and has held up pretty well in the year or so that I've been using it.

      • Low end nVidia cards such as the 5200 you mention do a poor job [anandtech.com] with high-res 2d, though, which was the focus of the parent poster's question. In the 2d arena, ATI's offerings do a better job, apparently because there are fewer OEMs who use more consistently high-quality parts.
        • they do what? I use the MSI FX 5200 on my 17" LCD and it looks just fine.

          Maybe I'm not videophile enough but it's crisp enough to edit source code/papers [with odd fonts and symbols] and yet coo enough to handle a bttv device and run 3d games...

          Tom
    • by Zed2K ( 313037 )
      As far as noise goes I was also tired of it with my 6800 ultra. I picked up a zalman cooler for these cards. Yes the one with the massive heatsink and fan. And it is amazing. Not only do I run about 20 degrees cooler but it is silent even at the fastest fan rate. I liked it so much I just ordered their CPU cooler for my 3.4Ghz P4 now that that has turned into a turbine under load.
    • Any card [newegg.com] can drive a 2048x1536 display if all you're doing is relatively static desktop stuff (check the specs).

      But not even a vacuum cleaner video card will play Doom 3 at that resolution today. And as soon as you create a quiet, low-power card that can handle Doom 3 at that resolution, someone will write a game that only runs at 800x600 at 10 fps on that card, and you'll need to buy a newer card.

      Which are you asking for? You can get a quiet card that will drive a big-ass display, but don't expect it to
    • Someone needs to build a card that draws single digit wattage and will drive 2048x1536 displays, and they will sell loads of them. I cannot be the only one sick of the jet engine noise and space heater performance.

      You most surely are not. I see no need to buy a card that takes more power than most entire laptops. Sure, the exposions might be prettier, but do people not realize that the power that goes into their PCs actually comes from somewhere? These things are like the Ford Excursions of the comput
    • by radish ( 98371 )
      The problem isn't the GPU, it's the cooler. It's amusing to me that you pay $600 for a video card and get a $5 fan on it that sounds like a leaf blower. Solution: replace the cooler. I switched the POS on my 6800 with a Zalman cooler and it's wonderful. Took 10 minutes and $25 and now it's (virtually) silent, cooler, better looking (if that matters to you), and more stable. There are also some Gigabyte cards out there based on 6800 chipsets with passive (heatpipe) cooling. Haven't tried one myself, but thos
  • Death of PC gaming (Score:3, Insightful)

    by Snowbeam ( 96416 ) on Wednesday June 22, 2005 @10:15AM (#12881099) Homepage
    There has been some talk about the death of PC gaming. With video cards costing this much, it's cheaper to just buy a gaming console and get better effects out of that.
    • Keep in mind that NVIDIA has so far only released the 7800 GTX (the equivalent of their 'Ultra' names from before), and that we have expected these to be really expensive. Once they release the 7800 GT, we'll discover if they're actually pricing the video cards out of reach.

      The Ultra and GTX boards are the super high-end enthusiast boards. And I doubt that Xbox 360 or PS3 GPUs will be better than the best PC GPUs even at the time of their release. If they are, it will be very short-lived. The PCs will

  • Summary: (Score:2, Funny)

    by beef3k ( 551086 )
    Your games will run about 15-20 fps faster, or a general 30% performance gain.

    Conclusion: Spend your money on beer instead.
  • Two months is not a "brief moment" in the graphics card industry.
  • R520 (Score:3, Funny)

    by williamhooligan ( 892067 ) on Wednesday June 22, 2005 @10:22AM (#12881169)
    it may be short-lived once ATI comes out with their latest GPU technology, code-named R520

    What a crap codename. If I was inventing what was going to be the fastest chip around, I'd have called it "Codename: BASTARDFIRE" or "SHITSTORM" or something. Let the marketing guys mod it down to R520 upon release.

  • While it's true that this doesn't seem to be much of an increase in speed especially compared to moving from the nvidia 5xxx FX series to the 6xxx series of videocards, the 7800 gtx does make it faster than ATI's current fastest card. This makes it so when a regular (non-nerd) person who plays the occasional computer game asks his/her computer nerd friend "hey, who makes the fastest video cards right now" the response will be Nvidia. That means that the non-nerd person will be more likely to buy the Nvidi
  • by aendeuryu ( 844048 ) on Wednesday June 22, 2005 @10:26AM (#12881213)
    When all those articles were coming out about Doom 3 being such a sophisticated engine that current hardware couldn't take full advantage of it, I couldn't help but wonder, how do you know that? How do you test a claim regarding performance on non-existent hardware? So, that got me wondering, was Doom 3 tested on nvidia 7800 prototypes, or maybe 8800 (pretending it exists)? Further to the point, if Id has access to this avant garde stuff, what can we expect?

    I'm not writing this as a skeptic. I'm honestly just curious.
    • That's just his way of deflecting the fact that the game sucks ass to "oh your card is not good enough".

      People tried that in the mid/early 90s saying you'd need 16MB of ram and a 90Mhz 586 to play a certain game or two. People didn't go for that then... why are they going for it now?

      Of course I question the use of these cards and think that their mass production is just a pain of society in terms of wasted power. When people can render a game at 500FPS and still not enjoy it ... will they finally see th
    • id does get new graphics cards _far_ in advance of their release... something like 6 months to a year in advance. Nvidia and ATI both keep in very close contact with John all the way through the development process. John has even talked about how he has helped them track down driver bugs for unreleased hardware before.

      If you can, try to go to QuakeCon [quakecon.org] sometime. John's keynote is always enlightening (except that last year he gave it via a prerecorded DVD... which was kind of boring.... but I guess the bi
  • Nividia Solid (Score:5, Interesting)

    by augustz ( 18082 ) on Wednesday June 22, 2005 @10:47AM (#12881396)
    Lots of somewhat bogus postings.

    The 7800 performs significantly better than 6800. In fact, reading through the (many) reviews that all popped up with NDA's expiring, in higher res / anti-aliasing a single 7800 is beating dual 6800's SLI. Of course, choice of benchmark affects these results, but it does look like a generational increase in speed.

    In addition, it uses LESS POWER. No one seems to be mentioning this, but these cards suck up rediculous amounts of power. This bodes well for cheaper versions.

    And cheaper versions are going to be coming, this release is for the insane gaming crowd that is already spending $1k on SLI setups. The price/value at this point is not the point, it is just about how fast you can go.

    ATI feels like they are a generation behind to me. They are coming out with first gen SLI, first gen Shader 3, while Nvidia is already on their second spins.

    The key of course is when they release their next gen part (and by this I mean actual retail volume, not a paper launch). In six months another cycle of cards will be coming through, so one has to be careful to compare apples to apples.

    Plus of course there is the nice AMD64 and Linux support (not perfect, but good) from Nvidia. Bottom line, will wait to see the ATI part, and how available it actually is, before singing its praises.
  • by glMatrixMode ( 631669 ) on Wednesday June 22, 2005 @10:53AM (#12881457)
    They fail to deliver useful drivers for *nix. X.Org developers should be able to implement all what they want, and for that they need better-documented hardware. Only then will we have a real eyecandy, hardware-accelerated desktop à la Quartz Extreme.

    This is why the Open Graphics Project [duskglow.com] is so important.

    The project has already been mentionned twice on Slashdot, but since then it has made a lot of progress. Skimming through their mailing list archives shows that they're even creating their own company to produce the graphics card. The company's name is "Traversal Technology". A website is coming soon.
  • Is it me or does saying "ATI will beat it in a couple of months" sound awful like "Longhorn will be better than this though"?
  • And For That Price (Score:3, Insightful)

    by crawling_chaos ( 23007 ) on Wednesday June 22, 2005 @11:10AM (#12881619) Homepage
    I can buy a PS3 and an Xbox 360, both of which will have games that aren't technology demos masquerading as entertainment (hi id!). Six hundred bucks for a video card is outrageous given the sorry state of PC games today. The kind of games that excel on the PC (RTS, MMORG, and other RPGs) don't really need that kind of processing power, particularly at that price point.

    Anybody else think that this sort of thing just isn't sustainable?

    • by Zed2K ( 313037 )
      "I can buy a PS3 and an Xbox 360,"

      Actually you can't because the PS3 and Xbox 360 don't exist yet and no pricing info has been officially announced for either new console.

      " both of which will have games that aren't technology demos masquerading as entertainment (hi id!)."

      But instead will have games that are repeats of games we've all played a million times over again.
  • by Vaystrem ( 761 ) on Wednesday June 22, 2005 @11:57AM (#12882057)
    http://www.anandtech.com/video/showdoc.aspx?i=2451 [anandtech.com]Anandtech has an excellent review which includes power consumption information and a good overview of technology in the new chip.

    http://www.beyond3d.com/previews/nvidia/g70/ [beyond3d.com]Beyond 3D as always has a fantastic writeup including information on: CPU Utilization for video decoding, noise, power consumption, etc.

New crypt. See /usr/news/crypt.

Working...