Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software Entertainment Games Hardware

GeForce 7800 GTX 512 Reviewed 37

ThinSkin writes "Today Nvidia released its latest combatant in the desktop graphics wars in the wake of ATI's new X1800 line, the GeForce 7800 GTX 512. The clock rate has been upped as well as the memory, partly thanks to a truly massive cooling solution. ExtremeTech's Jason Cross does all the benchmarking on a board from XFX, which is slightly overclocked and includes VIVO capabilities. At $650 list, it also sets a new price record for a new generation desktop graphics card."
This discussion has been archived. No new comments can be posted.

GeForce 7800 GTX 512 Reviewed

Comments Filter:
    • Re:Other Reviews (Score:3, Informative)

      by plover ( 150551 ) *
      Bigger question: have any of the reviews discovered if nVidia's cheated [extremetech.com] on this benchmark yet?

      It seems that any time ATI or nVidia releases a new card, they've also got some drivers that "optimize" for the 3DMark benchmarking software. So I figure it must just mean that nobody's found out how they're doing it yet.

      It's kind of sad to think that when they announce some obviously kick-ass hardware that all I can think of is "how did they cheat this time?"

      • Re:Other Reviews (Score:3, Informative)

        Bigger question: have any of the reviews discovered if nVidia's cheated on this benchmark yet?
        It seems that any time ATI or nVidia releases a new card, they've also got some drivers that "optimize" for the 3DMark benchmarking software. So I figure it must just mean that nobody's found out how they're doing it yet.

        It's kind of sad to think that when they announce some obviously kick-ass hardware that all I can think of is "how did they cheat this time?"


        The GeForce FX scandal was a few years ago, and no major
        • Re:Other Reviews (Score:3, Interesting)

          by plover ( 150551 ) *
          Because nVidia wasn't the only company cheating. ATI was also found [extremetech.com] to be "optimizing" for benchmarks, too. Yes, it was a couple of years ago. But you give too much credit to think that people have stopped being stupid. It won't surprise me at all if it happens again.
      • Re:Other Reviews (Score:2, Informative)

        by dimfeld ( 247690 )
        That's just another reason that reviewers are using real games as benchmarks more and more. It also helps that they give a much better impression of the card's capabilities in real-life situations.
        • Re:Other Reviews (Score:3, Interesting)

          by plover ( 150551 ) *
          Don't be fooled just because it's not an official benchmark program. Remember Quack vs Quake? A benchmarker renamed Quake3 to Quack3, and found the ATI card's performance dropped because the drivers were tuned to the specific application named Quake3.exe.

          What they never identified was an answer to this question: "Was this an optimization for Quake, or was this a deliberate attempt to improve ATI's standings in the benchmark wars?" If you download the ATI drivers, you'll find that every new driver patc

          • What they never identified was an answer to this question: "Was this an optimization for Quake, or was this a deliberate attempt to improve ATI's standings in the benchmark wars?"

            It lowered the quality settings a bit only when the executable was named specifically "quake3.exe". So, no matter what the "real" end result someone wanted, doing so improved ATI's standings for Quake3.

            One of my favorite not-so-malicious tactics was Nvidia holding out extremely improved drives until ATI launched a new line.
          • I'd imagine that the answer to the question is "both." Personally, I don't have a problem with this sort of optimization, since the user will probably get the same benefit when actually playing the game, and that's what really matters.

            I'd be unhappy if there were optimizations that only turned on when running a game in demo mode or when FRAPS was loaded (or whatever is being used for FPS measurements nowadays). I haven't heard about anything like that yet though.
      • It seems that any time ATI or nVidia releases a new card, they've also got some drivers that "optimize" for the 3DMark benchmarking software. So I figure it must just mean that nobody's found out how they're doing it yet.

        Actually, it's not every time. It's much more rare. They've learned that not all good publicity is good publicity. On top of that, they will always get caught because there's masses of fanboys of the opposing line of cards watching for that type of mucking with the drivers.
  • I only partially RTFA but I can't see any mention of SLI, 1 Gb of video memory is a dream.
  • And now for the slew of, "WTF? Who pays that much for a video card!" "If game developers knew what they were doing, we could all play Half Life 2 on my Geforce 2." "This is why consoles are cheaper/more appealing."
    • How about "WTF? Who wants to look at crappy graphics on a console when you can have creamy goodness from high-end cards like these?" Once you get used to nice graphics, consoles just look like Fisher Price toys - big, safe, chunky, blocky, slow, and mostly made of primary colors. Put a real rendering engine to work for you some time and there is just no way you want to play on crap like an XBox or PS2 ever again. And unless you've got a spectactular television, there's simply no way an NTSC-based displa
    • Sure why not. Someone has to keep these mindless idiots in check.

      The problem is thinking like yours means you'll never have "the right card". Once you save up your salary from Walmart to buy this 7800 the "7800 X-treme e-gamer E-edition" will be out and you'll be like "shit, damn and I just bought this 350W graphics card for nothing!"

      And frankly I don't know why you guys are impressed. My graphics card spends 99.9999% of the time in 2D mode [rendering my glorious X11/Gnome desktop]. I do play games but
      • And frankly I don't know why you guys are impressed ... I do play games but really not a big part of my daily life.

        Because games ARE a big part of the daily life of many people. (Not ALL people, but many.) Leaving computer graphics aside for a moment, how many sports fans do you know? Football, baseball, hockey, basketball? How many of them covet HDTV sets so they can watch the big game in more detail, or see larger-than-life closeups? And when one of these guys gets a plasma TV, is it coincidence th

        • Ah don't worry about him. Sounds more like he is the jealous one working at Walmart on his bargain bin Lindows PC. Why else woule he cry that he doesent need that kind of expensive video card. It would be like posting on a car website and ranting about why I would spend $250,000 on a Ferrari when all I need is a Honda. Morons. If you have nothing to contribute then shut the hell up.
          • Actually I have my own "nifty" computing devices [among other things AMDx2 and Pentium D desktops]. But I buy them because I actually do productive work on them [ok the pentium was solely for benchmarking ...]. When building a project that has 100K lines in it [from scratch when doing customer drop tests] it's nice to know I can do 30 builds an hour instead of 5. [to fine tune scripts and such].

            Things like these cards make no sense. My FX5200 was capable of playing games like UT2k4 just fine. My 6600 i
            • Your still not getting the point. It might not make sence to you but to others it does. These over the top cards are for people who want bragging rights as well as top preformance.
              And did I say I was impressed? I have a 6800GT and it runs most of todays cutting edge games just fine.

              I dont mean to be an ass hole but who cares what you do with your computer? If you dont need that card then don't but it. End of story. Reguardless of what you say people will buy it. Just like the people who buy ferraries to dri
              • It's called a discussion. I'm raising the question of whether we should encourage this trend. Suggesting perhaps that a better course of action would be looking at new sources of performance.

                I'm not saying I'm right or everyone should bow to my wisdom. I'm just trying to act anti-sheep and suggest we as customers ought to demand more than power-hungry graphics cards [or processors for that matter].

                Tom
            • I'm asking why are you impressed? Engineering is about making the most from the least.

              And this card is about making more from less. They're clocking their stock chipset far above their previous 7800 card and massively piling on heat-reduction gear like a rabid overclocker in a copper tube factory. And they're making more money :-)

              Keep in mind that it's not about the frame rate. It's about keeping the frame rate high in increasingly complex situations. Obviously, a frame rate higher than your monitor

              • I upgraded to the 6600 because my AMDx2 motherboard is PCI-E. So is my 915G [and 945] I use for my Intel gear.

                Trust me, I wasn't happy to have to buy the new card. My choice was an 6200, 6600 or not buy the dual-cores. The 6200 is just a piece of shit. I mean I'm not into excess but I DO like some ability to render a 3d scene ;-)

                Tom
            • "Things like these cards make no sense. My FX5200 was capable of playing games like UT2k4 just fine. My 6600 is more than enough for things like Far Cry. I'm not saying people shouldn't be allowed to get faster cards. I'm asking why are you impressed? Engineering is about making the most from the least."

              Yes, making the most from the least is one thing engineering is about but it isn't the only thing. I know engineering types often like to focus on that aspect of engineering. I tend to find it impressive as
              • Pumping a lot of electricity into a circuit is ***NOT*** "cutting edge".

                I'm sure my 3.2Ghz P4 could run at 4.2Ghz if I nitrogen cooled it and pumped 16 times the current into it.

                Big deal. Show me where they get this performance at an EQUAL or LOWER amount of waste?

                The SR-71 is not exactly a good example. They sacrifice payload and a lot of fuel to fly that high and fast. Compare that to a 757. Last time I did the rough calculation per person transported a 757 wasn't that much less efficient than your av
                • "Pumping a lot of electricity into a circuit is ***NOT*** "cutting edge".

                  I'm sure my 3.2Ghz P4 could run at 4.2Ghz if I nitrogen cooled it and pumped 16 times the current into it.

                  Big deal. Show me where they get this performance at an EQUAL or LOWER amount of waste?

                  The SR-71 is not exactly a good example. They sacrifice payload and a lot of fuel to fly that high and fast. Compare that to a 757. Last time I did the rough calculation per person transported a 757 wasn't that much less efficient than your avera
                  • Improvements are not always obvious.

                    Are the new GPUs faster? Yes.

                    Do they get more detail at same/higher FPS? Probably yes.

                    Is this the result of some new groundbreaking design? No.

                    Is this EVEN LESS power efficient then the last series of non-power efficient cores? Yes.

                    You're trying to go along the lines of "this is the way it is and that's all there is to it." and I'm trying to say it isn't impressive, worthy enough to spend money on. What is their incentive to come up with a new GPU design if people ke
                    • The incentive is that at some point silicon turns into a pile of goop when you pump too much power through it. If, and when, we are able to move onto something stronger, like diamond semiconductors, then we will. But, even diamond breaks down at some point. We are hitting the practical limits of how much we can shrink the die size. This leaves us with only three real options for improvement. Those are to change semiconductor material (allowing us to pump more power in), to improve the chip designs, and to m
          • It would be like posting on a car website and ranting about why I would spend $250,000 on a Ferrari when all I need is a Honda. Morons. If you have nothing to contribute then shut the hell up.

            It's 99% sour grapes that creates those kinds of comments about Ferrari's and such.

            Those who have thousands of dollars of extra money each month for entertainment aren't the ones complaining. It's the ones up to their ears in debt or are barely living from paycheck to paycheck what whinge about people (who they
        • Silly argument, IMO. There are $500 HDTVs now. I think spending $650 on a video card is worse because the retail value often drops in half in six months and would likely need to be replaced every year, while current HDTVs could be in service for ten years.
          • Silly argument, IMO. There are $500 HDTVs now.

            Sure, tiny ones. 60 inch HDTV's are a little more than that. A quick look at Best Buy's site reveales a price tag of $3,500 for a 50" plasma or a 62" rear-projection tv.
    • And now for the slew of, "WTF? Who pays that much for a video card!"

      Yes, they often miss the fact that if it weren't for people who have enough spare cash to pay for those high priced cards, development would stagnate and the masses wouldn't reap the benefits from buying those same cards 1 year later at a much lower price. Those who pay high prices subsidise the rapid development of them and the resulting lower prices for the previous generation of video cards.
  • Comment removed based on user account deletion
  • GeForce 6600 DDR2 (Score:5, Interesting)

    by RzUpAnmsCwrds ( 262647 ) on Monday November 14, 2005 @11:15AM (#14026683)
    Slashdot users may be far more intereseted in the GeForce 6600 DDR2:

    http://www.neoseeker.com/Articles/Hardware/Reviews /geforce6600ddr2/ [neoseeker.com]

    At $99, it's a lot easier to swallow than the $600 GPUs we're now seeing, and it still offers excellent performance and decent Linux support.
  • Sorry for the fanboi rant, but ATI is going to have to do some SERIOUS work to catch up to nVidia at this point. With Crossfire being pretty much a joke and their top of the line card still not besting nvidias now second string card I see a good deal of the gaming community losing faith in them quick. Even being a huge nvidia fan I see this going one of two ways, either Nvidia will crush ATI and they will be the lone wolf in the graphics industry thus bringing innovation to a halt and bringing prices even
    • My that is a fanboi rant. The ATi card that this 7800 is being compared with is priced at around $400 vs nVidia's $650 for a card that marginally beats it in performance. Your words sound a lot like those of the folk that used to proclaim that one day 3dfx would be a "lone wolf in the graphics industry" and we all know how that turned out. Just like AMD and Intel or any other big component manufacture you will always see a performance see-saw between iterations of their products. But when one of them ha

Someday somebody has got to decide whether the typewriter is the machine, or the person who operates it.

Working...