Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Graphics Software Displays Hardware

Dual GeForce 7800 GT SLI Single Card Performance 129

Maximus writes "Asus is this first board partner out of the block with a single board, dual GPU design based on NVIDIA's GeForce 7800 GT graphics chip. The Asus Extreme N7800 GT DUAL essentially takes a dual board SLI setup and packs it all into a single PCI Express based card. HotHardware has a performance preview posted that shows this card can even compete in some cases with an GeForce 7800 GTX SLI setup, due to improved latency characteristics with respect to inter-GPU transactions, that are inherent to a single board design . This board is a bit pricey though for sure so only gaming speed freaks need apply."
This discussion has been archived. No new comments can be posted.

Dual GeForce 7800 GT SLI Single Card Performance

Comments Filter:
  • Does it have twice the fans and heatsinks? How could you get it to stay cool with twice the card in one spot?
    • Re:Cooling? (Score:5, Funny)

      by FidelCatsro ( 861135 ) <fidelcatsro&gmail,com> on Saturday October 15, 2005 @12:51PM (#13797765) Journal
      Looking at the price , I believe a Small cottage in Siberia is included , Early orders also receive a free flight .
      Which should amply handle the cooling
    • Most of you people already know this, but... Ill explain a little further. Air can only hold so much heat easily, so if it produces too much heat with respect to time, you'll need a fancy system to carry away all the excess heat. You cant just speed it up, or the air wont absorb enough heat in the time allowed before it gets jetted out the back. You might just need something with a better heat capacity, thats compressible, to create a normal compression cooling cycle. Or you could get "water cooling" bu
  • by Inoshiro ( 71693 ) on Saturday October 15, 2005 @12:50PM (#13797756) Homepage
    "This board is a bit pricey though for sure so only gaming speed freaks need apply."

    I'm really, really curious about the high-end sales for ATI and nVidia. What kind of people honestly go out and spend almost 1,000$ USD on a card every year? What benefits are there? Despite the fact that these hot, sexy cards come out, I don't see any real push to get software out that uses them. Windows Vista isn't out. Linux still doesn't have X rendering done via OpenGL. Mac OS X is the only OS that uses 3D everywhere.

    Beyond that, what games push the card? WoW? Doom 3? Half-life 2? Add in Far Cry and UT, and that's pretty much it for 3D games. If you spend that same amount of money on any console, you can buy more than double those number of games.

    What niche does this represent? I'm really curious as to the people that buy this kind of stuff.
    • If you spend that same amount of money on any console, you can buy more than double those number of games.


      Unless you (as I) get into just one game. (UT for me.)

      But point taken. Obscene amount to pay. Anyone spending that much on video cards for home use literally has far more money than sense, IMHO.

      (Must be getting old when I start using phrases my Dad used against me...)
    • by utuk99 ( 656026 ) on Saturday October 15, 2005 @01:01PM (#13797808)
      Speed freaks. The computer equivilent of the people who buy sports cars to go 20 miles an hour on the freeway. It doesn't matter that you can't use it for anything, just that you have it. Anyway, these function as previews for what normal gamers will be able to buy for $100-$200 in a year or so after the next couple uber video cards come out. The company gets a few sales to the freaks. We get to see whats next. Everybody wins.
      • agreed! look at the ridiculous scupltures on fashion catwalks [ i refuse to caqll them clothes] that sell for hundreds of thousands as collectible art and fuction as composite caricatures of the coming trends the design elite have deemed suitable for us to wear thus tipping off the lower down the chain manufacturers as well as being test balloons to gauage reactions within the indistry and the market as well as stimulating interest and genrating massive publicity.
      • In fairness to car nuts, most people who buy a sports car speed on a regular basis. They may not get much over 120, but it's also not like they're sitting at the speed limit either.
      • The computer equivilent of the people who buy sports cars to go 20 miles an hour on the freeway. It doesn't matter that you can't use it for anything, just that you have it.

        Sounds like class envy to me.
      • "Speed freaks. The computer equivilent of the people who buy sports cars to go 20 miles an hour on the freeway. It doesn't matter that you can't use it for anything, just that you have it."

        Good analogy. A friend of mine bought a de-restricted, Japanese import Mitsubishi GTO that was chipped for 200+ mph [clocked at 207]. He bought it to do high speeds on the motorway - not that he ever did - the roads in the UK just aren't built for those sorts of speeds.

        Before he sold it he fitted a flame kit to the exh
    • The people that buy ATI/Nvidia/Matrox's high end cards (the Workstation ones at least, but I dont know of any consumer grade cards that are 1000$+) are usually CAD and DCC people (Digital Content Creation, IE, CGI video, rendering images, etc etc etc). Its usually BUSINESSES that buy these cards, thats why they can afford them at such a cost.
      • The people that buy ATI/Nvidia/Matrox's high end cards (the Workstation ones at least, but I dont know of any consumer grade cards that are 1000$+) are usually CAD and DCC people (Digital Content Creation, IE, CGI video, rendering images, etc etc etc). Its usually BUSINESSES that buy these cards, thats why they can afford them at such a cost.

        I'm sure the gaming companies buy lots of the high end gaming cards, so their developers, designers, and testers can play around with tomorrow's mainstream level cards
    • I don't think anyone actually buys the top card every year. Everyone's upgrade cycle isn't at the same time, and someone's always upgrading. But to answer your main question: the only piece of software that can use all this power that I know of right now is Battlefield 2 [battlefield2.com]. And for me, that's enough to make me consider it (read: want it) but I'll wait until it drops in price a bit.
      • Word. My dual 6600 does a fine job with Battlefield 2, but I've noticed ram goes a long way towards helping that game. When I had 1 gig it was chunky, but after that second gig went in, it became buttery smooth.

          Just wait for Quake 4 next week, then we'll see yet another graphics card killer from your pals at iD.
    • Same people that drop 50grand into speaker wire because it sounds better. Or 10k for gold leads, because they sound "warmer".

      In other words, pretentious assholes. Correction, pretentious RICH assholes.
    • by freidog ( 706941 ) on Saturday October 15, 2005 @01:43PM (#13797986)
      the same market that won't buy a console because they think the graphics suck right now. Or that the ~500 lines of resolution on a TV is woefully insufficient to render the 'proper' graphical detail they desire in their games.
      Or of course, the rich yet clueless. (note: those two are not mutually exclusive...)

      Personally my 9600XT is plenty good for my gaming needs, I'd like to be able to run everything at 1280x1024 (native res for my LCD), but I'm not complaing about 1024x768 or even lower, they look just fine to me. Which is why I'm deffinately not the target audiance for SLI.

      Right now it's deffinately uneeded, but a year from now we may seem games wher 1280x1024 or 1600x1200 bring a 7800GT / GTX to 'marginally' playable frame rates (say about 30 FPS), you already saw Splinter Cell: Chaos theory was brought under 40 FPS at 1600x1200 AA and AF enabled. It's certainly not unreasonable to expect far more graphically demanding games over the next few years.
      • by Fweeky ( 41046 ) on Saturday October 15, 2005 @02:53PM (#13798298) Homepage
        Soft shadows in games like Chronices of Riddick and FEAR really take it out of my 7800GTX, especially at my TFT's native resolution (1600*1200). SLI's probably pretty much the only way to play at such high settings reasonably. Complex maps/situations in other games can also make it chug, and I'm sure it only gets worse at higher resolutions and AA levels (I normally play with 4X AA, an SLI user will probably be breezing along at 16X).

        Then of course there's the people who use 3D hardware as part of their job; CAD, 3D artists, level designers, game engine developers; one of the first SLI forum threads I read was by a guy involved in medical imaging. SLI is also laying the groundwork for future multicore cards; in much the same way that SMP has been the realm of rich bastards and high end professional users until multicore consumer level CPU's, SLI will probably remain in the realm of the same sort of people for a year or two until we start seeing multicore NV chips.
        • Does NV need to go multicore? Just multiply the pipelines like they already do and they're set.

          Did you know I'm still using a GeForce 2, and I'm okay with that!
          • I have no idea, really, I'm just speculating; I suspect it's not as simple as that. They're already hideously complicated and expensive to design; putting two on a chip, or even just two on a card, may well prove to be easier than working out how to integrate 700 million transistors into a single core. *shrug*

            I started with the GeForce 256DDR (T&L, woo!), Ti4200, 5900XT, 6800GT and now a 7800GTX. I'm also okay with this :)
    • by lewp ( 95638 ) *
      What kind of people honestly go out and spend almost 1,000$ USD on a card every year?

      Me. But it's more like $500+ twice a year.

      New cards make the games I play (basically just WoW and occasionally CS:Source now) run more smoothly and let me crank up the resolution to my LCD's native res (1900x1200) with all the eye candy on without turning into a slideshow. Other than that, I don't really think about it that much.

      My gaming PC is on about a (unintentional, I just get the itch about the same time) 6 month upgr

      • Once a machine falls off the end of the ol' upgrade queue (I have 4 right now including my Powerbook, that's enough...) I usually end up packing it up and shipping it off to someone I know who needs a computer.

        Hello...
    • Your priorities are different than mine. Simple as that. NVidia knows there is a market that will find value in these cards despite the high price.

      As another poster said, the newer cards afford the older games to run more smoothly. A definite plus since I still manage to game quite a bit. I am also an amature 3D visualization artist. The OpenGL aspect of cards like these agree with my 3D software of choice. Most 3D animation and modeling software in the market has some kind of hardware GPU accelerated
    • F.E.A.R. (Score:1, Interesting)

      by Anonymous Coward
      The Geforce 7800 GTX (Nvidia's top end card) gets only 30-40 FPS with everything turned on high. That's pretty low for something that's suppose to be Nvidia's flagship card. That's where SLI comes in.

      Of course, you could turn down the visuals down a notch. But I tried playing on my friend's computer (who happens to be one of those hardcore gamers) and the experience just isn't the same compared to my Radeon 9800 Pro. The graphics are on a whole another level.

    • I tend to only build a new box every 2-3 years. I go bleeding edge on most every component, and my system can run every new title at high settings damn near until the time I usually end up building a new one 2-3 years later. The bonus is that I don't usually have to open my system up for anything but cleaning the whole time.
    • You're right, there's no games out that would really push this graphics card. I haven't found a game yet that will yeild unplayable framerates at maximum detail and resolution on my GeForce6800 Ultra.

      But when you're creating 3D graphics (such as a video game, or in my case, virtual reality/simulations), the data that you deal with is uncompressed, and uses much more horsepower. Running multiple instances of an OpenGL application can get pretty intesive. I am usually working in 3dsMax and I also have a ru
      • not always, I knew a guy at my school who would go out and buy the top of the line anything for a computer whenever it hit the market. When I say everything, I mean he had two two top end PC systems and a mac system simply because he wanted the top of Mac, Intel , AMD, Nvidia, and ATI. He also kept a matrox card.

        There wasn't any real reason that he did it. He wasn't getting any noticably different performance(once the res is all the way up and the eye candy is turned on, the extra 2 frames a second don't
    • "What kind of people honestly go out and spend almost 1,000$ USD on a card every year?"

      Do they really need people spending that much money every year? There's enough computers out there that are of various ages and specs, couldn't it be more of a staggered approach to sales? Lots of people have a two year life cycle on their computers. Every couple of years or so they dump a ton of money into a big whiz bang new system intended to 'last' a long time. Get enough people with that sort of mentality buying
      • Don't forget the forced upgrades that Microsoft hands everyone every 3 to 5 years. Try running XP on a machine 'designed for Win98' or even a Milennium-spec machine. Every software developer expects people to be on the upgrade treadmill, and even Linux is getting to be like this (if you're running the latest and greatest kernel, X.org, multimedia apps, desktop, etc.).

        Let's just face it. Developers are no longer limited to a low spec platform, so code isn't tight and 'on the metal' like it used to be
    • Beyond that, what games push the card? WoW? Doom 3? Half-life 2? Add in Far Cry and UT, and that's pretty much it for 3D games.

      It's absolutely all about games sure. Doom 3, Half Life 2, Far Cry and even the modest graphics of WoW will push any single card currently on the market at moderately high resolutions if (and that's the kicker, if) you have the quality turned up.

      It's fair to say people don't actually set the high detail options though, they just set the in-game quality to 'High' and leave it at what
    • Mostly for NEWER/UPCOMING games. Not the current. WoW is fine on my old ATI Radeon 9800 Pro AIW card at 1152x864 resolution and everything cranked up in game's video options. I don't use FSAA.

      Although, newer/upcoming games like Battlefield 2, Call of Duty 2, etc. are choppy. I had to lower video options. I still need to find somethign to replace my ATI's TV tuner (AIW) and video card. I will be going to NVIDIA.
    • Please don't be quick to assume that all users of high end graphics cards are gaming freaks.

      I do CAD/CAE/CAM with ProE/Wildfire/Mechanica on a system with twin Xeons, 4 gig sdram, Nvidia Quadro FX 4000 video Card (bought for $1,661), triple 60 gig high speed drives, etc... In my case a high end video card isn't needed to boost gaming frame rates but to create and edit large models consisting of lots of assemblies and lots & lots of parts.

      BTW sure am glad the next version of Windows (Vista) will handle
    • Beyond that, what games push the card? WoW? Doom 3? Half-life 2? Add in Far Cry and UT, and that's pretty much it for 3D games. If you spend that same amount of money on any console, you can buy more than double those number of games.

      Not with that much detail. The next generation might make it, but the current consoles aren't quite there. And I'm quite capable of spending all my free time playing UT. If you want to be good at a game it needs to be your primary game or at least your primary genre.

    • What kind of people honestly go out and spend almost 1,000$ USD on a card every year? What benefits are there? Despite the fact that these hot, sexy cards come out, I don't see any real push to get software out that uses them.

      You need a card like this, or two 7800gts in SLI to run games on this badboy [westinghousedigital.com].

      4XAA at 1920X1080 can send your comp to its knees.
    • Black and white 2 can also push the cards pretty hard
    • The high end cards drive the low end market. ATI and Nvidia duke it out for ratings at the top end to create the buzz which drives people to (perhaps incorrectly) project the performance of these top-of-the-line parts onto the more reasonable and budget offering from the companies.

      Also the folks who shell out for these kinds of parts will be sure to tell all their friends how great their respective graphics chip company's kit is. You don't usually drop a G on hardware and then tell everyone how bad it is.

      It
    • Linux still doesn't have X rendering done via OpenGL.

      Java2D is now OpenGL accelerated under mustang. [java.net]

      Glitz [freedesktop.org] provides support for hardware acceleration too.

      So, usage of OpenGL is increasing...

    • "What kind of people honestly go out and spend almost 1,000$ USD on a card every year?"

      The kind of people you're not in the target market of.

      "What benefits are there?"

      At the risk of sounding like a marketing-bot (which I am *not*) The greater utility of having tomorrow's performance today. What kind of techno-geek are you, that trashes the early adopters? They pay the premium that gives you cheaper performance later!

      Without the initial waste their money, you'd be spending a lot more in the long-run.
  • What about a Dual GeForce 7800 GT SLI Single Card SLI configuration? 4 gpu's for the price of two! [goes into blissful fantacy]
    • Wake up! (Score:2, Insightful)

      by voxel ( 70407 )
      A single Dual 7800 GT Card costs MORE than TWICE a true TWO-Card 7800 GT SLI Setup.

      So your 4 GPU setup would end up costing alot more than "the price of two!"

      Besides, you can't run these cards in "SLI" mode again. This card is it, you can't add another.

      Wake up from your fantacy!
  • Multi-SLI?? (Score:2, Interesting)

    by Coleco ( 41062 )
    Now they need to figure out how to get two of these things working together in SLI.. or what about *four* of them in gigabyte's crazy quad mb:

    http://www.tomshardware.com/motherboard/20051004/i ndex.html [tomshardware.com]

    sweeeeeeet.
  • by Anonymous Coward
    This card doesn't need external power! It runs a fusion reactor off its own heat!
    • Actually, you might be able to make 1/8 of the ~300W two of these cards would be using by running a small steam turbine off them. That would be an awsome mod, outfitting an OCed Pentium D and SLI right with water cooling, a heat exchanger, and steam generator.
  • "HotHardware has a performance preview posted..." It's going to be a hot piece of hardware indeed... Have we come to the point yet where a graphics card draw more power than the actual computer?
  • Naa... (Score:3, Funny)

    by Anonymous Coward on Saturday October 15, 2005 @12:59PM (#13797794)
    The Radeon 9200 in the Mac Mini should be anough for anybody.
  • by blankoboy ( 719577 ) on Saturday October 15, 2005 @01:00PM (#13797799)
    Not looking to create a flamewar between ATI and Nvidia folks here. I am currently putting together my parts list for a new PC and am down to deciding what to do about a VGA card. The two options on the table at the moment, are: - Nvidia 7800 GT (probably going with Albatron as it is the best price) - ATI X1800XL (would most likely be Sapphire) My question is with regard to.... 2D quality...SUPRISE! Back around 3 or so years ago when I was upgrading my PC it was a toss between a Geforce 4400Ti and a Radeon 9700 Pro. I initially bought the Geforce and was horrified by the 2D quality. The store was kind enough to allow me to switch over to the Radeon 9700 Pro which has been serving me well ever since. I know ATI has always had superior 2D quality in the past but is this still the case? Has Nvidia improved in this area? Thanks and I look forward to your objective and knowledgeable opinions!
    • Ive been using a Geforce 6600 along with Radeon 9600/9500 and 9800s, and have to say, nvidia still need to improve on 2D quality. Text has been blurred in games (even the HUD text was more blurred), squinting at text at 1280x1024 in IRC on a 19" monitor is painful, when going back to a Radeon, never needed to worry, everything was nice and sharp. I used to like Geforces, since they were smaller and faster than Voodoos. When I went form a Geforce 3 to a 9800, I couldn't believe how sharp my monitor picture l
    • by FidelCatsro ( 861135 ) <fidelcatsro&gmail,com> on Saturday October 15, 2005 @01:10PM (#13797849) Journal
      If you want to use Linux , don't buy ATI.
      Nvidia's 2d quality is excellent these days (under Linux and OS X at least , not sure about Windows ) and certainly on par with ATI, it does not really compare to Matrox cards (in my opinion) though.
      Looking at recent benchmarks , if you want the best performance then you should go with Nvidia ,if you want to have good cross platform support Again Nvidia , If you want a fancy 3D mark score then ATI is a good option .

      • I build quiet PC's with fanless video cards. One of mine has a Matrox G550, another has an nVidia Quadro4 550XGL. I run these at 1600x1200 analog to a Samsung 213t LCD display, and Samsung includes an "auto pattern" program that displays a black/white checkerboard pattern that is optimal for tuning the LCD a/d clock to the card.

        The nVidia display for this is dead sharp and visually quiet, indistinguishable from DVI. The Matrox isn't generally bad, but this kind of display shows a lot of scanning flicker,
      • In my personal experience with NVIDIA cards from GeForce 2 Pro to GeForce 4 Ti4200 and ATI Radeon 9800 Pro AIW (128 MB), I would have to pick NVIDIA cards.

        1. Linux support. ATI's driver in Linux = horrible and harder to set up compared to NVIDIA's.

        2. In Windows, NVIDIA's drivers and software seems to be less buggy than ATI. I use the All-In-Wonder software (MMC), and it is VERY buggy. Sometimes driver don't work like video out to my TV. I have to reboot to make it work. I know NVIDIA doesn't make TV tuner s
    • Does 2D quality even matter anymore? Data is sent over DVI, meaning the RAMDAC specs we were used to seeing on analog signals doesn't matter anymore. As long as both companies follow DVI specs and support color profiles correctly the only thing that matters is the 2D quality of your LCD.

      And will people please start using the DVI connection on the projector? I'm sick and tired of seeing blurr-o-vision on LCD presentations caused by 50 ft of VGA cable.
    • Trident 9680, 1 meg with 1 meg possible expansion!

      Seriously. It's 2005. Who cares? I've never seen a problem with any GeForce or Radeon card I've owned in the past.
      • Yea, who cares if there's ghosting on all of your text and everything looks like a blurry mess? Oh wait... I DO. I've always been a fan of nvidia because of their driver performance and stability, but their cards have always been known for their substandard 2d visual clarity. The fact that it's 2005, and 3d is king... That's the problem. It's allowed nvidia to be wildly successful while completely ignoring a very important aspect of the end-user experience. You've never had a problem? Well, you've p
    • by Scorchmon ( 305172 ) on Saturday October 15, 2005 @01:18PM (#13797880)
      The Geforce 4s were known for using a certain low quality 2D filter setup in their circuitry. There were user mods to improve the 2D quality with minimal effort. I don't know how their current tech is, but I just wanted to let you know that your previous experience was tarnished by lackluster design. Whether or not this practice continues with the current generation, I can't say.
      • The GeForce 4s are now 3 generations old, so I hardly think it's relevent.

        It's kind of like pointing out the the original Pentium had a flaw which reduced it's accuracy in certain kinds of division operations, true but irrelavent to the current generation of Pentium IVs.
    • My experience is that the ATI cards are better, much better. But remember, the nVidia cards can come from so many manufacturers and they may use substandard parts.

      The drawback is that I still get driven crazy by ATI and their stupid drivers. Just installed the latest last night and it completely messes up the TV function of the card and I had to remove and reinstall all the software to get it to work. That is why I haven't upgrade the video drivers for so long. Every time I do it it is a big mess. You
    • My conclusion is that ATI still can't figure drivers out. I have a 9700 pro and the drivers have been a constant pain in the ass to me. My system is rock solid stable and passes any test I can find to throw at it, yet catalyst control center bluescreened it. Without CCC the drivers work, mostly. My next card will be an nvidia simply because ATI still can't get drivers right. (the linux drivers worked better than the windows driver, how weird)
    • I think the moral of the story is use DVI and it is perfectly fine. I have been using DVI with my old GeForce4 4200 for ages now and have never noticed any 2D blurriness, ghosting or anything. I just recently upgraded to a 7800 GT and likewise, no problems.
  • Vista (Score:2, Insightful)

    by CDPatten ( 907182 )
    as much as you guys hate microsoft, they are going to be driving higher performance graphic cards with the release of Vista.
    • So what? What does it mean that they are going to be driving higher performance graphic cards with the release of Vista? It's still a year away, so the point is moot until something actually happens.

      I'm willing to bet that the release of OS X on Intel based Macs will push higher performance graphics cards too; but still, what is the point of stating something like that? Is there any point at all?
      • The radical increase in video card graphics and price drops is going to occur simply because of Vista. OSX is a tiny market and won't have much of an impact on sales of video cards. OSX driving down the costs of the video card market is retarded, and really shouldn't even be mentioned in the same breath as Vista. The volume is so far from ever being comparable, it's just absurd. You can offer some wishful thinking on your part that MACTEL is going to take out windows, but even the Mac worlds Holy Grail
        • You miss my point (and in doing so make some stupid statements)

          I said you didn't have a point; all you said was:

          as much as you guys hate microsoft, they are going to be driving higher performance graphic cards with the release of Vista.

          My point was not that your view was wrong, or that my view was right; my point was that your statement was incomplete.

          I never said anything about OS X having any impact on the sales of video cards; when I said, "I'm willing to bet that the release of OS X on Intel based Macs

      • To be fair to the GP, most of the speculation I've seen about Vista has predicted that a mid-to-high-range card will be necessary for acceptable graphics performance, so there probably are people upgrading already in anticipation of higher system requirements when Vista comes out.

        Having said that, I'd imagine that there are far more people upgrading because of a combination of larger monitors and graphically demanding games like HL2, Battlefront 2, and Doom 3.
  • HotHardware reviewed it. I'll just buy a cooler video card that they don't review so that I don't need to worry about extra cooling fans.

    I'll wait for a review from these guys: http://www.coolhardware.co.uk/ [coolhardware.co.uk]
  • by rathehun ( 818491 ) on Saturday October 15, 2005 @01:44PM (#13797996) Homepage
    has anyone else noticed that this ships with an external power supply? This might then be a decent card for systems with only a 350/400 watt SMPS.
  • You can still buy a perfectly good solution in the form of earlier generation adapters, like Radeon 9800Pro. They are quite cheap now. What amazes me that all the hardware sites foam about the latest and greates nVidia and ATI have to offer, but what I would like to see is how much it actually benefits me if I get a new adapter.
  • ...do we need all this power? :) I am wondering why someone should buy the latest ultra-pc card when no game actually uses all his power...?
    • It does help that you wont have to upgrade it for a long time. Hell, i have a friend that's still running on a geforce 2 mx and he can run most modern games, albeit on minimum settings. Until just recently we felt little need to upgrade it.
      • Yes indeed: I have a Radeon 9000, quite a new card, but looking at tech specs mine looks already old compared to this new one... :) but I won't upgrade so soon, just reduce the game settings !
  • by MikShapi ( 681808 ) on Saturday October 15, 2005 @07:25PM (#13799520) Journal
    The card offers an humongous amount of horsepower, yet the vast majority of people have monitors that can do 1280x1024 (most mid-sized LCDs out there) or 1600x1200 (most CRT's). So most of the power your card can produce above what a mid-range last-generation card (or high-range 2-gen-old card) can produce is largely unused.

    All of these new cards will give more than playable rates at either of these resolutions on most modern games without breaking a sweat, the heavier game engines requiring you to drop a notch or two on the FSAA or AF.

    In fact, even my trusty OEM Radeon 9700 Pro bought December 2002 for 270$ does that just fine.

    But where is all that horsepower needed? The answer is obvious, and yet promptly ignored. All these cards have two outputs (at least). Which can very well work simultaneously in a game, thank you very much. If one LCD can't go over 1280x1024, why not have two?

    I run a two-monitor setup on my Rad (Dual Samsung 172X's). Both nVidia and ATI drivers support spanning (turning all outputs into one virtual very large screen). Three problems arise that require attention for this to work in gaming:

    1. The game must support using SPAN. Many games (UT2k4, NWN, Fable, etc.) support this reasonably.

    2. Unrelated to Issue #1 above, the game must support *weird* aspect ratios. Contrary to popular belief, unlike 640x480, 800x600 and 1024x768 - the 1280x1024 res, what our modern LCD's do best is not 4x3. It is 5x4. Do the math. The next 4x3 notch is 1280x960. The 5x4 aspect ratio aside, dual monitors give some very new AR's altogether - 8x3 for two 4x3 monitors, or 10x4 AR for two 1280's side by side. Fable, for example, while putting the rendered picture within my virtual 10x4 display area neatly, promptly puts the (quite essential) dialog subs and game choices outside the viewable area because it is unfamiliar with this aspect raito.

    3. Not a showstopper, but very easy to work around if only the game devs would give it one ounce of thought:

    Most action in almost any type of game (bar, perhaps, RTS's) happens dead in the center of your display. Which is good if you're playing with three displays, all important stuff happening flat in the center of your middle one, but with the simple solution 90% of people can affort and implement - purchase an additional monitor and hook it up to their existing dual-head-supporting graphics card - all the action happens right on top of the split between the two monitors. Things like your character in NWN (which properly gets split by 2cm (if you're lucky and chose your monitors wisely - 5cm if you're not) of space in the middle, looking somewhat 'fat') to that little pixel marking the business end of my sniper rifle in UT. VERY annoying (though I got used to it, to an extent, and it's very much worth the wider viewport).

    GAME DEVELOPERS, PLEASE, PRETTY PRETTY PLEASE, PUT AN OPTION IN THE CONFIG TO OFFCENTER THE GAME HAPPENINGS SO THE CENTER OF THE GAME IS ... 40% FROM THE LEFT EDGE AND 60% FROM THE RIGHT (OR OTHERWISE ADJUSTABLE) OF THE DISPLAY. IT'S OUTRIGHT A NEUCANSE! TIA.

    Those issues aside (and with some, at least the former two issues definitely are), two monitors and a 2560x1024 resolution would give even the newest GPU (with FSAA, AF and shadow rendering cranked up to max of course) a very decent workout, and put all that unuseable horsepower on the fringes of the useable realm.

    My two cents.

  • I wonder if you can run 2 of these in sli for quad cores, if not i wonder if asus will revise the card to do so

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...