Dual GeForce 7800 GT SLI Single Card Performance 129
Maximus writes "Asus is this first board partner out of the block with a single board, dual
GPU design based on NVIDIA's
GeForce 7800 GT graphics chip. The
Asus Extreme N7800 GT DUAL essentially takes a dual board SLI setup and
packs it all into a single PCI Express based card.
HotHardware has a performance preview posted that shows this card can even
compete in some cases with an GeForce 7800 GTX SLI setup, due to improved
latency characteristics with respect to inter-GPU transactions, that are
inherent to a single board design . This board is a bit pricey though for
sure so only gaming speed freaks need apply."
Re:This means... (Score:2)
So, can you spell (Score:2)
I mean, this card can be relinked to another identical one or not ?
Re:So, can you spell (Score:2)
Re:So, can you spell (Score:2)
The standard single PCI express video bus uses a 16x path. However, when you throw another card in, that path gets cut in half, so each card gets an 8x path. This is due to pipeline limitations on the motherboard itself and is part of the PCI-X spec.
Having a single card which is an SLI card doesn't change the PCI-X spec. Rathe
Re:So, can you spell (Score:2)
The lanes per board limit isn't imposed by PCI-e, it's imposed by the chipset. There are chipsets [nvidia.com] that have enough lanes to provide two x16 slots (and motherboards that use them to do this), however they are still somewhat on the high end.
Re:So, can you spell (Score:2)
Re:This means... (Score:2)
Cooling? (Score:1)
Re:Cooling? (Score:5, Funny)
Which should amply handle the cooling
Compression cooling cycle? (Score:1)
Re:err.. (Score:2)
Re:err.. (Score:2)
Indeed (Score:2)
But some where produced for AAlchemy graphic station.
There're still some in circulation, from time to time they show up on ebay, and, thanks to community efforts like those for AgminMerlin drivers, you even have a recent WindowsXP-compatible driver.
AAlchemy even has 8x [libero.it] way multi VSA-100 configuration PCI boards.
I just find I ironic that once every body (specially nVidia fans) was making fun [3dchip.de] of Voodoo 5 multichip-with-external-power-su
Gaming freaks indeed. (Score:5, Interesting)
I'm really, really curious about the high-end sales for ATI and nVidia. What kind of people honestly go out and spend almost 1,000$ USD on a card every year? What benefits are there? Despite the fact that these hot, sexy cards come out, I don't see any real push to get software out that uses them. Windows Vista isn't out. Linux still doesn't have X rendering done via OpenGL. Mac OS X is the only OS that uses 3D everywhere.
Beyond that, what games push the card? WoW? Doom 3? Half-life 2? Add in Far Cry and UT, and that's pretty much it for 3D games. If you spend that same amount of money on any console, you can buy more than double those number of games.
What niche does this represent? I'm really curious as to the people that buy this kind of stuff.
Re:Gaming freaks indeed. (Score:1)
Unless you (as I) get into just one game. (UT for me.)
But point taken. Obscene amount to pay. Anyone spending that much on video cards for home use literally has far more money than sense, IMHO.
(Must be getting old when I start using phrases my Dad used against me...)
Re:Gaming freaks indeed. (Score:4, Insightful)
Re:Gaming freaks indeed. (Score:1)
Re:Gaming freaks indeed. (Score:2)
Re:Gaming freaks indeed. (Score:1)
Sounds like class envy to me.
Re:Gaming freaks indeed. (Score:1)
Good analogy. A friend of mine bought a de-restricted, Japanese import Mitsubishi GTO that was chipped for 200+ mph [clocked at 207]. He bought it to do high speeds on the motorway - not that he ever did - the roads in the UK just aren't built for those sorts of speeds.
Before he sold it he fitted a flame kit to the exh
Re:Gaming freaks indeed. (Score:1)
So that's why the robots in Futurama belched flames!
Re:Gaming freaks indeed. (Score:2, Interesting)
Re:Gaming freaks indeed. (Score:2)
I'm sure the gaming companies buy lots of the high end gaming cards, so their developers, designers, and testers can play around with tomorrow's mainstream level cards
Re:Gaming freaks indeed. (Score:3, Insightful)
Re:Gaming freaks indeed. (Score:2)
Just wait for Quake 4 next week, then we'll see yet another graphics card killer from your pals at iD.
Re:Gaming freaks indeed. (Score:2)
In other words, pretentious assholes. Correction, pretentious RICH assholes.
Re:Gaming freaks indeed. (Score:5, Interesting)
Or of course, the rich yet clueless. (note: those two are not mutually exclusive...)
Personally my 9600XT is plenty good for my gaming needs, I'd like to be able to run everything at 1280x1024 (native res for my LCD), but I'm not complaing about 1024x768 or even lower, they look just fine to me. Which is why I'm deffinately not the target audiance for SLI.
Right now it's deffinately uneeded, but a year from now we may seem games wher 1280x1024 or 1600x1200 bring a 7800GT / GTX to 'marginally' playable frame rates (say about 30 FPS), you already saw Splinter Cell: Chaos theory was brought under 40 FPS at 1600x1200 AA and AF enabled. It's certainly not unreasonable to expect far more graphically demanding games over the next few years.
Re:Gaming freaks indeed. (Score:4, Interesting)
Then of course there's the people who use 3D hardware as part of their job; CAD, 3D artists, level designers, game engine developers; one of the first SLI forum threads I read was by a guy involved in medical imaging. SLI is also laying the groundwork for future multicore cards; in much the same way that SMP has been the realm of rich bastards and high end professional users until multicore consumer level CPU's, SLI will probably remain in the realm of the same sort of people for a year or two until we start seeing multicore NV chips.
Re:Gaming freaks indeed. (Score:2)
Did you know I'm still using a GeForce 2, and I'm okay with that!
Re:Gaming freaks indeed. (Score:2)
I started with the GeForce 256DDR (T&L, woo!), Ti4200, 5900XT, 6800GT and now a 7800GTX. I'm also okay with this
Re:Gaming freaks indeed. (Score:3, Informative)
Me. But it's more like $500+ twice a year.
New cards make the games I play (basically just WoW and occasionally CS:Source now) run more smoothly and let me crank up the resolution to my LCD's native res (1900x1200) with all the eye candy on without turning into a slideshow. Other than that, I don't really think about it that much.
My gaming PC is on about a (unintentional, I just get the itch about the same time) 6 month upgr
Re:Gaming freaks indeed. (Score:2)
Once a machine falls off the end of the ol' upgrade queue (I have 4 right now including my Powerbook, that's enough...) I usually end up packing it up and shipping it off to someone I know who needs a computer.
Hello...
Re:Gaming freaks indeed. (Score:2)
As another poster said, the newer cards afford the older games to run more smoothly. A definite plus since I still manage to game quite a bit. I am also an amature 3D visualization artist. The OpenGL aspect of cards like these agree with my 3D software of choice. Most 3D animation and modeling software in the market has some kind of hardware GPU accelerated
F.E.A.R. (Score:1, Interesting)
Of course, you could turn down the visuals down a notch. But I tried playing on my friend's computer (who happens to be one of those hardcore gamers) and the experience just isn't the same compared to my Radeon 9800 Pro. The graphics are on a whole another level.
For some, future-proofing. (Score:3, Interesting)
Re:Gaming freaks indeed. (Score:1)
But when you're creating 3D graphics (such as a video game, or in my case, virtual reality/simulations), the data that you deal with is uncompressed, and uses much more horsepower. Running multiple instances of an OpenGL application can get pretty intesive. I am usually working in 3dsMax and I also have a ru
Re:Gaming freaks indeed. (Score:2)
There wasn't any real reason that he did it. He wasn't getting any noticably different performance(once the res is all the way up and the eye candy is turned on, the extra 2 frames a second don't
Re:Gaming freaks indeed. (Score:3, Interesting)
Do they really need people spending that much money every year? There's enough computers out there that are of various ages and specs, couldn't it be more of a staggered approach to sales? Lots of people have a two year life cycle on their computers. Every couple of years or so they dump a ton of money into a big whiz bang new system intended to 'last' a long time. Get enough people with that sort of mentality buying
Re:Gaming freaks indeed. (Score:2)
Let's just face it. Developers are no longer limited to a low spec platform, so code isn't tight and 'on the metal' like it used to be
Why? Quality! (Score:2)
It's absolutely all about games sure. Doom 3, Half Life 2, Far Cry and even the modest graphics of WoW will push any single card currently on the market at moderately high resolutions if (and that's the kicker, if) you have the quality turned up.
It's fair to say people don't actually set the high detail options though, they just set the in-game quality to 'High' and leave it at what
Re:Gaming freaks indeed. (Score:2)
Although, newer/upcoming games like Battlefield 2, Call of Duty 2, etc. are choppy. I had to lower video options. I still need to find somethign to replace my ATI's TV tuner (AIW) and video card. I will be going to NVIDIA.
Re:Gaming freaks indeed. (Score:1)
I do CAD/CAE/CAM with ProE/Wildfire/Mechanica on a system with twin Xeons, 4 gig sdram, Nvidia Quadro FX 4000 video Card (bought for $1,661), triple 60 gig high speed drives, etc... In my case a high end video card isn't needed to boost gaming frame rates but to create and edit large models consisting of lots of assemblies and lots & lots of parts.
BTW sure am glad the next version of Windows (Vista) will handle
Re:Gaming freaks indeed. (Score:2)
Not with that much detail. The next generation might make it, but the current consoles aren't quite there. And I'm quite capable of spending all my free time playing UT. If you want to be good at a game it needs to be your primary game or at least your primary genre.
Re:Gaming freaks indeed. (Score:2)
You need a card like this, or two 7800gts in SLI to run games on this badboy [westinghousedigital.com].
4XAA at 1920X1080 can send your comp to its knees.
Re:Gaming freaks indeed. (Score:1)
Re:Gaming freaks indeed. (Score:1)
Also the folks who shell out for these kinds of parts will be sure to tell all their friends how great their respective graphics chip company's kit is. You don't usually drop a G on hardware and then tell everyone how bad it is.
It
OpenGL (Score:1)
Java2D is now OpenGL accelerated under mustang. [java.net]
Glitz [freedesktop.org] provides support for hardware acceleration too.
So, usage of OpenGL is increasing...
Re:Gaming freaks indeed. (Score:2)
The kind of people you're not in the target market of.
"What benefits are there?"
At the risk of sounding like a marketing-bot (which I am *not*) The greater utility of having tomorrow's performance today. What kind of techno-geek are you, that trashes the early adopters? They pay the premium that gives you cheaper performance later!
Without the initial waste their money, you'd be spending a lot more in the long-run.
Re:Gaming freaks indeed. (Score:1)
I have a couple a places in mind that would hire them... Hint: one starts with n, another starts with A.
Ooooh! what about... (Score:2, Troll)
Wake up! (Score:2, Insightful)
So your 4 GPU setup would end up costing alot more than "the price of two!"
Besides, you can't run these cards in "SLI" mode again. This card is it, you can't add another.
Wake up from your fantacy!
Multi-SLI?? (Score:2, Interesting)
http://www.tomshardware.com/motherboard/20051004/
sweeeeeeet.
What the announcement didn't mention... (Score:2, Funny)
Re:What the announcement didn't mention... (Score:2)
Nice and hot (Score:1)
Naa... (Score:3, Funny)
Nvidia 2D quality compared to ATI? (Score:4, Interesting)
Re:Nvidia 2D quality compared to ATI? (Score:2, Insightful)
Re:Nvidia 2D quality compared to ATI? (Score:1)
Re:Nvidia 2D quality compared to ATI? (Score:4, Informative)
Nvidia's 2d quality is excellent these days (under Linux and OS X at least , not sure about Windows ) and certainly on par with ATI, it does not really compare to Matrox cards (in my opinion) though.
Looking at recent benchmarks , if you want the best performance then you should go with Nvidia
my nVidia Quadro4 smokes my matrox's (Score:2, Informative)
The nVidia display for this is dead sharp and visually quiet, indistinguishable from DVI. The Matrox isn't generally bad, but this kind of display shows a lot of scanning flicker,
My two cents on NVIDIA vs. ATI (Score:3, Interesting)
1. Linux support. ATI's driver in Linux = horrible and harder to set up compared to NVIDIA's.
2. In Windows, NVIDIA's drivers and software seems to be less buggy than ATI. I use the All-In-Wonder software (MMC), and it is VERY buggy. Sometimes driver don't work like video out to my TV. I have to reboot to make it work. I know NVIDIA doesn't make TV tuner s
Re:My two cents on NVIDIA vs. ATI (Score:2)
Re:My two cents on NVIDIA vs. ATI (Score:2)
Re:Nvidia 2D quality compared to ATI? (Score:2)
And will people please start using the DVI connection on the projector? I'm sick and tired of seeing blurr-o-vision on LCD presentations caused by 50 ft of VGA cable.
Re:Nvidia 2D quality compared to ATI? (Score:2)
Seriously. It's 2005. Who cares? I've never seen a problem with any GeForce or Radeon card I've owned in the past.
Re:Nvidia 2D quality compared to ATI? (Score:3, Informative)
Re:Nvidia 2D quality compared to ATI? (Score:2)
If there's ghosting, there's probably something else wrong with the card, not the GPU.
Re:Nvidia 2D quality compared to ATI? (Score:5, Interesting)
Re:Nvidia 2D quality compared to ATI? (Score:1)
It's kind of like pointing out the the original Pentium had a flaw which reduced it's accuracy in certain kinds of division operations, true but irrelavent to the current generation of Pentium IVs.
Re:Nvidia 2D quality compared to ATI? (Score:2, Informative)
And the low quality of ATI's drivers continue to affect 2D performance, 3D performance, stability and user sanity. Great image quality though. When the drivers on my laptop crash every few hours and drop back to VGA mode with a cute little error message, I al
Re:Nvidia 2D quality compared to ATI? (Score:2, Insightful)
The drawback is that I still get driven crazy by ATI and their stupid drivers. Just installed the latest last night and it completely messes up the TV function of the card and I had to remove and reinstall all the software to get it to work. That is why I haven't upgrade the video drivers for so long. Every time I do it it is a big mess. You
Re:Nvidia 2D quality compared to ATI? (Score:2, Informative)
Re:Nvidia 2D quality compared to ATI? (Score:2)
Vista (Score:2, Insightful)
Re:Vista (Score:2)
I'm willing to bet that the release of OS X on Intel based Macs will push higher performance graphics cards too; but still, what is the point of stating something like that? Is there any point at all?
Re:Vista (Score:1)
Re:Vista (Score:2)
I said you didn't have a point; all you said was:
My point was not that your view was wrong, or that my view was right; my point was that your statement was incomplete.
I never said anything about OS X having any impact on the sales of video cards; when I said, "I'm willing to bet that the release of OS X on Intel based Macs
Re:Vista (Score:1)
Having said that, I'd imagine that there are far more people upgrading because of a combination of larger monitors and graphically demanding games like HL2, Battlefront 2, and Doom 3.
Heat will be an issue if... (Score:1)
I'll wait for a review from these guys: http://www.coolhardware.co.uk/ [coolhardware.co.uk]
external power supply (Score:4, Interesting)
Re:external power supply (Score:1)
Re:external power supply (Score:1)
Things that make you go hmm.... (Score:1)
Re:Things that make you go hmm.... (Score:2)
Re:Things that make you go hmm.... (Score:2)
Re:Things that make you go hmm.... (Score:2)
yeah great but... (Score:1)
Re:yeah great but... (Score:1)
Re:yeah great but... (Score:1)
Tapping the untapped... (Score:3, Insightful)
All of these new cards will give more than playable rates at either of these resolutions on most modern games without breaking a sweat, the heavier game engines requiring you to drop a notch or two on the FSAA or AF.
In fact, even my trusty OEM Radeon 9700 Pro bought December 2002 for 270$ does that just fine.
But where is all that horsepower needed? The answer is obvious, and yet promptly ignored. All these cards have two outputs (at least). Which can very well work simultaneously in a game, thank you very much. If one LCD can't go over 1280x1024, why not have two?
I run a two-monitor setup on my Rad (Dual Samsung 172X's). Both nVidia and ATI drivers support spanning (turning all outputs into one virtual very large screen). Three problems arise that require attention for this to work in gaming:
1. The game must support using SPAN. Many games (UT2k4, NWN, Fable, etc.) support this reasonably.
2. Unrelated to Issue #1 above, the game must support *weird* aspect ratios. Contrary to popular belief, unlike 640x480, 800x600 and 1024x768 - the 1280x1024 res, what our modern LCD's do best is not 4x3. It is 5x4. Do the math. The next 4x3 notch is 1280x960. The 5x4 aspect ratio aside, dual monitors give some very new AR's altogether - 8x3 for two 4x3 monitors, or 10x4 AR for two 1280's side by side. Fable, for example, while putting the rendered picture within my virtual 10x4 display area neatly, promptly puts the (quite essential) dialog subs and game choices outside the viewable area because it is unfamiliar with this aspect raito.
3. Not a showstopper, but very easy to work around if only the game devs would give it one ounce of thought:
Most action in almost any type of game (bar, perhaps, RTS's) happens dead in the center of your display. Which is good if you're playing with three displays, all important stuff happening flat in the center of your middle one, but with the simple solution 90% of people can affort and implement - purchase an additional monitor and hook it up to their existing dual-head-supporting graphics card - all the action happens right on top of the split between the two monitors. Things like your character in NWN (which properly gets split by 2cm (if you're lucky and chose your monitors wisely - 5cm if you're not) of space in the middle, looking somewhat 'fat') to that little pixel marking the business end of my sniper rifle in UT. VERY annoying (though I got used to it, to an extent, and it's very much worth the wider viewport).
GAME DEVELOPERS, PLEASE, PRETTY PRETTY PLEASE, PUT AN OPTION IN THE CONFIG TO OFFCENTER THE GAME HAPPENINGS SO THE CENTER OF THE GAME IS
Those issues aside (and with some, at least the former two issues definitely are), two monitors and a 2560x1024 resolution would give even the newest GPU (with FSAA, AF and shadow rendering cranked up to max of course) a very decent workout, and put all that unuseable horsepower on the fringes of the useable realm.
My two cents.
duel duel (Score:1)
Re:duel duel (Score:1)
-D