Running Video Cards in Parallel 263
G.A. Wells writes "Ars Technica has the scoop on a new, Alienware-developed graphics subsystem called Video Array that will let users run two PCI-Express graphics cards in parallel on special motherboards. The motherboard component was apparently developed in cooperation with Intel. Now if I could only win the lottery."
In other news... (Score:5, Funny)
Re:In other news... (Score:5, Funny)
However, they went on to say that Clippy was still intact. They're going to try again using a bigger catapult and with a concrete-reinforced barrier.
Man am I out of the loop. (Score:4, Funny)
Seriously, I've been out of the PC market for too long. Alas, poor wallet. I had cash flow, Horatio.
Re:Man am I out of the loop. (Score:4, Informative)
Re:Man am I out of the loop. (Score:3, Informative)
http://www.pcworld.com/news/article/0,aid,94724,0
Re:Man am I out of the loop. (Score:2)
Not only that, but PCI is bidirectional, while PCI Express is not. A PCI express pair of wires transmits at 1.25Gib/s, so PCI express is barely faster than PCI at all for applications using mostly unidirectional stuff. Graphics cards, for example, are rarely used to send data back to the CPU. The difference is if, instead of 4 pins for a 1X PCI express connection, with 2.5Gib/s (1.25 each way), you have say, 64 pins for a 16X PCI Express connection, with 40Gib/s (20 each way).
Also note that the 1Gib/
Re:Man am I out of the loop. (Score:5, Informative)
Re:Man am I out of the loop. (Score:5, Interesting)
Re:Man am I out of the loop. (Score:2, Insightful)
He must have been out of the market for a decade, to have never heard of something which is only just now in the market? What?
I've been half way out of the market for about five years, and I only recently heard of PCI-Express, and I didn't have many details about it. Researching new, not yet marketed t
Re:Man am I out of the loop. (Score:4, Informative)
That's rather over-stating the case.
Roughly 10 years ago, PCI was finally just supplanting EISA/VESA and ISA boards were still common.
I build a few machines per year, and PCI-Express only just hit my radar screen in the past 12-18 months. Even today, I have yet to see mainstream motherboards or cards for it, so it's still rather ephemeral at this point.
It is an interesting design. Whether or not it will live up to it's promise remains to be seen.
Re:Man am I out of the loop. (Score:5, Informative)
Quad-screen? (Score:5, Interesting)
I want tri-head or quad-head video, but with at least AGP speeds. You can do it now, but only with PCI cards getting involved.
Comment removed (Score:5, Informative)
Re:Quad-screen? (Score:2)
NOC situations (monitoring)
Graphics work (GIMP, Maya, etc)
Coding (reference materials on one screen, IDE/terminals on another, debugging/output on another)
While I do game some, nothing I play has any worthy multi-head capabilities, so this is meaningless for gaming, at least for me.
Re:Quad-screen? (Score:3, Interesting)
On another note, I suspect the only way it will really accelerate single images is in cases where render-to-texture is used. i.e. per-frame generation of shadow or environment maps. The completed maps could then be passed to the card that actually has the active frame buffer to be used in regular rendering. Two cards could at BEST double performance and nothing ever scales optimally.
Re:Quad-screen? (Score:5, Interesting)
As long as you can live with at least one of the screens not running quite as fast (maybe an informational type of screen as opposed to 3D scenery?), 3 screens ia really easy today. Almost all decent AGP cards these days support 2 screens at 1600x1200. Throw in a good PCI card and you've got 3. I've been running this way for years and it works well. Actually, the PCI card isn't shabby.
The only problem I encounter in Windows is an occasional tooltip coming up on the primary monitor instead of a secondary monitor. This is not the fault of the OS, rather the application is constraining the tooltip to be on the primary monitor by forcing it to be within the primary monitor's coordinates.
Note that Matrox's single board AGP solution does not compete with this. Using a high end NVidia for the main two screens provides too much of a performance advantage to give up for Matrox's slow cards. Matrox's cards, even though on AGP, run about like the PCI cards.
Regardless, when these systems become more available, I will be one of the first to put 2 video cards in and run 3 or 4 screens from my PCI Express system. But, though I like playing 3D games this way, I do it for the extra informational surface for programming. It greatly eases things to run your application on one screen and your development environment on all of the others so that you can see everything at once. And with 19" 1920x1440 monitors (which usually manage 1600x1200 with better focus than a 1600x1200) running around $250 a pop, its a very worthwhile investment.
Re:Quad-screen? (Score:3, Informative)
here [matrox.com]
Matrox, because all the other cards are merely toys for the kids at home.
Re:Quad-screen? (Score:2)
While I do game at home, nothing I play can use dual-head, other than stretching--which is unplayable. Ever tried playing with your crosshair split in half by 4 inches?
I probably will end up doing what others have suggested--just mix a PCI card in wi
Re:Quad-screen? (Score:2)
Have a better option did you?
If it weren't for Matrox, irregardless of your own issues with drivers on your chosen platform, do you think there would be _any_ multi-monitor cards today? And even if there were, do you think they'd be up to the quality that they are now?
Come on, at least give credit where it is due.
Because of Matrox, you can now get a multi-monitor nVidoa or ATI card that is _better_ than a matrox card for your needs.
Re:Quad-screen? (Score:2)
Press Release (Score:5, Informative)
Re:3dfx has done it again ... (Score:2)
Re:3dfx has done it again ... (Score:2)
Secondly, and more importantly, nahh, you're obviously trolling if you can dis the Voodoo2 for what it was at the time...THE card, PERIOD.
(No, not for long, but without any doubt at the time it came out it was by FAR the best consumer card available, also cost me twice as much as
Voodoo (Score:5, Interesting)
Re:Voodoo (Score:5, Informative)
Re:Voodoo (Score:2, Interesting)
Good idea implemented too early. Such is life.
Re:Voodoo (Score:3, Interesting)
I think you could string something like 4 voodoo rush cards together or something (who knows if you got 4x performance, but I'm sure it went up not down)
Problem was, by the time they put this out there, the tech it was running was months behind cutting edge. 4x something old is easily forgotten.
Re:Voodoo (Score:5, Informative)
Re:Voodoo (Score:5, Interesting)
Benchmarks for the old 3dfx V2 SLI can be seen here:
http://www4.tomshardware.com/graphic/19980204/
I was (and still am, although its in the junk pile) a 3dfx V2 owner, the performance of that card was just amazing at the time. The Voodoo and the Voodoo2 definitely changed the world of 3d gaming.
Also of interest is an API that came out much later for the 3dfx chipsets that actually let you use your 3dfx chipset (they didn't call it a GPU back in the day) as another system processor. If you were an efficient coder you could actually offload geometric and linear calculations to the card for things other than rendering. I can't seem to find the link for that though, it may be gone forever.
Re:Voodoo (Score:2)
Light on Info (Score:2, Interesting)
Re:Light on Info (Score:2, Informative)
It splits the screen in half. Alienware claims a ~50% boost.
interesting technology (Score:5, Interesting)
I'll admit I haven't yet read the whole article, but even though it says that it isn't tied to any one video card, that doesn't say to me that it can have multiple disparate cards. If it is doing something along the lines of SLI, I would guess that the speeds would need to be matched between the two cards. And that would imply having two of the same card, whatever card the user chooses.
But maybe not... maybe it's the advent of asymetric multi video processing.
Re:interesting technology (Score:5, Interesting)
Chromium replaces your OpenGL library with one that farms the OpenGL drawing out to multiple machines. It's how display walls [psu.edu] are built.
You can use the same technique for multiple card in the same box.
Re:interesting technology (Score:2)
Each card will recieve a number of scanlines to process according to it's strength, therefore making the rendering speed similar, and after that syncing the speed to the lower one.
There are still problems with different features that might or might not be available on one of the cards such as pixel shaders. Also anti-aliasing can be weird.
this isn't new (Score:5, Informative)
the PCI and PCI Express have had this written into spec
AGP does too, but when was the last time you saw dual AGP slots on a mobo? (they do exist)
Re:this isn't new (Score:2)
Re:this isn't new (Score:2)
I can dream...
Re:this isn't new (Score:5, Informative)
Re:this isn't new (Score:2)
Would you mind to enlighten us as to where we might find such a board?
Not quite the same... (Score:2)
Although it is impressive that an SSI can run across them. Very neat.
But I wonder if you can actually run a single X11 session using both screens.
Big Deal - PCI Express. Any one can add two video (Score:3, Insightful)
So they have one of the first MB's with two PCI Express slots. Big deal, soon MB's will contain many PCI-Express slots. Hopefully a lot more than 2.
Re:Big Deal - PCI Express. Any one can add two vid (Score:2)
Re:Big Deal - PCI Express. Any one can add two vid (Score:2)
too many standards (Score:2, Funny)
Oh, come on! (Score:4, Insightful)
Easy solution? Several high-speed serial connections in parallel between the two cards. With a little bit of circuitry on the card dedicated to keeping the data identical.
Or, with a little bit of a performance hit, you could keep each section of RAM separate, and route misses over the cables.
Nice, A complete Vapor-article. (Score:5, Informative)
From the article: "The answers may have to wait until Q3/Q4". There are no performance numbers, no real statements of how it works, nothing much at all. Just wow, gee whiz, dual graphics cards in parallel. What exactly does "in parallel" mean? That's not even addressed.
Some things I thought of immediately reading this, great - two displays each driven by a separate card, or, better yet, quad displays driven by two cards. Nope, not a word about either possibility. The implication of the PR/article is that 3D graphics will be processed faster. How? Do they have some nifty way of combining two standard off the shelf graphics card signals into a single monitor? (Hint, it's hard enough getting the monitor to properly synch up with a single high performance graphics card!)
Since when does ArsTechnica merely regurgitate PRs? This was 99.999% vacuum.
Re:Nice, A complete Vapor-article. (Score:2)
Perhaps something like that?
Re:Nice, A complete Vapor-article. (Score:2)
well this is Alienware so I think they mean that one card is on top of the other and both are perpendicular to the MB, hence they are running "in parallel" with each other.
Not really hard.... (Score:3, Informative)
Duplicate data stream (should be doable in hardware), have them render half each (every 2nd scanline?) and merge them with a trivial buffer (keep two bools, one "firsthalf=done/not done, secondhalf=done/not done"). You'd limit yourself to the minimum of the two, but since they eac
Re:Nice, A complete Vapor-article. (Score:2, Informative)
So, you have the press release to go on. And
Are we going to need this... (Score:5, Funny)
Intel's Chipset only supports One x16 PCIe (Score:2, Informative)
I doubt that Intel is going to make a 2 port one especially for Alienware.
So I expect it means that the second graphics card is plugged into a x4 or x1 PCIe connector.
Anyway, this is nothing special, it is all part of the specification. Hell, you could have two AGP v3 slots in a machine working at the same time - how do you think ATI's integrated graphics can work at the same time as an inserted AGP ca
Re:Intel's Chipset only supports One x16 PCIe (Score:2)
Re:Intel's Chipset only supports One x16 PCIe (Score:2, Informative)
Under Linux, run "lspci" as root, and see if the two cards are on different PCI buses.
You can do something similar under Windows XP:
Go to the device manager, and look at the Location field of your two video devices. The box I'm on only has one, but here's what an AGP card's location field looks like: "PCI bus 1, device 0, function 0"
Re:Intel's Chipset only supports One x16 PCIe (Score:2)
Have you seen anything talking about second-generation chipsets that support two 16x PCI-express connectors?
This is what I want and I'm not getting a new computer until it happens.
Re:Intel's Chipset only supports One x16 PCIe (Score:2)
If it's not a bus but a port, I don't see how it's radically better than AGP.
Re:Intel's Chipset only supports One x16 PCIe (Score:2)
Everything old is new again? (Score:2, Informative)
The article seems to claim that the cards will be able to split processing duties, even if they're not from the same manufacturer. That particular claim seems very dubious to me for some reason. Other than integrating
Re:Everything old is new again? (Score:2, Insightful)
Re:Everything old is new again? (Score:2)
-fren
Metabyte PGC (Score:3, Informative)
Nothing wrong with it, though - PGC actually did work, and was previewed independently by several people (I think Sharky?).
-Erwos
But do you need multiple monitors? (Score:3, Interesting)
Besides the obvious issue of hardware cost of multiple graphics cards and multiple monitors, you also have to consider desktop space issues. Even with today's flat-panel LCD's, two monitors will hog a lot of desktop space, something that might not be desirable in many cases.
I think there is a far better case for a single widescreen display instead of multiple displays. Besides having a lot less impact on hogging desktop space widescreen displays allow you to see videos in the original aspect ratio more clearly and also allow for things like seeing more of a spreadsheet, clearer preview of work you do with a desktop publishing program and (in the case of a pivotable display) make the reading of web pages easier and/or single page work with a DTP program easier. Is it small wonder why people so much liked the Apple Cinema Display that uses a 1.85 to 1 (approximately) aspect ratio?
Re:But do you need multiple monitors? (Score:3, Interesting)
Re:But do you need multiple monitors? (Score:2)
I agree with that, but the desktop space hogged by two 17" LCD monitors is surprisingly large, far more than what you get with the Apple Cinema Display.
Besides, with large-scale manufacturing of widescreen LCD's the cost would come down very quickly. Remember, most of today's latest graphics cards can easily add display drivers that can support something akin in aspect ratio to the Apple Cinema Display (they're already part way th
Re:But do you need multiple monitors? (Score:2)
Re:But do you need multiple monitors? (Score:2)
Re:But do you need multiple monitors? (Score:2)
If you would have left that third paragraph off of your post, I would have rated Funny. (I thought you were talking about 'desktop' in terms of root window).
Hell yes, some of us NEED [tjw.org] multiple monitor setups. I've been using a dual monitor setup for about 5 years and although I can get by with one, it would make my day to day work much more annoying. It would be lik
Re:But do you need multiple monitors? (Score:2)
Besides, most games are designed with the assumption that you're using only one monitor. Wouldn't it be better for a game to take advantage of a wider aspect ratio display so the view becomes a bit more realistic?
Re:But do you need multiple monitors? (Score:2)
Right now I'm posting to slashdot on my secondary monitor and playing eve-online on my primry. While I've used the two monitor set up for web design and programing you will find uses beyond the standard CAD/programming argument really fast once you've got it in front of you.
Like most Widescreen monitors... (Score:2)
I just hope that we in Europe can *please* have HDTVs and none of those fucking stupid region codes, yes? Or do I have to wait for DeCSS2 before I can buy any of the special offers??? (ever notice how those with no cod
Re:But do you need multiple monitors? (Score:2)
Ironic you mention that on the day that my development computer at work is moved onto a new desk to cope with the 4 monitors (for 4 PCs, admittedly), while our game-playing PCs are filled with multiple-output graphics cards, and with outputs serving multiple monitors. Oh, and the
Re:But do you need multiple monitors? (Score:2)
It's like back to the days of the Voodoo 3 card running in SLI fashion? Such an idea seems superfluous nowadays, especially starting with the ATI R300 and nVidia GeForce FX generation of cards, which already have a tremendous amount of graphics processing power to start with. The very latest ATI X800 and nVidia GeForce FX 68
Re:But do you need multiple monitors? (Score:2)
Maybe someone's done this already, but I've never come across it. In the age of more widescreen laptops and d
The real question (Score:5, Interesting)
Imagine an openmosix cluster of dual-processor machines that run bioinformatic calculations and simulations. Lots of matrix math and such - pretty fast (and definitely a lot faster than a single researcher's machine).
Now imagine the same cluster but each machine has 2 or 4 dual-head graphics cards and each algorithm that can be created in Brook or similar is. That gives each machine up to 2 CPU's and maybe 8 GPU's that may be used for processing. The machines are clustered so a group of ~12 commodity machines (1 rack) could have 24 CPU's and 96 GPU's. Now that would be some serious computing power - and relatively cheap too (since 1-generation old dual-head cards are ~$100-$150).
By the way, does anyone know if there is any work going on to create toolkits for Octave and/or MatLab which would utilize the processing power of a GPU for matrix math or other common calculations?
Re:The real question (Score:2)
It would be interesting to see some of the distributed computing efforts (seti@home, etc) take advantage of GPUs if there is anything there of use.
The Return of Voodoo 2 SLI (Score:3, Informative)
Power to them if they can pull it off! (Score:4, Insightful)
So here's the question:
-How is pixel processing going to work? For a given frame, there is vertex, texture information, as well as the interesting little shader routines that work their magic on these pixels. How are you going to split up this workload between the 2 GPUs? you can't split a frame up between the GPUs, that would break all texture operations and there would be considerable overhead with the GPUs swapping data over the PCI bus. *MAYBE* having each gpu handle a frame in sequence would do the trick, but, again, it's a dicey issue.
It would appear to me that this dual-card graphics rendering is quite similiar to dual-gpu graphics cards. Except, where in a graphics card you can handle cache/memory coherency and logic arbiting easily due to the proximity of the GPUs, with this discrete solution you run the problem of having to use the PCI Express bus, which, as nice as it is, is certainly not that much faster than AGP.
So I say, power to you Alienware. If you can pull it off with Nvidia, ATi et all, great. It's too bad the cynical side of me thinks this idea reeks of those blue crystals marketing departments love
More interested in real dual PCI-Express GFX slots (Score:2)
Having 3 slots would be ideal but I won't say no to 2 GFX cards so I can drive two monitors from two independent graphics cards at last.
It doesn't say how this technology will combine the two cards and whether it will need software support from the games. Hopefully it won't but the
Re:More interested in real dual PCI-Express GFX sl (Score:2)
What's stopping you from plugging an extra PCI based card in today? I've been running two monitors from two independent graphics for years.
But will they run? (Score:3, Insightful)
Something tells me you need special drivers AND/OR a standardized graphic card accellerator protocol just to pull it off, otherwize you're stuck with two of the same cards.
Re:But will they run? (Score:2)
That's not the problem.
Presuming they're doing interleaved frame processing (you simply cannot do scan line interleaving -- things as simple as anti-aliasing make that impossible), then you'll have every other frame rendered by each card.
You realize that the cards don't produce the exact same output, right? You'd probably get a headache from trying to visually process the subtle differences that occurred between each successive frame (but w
Who else will obviously support this on mobos... (Score:2, Insightful)
ATI will be doing this it's Catalyst drivers soon (Score:3, Informative)
Ati's Terry Makedon says: "Something big is coming for CATALYST in the next 2-3 months. It will take graphic drivers to a brand new level, and of course will be another ATI first. It will be interesting to see how long before other companies will copy the concept after we launch it."
Hmmm... just in time for PCI Express and it's not something specifc to Ati's hardware.
In Depth Interview (Score:2)
This is multi-display, not faster, graphics (Score:2)
It's possible to design and build GPUs that will play together to provide higher performance graphics. The Apple 3D Quickdraw Accelerator Card [kicks-ass.net], from the early PowerPC days, does exactly that. If you get two, drawing speed nearly doubles. That device was more of a coprocessor, closer to the CPU than a modern GPU. It didn't drive the display; it jus
Re:This is multi-display, not faster, graphics (Score:2)
UPDATE: The current implementation splits the screen in half, assigning rendering for each half to one of two cards using a software load balancer to try and ensure proper synchronization. ...
Alienware is currently saying that they expect users to see a ~50% performance boost over single card implementations.
So that's a dual head graphics system mapped onto one display. Like picture-in-picture TVs.
With a really wide display, it might be worth it.
Hardly a new concept (Score:3)
1. GeForce 256 released shortly after this was announced.
2. PCI bridge required both the AGP and the PCI card to operate in PCI DMA mode. Unfortunately, there never was such a thing as an "AGP bridge".
In any case, other companies have now successfully implemented a simple framebuffer splitting concept on-card, where the bandwidth is more plentiful. The ATI Rage Fury MAXX and the 3dfx VSA-100 come to mind, these chips simply split the framebuffer rendering according to complexity. Beyond that, NOTHING was shared - triangle and texture data were replicated for each chip.
The key to this: on the software side in 3D mode their software automatically splits two framebuffers between the two cards. As for the "special" chipset, whatever scene data is sent to one video card, the same data is sent to the other video card. I can't imagine it being any more complex than this.
Doom III (Score:2)
It's about time (Score:2)
Re:what will this do? (Score:3, Funny)
Re:Is that new? (Score:2, Insightful)
Re:Next comes dual AGP graphics. (Score:5, Informative)
Re:Next comes dual AGP graphics. (Score:5, Informative)
Re:Next comes dual AGP graphics. (Score:2)
But this is
Re:Next comes dual AGP graphics. (Score:2)
Anyway, it will be nice to be able to get back to the old methods of multiple video cards, even if only the highest-end systems will be able to handle them. I really don't see this happening in too many systems based simply on the power draw. nVidia seems to think they can knock the power requirements on the new cards back to needing only a single rail, which is good, but even then the d
Re:I don't think the author really got it there... (Score:2, Insightful)
Re:How is this different than what I've been doing (Score:3, Informative)
You are running a bunch of video cards INDEPENDANT of each other. Clearly NOT THE SAME THING...
Re:Hello? Matrox, anyone? (Score:2, Informative)
Re:Hello? Matrox, anyone? (Score:2)
Frankly, though, I could care less about using a pair of cards to crunch graphics faster. I'd rather have more displays. I want PCI express so I can have a quad head setup with all the screens running decently fast. The only way to do this now is with PCI cards which aren'