Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Running Video Cards in Parallel 263

G.A. Wells writes "Ars Technica has the scoop on a new, Alienware-developed graphics subsystem called Video Array that will let users run two PCI-Express graphics cards in parallel on special motherboards. The motherboard component was apparently developed in cooperation with Intel. Now if I could only win the lottery."
This discussion has been archived. No new comments can be posted.

Running Video Cards in Parallel

Comments Filter:
  • by Unloaded ( 716598 ) * on Thursday May 13, 2004 @09:54AM (#9138789)
    ...Microsoft announced that Clippy had broken the before unheard of 2,000 fps barrier.
  • by Randolpho ( 628485 ) on Thursday May 13, 2004 @09:55AM (#9138802) Homepage Journal
    PCI-Express? What happened to AGP?

    Seriously, I've been out of the PC market for too long. Alas, poor wallet. I had cash flow, Horatio.
    • by Laebshade ( 643478 ) <laebshade@gmail.com> on Thursday May 13, 2004 @09:59AM (#9138855)
      PCI-Express is meant to replace AGP. From what little I've read into it, it will require lower voltages than AGP and has a wider bus.
    • by Plutor ( 2994 ) on Thursday May 13, 2004 @10:03AM (#9138924) Homepage
      You've been out of the PC market for about a decade then, if you've never heard of PCI-Express. It's been proposed and talked about and raved about for years, but it's just now finally coming to market. The best thing is that it's not limited to a single slot per board! That's why this parallel thing is even possible.
      • by The_K4 ( 627653 ) on Thursday May 13, 2004 @10:28AM (#9139162)
        Since the PCI-Express spec defines switches (these are like P2P-bridges only they have 2 sub-buses) a mother board manufacturer could add 2 of 3 of these and get 4 PCI-Express Graphics ports (or 7 and get 8 ports) the problem is that every time you do this you have to share the total bandwidth at the highest level. Since PCI-Express does have more bandwidth the AGP 8x and 1/2 of that bandwidth is dedicated up the other 1/2 dedicated down. So the down-stream (where video cards use most of their bandwidth) is greater the AGP8x's TOTAL bandwidth. So this data path bottle next shouldn't be bad if you have 2 cards (might work well for 4 if they use the bus right).
      • You've been out of the PC market for about a decade then, if you've never heard of PCI-Express. It's been proposed and talked about and raved about for years, but it's just now finally coming to market.

        He must have been out of the market for a decade, to have never heard of something which is only just now in the market? What?

        I've been half way out of the market for about five years, and I only recently heard of PCI-Express, and I didn't have many details about it. Researching new, not yet marketed t
      • by WuphonsReach ( 684551 ) on Thursday May 13, 2004 @02:47PM (#9142508)
        You've been out of the PC market for about a decade then, if you've never heard of PCI-Express.

        That's rather over-stating the case.

        Roughly 10 years ago, PCI was finally just supplanting EISA/VESA and ISA boards were still common.

        I build a few machines per year, and PCI-Express only just hit my radar screen in the past 12-18 months. Even today, I have yet to see mainstream motherboards or cards for it, so it's still rather ephemeral at this point.

        It is an interesting design. Whether or not it will live up to it's promise remains to be seen.
    • by Auntie Virus ( 772950 ) on Thursday May 13, 2004 @10:07AM (#9138972)
      There's a White Paper on PCI Express from Dell: Here [dell.com]
  • Quad-screen? (Score:5, Interesting)

    by Vrallis ( 33290 ) on Thursday May 13, 2004 @09:55AM (#9138808) Homepage
    Hell, I couldn't care less about parallel processing for the video cards.

    I want tri-head or quad-head video, but with at least AGP speeds. You can do it now, but only with PCI cards getting involved.
    • Comment removed (Score:5, Informative)

      by account_deleted ( 4530225 ) on Thursday May 13, 2004 @10:02AM (#9138909)
      Comment removed based on user account deletion
    • Re:Quad-screen? (Score:3, Interesting)

      by gr8_phk ( 621180 )
      I'll second that. Flight simulation begs for 3 screens, as do some driving and other games.

      On another note, I suspect the only way it will really accelerate single images is in cases where render-to-texture is used. i.e. per-frame generation of shadow or environment maps. The completed maps could then be passed to the card that actually has the active frame buffer to be used in regular rendering. Two cards could at BEST double performance and nothing ever scales optimally.

      • Re:Quad-screen? (Score:5, Interesting)

        by RhettLivingston ( 544140 ) on Thursday May 13, 2004 @11:09AM (#9139673) Journal

        As long as you can live with at least one of the screens not running quite as fast (maybe an informational type of screen as opposed to 3D scenery?), 3 screens ia really easy today. Almost all decent AGP cards these days support 2 screens at 1600x1200. Throw in a good PCI card and you've got 3. I've been running this way for years and it works well. Actually, the PCI card isn't shabby.

        The only problem I encounter in Windows is an occasional tooltip coming up on the primary monitor instead of a secondary monitor. This is not the fault of the OS, rather the application is constraining the tooltip to be on the primary monitor by forcing it to be within the primary monitor's coordinates.

        Note that Matrox's single board AGP solution does not compete with this. Using a high end NVidia for the main two screens provides too much of a performance advantage to give up for Matrox's slow cards. Matrox's cards, even though on AGP, run about like the PCI cards.

        Regardless, when these systems become more available, I will be one of the first to put 2 video cards in and run 3 or 4 screens from my PCI Express system. But, though I like playing 3D games this way, I do it for the extra informational surface for programming. It greatly eases things to run your application on one screen and your development environment on all of the others so that you can see everything at once. And with 19" 1920x1440 monitors (which usually manage 1600x1200 with better focus than a 1600x1200) running around $250 a pop, its a very worthwhile investment.

    • Re:Quad-screen? (Score:3, Informative)

      by Lumpy ( 12016 )
      hows it feel to want? espicalyl about something that has been available for a decade now.

      here [matrox.com]

      Matrox, because all the other cards are merely toys for the kids at home.
      • I do know about what Matrox has. I used Matrox dual-head for years (G450, I believe), and it sucked--bad--under Linux, using their proprietary drivers or not. I eventually convinced work to let me get a low-end nVidia dual-head, and things greatly improved.

        While I do game at home, nothing I play can use dual-head, other than stretching--which is unplayable. Ever tried playing with your crosshair split in half by 4 inches?

        I probably will end up doing what others have suggested--just mix a PCI card in wi
        • Sucked for years, ok, but compared to what?
          Have a better option did you?

          If it weren't for Matrox, irregardless of your own issues with drivers on your chosen platform, do you think there would be _any_ multi-monitor cards today? And even if there were, do you think they'd be up to the quality that they are now?

          Come on, at least give credit where it is due.

          Because of Matrox, you can now get a multi-monitor nVidoa or ATI card that is _better_ than a matrox card for your needs.
      • As other posters have pointed out, these cards are horrible at 3d rendering.
  • Press Release (Score:5, Informative)

    by Anonymous Coward on Thursday May 13, 2004 @09:56AM (#9138817)
    over here: clicky [alienware.com]
  • Voodoo (Score:5, Interesting)

    by Eu4ria ( 110578 ) on Thursday May 13, 2004 @09:56AM (#9138818)
    Didnt the early voodoo cards allow something similar to this ? I know they had a pass through from your 'normal' video card but i seem to remember the ability of running more and they would each do alternating scan lines.

    • Re:Voodoo (Score:5, Informative)

      by scum-e-bag ( 211846 ) on Thursday May 13, 2004 @09:59AM (#9138851) Homepage Journal
      The company was 3dFx, and it was thier Voodoo II cards that allowed the use of two cards a few years back, sometime around 1998 IIRC.
    • Re:Voodoo (Score:2, Interesting)

      by Trigun ( 685027 )
      Yes, they did. Unfortunately, at the time they were too expensive and took up all of your extra slots on your mobo. Now, with integrated everything, it's not so bad.

      Good idea implemented too early. Such is life.
    • Re:Voodoo (Score:3, Interesting)

      by naoiseo ( 313146 )
      without a special motherboard, yes.

      I think you could string something like 4 voodoo rush cards together or something (who knows if you got 4x performance, but I'm sure it went up not down)

      Problem was, by the time they put this out there, the tech it was running was months behind cutting edge. 4x something old is easily forgotten.

    • Re:Voodoo (Score:5, Informative)

      by UnderScan ( 470605 ) <jjp6893&netscape,net> on Thursday May 13, 2004 @10:01AM (#9138886)
      SLI - scan line interleve, was available for 3dfx Voodoo IIs (maybe even Voodoo 1) where the first card would process all the odd lines & the second card would process all the even lines.
      • Re:Voodoo (Score:5, Interesting)

        by kamelkev ( 114875 ) on Thursday May 13, 2004 @10:07AM (#9138980)
        Voodoo was basically the beginning of the performance PC market, with tons of wierd options and card types.

        Benchmarks for the old 3dfx V2 SLI can be seen here:

        http://www4.tomshardware.com/graphic/19980204/

        I was (and still am, although its in the junk pile) a 3dfx V2 owner, the performance of that card was just amazing at the time. The Voodoo and the Voodoo2 definitely changed the world of 3d gaming.

        Also of interest is an API that came out much later for the 3dfx chipsets that actually let you use your 3dfx chipset (they didn't call it a GPU back in the day) as another system processor. If you were an efficient coder you could actually offload geometric and linear calculations to the card for things other than rendering. I can't seem to find the link for that though, it may be gone forever.

  • Light on Info (Score:2, Interesting)

    by the morgawr ( 670303 )
    The PR mess is light on information and I don't have flash to view their site. Can someone give some technical information? e.g. How does this work? What does it really do? What can a typical gamer actually expect (surely it doesn't just double your power by sending every other frame to each card)?
  • by cheese_wallet ( 88279 ) * on Thursday May 13, 2004 @09:58AM (#9138835) Journal
    I think it is great that a company has the will to do something like this, even if it doesn't catch on. It's cool to try something new, instead of just hanging back and doing the tried and true.

    I'll admit I haven't yet read the whole article, but even though it says that it isn't tied to any one video card, that doesn't say to me that it can have multiple disparate cards. If it is doing something along the lines of SLI, I would guess that the speeds would need to be matched between the two cards. And that would imply having two of the same card, whatever card the user chooses.

    But maybe not... maybe it's the advent of asymetric multi video processing.
    • by jonsmirl ( 114798 ) on Thursday May 13, 2004 @10:17AM (#9139067) Homepage
      You can do this today with Chromium [sourceforge.net].

      Chromium replaces your OpenGL library with one that farms the OpenGL drawing out to multiple machines. It's how display walls [psu.edu] are built.

      You can use the same technique for multiple card in the same box.
    • How about this:
      Each card will recieve a number of scanlines to process according to it's strength, therefore making the rendering speed similar, and after that syncing the speed to the lower one.

      There are still problems with different features that might or might not be available on one of the cards such as pixel shaders. Also anti-aliasing can be weird.
  • this isn't new (Score:5, Informative)

    by f13nd ( 555737 ) on Thursday May 13, 2004 @10:00AM (#9138877) Homepage
    Alienware didn't invent this
    the PCI and PCI Express have had this written into spec
    AGP does too, but when was the last time you saw dual AGP slots on a mobo? (they do exist)
    • Do any non-Mac dual AGP motherboards exist? If so, could you list some or all of them so that I can do some research? Thanks!
    • Re:this isn't new (Score:5, Informative)

      by BenBenBen ( 249969 ) on Thursday May 13, 2004 @10:25AM (#9139129)
      The AGP port spec lays it out; AGP is a preferred slot on the PCI bus, with four main enhancements (pipeline depth etc) designed to... Accelerate Graphics. Therefore, if you had more than one PCI bus, you could technically have more than one AGP port. However, I cannot find a single motherboard that offers 2 AGP slots, including looking in numerous AV/editing specialists, where I'd expect this osrt of thing to tip up.
    • AGP does too, but when was the last time you saw dual AGP slots on a mobo? (they do exist)

      Would you mind to enlighten us as to where we might find such a board?
  • by liquidzero4 ( 566264 ) on Thursday May 13, 2004 @10:02AM (#9138893)
    So what technology did Alienware create here? None..

    So they have one of the first MB's with two PCI Express slots. Big deal, soon MB's will contain many PCI-Express slots. Hopefully a lot more than 2.

  • Oh, come on! (Score:4, Insightful)

    by Short Circuit ( 52384 ) <mikemol@gmail.com> on Thursday May 13, 2004 @10:04AM (#9138933) Homepage Journal
    All you really need is some way to copy the data in memory from one card to another.

    Easy solution? Several high-speed serial connections in parallel between the two cards. With a little bit of circuitry on the card dedicated to keeping the data identical.

    Or, with a little bit of a performance hit, you could keep each section of RAM separate, and route misses over the cables.
  • by Gr8Apes ( 679165 ) on Thursday May 13, 2004 @10:04AM (#9138940)

    From the article: "The answers may have to wait until Q3/Q4". There are no performance numbers, no real statements of how it works, nothing much at all. Just wow, gee whiz, dual graphics cards in parallel. What exactly does "in parallel" mean? That's not even addressed.

    Some things I thought of immediately reading this, great - two displays each driven by a separate card, or, better yet, quad displays driven by two cards. Nope, not a word about either possibility. The implication of the PR/article is that 3D graphics will be processed faster. How? Do they have some nifty way of combining two standard off the shelf graphics card signals into a single monitor? (Hint, it's hard enough getting the monitor to properly synch up with a single high performance graphics card!)

    Since when does ArsTechnica merely regurgitate PRs? This was 99.999% vacuum.

    • a couple years back you could link together two voodoo2 cars together to run games at a higher resolution. It did require a special cable as I recal.

      Perhaps something like that?
    • >What exactly does "in parallel" mean?

      well this is Alienware so I think they mean that one card is on top of the other and both are perpendicular to the MB, hence they are running "in parallel" with each other.
    • Not really hard.... (Score:3, Informative)

      by Kjella ( 173770 )
      How? Do they have some nifty way of combining two standard off the shelf graphics card signals into a single monitor? (Hint, it's hard enough getting the monitor to properly synch up with a single high performance graphics card!)

      Duplicate data stream (should be doable in hardware), have them render half each (every 2nd scanline?) and merge them with a trivial buffer (keep two bools, one "firsthalf=done/not done, secondhalf=done/not done"). You'd limit yourself to the minimum of the two, but since they eac
    • E3 is on this week. E3 is dominated by product announcements, including products that won't see the light of day for years. So, the "vapor" aspect of it is what pretty much what happens at most of the trade shows. By that measure, half of the news out this week is "vapor." Sure, some people might blow smoke and tell you about the performance stats, but we all know that when the products actually ship, their "performance stats" will probably have changed as well.

      So, you have the press release to go on. And
  • by CodeMonkey4Hire ( 773870 ) on Thursday May 13, 2004 @10:05AM (#9138945)
    for Longhorn [slashdot.org]?
  • In fact all the first generation PCI-Express chipsets only support one x16 PCIe for graphics controller.

    I doubt that Intel is going to make a 2 port one especially for Alienware.

    So I expect it means that the second graphics card is plugged into a x4 or x1 PCIe connector.

    Anyway, this is nothing special, it is all part of the specification. Hell, you could have two AGP v3 slots in a machine working at the same time - how do you think ATI's integrated graphics can work at the same time as an inserted AGP ca
    • Are you quite sure that ATI's integrated graphics are AGP based and not on the PCI bus? Last I checked, there was not a publicly available chipset that could handle more then one AGP video device in a system at a time. If it were possible, I'd throw out my nVidia card today and worship at the alter of ATI... I fear though that you must be mistaken.
      • Anyone with that hardware could tell us.

        Under Linux, run "lspci" as root, and see if the two cards are on different PCI buses.

        You can do something similar under Windows XP:

        Go to the device manager, and look at the Location field of your two video devices. The box I'm on only has one, but here's what an AGP card's location field looks like: "PCI bus 1, device 0, function 0"
    • In fact all the first generation PCI-Express chipsets only support one x16 PCIe for graphics controller.

      Have you seen anything talking about second-generation chipsets that support two 16x PCI-express connectors?

      This is what I want and I'm not getting a new computer until it happens.
    • I thought PCI Express is a bus, like the current PCI; even with one slot on the motherboard, you could connect multiple devices in parallel into it.

      If it's not a bus but a port, I don't see how it's radically better than AGP.

  • When Windows 98 came out, there was a new feature (that before had pretty much been limited to Matrox cards with a special driver) that would let you use multiple PCI and AGP video cards in the same motherboard with multiple monitors. At first glance, this seems like pretty much the same idea.

    The article seems to claim that the cards will be able to split processing duties, even if they're not from the same manufacturer. That particular claim seems very dubious to me for some reason. Other than integrating
  • Metabyte PGC (Score:3, Informative)

    by Erwos ( 553607 ) on Thursday May 13, 2004 @10:14AM (#9139051)
    It looks like the same thing as Metabyte PGC - and Alienware was supposed to be the roll-out partner for that.

    Nothing wrong with it, though - PGC actually did work, and was previewed independently by several people (I think Sharky?).

    -Erwos
  • by MtViewGuy ( 197597 ) on Thursday May 13, 2004 @10:23AM (#9139111)
    I think the big question we need to ask is do we really need multiple monitor setups?

    Besides the obvious issue of hardware cost of multiple graphics cards and multiple monitors, you also have to consider desktop space issues. Even with today's flat-panel LCD's, two monitors will hog a lot of desktop space, something that might not be desirable in many cases.

    I think there is a far better case for a single widescreen display instead of multiple displays. Besides having a lot less impact on hogging desktop space widescreen displays allow you to see videos in the original aspect ratio more clearly and also allow for things like seeing more of a spreadsheet, clearer preview of work you do with a desktop publishing program and (in the case of a pivotable display) make the reading of web pages easier and/or single page work with a DTP program easier. Is it small wonder why people so much liked the Apple Cinema Display that uses a 1.85 to 1 (approximately) aspect ratio?
    • Two 4:3 displays can be bought at a lower cost than one widescreen display.
      • Two 4:3 displays can be bought at a lower cost than one widescreen display.

        I agree with that, but the desktop space hogged by two 17" LCD monitors is surprisingly large, far more than what you get with the Apple Cinema Display.

        Besides, with large-scale manufacturing of widescreen LCD's the cost would come down very quickly. Remember, most of today's latest graphics cards can easily add display drivers that can support something akin in aspect ratio to the Apple Cinema Display (they're already part way th
    • So, uh, why not multiple widescreen displays?
    • Even with today's flat-panel LCD's, two monitors will hog a lot of desktop space, something that might not be desirable in many cases.

      If you would have left that third paragraph off of your post, I would have rated Funny. (I thought you were talking about 'desktop' in terms of root window).

      Hell yes, some of us NEED [tjw.org] multiple monitor setups. I've been using a dual monitor setup for about 5 years and although I can get by with one, it would make my day to day work much more annoying. It would be lik

      • The only places I know where multiple monitor setups are a good idea are in CAD/CAM work (where seeing multiple views of an object being drawn is a good idea) and in the financial industry (where seeing multiple real-time charts and other financial data are a must for equities traders).

        Besides, most games are designed with the assumption that you're using only one monitor. Wouldn't it be better for a game to take advantage of a wider aspect ratio display so the view becomes a bit more realistic?
        • I've been using 2 monitors for years now and I have to say that once you've used them for a while you won't go back.

          Right now I'm posting to slashdot on my secondary monitor and playing eve-online on my primry. While I've used the two monitor set up for web design and programing you will find uses beyond the standard CAD/programming argument really fast once you've got it in front of you.
    • Apple's is 16:10. Not that I quite figured out why, but TVs seem to be 16:9, Monitors 16:10. Now, if only there was HDTV content near me (HDTV broadcasts? None of the national broadcasters at least. HD-DVD? Still up in the blue. HDTV newsgroups *cough*? Too large for a measly broadband line).

      I just hope that we in Europe can *please* have HDTVs and none of those fucking stupid region codes, yes? Or do I have to wait for DeCSS2 before I can buy any of the special offers??? (ever notice how those with no cod
    • "I think the big question we need to ask is do we really need multiple monitor setups? Besides the obvious issue of hardware cost of multiple graphics cards and multiple monitors, you also have to consider desktop space issues."

      Ironic you mention that on the day that my development computer at work is moved onto a new desk to cope with the 4 monitors (for 4 PCs, admittedly), while our game-playing PCs are filled with multiple-output graphics cards, and with outputs serving multiple monitors. Oh, and the
  • The real question (Score:5, Interesting)

    by 241comp ( 535228 ) on Thursday May 13, 2004 @10:26AM (#9139144) Homepage
    Is this compatible with Brook [stanford.edu] and other general-purpose GPU [gpgpu.org] programming techniques? The use I see for it is this:

    Imagine an openmosix cluster of dual-processor machines that run bioinformatic calculations and simulations. Lots of matrix math and such - pretty fast (and definitely a lot faster than a single researcher's machine).

    Now imagine the same cluster but each machine has 2 or 4 dual-head graphics cards and each algorithm that can be created in Brook or similar is. That gives each machine up to 2 CPU's and maybe 8 GPU's that may be used for processing. The machines are clustered so a group of ~12 commodity machines (1 rack) could have 24 CPU's and 96 GPU's. Now that would be some serious computing power - and relatively cheap too (since 1-generation old dual-head cards are ~$100-$150).

    By the way, does anyone know if there is any work going on to create toolkits for Octave and/or MatLab which would utilize the processing power of a GPU for matrix math or other common calculations?
    • Wired carried a story on this very topic last year (or maybe the year before). I don't have the issue handy. It was nicely done, and was the first inkling I'd had that these GPUs where some serious hardware looking for a problem to solve.

      It would be interesting to see some of the distributed computing efforts (seti@home, etc) take advantage of GPUs if there is anything there of use.
  • by MoZ-RedShirt ( 192423 ) on Thursday May 13, 2004 @10:35AM (#9139243)
    History repeating: Who can (or can't) remember [tomshardware.com]
  • discrete parallel graphics processing has been around for a while. The most notable example of it is probably 3DFX and their Voodoo-2 cards. However, there's a problem with this tactic, namely, in the "diminishing gains" department.

    So here's the question:

    -How is pixel processing going to work? For a given frame, there is vertex, texture information, as well as the interesting little shader routines that work their magic on these pixels. How are you going to split up this workload between the 2 GPUs? you can't split a frame up between the GPUs, that would break all texture operations and there would be considerable overhead with the GPUs swapping data over the PCI bus. *MAYBE* having each gpu handle a frame in sequence would do the trick, but, again, it's a dicey issue.

    It would appear to me that this dual-card graphics rendering is quite similiar to dual-gpu graphics cards. Except, where in a graphics card you can handle cache/memory coherency and logic arbiting easily due to the proximity of the GPUs, with this discrete solution you run the problem of having to use the PCI Express bus, which, as nice as it is, is certainly not that much faster than AGP.

    So I say, power to you Alienware. If you can pull it off with Nvidia, ATi et all, great. It's too bad the cynical side of me thinks this idea reeks of those blue crystals marketing departments love :)
  • They might be able to make this work for games but I'm personally more interested by the simple fact that Intel chipsets will support dual PCI Express graphics buses. Hopefully this will be possible on a reasonably proces mobo.

    Having 3 slots would be ideal but I won't say no to 2 GFX cards so I can drive two monitors from two independent graphics cards at last.

    It doesn't say how this technology will combine the two cards and whether it will need software support from the games. Hopefully it won't but the
    • "Having 3 slots would be ideal but I won't say no to 2 GFX cards so I can drive two monitors from two independent graphics cards at last."

      What's stopping you from plugging an extra PCI based card in today? I've been running two monitors from two independent graphics for years.
  • But will they run? (Score:3, Insightful)

    by strredwolf ( 532 ) on Thursday May 13, 2004 @10:57AM (#9139523) Homepage Journal
    Okay, we have a GeForce and a Radeon in parallel. What's the communication protocol that's running over PCI Express that allows them to do that?

    Something tells me you need special drivers AND/OR a standardized graphic card accellerator protocol just to pull it off, otherwize you're stuck with two of the same cards.
  • Let us not forget that nVidia and ATI both produce chipsets. Multiple graphics card purchases per system would be a dream for them, and they can help in a direct manner. Although there are not many (read as 'maybe a dozen worldwide') boards with dual AGP, the PCI-Express standard will lead to much easier multi-GPU setups. Also, the newest ATI chipsets with embedded GPUs support multi-monitor if an ATI card is used in the empty AGP slot, so you know that these guys already have to have agendas for PEG in
  • by 89cents ( 589228 ) on Thursday May 13, 2004 @11:58AM (#9140323)
    Ok, so they havn't explicitly said so, but have been hinting at it [legitreviews.com]

    Ati's Terry Makedon says: "Something big is coming for CATALYST in the next 2-3 months. It will take graphic drivers to a brand new level, and of course will be another ATI first. It will be interesting to see how long before other companies will copy the concept after we launch it."

    Hmmm... just in time for PCI Express and it's not something specifc to Ati's hardware.

  • There's also an in depth [homelanfed.com] interview up at HomeLAN, which talks more about the specs of the X2 systems, along with how Alienware's going to handle powering and cooling the beast.
  • This is just a motherboard with two slots for graphics boards. Period. This is not about somehow using two GPUs on separate cards to run one display faster.

    It's possible to design and build GPUs that will play together to provide higher performance graphics. The Apple 3D Quickdraw Accelerator Card [kicks-ass.net], from the early PowerPC days, does exactly that. If you get two, drawing speed nearly doubles. That device was more of a coprocessor, closer to the CPU than a modern GPU. It didn't drive the display; it jus

  • by default luser ( 529332 ) on Thursday May 13, 2004 @12:58PM (#9140983) Journal
    And no, I'm not referring to SLI, which was specifically designed to pair two Voodoo 2s together. I'm talking about technology that can bridge any two cards together. This is nothing more than the complex bridging involved in say Metabyte's TNT 'SLI' [sharkyextreme.com] solution that consisted of a PCI bridge and software to split the framebuffers. It was never released for two reasons

    1. GeForce 256 released shortly after this was announced.
    2. PCI bridge required both the AGP and the PCI card to operate in PCI DMA mode. Unfortunately, there never was such a thing as an "AGP bridge".

    In any case, other companies have now successfully implemented a simple framebuffer splitting concept on-card, where the bandwidth is more plentiful. The ATI Rage Fury MAXX and the 3dfx VSA-100 come to mind, these chips simply split the framebuffer rendering according to complexity. Beyond that, NOTHING was shared - triangle and texture data were replicated for each chip.

    The key to this: on the software side in 3D mode their software automatically splits two framebuffers between the two cards. As for the "special" chipset, whatever scene data is sent to one video card, the same data is sent to the other video card. I can't imagine it being any more complex than this.
  • I hope this technology doesn't delay Doom III anymore. John Carmack will probably want to program in full support for his new engine before it ships off cause he has to make money of licenses until he makes the next one.
  • Graphics algorithms are some of the easiest on earth to distribute and run in parallel. I'm surprised that this hasn't been more popular already.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...