Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Hardware

Sneak Peek at ATi's CrossFire Graphics System 130

Kez writes "While at Computex in Taipei HEXUS.net grabbed some benchmarks of an ATi CrossFire powered system. They have since had the chance to reconstruct a similar system and perform the same benchmarks with other cards and configurations to give us an idea of how CrossFire will perform. Obviously, CrossFire's performance will almost certainly change before release time, but in the very least the article provides an idea of what to expect. Interestingly, from these tests it looks like Nvidia's SLI may remain top-dog for graphics performance."
This discussion has been archived. No new comments can be posted.

Sneak Peek at ATi's CrossFire Graphics System

Comments Filter:
  • CrossFire (Score:1, Funny)

    by Jeet81 ( 613099 )
    I though crossfire was a trademark of chrysler. Or... Don't they have a patent on Crossfire.. If not I should get one.
    • Top Gear's Clarkson gave Chrysler's CrossFire nothing short of a disappointing review.

      I'd rather choose ATI's, thanks :)
    • Re:CrossFire (Score:4, Informative)

      by DreadCthulhu ( 772304 ) on Sunday July 10, 2005 @12:01PM (#13026996)
      Trademarks only generally only valid for one sort of product, and exist to prevent consumer confusion between the makers of similar products. If two companies/people make different products, that are not likely to be confused. they can have the same name for their trademarks - for example, Apple Records and Apple Computers. But if you tried to start a computer company called "Appletastic Computers" Apple Computer could sue you and probably win. Or for a real life example, think of the Lindows case, and how that could get confused the MS Windows OS.
      • Whoa, if that's true, whatever happened to StarCraft (the boat) and StarCraft (the game)? I could've sworn Blizzard sought and won a legal victory over that. Or something.
      • so say someone creates a new dog crap scooper and names it the 'crap picker upper' at first. the inventor doesn't like ford vehicles very much and generally thinks they are crap. then the inventor names it the 'ford picker upper'; and then for laughs officially names it the ford pick-up. ford (the auto company) has no way of stopping that? wow, sweet!
        • They do, it's called trademark dilution. It only applies for famous marks though. Ford is famous so they can probably prevail.
      • what about crapple computers?

        just kidding. don't kill me.
    • Dont worry by the time it hits the market the name will be changed to Foxfire... er... um... no Crossfox thats it and the logo will be a cross dressing fox. I can't wait can you?
    • Re:CrossFire (Score:2, Insightful)

      I thought it was a trademark of Milton Bradley.

      ATi should borrow their briliiant advert's song:
      Crossfire, you'll get caught up in the... Crossfire!

    • Whew.. and here I thought Crossfire was Show on cnn. [cnn.com] I guess the good names are all used ;)

  • by Saiyine ( 689367 ) on Sunday July 10, 2005 @11:49AM (#13026934) Homepage
    Will it cost more than the computer alone?
  • architecure (Score:4, Interesting)

    by imboboage0 ( 876812 ) <imboboage0@gmail.com> on Sunday July 10, 2005 @11:51AM (#13026941) Homepage
    It would be interesting to compare diagrams of the architectures that SLI/Crossfire use to see why one would be better than the other.
    • They reviewed the technology behind CrossFire a month or so ago.

      From what I saw, I think CrossFire is going to be better - it might have a little less performance then the nVidia SLI but it seems like it will be a LOT more compatible with existing and new games.

      AND, you don't have to match boards. So, you can have an X850 from Company A, use it for a year, then get the Crossfire from Company B - slap it in and you're good to go. No compatibility issues.
  • Wait and see (Score:4, Informative)

    by Radicode ( 898701 ) on Sunday July 10, 2005 @11:59AM (#13026976)
    That motherboard they used for testing looks like a monster! 8 sata connectors... I don't want to think about the noise produced by 8 HDs spinning.

    Anyway, as with any ATI products... it's better to wait for the final before declaring it a winner or a loser. I tested many beta revisions of their TV wonder USB2 and I saw the performances change with every release, sometimes good, sometimes bad.

    -Radicode
    • That motherboard they used for testing looks like a monster! 8 sata connectors... I don't want to think about the noise produced by 8 HDs spinning.
      This guy [k04.com] doesn't seem to mind.
      • How did he assemble that with such hairy palms?

        Seriously though, the noise from any system with a large number of hard drives is going to be mainly from the fans cooling them. Stacking up 8 drives in a computer without properly cooling them is asking for trouble.
      • OMGWTFBBQ! No girlfriend will allow such a thing in the bedroom... oh wait, this is Slashdot, nobody has a girlfriend!
    • ACtually, less than you might think...
      While fans became louder and louder, every generation of harddiscs became more silent (with execptions).

      I actually HAVE a system here, with 8HDs. And its quite enough that its not noticable if you dont actually try to hear it.
      (its samsungs, decoupled from the case because they sit on the hot-swap trays)

      And even 8 loud HDs (like the maxtors) would still by less loud than one of those turbo-GPUs with stock fan.
    • hard drives are really not that loud anymore, my 2tb raid has 10 sata drives, and though you can hear them seeking sometimes when you listen close, the fans dwarf the drives in actual sound. the heat is pretty nasty though. consumer raid-5 is becoming more and more common nowadays though, i mean how do you realistically back up 500gb of hd space?

      also, from my experience ati tends to have large variations between driver releases. their flagship driver releases are handled by a different team, and have made
  • by CyricZ ( 887944 ) on Sunday July 10, 2005 @11:59AM (#13026980)
    Are these cards compatible with SGI Prism systems? The current SGI Prism systems appear to include a ATI FireGL card.

    http://www.sgi.com/products/visualization/prism/ [sgi.com]

    • by Anonymous Coward
      Who cares? SGI is dead anyway
      • That's what people said about Apple, too. And now Apple is quite undead. SGI may very well turn themselves around.
  • Bias? (Score:3, Informative)

    by LTC_Kilgore ( 889217 ) on Sunday July 10, 2005 @12:00PM (#13026990)
    In the last test (3DMark05 - 1280x1024 4xAA 16xAF), they are running the Nvidia cards at 4x Anti-Aliasing, while the ATI cards are running at 6x.

    • Re:Bias? (Score:3, Informative)

      by MightyPez ( 734706 )
      At first glance that is always the conclusuon people draw. But Nvidia and ATI do different methods of anti-aliasing depending on the the level that is chosen. In Nvidia's case they use super-sampling, multi-sampling, and sometimes both. ATI uses multi-sampling only. The result is the same level of AA on each card will produce different visual results.

      This article [xbitlabs.com] goes into depth about the FSAA issue between ATI and Nvidia. Look at page 12 [xbitlabs.com] and beyond for the full poop.
      • Re:Bias? (Score:2, Insightful)

        by LTC_Kilgore ( 889217 )
        I believe that has changed with the modern line up of ATI cards. I could be wrong however, but I think that the differences in AA quality were more prominent with the older generation cards from both manufacturers.
    • Re:Bias? (Score:1, Informative)

      by Anonymous Coward
      I don't think Nvidia cards CAN run that 6x mode, it's ATI only. Perhaps these were driver defaults? As in, the driver defaults one is supposed to have set for "official" 3dMark tests?
      • Re:Bias? (Score:2, Insightful)

        by LTC_Kilgore ( 889217 )
        I don't think Nvidia cards CAN run that 6x mode, it's ATI only. Perhaps these were driver defaults? As in, the driver defaults one is supposed to have set for "official" 3dMark tests?
        Then why not run the ATI cards at 4x like the Nvidia ones?
        • Re:Bias? (Score:2, Informative)

          by DeathByDuke ( 823199 )
          Why not? Cos ATi cards are more efficient at FSAA than nvidias cards. If you see all the benchmarks, ATi gets less of a hit in 4x AA compared to nvidia. In fact in some cases, you get same hit with 6x on ATi as with 4x on nvidia... they're just balancing it out.
          • Re:Bias? (Score:2, Insightful)

            by LTC_Kilgore ( 889217 )
            The point of a benchmark is not to 'balance things out'. The ATI card is pushing more pixels in 6x mode. Period. Whether it takes a bigger performance hit or not isn't the issue, it's doing calculations on more pixels, which will affect it's score in an adverse way.
  • to get twice of the money from customers with little performance gain.

    Further, NVIDIA drivers are (still) quite unstable (on linux at least) with just one video card, and i was told that ATI drivers are even worse. I can't even imagine how often they will crash on a SLI/Crossfire system...

  • by taskforce ( 866056 ) on Sunday July 10, 2005 @12:04PM (#13027014) Homepage
    They tested it on 3Dmark... that's totally irrelevant to anyone looking to buy the card; Nvidia are notorious for optimising their drivers for synthetic benchmarks, meaning Nvidia cards almost always perform much better in tests like 3D mark, but when you get the cards into a game anything can happen.
    • Re:3Dmark (Score:2, Informative)

      by softends ( 886321 )
      Both sides are notorious for cheating, but 3DMark05 benchmark is generally a good indicator of gaming performance.
      • Re:3Dmark (Score:5, Insightful)

        by Anonymous Coward on Sunday July 10, 2005 @12:32PM (#13027139)
        The 3DMark05 benchmark is generally a good indicator of 3DMark05 performance. Any similarity to real gaming is coincidental.
        • Re:3Dmark (Score:2, Informative)

          Why was this modded funny? It is completely true. 3DMark scores have almost no bearing on real-world game performance.
          • Re:3Dmark (Score:3, Interesting)

            by ionpro ( 34327 )
            That's what people bitched out when 3dMark03 came out and gave advantages to ATI's cards. Guess what? Those 9700 Pros really did blow away the 5800 Ultras in next-gen games. 3DMark is much maligned, but I think they've done an excellent job of trying to predict the future 2 years in advance and figure out what games will be using then. Can it be perfect? No. Will the performance deltas differ between 3dmark and current-gen games? Absolutely. But when Unreal Engine 3 comes out, go back and look at those 3dma
        • In the same vein...

          HalfLife 2 benchmarks are generally good indicators of HalfLife2 performance. Any similarity to doom3, far cry, blah blah blah, is coincidental.

          Synthetic benchmarks are useful at times. Ideally 3dm05 would be most useful for seeing what your graphics card would do with only driver-level optimizations, not application-level optimizations (which nvidia fully admits to doing and ati has committed to not do). This would be most useful for knowing if some obscure game that you like would
          • The problems with 3DMark is many fold, it uses a very similar engine for all it's gaming performance.

            It get's rid of certain aspects of graphics overhead that create huge performance hits in real games.

            They have been using new techniques which are moderated in real games to keep fps rates reasonable, things that aren't moderated like polygon count, Anisotropic filtering, OpenGl vs DirectX issues etc need to be their focus instead.
    • ATI are equally as notorious for optimising drivers for synthetic benchmarks.
      http://www.theregister.co.uk/2003/05/27/ati_admits _it_optimised_drivers/ [theregister.co.uk]
      They both do it and it makes artificial tests about as stupid as paying 900 quid for graphics cards you know will be outdone next year
      • Yea but Nvidia shit all over image quality to do it.

        ATI and Nvidia both target major games to offer good performance, it's just that doing it to a benchmark is shady.
  • If the predictions are accurate, these tests will be meaningless when the R520 based card from ATI is released. The comparison that matters in the uber-high end will be the 7800GTX in SLI vs. R520 in Crossfire.
  • When ATI first announced their CrossFire solution, hyping about the fact you could use older cards with newer generations for improved performance, I thought this was a great idea. Spending $500+ for a video card today, only to have it replaced a year or two later is kind of a waste, but if it still could be used to contribute to improved gaming performance, then I could see spending the money.

    Then details about CrossFire came out. It requires using only CURRENT generation ATI cards, the X850 and X800, a

    • With nVidia's SLI, sure you need 2 expensive and matching cards to work, but that is it, you don't need any specialized motherboards.

      I hate to be the one to burst your bubble, but whether you realise it or not SLI itself is a "specialized" motherboard requirement. So-called "normal" motherboards come with a single slot for either AGP or PCIx solution. SLI itself is still considered "special", at least to those who feel the need to swap to an SLI-based motherboard an unnecessary expense.
    • With nVidia's SLI, sure you need 2 expensive and matching cards to work, but that is it, you don't need any specialized motherboards. I think this will be CrossFire's major downfall, the requirement for specialized hardware, especially if VIA decides not to make their own CrossFire compatible chipset.

      You very much do need a specialized motherboard to run nVidia SLI. It requires two full length PCI-E expansion slots, and an nvidia manufactured chipset to boot.
      • I don't think it's that you need a SLI capable MB so much as you need two full PCI-E slots. About the only systems that have these are nForce based motherboards marked for "SLI". Do you know of any motherboards without nVidia chipsets with more than one PCI-E 16x slot? If so has anyone tried to run SLI on these?
        • Well licensing according to Nvidia says you do need a specialised SLI motherboard. DFI and some other manufactures got into a little trouble over this. DFI's Nforce 4 Ultra-D [anandtech.com] could orginaly be modifed to run in SLI mode without a SLI bridge. Nvidia quickly had DFI change that flaw in later revisions and blocked out the bridgeless "feature" in their later drivers.

          Funnily enough, the block looks to have been disabled in the newest betas. Some speculate because of pressure from ATI's Crossfire.
      • Incorrect. SLI can and does run on some Intel chipsets, at the minimum. This was the original place SLI worked, before ever working on an Nvidia chipset (at least outside Nvidia.) (The article is prior to the actual introduction, but that was one of the boards which worked.) http://www.xbitlabs.com/articles/video/display/gef orce6-sli_3.html [xbitlabs.com]
    • had to finish your sentence:

      It seems like next-generation video cards are already boasting the capability to out perform current generation dual card systems, with only ONE GPU. Wasting $1000+ to get a dual system today to find out a $500 video card 6 months from now outperforms it would be quite dissapointing.

      ... for GAMING.

      fyi, crossfire was originally designed for commercial opengl flight simulators.

      however, as one of those customers with, as a good friend once told me, "more money than sense"

  • .... Like two months ago?

    Here's benchmarks from a real journal here: http://www.anandtech.com/video/showdoc.aspx?i=2432 [anandtech.com]

    HJ
  • In the future, the general purpose CPU may be in a card, and the graphics component may be the motherboard consisting of multiple GPUs.

    From a financial logic standpoint, it is already that way today for the computer gamer teens want.

    Example, a recently built computer for my cousin .. the motherboard, cpu, and RAM came in TOTAL under 400 .. but the graphics card was around $450. If you draw a conceptual diagram showing the size/area of the components as representative of cost .. the graphics card is much l
  • nVidia's solution requires changes to your game's codebase and build in order to function. AFAIK, ATI's solution will work on ANY game old or new with zero changes. That's a huge advantage.
    • not true, nvidia's solution has them creating profiles for popular games for them to function, though i believe you can sli awareness to a game to increase it's support.

      my current release has at least a hundred games, and there aren't that many popular games that need this kind of graphics firepower out there.
      • Aaah, my mistake. When SLI was first introduced, hardware sites were saying that this was the case. I'm glad to see that it has been changed to a profile system. Profiles look a little hairy but it appears there are some free apps/utilities that will help you out for nVidias.

        My apologies to those who read my comments.
  • by Travoltus ( 110240 ) on Sunday July 10, 2005 @01:46PM (#13027518) Journal
    It looks like those video cards overlap not just one, but TWO empty expansion slots that I could use for other cards!

    http://img.hexus.net/v2/features/dfi_crossfire_com putex/images/crossfire_big.jpg [hexus.net]

    This is why I have avoided upgrading to these new generation of cards... I have the lowly 6600 now and that's going to be it, perhaps. I don't like onboard sound (I prefer my Audigy 2, especially for Linux), thank God for the on board USB, FireWire and NIC though; I have a video capture card and a SCSI card for legacy stuff, and there'd be no room for these two cards in any PCI-E system I'd upgrade to... they all come with fewer slots now.
    • I think you can attribute this to the low quality of the DFI motherboard layout, I've always despised their layouts, nVidias motherboards manage it safely, so I'm sure ATI's will accomplish the same.
  • drivers.. (Score:1, Interesting)

    by Hamelius ( 898716 )
    my luck with ATI has been pretty bad, i have installed 3 cards, all of them different models, always having driver problems, and crashes, with nvidia i had 4 with no problems on any of them, but maybe thats just me..
    • It's been that way for me as well. Every ATi I've owned (4 of them over the past 6 years) has given me tremendous problems. I even had a friend with an ATi card get his whole computer fried. He leaves his computer on all the time, and the fan on the card fell off. The card boiled and died, and the resulting surge took out every piece of hardware on his computer except for the memory. Hard drives, sound card, network card, and the motherboard all had to be replaced. Granted, this could have happened wi
      • Comment removed based on user account deletion
        • Actually, the last ATi card i owned was a Radeon 9700. And before that was an ATi Radeon Mobility in a laptop.
          The 9700 had issues locking up whenever i started a 3D app (fresh install of windows/directx/latest drivers from ATi) so I took it back, said it was faulty.
          The Mobility was just crap. Slideshow performance, texture issues, rendering in the wrong order so some things you could see through and some you couldn't. Luckily I wasn't using it for gaming much, but i was hoping to have something i could t
  • i've had a number of Nvidia cards (TNT, GF2, GF3ti200) then I decided to try ATI got a Saphire 9500 had some driver difficulties and video corruption on boot up sometimes. I got a 9600AIW PRO, and it was pretty good except the guide plus software ati supplies with it SUCKS and MMC likes to look up the whole computer by crashing now and again. after about 11mos the AIW 's TV tuner apparently decided to die on me, so I had to RMA it... ATI got me a replacement pretty quickly. but it was a pain). I got a 6
  • The NVidia 7800 comes from their next generation G70 line of chips...X850 is not ATI's new generation of GPU's (the codenamed r520 line) so the comparisons are not exactly fair either.
  • Are there proper ATI drivers available for Linux yet? I will stay with nVidia until such time.
  • I didn't know the GF7800GTX-SLI can get 11301 frames per second on 3d Mark. That's really quite impressive!

"Being against torture ought to be sort of a multipartisan thing." -- Karl Lehenbauer, as amended by Jeff Daiell, a Libertarian

Working...