Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Software AMD Hardware

SLI On Life Support For the AMD Platform 186

JoshMST writes "For years AMD and Nvidia were like peas and carrots, and their SNAP partnership proved to be quite successful for both companies. Things changed dramatically when AMD bought up ATI, and now it seems like Nvidia is pulling the plug on SLI support for the AMD platform. While the chipset division at AMD may be a bitter rival to Nvidia, the CPU guys there have had a long and prosperous relationship with the Green Machine. While declining chipset margins on the AMD side was attributed to AMD's lackluster processor offerings for the past several years, the Phenom II chips have reawakened interest in the platform and they have found a place in enthusiasts' hearts again. Unfortunately for Nvidia, they are seemingly missing out on a significant revenue stream by not offering new chipsets to go with these processors. They have also curtailed SLI adoption on the AMD platform as well, which couldn't be happening at a worse time."
This discussion has been archived. No new comments can be posted.

SLI On Life Support For the AMD Platform

Comments Filter:
  • by carp3_noct3m ( 1185697 ) <slashdot@NoSpAm.warriors-shade.net> on Friday June 19, 2009 @10:10PM (#28398489)
    This is pure conjecture, but to me it seemed as if when AMD and ATI became one team and Nvidia and Intel became the other, that it would make sense for each one to offer incentives (read: threats) so that their partner would not bend over for the competition. So its not like its completely up to Nvidia to start improving their standing with AMD because of pressure from Intel. If that made any sense, then I'll drink a couple more beers before posting next time. Out
    • Always... (Score:5, Funny)

      by maz2331 ( 1104901 ) on Friday June 19, 2009 @10:12PM (#28398503)

      It is very important to always drink more beers before posting here. Otherwise, there is no chance of a +5 Insightful mod.

    • Nvidia and Intel became the other

      You forgot one important thing: Larrabee.

      • Re: (Score:3, Insightful)

        by billcopc ( 196330 )

        Larrabee doesn't change a damned thing. A beowulf cluster of shitty Intel GPUs doesn't magically remove the stench of failure. It's just a whole lotta more suckage on one die.

        • by jonwil ( 467024 )

          Shows you dont know much about Larabee.
          Larabee has nothing to do with current Intel GPU architecture.

          Larabee is a bunch of older Pentium cores re-engineered to be REALLY good at the kinds of floating point operations 3D graphics need combined with a really good software setup to actually provide 3D for the thing.
          Its all x86.

          • Re: (Score:2, Insightful)

            by hattig ( 47930 )

            Yeah, it's 32 cores of x86 overhead.

            Why not use a small RISC core alongside the new 512-bit vector unit? No more x86 decoder overhead (non-trivial on a Pentium-level core replicated 32 times), remove the cruft, tighten up the ISA, etc.

            Right now it looks like 2x the die area to achieve the same in 2010 as NVIDIA achieved in 2008, and rumoured power consumption figures that make a GT200 look lean and athletic.

            However it is a major improvement for Intel, and Larrabee 2 or Larrabee 3 will get it right. Also the

            • by jonwil ( 467024 )

              The most interesting thing will be whether Intel follows its usual practice and releases the source code for all this stuff. And if so, whether its the host side, the Larabee side or both.

              Also, with Larabee, its possible to just add more cores and boost the horsepower I believe. And new features (including new DirectX versions) can be added via updates to the software (host, Larabee or both)

            • by frieko ( 855745 )
              This isn't 1992, x86 decode IS trivial.
        • I know the beowulf cluster is cool and all, but come on now, you're starting to lose geek cred for inaccuracy here. Beowulf clusters are whole machines clustered to act as one - so you can have a beowulf cluster of Wii's, PS3's, and Barbara Streisands, but not a beowulf cluster of Phenom IIs.

          Besides, I thought it was assload of CPUs in a GPU package, not the other way around? Intel makes damn good CPUs, no matter how much their GPUs suck (they are adequate, but only for very small values of adequate).

        • Larrabee is being built by a bunch of engineers that used to work at 3DLabs [wikipedia.org]. They know what they're doing.
    • Re: (Score:3, Interesting)

      by CajunArson ( 465943 )

      That's not really an accurate portrayal of what's going on. In reality it's more like, Intel is against the CPU side of AMD, in a semi-cordial relationship with the graphics side of AMD (ATI) and swatting at Nvidia like an annoying bug... which is all that Nvidia is compared to Intel despite Jen-Hsun Huang's deluded sense of grandeur.
      Remember that Intel has supported ATI's crossfire configuration natively for a long time, and this support continues into the high-end X58 chipsets making Crossfire a very eas

    • This is pure conjecture, but to me it seemed as if when AMD and ATI became one team and Nvidia and Intel became the other, that it would make sense for each one to offer incentives (read: threats) so that their partner would not bend over for the competition. So its not like its completely up to Nvidia to start improving their standing with AMD because of pressure from Intel. If that made any sense, then I'll drink a couple more beers before posting next time. Out

      nVidia has made it quite clear on many occasions that they are not team players. They don't care about anyone else except themselves, will constantly put fault on anyone else that it can, and only acts in interest of protecting its own 'precious' IP. Not that Intel is any better on that last part, though.

      nVidia and Intel aren't a team. They're competitors. DAAMIT is only starting to piss of nVidia more because they can actually push a whole platform (CPU + Chipset + Graphics), something Intel's had since

  • amd (Score:2, Interesting)

    by Anonymous Coward

    Beginning of the end?

    • Re: (Score:3, Funny)

      by robot_love ( 1089921 )
      Good lord. The end of AMD started about 3 years ago. Where have you been?

      This has got to be at least the middle of the end.
      • Re:amd (Score:5, Insightful)

        by laughingcoyote ( 762272 ) <barghesthowl.excite@com> on Friday June 19, 2009 @10:46PM (#28398697) Journal

        Good lord. The end of AMD started about 3 years ago. Where have you been? This has got to be at least the middle of the end.

        I heard that about three years ago, and I've been right here, using an AMD Athlon XP that worked well for many years after it was built, and still serves nicely as a server, while using my aging Athlon T-Bird as a fileserver, again with no issues other than one power supply replacement a couple years ago. I'm posting this on the AMD Phenom-based system I built about a month ago, and I couldn't be happier with it. Especially since the price I paid vs. the performance I got is absolutely amazing. I've built many AMD systems for others, and not had a single complaint about it yet. I will of course build you an Intel-based system if that's what you want, but it's going to cost you more, because the parts cost me more.

        I've always personally used AMD systems, and have never found them lacking. Your mileage may vary, of course, but if nothing else it's a good thing there are two competitive forces in this market. It forces them both to innovate at a much faster rate than either one would if they were the only game in town.

        Of course, I've always been happy with Nvidia as well, but if they decide not to support what I use, I'll just have to head across the street and check out their competitor who does. That tends to happen when you choose to engage in turf wars rather than providing your customers what they want.

        • I was interested until I saw that word: Phenom. I pronounce it "Phail".

          It Phails to match Intel's offerings, and also Phails to compete on price. Perhaps most importantly, it Phails to offer a good selection of motherboards to put it on. Intel's entry-level quads are in the same price bracket yet are typically 10% to 20% faster for the same clock speed, under typical CPU-bound loads like media encoding and floating point math (graphics).

          The reason AMD got so much love in the late 90's and early 00's is b

        • My neighbour's got a nice Rover 75, he's been using it for years after it was built, and it still serves nicely as a mid-sized full featured hothatch. That doesn't mean Rover didn't go bust.

          I always tended to favour AMD, this computer I'm on now is some flavour of Duron, and I always preferred Athlons in their prime. But that doesn't stop me acknowledging that their offerings haven't been great in the last few years. As a consumer, I'm always going to buy whatever product is the best at any given time, and

          • by MooUK ( 905450 )

            You're calling a Rover 75 a hothatch? It's not exactly hot, and it's (to the best of my knowledge) purely a saloon. The "hothatch" definition tends to be small powerful hatchbacks. A Golf, for example.

            • Dang, knew I was getting confused somewhere there. I was assuming it was one of those hatchbacks-that-are-supposed-to-look-like-a-saloon hatchbacks.

              Consider me happily educated.

        • by Xest ( 935314 )

          "I've always personally used AMD systems, and have never found them lacking."

          I'd be suprised if you did find them lacking if they're all you've used.

          Only using one type of system doesn't really put you in the best of places to judge on what you may or may not be missing out on though.

      • by scumdamn ( 82357 )
        I hope this comment is in jest pointing out that people have been claiming that AMD is in trouble for years and despote those prognostications AMD continues to carry on.
        Either that you're totally wrong. AMD's not the GM of the PC industry.
        • by macshit ( 157376 )

          I hope this comment is in jest pointing out that people have been claiming that AMD is in trouble for years

          I never quite figured that out. Sure AMD's had a rough patch in recent years, but Intel spent years churning out crap sub-standard processors. Intel tried to fix their problems, and have come back with great products; there's no reason to think AMD can't do the same -- and indeed the Phenom II seems to be excellent (it doesn't completely crush Intel's offerings like AMD's products did a few years ago, but Intel's not turning out complete crap these days).

          What was particularly surprising to me, though, is

  • Talk about stupid (Score:4, Interesting)

    by Hansele ( 579672 ) * on Friday June 19, 2009 @10:11PM (#28398495)
    Why on earth if you're NVIDIA do you make it harder to find mainboards to leverage your tech? I'd have expected this move by AMD first, you'd think NVIDIA would be wanting to have their tech available everyplace possible.
  • Well... (Score:5, Insightful)

    by Evelas ( 1531407 ) on Friday June 19, 2009 @10:11PM (#28398497)
    Looks like no more NVIDIA for me, time to research what ATI has available. I like my AMD chips.
    • Re: (Score:3, Insightful)

      by XPeter ( 1429763 ) *

      I like my AMD chips.

      ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.

      • Re: (Score:3, Informative)

        by CastrTroy ( 595695 )
        Yeah Intel does make faster processors. But you'll pay for them. The AMD chips cost about the same as the comparable Intel chips, so in the end, the decision just comes down to religion. Unless you want to talk about spending $500 on a processor.
        • Re: (Score:2, Interesting)

          by Anonymous Coward

          Unless you want to talk about spending $500 on a processor.

          well, the GP does claim to be the sort of person who buys two GFX cards in order to get a marginally faster frame rate; it would be reasonable to assume he'd spend a ton on his cpu too to make his e-penis bigger

      • Re:Well... (Score:5, Informative)

        by SplashMyBandit ( 1543257 ) on Friday June 19, 2009 @10:41PM (#28398657)

        ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.

        True, but only if system cost is not factored into the equation.

        • At the same price-point the AMDs are actually better performers (which is why many are interested in them again). That is, the AMDs have better performance-per-dollar (or whatever your local currency is, clams anyone?).
        • The high end i7s give impressive benchmarks but they are in the same price range as Xeon or Opteron.
        • Same with the Nvidia cards, great benchmarks but for twice the price they might give as little as a 10% performance increase on the ATI 'equivalent'. If you are counting your pennies (or whatever your local currency sub-unit is, shiny beads anyone?) then the AMD and ATI actually give you better performance, which I found surprising when I started looking at the benchmarks and costs of getting a new system.

        Mandatory car analogy: Yes, the $500k ferrari might win against my $100K porsche, but how many people are gonna pay the extra megabucks for them (or whatever your local currency is, electrum pieces?).

        • Re:Well... (Score:5, Insightful)

          by hedwards ( 940851 ) on Friday June 19, 2009 @11:13PM (#28398823)
          That's more or less why I always buy AMD. The performance, except perhaps early on with FPU, has always been good and at a price I could afford. Well, that and my annoyance at the monopolistic behaviors of Intel.

          Additionally, I really like what I've seen from AMD recently, sure it probably isn't as good at the top end of the offerings, but my current set up cost me somewhat less than $500 and is able to handle things like virtual box quite well.
        • True, but only if system cost is not factored into the equation.

          ..True, but only if overclocking is not factored into the equation. :) Intel blows AMD out of the water once you start overclocking. Starting with default voltage and BOX cooling you easily hit ~3.6-3.7GHz on 45nm Core. If you bump voltage and provide 10" case fan you can reach 4GHz. AMD ends at about 3.6GHz, maybe 3.8GHz with expensive cooling solutions.

        • My problem is that every time I buy an ATI card it fails massively everywhere, and when I get an nVidia card it at least mostly works everywhere. The Intel vs. AMD comparison might welcome comparisons of Ferraris and Porches, but the difference between ATI and nVidia is more like Toyota and Nissan. The Nissan has just a bit more performance and the overall implementation is a little nicer (even the driver install) and you will find that all the little clip-and-knob stuff will break a lot easier on the Toyot

      • Re: (Score:2, Interesting)

        by Anonymous Coward

        ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.

        ATI isn't quite up there with nVidia for performance. The 4950 and 4750x2 of ATI struggle to compete with the GTX285 & GTX295 offerings of nVidia, but the ATI cards are cheaper. And while the Phenom II is behind the i7 in performance, it's also cheaper - especially when you factor in the cost of motherboard into the equation (i7 motherboards are almost exclusively expensive "enthusiast" setups).

        So... You get what you pay for really.

        • Re: (Score:3, Interesting)

          by tyrione ( 134248 )

          ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.

          ATI isn't quite up there with nVidia for performance. The 4950 and 4750x2 of ATI struggle to compete with the GTX285 & GTX295 offerings of nVidia, but the ATI cards are cheaper. And while the Phenom II is behind the i7 in performance, it's also cheaper - especially when you factor in the cost of motherboard into the equation (i7 motherboards are almost exclusively expensive "enthusiast" setups).

          So... You get what you pay for really.

          Nvidia is pushing CUDA first, OpenCL second. AMD is moving Streams to second and OpenCL to first. I find AMD using their brains and Nvidia pissin' up a rope.

        • I'm not sure why one would buy the models you suggest, when you can get a 4870 or 4890 at less cost and fairly comparable performance. the highest of the high end ATI's are cheaper and you're not going to complain... any about performance.

          I've got a 4870, and with the price point they are at these days I'd assume they're doing pretty damned well..

        • When you're talking "up there with nVidia for performance", make sure you note that's only on 2560x1600 or so resolutions. AMD's cards pump way more frames than is needed for any sub-$300 monitor, which is what most people have. Unless you're trying to play Crysis on a 30" flat panel monitor, the top end card from either company will be more than adequate.
          • by Khyber ( 864651 )

            I play Crysis on a 32" monitor. My single 9800GTX+ pumps 1920x1080 with all the super details just fine, I might drop below 20FPS in a scene with loads of energy weaponry firing but otherwise maintain around 45 FPS.

            • Ok, so make it a 36" flat panel... whatever those uber-resolution displays are at any more. My point remains... your older 9800GTX+ can do it, and the current ATI chips are faster. You only run into ATI's cards being noticeably slower than Nvidia's when you're talking about higher resolutions than the 1920x1080 you're running. Even 2048x1152, which is the highest resolution non-specialist display [newegg.com] I've seen, as compared to the starting at $600 uber-displays out there.
      • Re:Well... (Score:5, Insightful)

        by bitrex ( 859228 ) on Friday June 19, 2009 @11:27PM (#28398903)

        I have a 2 year old AMD machine with an AM2 motherboard, which supports AM2+ processors in the latest BIOS. I was considering replacing the aging box with an Intel machine, or building a new AMD machine, I wasn't quite sure what to do.

        Then I found I could buy an AM2+ Phenom 2 triple-core and a Radeon HD4850 for just shy of $200. That pretty much ended the internal debate.

        • by bitrex ( 859228 )
          Sorry, I just looked at my receipt and it was more like $250. Don't want to get anyone's hopes up too much.
      • Maybe Intel has a CPU that magically blows away anything AMD has. I really doubt it, but maybe.

        Problem number one, speed isn't everything. The various comm channels can choke an impossibly fast CPU down to nothing, and they can make a moderately fast CPU look really really fast. That is, after all, the reasoning behind AMD's rating system - the 2400+, 5400+, etc. My ancient 2400+ XP chip running at 2Ghz benches alongside Intel's chips clocked at 2400Mhz.

        Problem number two - Intel charges a premium for t

        • by ZosX ( 517789 )

          Actually the initial speed rating on AMD chips was in direct comparison to a 1ghz Thunderbird. So a 2400 would be 2.4x as fast as a 1ghz Thunderbird. I'm still using an "ancient" Athlon64 3000, which Ubuntu no longer wants to boot on. (Unless they fixed that nasty bad no PSS objects bug) At 2ghz, it is somehow 3x as fast. The faster FSB certainly helps. For what I do (photoshop, lightroom, sound mixing) it is still plenty adequate, though I've been due for an upgrade to last years technology for some time

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      The ATI video cards have impressive hardware specs when comparing to Nvidia. However, their drivers and their driver support is shit.

  • What's the news? (Score:5, Insightful)

    by FutureDomain ( 1073116 ) on Friday June 19, 2009 @10:14PM (#28398511)
    NVIDIA tries to jinx AMD, but ends up jinxing themselves. This has been tried throughout the ages and often ends up at the same result.<br />
    Move on, nothing to see here.
  • Who cares? (Score:5, Insightful)

    by Winckle ( 870180 ) <`ku.oc.elkcniw' `ta' `kram'> on Friday June 19, 2009 @10:19PM (#28398539) Homepage

    Dual GPU solutions are so pointless, a waste of money for little performance gain, that doesn't even work in some games.

    • Re: (Score:3, Informative)

      by NFNNMIDATA ( 449069 )

      I'm with this guy. I have SLI and I have yet to find a single game that it actually improves. In fact, in most cases it cuts performance in half. As far as I can tell it's just a way to trick morons like me into buying twice as many video cards.

      • My SLI cards get me better performance. Of course they are ancient cards, I think the second one cost $50 2 years ago.

        What I don't understanf is how you fit two cards in now that every card I see takes two slots all by itself.

        But I'm years behind.

    • Re: (Score:3, Insightful)

      Not only that, but SLI in the specific is so bad that dual card setups are one of the few places you actually want to have ATI over nVidia.

    • Re:Who cares? (Score:4, Insightful)

      by tyrione ( 134248 ) on Friday June 19, 2009 @11:23PM (#28398881) Homepage

      Dual GPU solutions are so pointless, a waste of money for little performance gain, that doesn't even work in some games.

      Think OpenCL. I could careless about Streams or CUDA. But I do care about OpenCL/OpenGL and the Engineering worlds. Games will get it sooner rather than later why OpenCL will thrive.

  • Who CARES about SLI? (Score:5, Interesting)

    by PrescriptionWarning ( 932687 ) on Friday June 19, 2009 @10:23PM (#28398569)
    The fact is that a very marginally small portion of people actually use more than one video card. And why should anyone really, when modern day consoles cost about the same amount as one would spend on a moderately high end processor + video card, why the hell would most people want to spend an extra 300 bucks or so to have an extra video card at only 25% or less extra benefit in framerate? Only the hardcore ones with the extra wallet is who. As for me, I'm more than happy with my $1000 system with ONE video card, and I know its going to last me at least and extra year or two anyway.

    Anyway all I'm saying is AMD has the ability to tie in their own processor + GPU combo, plus let the consumer buy a separate GPU, thus getting their own "SLI". If they play their card right, they can just give the finger to NVIDIA and provide some real competition that this market really needs to prevent us all from paying $200-300 for a decent GPU these days.
    • Re: (Score:2, Troll)

      by EdIII ( 1114411 ) *

      The fact is that a very marginally small portion of people actually use more than one video card.

      You're wrong on that. There are plenty of people I know that have and use more than one video card. Not necessarily with SLI though...

      And why should anyone really, when modern day consoles cost about the same amount as one would spend on a moderately high end processor + video card, why the hell would most people want to spend an extra 300 bucks or so to have an extra video card at only 25% or less extra benef

      • Pretty sure that SLI requires the two video cards to be actually connected.

        That's SLI, which is NVidia's dual-card system. Doesn't ATI have their own? (Crossfire?). Maybe that's different.

        If you mean that AMD would put the GPU on the motherboard, I would still think you would need a SLI connector on that separate GPU actually connected.

        So? Include the cable with the card and mandate a port on the motherboard. Or include a PCI-E 1x card with the processor.

      • On tearing apart this person's use of words and then basically agreeing with him.

        Perhaps you could just agree with his reasoning and point out the flaws in the fine points of his logic instead.

        The argument is that SLI is pointless, you both agree, and yet you MUST find ways to pick on him.

        Sheeesh.

        • by EdIII ( 1114411 ) *

          On tearing apart this person's use of words and then basically agreeing with him.

          I did not "tear him apart". You make it sound like my post was filled with vitriol and hyperbole. I only pointed out that multiple video cards does not mean SLI is being used by default.

          Perhaps you could just agree with his reasoning and point out the flaws in the fine points of his logic instead.

          Hmmmm, I thought I did point of the flaws in his logic. I never said that I agreed with his reasoning on SLI being pointless eithe

      • Re: (Score:2, Informative)

        by reub2000 ( 705806 )
        But you don't need any special support for running 2 video cards in a machine. A while back, before nVidia created their implementation of SLI, I had a GeForce 3 with one video output. In order to connect a second monitor, I had to install a GeForce 2. No special link. I'm not sure what any of this has to do with SLI and crossfire other than the physical presence of 2 video cards in one machine.
    • by Gregg M ( 2076 )
      why the hell would most people want to spend an extra 300 bucks or so to have an extra video card at only 25% or less extra benefit in framerate?

      SLI will give you better than 25% improved performance. I hear the cards can double performance. Still most games will stick to one video card. The Valve hardware survey has SLI users at 2% of valve customers. That's 2% of gamers not everyday PC users.

      You don't need to spend 300 bucks on any video card. Most 100 dollar video card give you all the performance m

    • You do know that ATI already has Crossfire, and even CrossfireX, which will let you mix cards of different speeds and still get a benefit? They don't have to be matched cards?
    • And why should anyone really, when modern day consoles cost about the same amount as one would spend on a moderately high end processor + video card, why the hell would most people want to spend an extra 300 bucks or so to have an extra video card at only 25% or less extra benefit in framerate?

      Because they're unreliable, tend to overheat, and have expensive games that run at low framerates.

      If you build your own PC you can make it out of highly-rated (by other purchasers of course, don't trust "consumer reporting" agencies) parts and carefully cool it so it works nicely. Building it from scratch this way will be more expensive than getting a console, but if you're just upgrading it like you are saying in your post, it won't be.

  • Nvidia (Score:4, Interesting)

    by C_Kode ( 102755 ) on Friday June 19, 2009 @10:49PM (#28398707) Journal

    Asus has jumped in bed with Microsoft as of late. With AMD's purchase of ATI and promise of open source drivers and Nvidia's failure to move forward in open source, Nvidia and Asus has seen the last dollar of mine.

  • 1: Cock gun.
    2: Aim at foot.
  • Unimportant. (Score:5, Informative)

    by Jartan ( 219704 ) on Saturday June 20, 2009 @12:15AM (#28399151)

    Really the article makes it sound like Nvidia is abandoning AMD chipsets but it's just SLI support. When they started making this decision it looked like AMD was totally dead in the enthusiast market. Even die-hards were switching to Intel chips. It seemed for a while there that the market for dual graphics cards on AMD was nearly dead. Now that AMD has a good chip again Nvidia will probably be scrambling to get a new chipset out for enthusiasts.

  • by jollyreaper ( 513215 ) on Saturday June 20, 2009 @12:20AM (#28399173)

    As I understand it, you don't really double your performance by putting two cards in. How many people seriously drop the coin to do this? Everything I've read says you'll get better bang for the buck by buying one good card, saving the money you would have spent on the second and then buying an equivalent card in three year's time that will kick the arse of the first card.

    • by Aladrin ( 926209 )

      And who buys AMD? People looking to get better bang for the buck. In other words, people who are unlikely to double the cost of the video card for only 50% more performance.

      While I think this is a silly move by nVidia (it makes them look bad to their customer base), it probably isn't nearly as dumb a move as it looks at first glance. They probably have a pretty good idea of what portion of their customers use AMD and SLI currently, and it's probably pretty low.

    • >>How many people seriously drop the coin to do this?

      My motherboard has SLI support (I bought it in December 2004, on the off chance the numbers would make sense in the future.) But when it came time to replace my 6800, it make more sense to buy a 7900 (which was like 10x faster) rather than a second 6800, which probably would have entailed needing a PSU upgrade as well.

      When it came time to replace my 7900, it made more sense to get an 8800 than a second 7900. When it came time to replace the 8800, it

    • Not necessarily.

      Not sure about NVidia's card's, but right now two of AMD's 4850s are cheaper than and just as fast as a single 4890. It's the best deal around the $220 price point.

      Source: http://www.tomshardware.com/reviews/radeon-geforce-price,2323-4.html [tomshardware.com]
  • by dr_wheel ( 671305 ) on Saturday June 20, 2009 @12:46AM (#28399291)
    There are single-slot dual GPU offerings from both camps. If you actually need/want SLI/CrossFire, what's the point of running 2 cards when you can have 1?
    • A two card solution provides better cooling and in case of Radeon 4670 it is possible to have a passive crossfire solution.

  • Not a problem (Score:4, Insightful)

    by Deliveranc3 ( 629997 ) <deliverance@level4 . o rg> on Saturday June 20, 2009 @01:13AM (#28399393) Journal
    AMD = Value.

    SLI = Not Value.

    AMD has consistently shown that they want to put a computer at every set of hands on the planet. Geode, PIC [betanews.com], OLPC. Now it would be nice if those computers had fast 3D graphics or GPU parralel processing, but that really seems like an easy way to waste the real power of computers.

    I have loved many Nvidia products in the past, but stepping away from AMD seems like a poor choice on Nvidia's part.
  • A bit OT, but I'm curious as to why the best deal for half decent motherboards around here seem to be NVidia chipsets and onboard graphics, and AMD processors... Should AMD/ATI be cranking out chipsets that allow board makers to do better/faster/cheaper boards with combos from the same manufacturer?? Just seems odd. All the PC's in my house (one Linux, one Hackintosh, a couple of Windows ones for the kids, are all NVidia/AMD setups, bought over the past few years.)

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...