SLI On Life Support For the AMD Platform 186
JoshMST writes "For years AMD and Nvidia were like peas and carrots, and their SNAP partnership proved to be quite successful for both companies. Things changed dramatically when AMD bought up ATI, and now it seems like Nvidia is pulling the plug on SLI support for the AMD platform. While the chipset division at AMD may be a bitter rival to Nvidia, the CPU guys there have had a long and prosperous relationship with the Green Machine. While declining chipset margins on the AMD side was attributed to AMD's lackluster processor offerings for the past several years, the Phenom II chips have reawakened interest in the platform and they have found a place in enthusiasts' hearts again. Unfortunately for Nvidia, they are seemingly missing out on a significant revenue stream by not offering new chipsets to go with these processors. They have also curtailed SLI adoption on the AMD platform as well, which couldn't be happening at a worse time."
I don't know but... (Score:5, Insightful)
Always... (Score:5, Funny)
It is very important to always drink more beers before posting here. Otherwise, there is no chance of a +5 Insightful mod.
Re:I don't know but...ONE CRUCIAL WORD MISSING (Score:2)
You forgot one important thing: Larrabee.
Re: (Score:3, Insightful)
Larrabee doesn't change a damned thing. A beowulf cluster of shitty Intel GPUs doesn't magically remove the stench of failure. It's just a whole lotta more suckage on one die.
Re: (Score:2)
Shows you dont know much about Larabee.
Larabee has nothing to do with current Intel GPU architecture.
Larabee is a bunch of older Pentium cores re-engineered to be REALLY good at the kinds of floating point operations 3D graphics need combined with a really good software setup to actually provide 3D for the thing.
Its all x86.
Re: (Score:2, Insightful)
Yeah, it's 32 cores of x86 overhead.
Why not use a small RISC core alongside the new 512-bit vector unit? No more x86 decoder overhead (non-trivial on a Pentium-level core replicated 32 times), remove the cruft, tighten up the ISA, etc.
Right now it looks like 2x the die area to achieve the same in 2010 as NVIDIA achieved in 2008, and rumoured power consumption figures that make a GT200 look lean and athletic.
However it is a major improvement for Intel, and Larrabee 2 or Larrabee 3 will get it right. Also the
Re: (Score:2)
The most interesting thing will be whether Intel follows its usual practice and releases the source code for all this stuff. And if so, whether its the host side, the Larabee side or both.
Also, with Larabee, its possible to just add more cores and boost the horsepower I believe. And new features (including new DirectX versions) can be added via updates to the software (host, Larabee or both)
Re: (Score:2)
Re: (Score:2)
I know the beowulf cluster is cool and all, but come on now, you're starting to lose geek cred for inaccuracy here. Beowulf clusters are whole machines clustered to act as one - so you can have a beowulf cluster of Wii's, PS3's, and Barbara Streisands, but not a beowulf cluster of Phenom IIs.
Besides, I thought it was assload of CPUs in a GPU package, not the other way around? Intel makes damn good CPUs, no matter how much their GPUs suck (they are adequate, but only for very small values of adequate).
Re: (Score:2)
Re: (Score:3, Interesting)
That's not really an accurate portrayal of what's going on. In reality it's more like, Intel is against the CPU side of AMD, in a semi-cordial relationship with the graphics side of AMD (ATI) and swatting at Nvidia like an annoying bug... which is all that Nvidia is compared to Intel despite Jen-Hsun Huang's deluded sense of grandeur.
Remember that Intel has supported ATI's crossfire configuration natively for a long time, and this support continues into the high-end X58 chipsets making Crossfire a very eas
Re: (Score:2)
This is pure conjecture, but to me it seemed as if when AMD and ATI became one team and Nvidia and Intel became the other, that it would make sense for each one to offer incentives (read: threats) so that their partner would not bend over for the competition. So its not like its completely up to Nvidia to start improving their standing with AMD because of pressure from Intel. If that made any sense, then I'll drink a couple more beers before posting next time. Out
nVidia has made it quite clear on many occasions that they are not team players. They don't care about anyone else except themselves, will constantly put fault on anyone else that it can, and only acts in interest of protecting its own 'precious' IP. Not that Intel is any better on that last part, though.
nVidia and Intel aren't a team. They're competitors. DAAMIT is only starting to piss of nVidia more because they can actually push a whole platform (CPU + Chipset + Graphics), something Intel's had since
Another weird Slashdot editor failure. (Score:3, Informative)
People are saying that Nvidia is not honest. (Score:5, Insightful)
There is a general impression now, apparently correct, that Nvidia is not honest [slashdot.org] and cannot be trusted. HP bought Nvidia graphics chips, and when they were found defective, neither company was completely honest about fixing the defects, articles say.
An Inquirer article, Nvidia cuts out reviewers for the GTS250 [theinquirer.net], says "IT IS ALWAYS funny when an unethical company turns on its own supporters as Nvidia did with the latest 'all new' GT250 cards. This time however, their PR stunts cross the line from unethical to purposely false, and hilarity ensues."
Another quote from the Inquirer story: "This time however, they crossed the line from plausible deniability to flat out deception. In the middle of last week we heard what Nvidia was up to this time around, but just couldn't believe they would be THAT sleazy."
Now that Intel is integrating faster GPUs into its chipsets, there is a perception that eventually there will be little room for Nvidia.
Re: (Score:2)
No shit nVidia isn't honest. I have to rely upon a modified .INF file to get my 8600M GS to work under XP, which is bullshit considering one driver is supposed to work with all the cards - the supposed point of a Unified Driver Architecture that nVidia likes to talk about. And what's even worse is the modified .INF install under XP gives me about 10 FPS over the same drivers using the official .INF under Vista. What's even better is I know the true amount of memory the video card came with - 512MB. There's
Three links alleging Nvidia dishonesty: (Score:2)
Quote: "In July 2008, NVIDIA publicly acknowledged a higher than normal failure rate for some of their graphics processors due to a packaging defect. At that same time, NVIDIA assured Apple that Mac computers with these graphics processors were not affected. However, after an Apple-led investigation, Apple has determined that some MacBook Pro computers with the NVIDIA GeForce 8600M GT graphics processor may be affe
An explanation: (Score:2)
The Slashdot editor should also have mentioned that the article to which Slashdot linked apparently unreasonably promotes Nvidia over ATI. See, for example, the explanation of ATI's Crossfire X [wikipedia.org], which competes with Nvidia's SLI. In that sense, the Slashdot story is apparently an advertisement.
It seems to me that Slashdot editors should make st
Comment removed (Score:4, Interesting)
Re: (Score:2)
amd (Score:2, Interesting)
Beginning of the end?
Re: (Score:3, Funny)
This has got to be at least the middle of the end.
Re:amd (Score:5, Insightful)
Good lord. The end of AMD started about 3 years ago. Where have you been? This has got to be at least the middle of the end.
I heard that about three years ago, and I've been right here, using an AMD Athlon XP that worked well for many years after it was built, and still serves nicely as a server, while using my aging Athlon T-Bird as a fileserver, again with no issues other than one power supply replacement a couple years ago. I'm posting this on the AMD Phenom-based system I built about a month ago, and I couldn't be happier with it. Especially since the price I paid vs. the performance I got is absolutely amazing. I've built many AMD systems for others, and not had a single complaint about it yet. I will of course build you an Intel-based system if that's what you want, but it's going to cost you more, because the parts cost me more.
I've always personally used AMD systems, and have never found them lacking. Your mileage may vary, of course, but if nothing else it's a good thing there are two competitive forces in this market. It forces them both to innovate at a much faster rate than either one would if they were the only game in town.
Of course, I've always been happy with Nvidia as well, but if they decide not to support what I use, I'll just have to head across the street and check out their competitor who does. That tends to happen when you choose to engage in turf wars rather than providing your customers what they want.
Re: (Score:2)
I was interested until I saw that word: Phenom. I pronounce it "Phail".
It Phails to match Intel's offerings, and also Phails to compete on price. Perhaps most importantly, it Phails to offer a good selection of motherboards to put it on. Intel's entry-level quads are in the same price bracket yet are typically 10% to 20% faster for the same clock speed, under typical CPU-bound loads like media encoding and floating point math (graphics).
The reason AMD got so much love in the late 90's and early 00's is b
Re: (Score:2)
My neighbour's got a nice Rover 75, he's been using it for years after it was built, and it still serves nicely as a mid-sized full featured hothatch. That doesn't mean Rover didn't go bust.
I always tended to favour AMD, this computer I'm on now is some flavour of Duron, and I always preferred Athlons in their prime. But that doesn't stop me acknowledging that their offerings haven't been great in the last few years. As a consumer, I'm always going to buy whatever product is the best at any given time, and
Re: (Score:2)
You're calling a Rover 75 a hothatch? It's not exactly hot, and it's (to the best of my knowledge) purely a saloon. The "hothatch" definition tends to be small powerful hatchbacks. A Golf, for example.
Re: (Score:2)
Dang, knew I was getting confused somewhere there. I was assuming it was one of those hatchbacks-that-are-supposed-to-look-like-a-saloon hatchbacks.
Consider me happily educated.
Re: (Score:2)
"I've always personally used AMD systems, and have never found them lacking."
I'd be suprised if you did find them lacking if they're all you've used.
Only using one type of system doesn't really put you in the best of places to judge on what you may or may not be missing out on though.
Re: (Score:3)
Sorry, but I'll trust my own experience over your comment. But as you say, nice try.
Re: (Score:3, Informative)
I took the liberty of comparing some systems. A Phenom II 940 x4 is about $200 with a board. A Intel Core 2 Quad Q8200 is about the same. The Phenom is 2.5 ghz, the Core is 2.2ghz. According to this page, http://www.dexternights.com/2009/05/29/intel-vs-amd-desktop-quad-core-processors-comparison-price-vs-performance-ratio/ [dexternights.com], They are pretty much equal. It is pretty amazing how much intel has dropped pricing in comparison to AMD. Performance wise, those two parts are pretty similar, so I don't think the perfo
Re: (Score:2)
Most benchmarks put the Phenom II's about equal to the Core 2 chips. The big advantage with going with a Socket AM3 Phenom II is that you'll be getting AMD's next generation socket so you should have plenty of future upgradability, whereas with LGA775 Intel is more or less done releasing new processors for that socket (save a few budget models until Core i5 comes out) so you're pretty much putting together a dead-end system. It is true that AMD really doesn't have an answer for Core i7, but that's also va
Re: (Score:2)
Either that you're totally wrong. AMD's not the GM of the PC industry.
Re: (Score:2)
I hope this comment is in jest pointing out that people have been claiming that AMD is in trouble for years
I never quite figured that out. Sure AMD's had a rough patch in recent years, but Intel spent years churning out crap sub-standard processors. Intel tried to fix their problems, and have come back with great products; there's no reason to think AMD can't do the same -- and indeed the Phenom II seems to be excellent (it doesn't completely crush Intel's offerings like AMD's products did a few years ago, but Intel's not turning out complete crap these days).
What was particularly surprising to me, though, is
Talk about stupid (Score:4, Interesting)
Re:Talk about stupid (Score:4, Funny)
Yeah, but this must be "The Way It's Meant To Be Played".
Well... (Score:5, Insightful)
Re: (Score:3, Insightful)
I like my AMD chips.
ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.
Re: (Score:3, Informative)
Re: (Score:2, Interesting)
Unless you want to talk about spending $500 on a processor.
well, the GP does claim to be the sort of person who buys two GFX cards in order to get a marginally faster frame rate; it would be reasonable to assume he'd spend a ton on his cpu too to make his e-penis bigger
Re:Well... (Score:5, Informative)
ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.
True, but only if system cost is not factored into the equation.
Mandatory car analogy: Yes, the $500k ferrari might win against my $100K porsche, but how many people are gonna pay the extra megabucks for them (or whatever your local currency is, electrum pieces?).
Re:Well... (Score:5, Insightful)
Additionally, I really like what I've seen from AMD recently, sure it probably isn't as good at the top end of the offerings, but my current set up cost me somewhat less than $500 and is able to handle things like virtual box quite well.
Re: (Score:2)
True, but only if system cost is not factored into the equation.
..True, but only if overclocking is not factored into the equation. :) Intel blows AMD out of the water once you start overclocking. Starting with default voltage and BOX cooling you easily hit ~3.6-3.7GHz on 45nm Core. If you bump voltage and provide 10" case fan you can reach 4GHz. AMD ends at about 3.6GHz, maybe 3.8GHz with expensive cooling solutions.
Re: (Score:2)
My problem is that every time I buy an ATI card it fails massively everywhere, and when I get an nVidia card it at least mostly works everywhere. The Intel vs. AMD comparison might welcome comparisons of Ferraris and Porches, but the difference between ATI and nVidia is more like Toyota and Nissan. The Nissan has just a bit more performance and the overall implementation is a little nicer (even the driver install) and you will find that all the little clip-and-knob stuff will break a lot easier on the Toyot
Re: (Score:2, Interesting)
ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.
ATI isn't quite up there with nVidia for performance. The 4950 and 4750x2 of ATI struggle to compete with the GTX285 & GTX295 offerings of nVidia, but the ATI cards are cheaper. And while the Phenom II is behind the i7 in performance, it's also cheaper - especially when you factor in the cost of motherboard into the equation (i7 motherboards are almost exclusively expensive "enthusiast" setups).
So... You get what you pay for really.
Re: (Score:3, Interesting)
ATI is right up there in performance when compared to it's rival Nvidia GPU's. The problem is, Intel's Core i7 blows anything AMD has out of the water. Even the aging Intel quad-cores rival with AMD's brand new Phenom 2's.
ATI isn't quite up there with nVidia for performance. The 4950 and 4750x2 of ATI struggle to compete with the GTX285 & GTX295 offerings of nVidia, but the ATI cards are cheaper. And while the Phenom II is behind the i7 in performance, it's also cheaper - especially when you factor in the cost of motherboard into the equation (i7 motherboards are almost exclusively expensive "enthusiast" setups).
So... You get what you pay for really.
Nvidia is pushing CUDA first, OpenCL second. AMD is moving Streams to second and OpenCL to first. I find AMD using their brains and Nvidia pissin' up a rope.
Re: (Score:2)
Yes an i7 975 will run circles around a Phenom II 955 in raw number crunching. But the i7 is $1k while the Phenom II is $250. Going with the Phenom, you have enough CPU to keep your GPU fed and $750 to keep yourself fed.
Re: (Score:2)
I'm not sure why one would buy the models you suggest, when you can get a 4870 or 4890 at less cost and fairly comparable performance. the highest of the high end ATI's are cheaper and you're not going to complain... any about performance.
I've got a 4870, and with the price point they are at these days I'd assume they're doing pretty damned well..
Re: (Score:2)
Re: (Score:2)
I play Crysis on a 32" monitor. My single 9800GTX+ pumps 1920x1080 with all the super details just fine, I might drop below 20FPS in a scene with loads of energy weaponry firing but otherwise maintain around 45 FPS.
Re: (Score:2)
Re:Well... (Score:5, Insightful)
I have a 2 year old AMD machine with an AM2 motherboard, which supports AM2+ processors in the latest BIOS. I was considering replacing the aging box with an Intel machine, or building a new AMD machine, I wasn't quite sure what to do.
Then I found I could buy an AM2+ Phenom 2 triple-core and a Radeon HD4850 for just shy of $200. That pretty much ended the internal debate.
Re: (Score:2)
Re: (Score:2)
Maybe Intel has a CPU that magically blows away anything AMD has. I really doubt it, but maybe.
Problem number one, speed isn't everything. The various comm channels can choke an impossibly fast CPU down to nothing, and they can make a moderately fast CPU look really really fast. That is, after all, the reasoning behind AMD's rating system - the 2400+, 5400+, etc. My ancient 2400+ XP chip running at 2Ghz benches alongside Intel's chips clocked at 2400Mhz.
Problem number two - Intel charges a premium for t
Re: (Score:2)
Actually the initial speed rating on AMD chips was in direct comparison to a 1ghz Thunderbird. So a 2400 would be 2.4x as fast as a 1ghz Thunderbird. I'm still using an "ancient" Athlon64 3000, which Ubuntu no longer wants to boot on. (Unless they fixed that nasty bad no PSS objects bug) At 2ghz, it is somehow 3x as fast. The faster FSB certainly helps. For what I do (photoshop, lightroom, sound mixing) it is still plenty adequate, though I've been due for an upgrade to last years technology for some time
Re: (Score:2, Insightful)
The ATI video cards have impressive hardware specs when comparing to Nvidia. However, their drivers and their driver support is shit.
Re: (Score:3, Informative)
Ever since AMD bought out ATI driver support has drastically improved. AMD isolated the problem (drivers) and got it fixed, or prompted ATI to do it.
Re: (Score:2)
What's the news? (Score:5, Insightful)
Move on, nothing to see here.
Who cares? (Score:5, Insightful)
Dual GPU solutions are so pointless, a waste of money for little performance gain, that doesn't even work in some games.
Re: (Score:3, Informative)
I'm with this guy. I have SLI and I have yet to find a single game that it actually improves. In fact, in most cases it cuts performance in half. As far as I can tell it's just a way to trick morons like me into buying twice as many video cards.
Re: (Score:2)
My SLI cards get me better performance. Of course they are ancient cards, I think the second one cost $50 2 years ago.
What I don't understanf is how you fit two cards in now that every card I see takes two slots all by itself.
But I'm years behind.
Re: (Score:3, Insightful)
Not only that, but SLI in the specific is so bad that dual card setups are one of the few places you actually want to have ATI over nVidia.
Re:Who cares? (Score:4, Insightful)
Dual GPU solutions are so pointless, a waste of money for little performance gain, that doesn't even work in some games.
Think OpenCL. I could careless about Streams or CUDA. But I do care about OpenCL/OpenGL and the Engineering worlds. Games will get it sooner rather than later why OpenCL will thrive.
Re:Who cares? (Score:4, Informative)
Maybe he actually meant "I could sloppy", although that sentence no verb.
Re: (Score:2)
Who CARES about SLI? (Score:5, Interesting)
Anyway all I'm saying is AMD has the ability to tie in their own processor + GPU combo, plus let the consumer buy a separate GPU, thus getting their own "SLI". If they play their card right, they can just give the finger to NVIDIA and provide some real competition that this market really needs to prevent us all from paying $200-300 for a decent GPU these days.
Re: (Score:2, Troll)
You're wrong on that. There are plenty of people I know that have and use more than one video card. Not necessarily with SLI though...
Re: (Score:2)
Pretty sure that SLI requires the two video cards to be actually connected.
That's SLI, which is NVidia's dual-card system. Doesn't ATI have their own? (Crossfire?). Maybe that's different.
If you mean that AMD would put the GPU on the motherboard, I would still think you would need a SLI connector on that separate GPU actually connected.
So? Include the cable with the card and mandate a port on the motherboard. Or include a PCI-E 1x card with the processor.
Congratulations (Score:2)
On tearing apart this person's use of words and then basically agreeing with him.
Perhaps you could just agree with his reasoning and point out the flaws in the fine points of his logic instead.
The argument is that SLI is pointless, you both agree, and yet you MUST find ways to pick on him.
Sheeesh.
Re: (Score:2)
I did not "tear him apart". You make it sound like my post was filled with vitriol and hyperbole. I only pointed out that multiple video cards does not mean SLI is being used by default.
Hmmmm, I thought I did point of the flaws in his logic. I never said that I agreed with his reasoning on SLI being pointless eithe
Re: (Score:2, Informative)
Re: (Score:2)
Actually, I recall several times where an SLI solution would perform pretty damn close to the high-end card, for less price. Or even in one case, the SLI cards outperformed the super-expensive high-end card. Sometimes one has to step back and see if two (or more!) mid-range cards can outdo a single high-end card, which costs 2-3x more money.
Re: (Score:2)
SLI will give you better than 25% improved performance. I hear the cards can double performance. Still most games will stick to one video card. The Valve hardware survey has SLI users at 2% of valve customers. That's 2% of gamers not everyday PC users.
You don't need to spend 300 bucks on any video card. Most 100 dollar video card give you all the performance m
Re: (Score:2)
Re: (Score:2)
And why should anyone really, when modern day consoles cost about the same amount as one would spend on a moderately high end processor + video card, why the hell would most people want to spend an extra 300 bucks or so to have an extra video card at only 25% or less extra benefit in framerate?
Because they're unreliable, tend to overheat, and have expensive games that run at low framerates.
If you build your own PC you can make it out of highly-rated (by other purchasers of course, don't trust "consumer reporting" agencies) parts and carefully cool it so it works nicely. Building it from scratch this way will be more expensive than getting a console, but if you're just upgrading it like you are saying in your post, it won't be.
Nvidia (Score:4, Interesting)
Asus has jumped in bed with Microsoft as of late. With AMD's purchase of ATI and promise of open source drivers and Nvidia's failure to move forward in open source, Nvidia and Asus has seen the last dollar of mine.
Ready...Aim...Fire (Score:2)
2: Aim at foot.
Re: (Score:2)
4. Profit!
Re: (Score:2)
Unimportant. (Score:5, Informative)
Really the article makes it sound like Nvidia is abandoning AMD chipsets but it's just SLI support. When they started making this decision it looked like AMD was totally dead in the enthusiast market. Even die-hards were switching to Intel chips. It seemed for a while there that the market for dual graphics cards on AMD was nearly dead. Now that AMD has a good chip again Nvidia will probably be scrambling to get a new chipset out for enthusiasts.
isn't sli just bs tech designed to sell more cards (Score:5, Insightful)
As I understand it, you don't really double your performance by putting two cards in. How many people seriously drop the coin to do this? Everything I've read says you'll get better bang for the buck by buying one good card, saving the money you would have spent on the second and then buying an equivalent card in three year's time that will kick the arse of the first card.
Re: (Score:2)
And who buys AMD? People looking to get better bang for the buck. In other words, people who are unlikely to double the cost of the video card for only 50% more performance.
While I think this is a silly move by nVidia (it makes them look bad to their customer base), it probably isn't nearly as dumb a move as it looks at first glance. They probably have a pretty good idea of what portion of their customers use AMD and SLI currently, and it's probably pretty low.
Re: (Score:2)
>>How many people seriously drop the coin to do this?
My motherboard has SLI support (I bought it in December 2004, on the off chance the numbers would make sense in the future.) But when it came time to replace my 6800, it make more sense to buy a 7900 (which was like 10x faster) rather than a second 6800, which probably would have entailed needing a PSU upgrade as well.
When it came time to replace my 7900, it made more sense to get an 8800 than a second 7900. When it came time to replace the 8800, it
Re:isn't sli just bs tech designed to sell more ca (Score:2)
Not sure about NVidia's card's, but right now two of AMD's 4850s are cheaper than and just as fast as a single 4890. It's the best deal around the $220 price point.
Source: http://www.tomshardware.com/reviews/radeon-geforce-price,2323-4.html [tomshardware.com]
Re: (Score:2)
A few years later, you won't be able to find an identical match to your existing card. That doesn't work. I know, I tried after about a year.
Single card, Dual GPU (Score:4, Informative)
Re: (Score:2)
A two card solution provides better cooling and in case of Radeon 4670 it is possible to have a passive crossfire solution.
Re: (Score:2)
I've been waiting for someone to chime in with this argument. I was going to mention it in my original post. While you do have a valid point, let's be honest... 3- and 4-way sli/crossfire is an even smaller niche market than dual cards... like, orders of magnitude smaller.
Also, I'm not trying to discredit those solutions, but 2-way sli seems to be the sweet spot in terms of performance vs. cost. As far as gaming is concerned, even 2-way SLI is rarely worth the c
Re: (Score:2)
This simply isn't true. Single-slot sli/xfire solutions perform equally as well as dual-slot solutions across the board.
As an example, check out the benchmarks for the ATI HD 4870 X2 vs. 2 4870's in SLI as tested by tomshardware:
http://www.tomshardware.com/charts/gaming-graphics-charts-q3-2008/compare,794.html?prod%5B2113%5D=on&prod%5B [tomshardware.com]
Not a problem (Score:4, Insightful)
SLI = Not Value.
AMD has consistently shown that they want to put a computer at every set of hands on the planet. Geode, PIC [betanews.com], OLPC. Now it would be nice if those computers had fast 3D graphics or GPU parralel processing, but that really seems like an easy way to waste the real power of computers.
I have loved many Nvidia products in the past, but stepping away from AMD seems like a poor choice on Nvidia's part.
NVidia and AMD (Score:2)
A bit OT, but I'm curious as to why the best deal for half decent motherboards around here seem to be NVidia chipsets and onboard graphics, and AMD processors... Should AMD/ATI be cranking out chipsets that allow board makers to do better/faster/cheaper boards with combos from the same manufacturer?? Just seems odd. All the PC's in my house (one Linux, one Hackintosh, a couple of Windows ones for the kids, are all NVidia/AMD setups, bought over the past few years.)
Re:Oh silly hardware companies..NVIDIA HAS PROBS (Score:5, Insightful)
Until, that is, millions of their mobile GPU chips keel over from heat death due to improper package bump and underfill construction.
And their single GPU chips are so big that they're impossible to manufacture cost effectively.
And that they need expensive PCB's because 512-bit wide memory is necessary when DDR3 has go up against ATI's more advanced DDR5 boards with half the required memory bus width for near equivalent memory performance.
And when two small, cheap, easy to manufacture chips beat out the biggest chip every time.
And when you're trying to get DirectX 11 running for the first time while making a radical architecture shift all while going to a new chip making process against a rival who is already shipping 40nm chips and has essentially had DX11 running in their past three generations of chips.
Yeah, I'm not sure Nvidia has nearly all the goods right at this moment.
Re: (Score:2)
Mod this man(or woman) up!
Re: (Score:2)
Until, that is, millions of their mobile GPU chips keel over from heat death due to improper package bump and underfill construction.
That sounds like some wicked conjecture, have any evidence that this is impending?
And their single GPU chips are so big that they're impossible to manufacture cost effectively.
This again seems like conjecture unless you've got NVIDIA's manufacturing balance sheet handy.
And that they need expensive PCB's because 512-bit wide memory is necessary when DDR3 has go up against ATI's more advanced DDR5 boards with half the required memory bus width for near equivalent memory performance.
Don't mistake 'first to market' for 'eternal competitive advantage'. DDR5 isn't ATI's IP, they're just implementing it. NVIDIA can do the same. NVIDIA's DDR3 cards are still beating ATI's DDR5 benchmarks. And while we're at it, do you think DDR5 is increasing or decreasing ATI's costs? Do you think those costs are more or less than
Re: (Score:2, Interesting)
Own NVIDIA shares do you? Or just a fan boy?
Sorry, but the NVIDIA solder problems have been well documented, so you must have chosen to ignore this.
NVIDIA is going to use GDDR5 on their upcoming 40nm "midrange" parts. ATI actually did a lot of work on GDDR5 with the memory manufacturers, hence they got the technology early.
Quite simply with die sizes, the bigger the die, the more costly it is. AMD/ATI and NVIDIA both use TSMC for their products, so it comes down to die size. The bigger the die, the bigger t
Re:Oh silly hardware companies..NVIDIA HAS PROBS (Score:4, Insightful)
Re: (Score:2)
My Radeon 9250 is at least 5 years old and it's 100% supported, and getting new features even in the latest drivers.
I guess that's the price you pay for choosing to run Windows.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
What the hell are you talking about? Things that the Core i7 has like an on-die memory controller, QuickPath interconnect, and single-die quad core chips is the kind of stuff AMD has had for years.
Re: (Score:2)
I believe AMD wanted to bring one of the two chipset manufacturers in house so they could have better coupling between their processors and their chipsets.
That's probably part of the reason, but AMD is looking long term, and they see Larrabee on the horizon, so they're also working towards the eventual marriage of the CPU and GPU at some point in the future. In the short term, they're heading towards putting the CPU and GPU on the same die, last I read, they codenamed it 'Fusion'. Just google 'AMD Fusion', you should find references.