NVidia Reportedly Will Exit Chipset Business 173
xav_jones sends along a story from X bit Laboratories claiming that NVidia is ready to quit making chipsets. That story links one from DigiTimes, which reports that NVidia has denied that it's getting out of the business. "[NVidia] is about to quit chipset business, which automatically means that the company's much-hyped multi-GPU SLI technology is either in danger or re-considered. Moreover, several mainboard makers have already ceased making high-end NVidia-based mainboards. [NVidia has]... reportedly decided to quit core-logic business to concentrate on development of graphics processors and following failure to secure license to build and sell chipsets compatible with Intel Corp.'s microprocessors that use Quick-Path Interconnect bus."
Damn (Score:5, Insightful)
Nvidia released nice overclocking tools, had good BIOS options, nice features (such a firewall built directly into the NIC), etc.
I always buy NForce chipsets.
AMD / ATI chipsets are good on board with ram (Score:1, Informative)
AMD / ATI chipset are good on board with side port ram, nice overclocking tools, hyper flash, pci-2.0. Cross fire works on any chip set as well.
Re: (Score:2, Offtopic)
explain to me what side port ram is, and why it is necessary?
Re: (Score:3, Interesting)
You could connect memory to be used ONLY by the video card on a special slot - it was used for laptops, I think. Also, if I remember correctly, Intel had an option to add a special card with RAM in the AGP port, and the chipset would use it as video memory.
This was used as the video part of the chipset can use more memory bandwidth than the main RAM allows it, and using integrated graphics with main memory lowers the available bandwidth to the processor.
Re: (Score:3, Interesting)
nice features (such a firewall built directly into the NIC)
Actually, NAM can be very buggy. My system had never been really stable when it was installed; for example, I wasn't able to install a game without getting a BSOD. Just playing music in Winamp with nothing else open would crash my computer at times as well.
Re: (Score:3, Interesting)
Well I think that I'm about to get called a troll but I've owned several systems over the last few years. Some have had nforce chipssets and others a mixture of AMD, Intel, and VIA. When I setdown and think about the systems that I have had the most trouble out of have nforce chipsets.
My current system has a AMD chipset in it and it has been the most stable and trouble free system I've. I bought two systems a few years ago. One for a linux server and the other for a HTPC. The linux server was a VIA
It appears this story is bogus (Score:5, Interesting)
Link [google.com]
Re:It appears this story is bogus (Score:5, Insightful)
Where the hell did this rumor even start?
It's like IBM stopping all work with Java or Starbucks announcing it will no longer sell baked goods at its stores.
Who even comes up with this stuff? On your mark, get set, castrate the company of your choice...
Re:It appears this story is bogus (Score:5, Insightful)
Well, to be fair, this is about core logic chipsets (nForce). They aren't exactly core to NVIDIA's business. Besides, given how poorly AMD is fairing in the enthusiast market, the merger of ATI/AMD making NVIDIA an AMD competitor (nForce originally made its splash, and had its "glory days", for AMD), and the desire of Intel to push its own chipsets (which have also been quite good recently, lessening the room for an "enthusiast" class third party) I wouldn't be incredibly surprised to see them make this move -- even though they apparently aren't doing it now.
According to Ars [arstechnica.com], the original source was one of the motherboard manufacturers. Aside from NVIDIA themselves, they'd be most likely to know. But again, according to NVIDIA, this is a load of crap.
Re: (Score:2)
I don't believe that this is the time to throw in the towel. Depending upon how the antitrust investigations against Intel go, they could very well be required to make it easier for third parties to provide chipsets for pentiums.
Even neglecting that specific possibility, people do still buy AMD based computers with nVidia chips in them, my new barebones has both an nForce chipset as well as an integrated nVidia video card onboard. I'll probably ditch the onboard, but it's a huge step up over any intel integ
Re: (Score:2)
Well, the rumour about their mobo partners dropping chipsets was probably started by the Inquirer (long history of fact deficient articles), they were purporting that Gigabyte had dropped their 700i series chipsets, when, in fact, they had never started making them (as I pointed out here [theinquirer.net] in the first comment).
Re: (Score:2)
Well, to be fair, this is about core logic chipsets (nForce). They aren't exactly core to NVIDIA's business
Based on their 2007 financial statements, core logic (what they call MCP) accounted for $660M of revenue, compared to $1990M for GPUs. I'd hardly call a clear quarter of their business "not exactly core".
Re:It appears this story is bogus (Score:5, Funny)
Apple announced today that it will stop selling actual products and will only sell hype, in pretty packages of course.
And so, what ?.... (Score:2)
Apple announced today that it {...} will only sell hype, in pretty packages of course.
I fail to see how this is any different from the current situation. :-P
Re:And so, what ?.... (Score:5, Funny)
Apple announced today that it {...} will only sell hype, in pretty packages of course.
I fail to see how this is any different from the current situation. :-P
BeOS lost. Get over it.
Re: (Score:3, Funny)
Apple announced today that it will stop selling actual products and will only sell hype, in pretty packages of course.
That's not funny. Their garbage will still sell to the mindless assholes who buy their current packages of hype.
Except for the iPod. You'll have to pry my shiny iPod from my cold, dead hands.
Re:It appears this story is bogus (Score:5, Funny)
It's like IBM stopping all work with Java or Starbucks announcing it will no longer sell baked goods at its stores
or, Slashdot will only post stories after they've been fact-checked.
Re: (Score:2, Funny)
Re: (Score:2)
If you think about it, Starbucks abandoning Java would make for interesting news as well...
You're saying you wouldn't be equally intrigued if you found out that IBM are leaving the baked goods market?!
Re: (Score:2)
Re: (Score:2)
Or slashdot readers insisting that this is a news site and not a blog built around discussing things that interest the editors.
Re: (Score:2)
I'd be shocked if slashdot even reached spell-checked.
But it sounds realistic (Score:2)
But given the mutual problems with licensing (NVidia refusing to license SLI technology to Intel, and Intel paying back by making it difficult to license Quickpath for nVidia), this could sound realistic. They could actually stop producing Intel chipset for Nehalem.
It' prank, but at least it's one which sound realistic.
It's actually possible that they'll drop the towel for Nehalems, and shift to only support AMDs and lower end Intel Cores still running with an actual north bridge, and require that hardcore
Re: (Score:3, Insightful)
Who even comes up with this stuff?
Probably someone trying to make money on stock... [google.com]
Hopefully nVidia catches whoever started this one and successfully sues them for conspiracy to affect stock price, defamation, and a slew of other fun charges that I no doubt have never heard of...
Re: (Score:2)
Actually, they wouldn't have to, it's the domain of the SEC to handle these sorts of things.
nVidia could sue for trademark infringement and probably a few other things, but enforcing the stock market regulation portion would be the domain of the SEC.
Re: (Score:2)
Who even comes up with this stuff?
People going short on the stock? You can generate a pretty penny doing this stuff.
Re:If only (Score:4, Interesting)
Re: (Score:2)
It's a shame that the EEtimes reports such unreliable crap sometimes. When they run articles on solar cells, new products, and tech business they're usually pretty good. Whenever I see an EEtimes article about superconducting circuits, MRAM, FE-RAM, or any tech coming out of research labs, their articles SUCK. Classic case of 'the new journalism' - unless there's a press release they can quote from, they're lost.
Re: (Score:2)
EETimes did not report this story. Digitimes did.
Comment removed (Score:5, Interesting)
Re: (Score:2)
Well, they could get out of the the chipset business and license SLI to AMD and Intel increasing their market for multiple NVidia cards. AMD might be a little hesitant, but if Intel buys it, they would buy it too.
Re: (Score:2)
AMD already has CrossFire, and no desire to help NVidia sell more video cards, when AMD has to sell video cards of its own.
This would be great for Intel, but they might want to get the SLI technology so that they would be able to use it on their own (maybe) future video cards.
Re: (Score:2)
Re:It appears this story is bogus (Score:5, Informative)
Digitimes --goes best with pinch of salt. (Score:2)
I applied for a job there a few years ago. I live in Taipei where it's headquartered. They told me I wanted too much money so it never happened. The staff is almost entirely Taiwanese with just a couple of native English speakers and they pay a basic local salary which is not that much while expecting long hours in the office. Because of the low wages and long hours, they have high turnover and most of the staff aren't really all that geeky, they're just doing a job. The guy who interviewed me was hoping I
Re: (Score:2)
Re: (Score:3, Interesting)
I don't have much experience with ATI chipsets but what I can say about nvidia's chipsets is that they're usually HOT and consume a lot of power.
I have an Asus mainboard with an Nforce2 chipset, it was great, with a great onboard soundcard (Soundstorm). Now, nVidia won't use a good soundcard anymore, to make their chips cheaper.
Now, I have an Asus mainboard with the Nforce4-SLI chipset... you can make eggs on it, that's how hot it is (see the P5ND2-SLI motherboard on google if you want). It's good, it's sta
Re: (Score:3, Insightful)
I have an NForce5-SLI chipset with just air cooling. The whole system runs pretty cool, even with a 10% overclock.
Re: (Score:3, Funny)
Well, I think you can cook eggs at around 60 deg C (140 deg F), which isn't that much, considering that you're trying to convey the image of a hotplate (mine can hit around 200 deg C).
Re: (Score:2)
http://arstechnica.com/journals/hardware.ars/2008/08/01/nvidia-to-ars-were-not-leaving-the-chipset-market [arstechnica.com]
Re: (Score:2)
For the last few years, most benchmarking and reviewing sites have suggested that unless you want SLi features, the Intel's lineup have been a lot better. Cheaper, faster and with better and more effective features. The mainboards that are sold with the Intel chipsets seem to be cheaper too (in general).
Re: (Score:2)
Do you inevitably start wondering what the hell is up with all of those TCP checksum errors when you run a packet sniffer on traffic handled by the onboard NIC?
It's an old bug that nvidia still has yet to fix, started in the nForce4's and has remained in all nforce boards since. You have to disable TCP checksum offloading, otherwise the NIC will continuously discard otherwise perfectly good packets as failing their checksum (and of course, request resends, which will also fail, ad infinitum).
And ActiveArmor
Re: (Score:2)
nice features (such a firewall built directly into the NIC)
Um, wasn't that so buggy they never actually hooked it up? Which isn't really a problem because it's about the most useless feature ever, is packet filtering on your Windows box really that big a CPU hog?
And their NICs are a fucking joke; no documentation, and awful binary blob drivers which barely worked and required Linux and OpenBSD to reverse engineer the damn thing to make working open source ones (which can still be wobbly). I'll never forgive Sun for replacing the Intel 1000/Pro NICs in their Galax
Re: (Score:2)
Old motherboard and new OS.
Not necessarily a situation where you can expect great support.
People always blast "driver" issues on Linux, which I find funny. Manufacturers rarely update old drivers on Windows, and sometimes finding old drivers is impossible.
You'll find that Nvidia wrote first-party Linux drivers for their motherboards which is nice. However, many people have complained that Vista is difficult to write good drivers for.
Their focus is likely newer chipsets, not getting old ones to work well w
Re: (Score:2)
Yeah but.... :)
Isnt the entire point of a PC, is to have decent hardware support, even older stuff?
At the time, the Nforce 3 board wasnt that old. Nforce 4 had been out a little while but not long enough to really make nforce3 old.
Especially when Vista came out.
Linux users would be furious if this were the case. Hell the only reason we have driver support in linux, is because the furious intent on making linux compatible with all kinds of hardware, new and old... by the passionate user base. Often it is the
print page (Score:2, Informative)
Re: (Score:2, Informative)
In other news, (Score:2, Funny)
Does this mean that (Score:1, Offtopic)
Or are they still in the graphics game, but not in the chipset game (can you do that?)
Nvidia Says: Bullshit; Chipset Business Strong (Score:5, Informative)
Is the correct title to this story. See here [extremetech.com]. "The story on Digitimes is completely groundless. We have no intention of getting out of the chipset business."
"Mercury Research has reported that the Nvidia market share of AMD platforms in Q2'08 was 60%," Del Rizzo said. "We have been steady in this range for over two years."
"We're looking forward to bring new and very exciting MCP products to the market for both AMD and Intel platforms," Del Rizzo added.
What does Netcraft have to say? (Score:2)
Never mind the actual truth! Has Netcraft confirmed it? Or at least has Gartner predicted it will happen?
Re: (Score:2, Troll)
There, fixed that for you.
Re: (Score:2)
I doubt they'd leave (Score:2)
Especially since AMD still seems to suck at making chipsets for their own boards. I use Intel processors myself but all the AMD fans I know recommend nVidia chipsets. As long as AMD doesn't do a good job in that market, I can't see nVidia leaving entirely.
Also as far as I know, nVidia has been able to get a QuickPath license. Basically Intel was annoyed that nVidia was playing hardball on SLI licenses. You may note that nearly all Intel boards are Crossifre only, despite ATi now being owned by AMD. Reason w
Re: (Score:2)
Maybe I'm behind the times, but I thought that AMD just didn't make chipsets at all.
Ars Technia says this isn't true... (Score:5, Informative)
Re: (Score:2)
"Hey, maybe people can actually hold Bush and MS to what they do/say..."
Whoa there, don't go rocking the boat there. Wouldn't want people to think you're a commie terrorist furinner or something like that.
i'll be missing you... not (Score:1)
I doubt this.. Anyway, one one hand, this won't be good for the market - less competition.
On the other, no flame here, recent NVIDIA products that I've used (although this is graphics, not a chipset as mentioned in the article), like in T61p, were quite buggy. So I won't be missing NVIDIA products.
Nvidia is managing badly recently. (Score:2)
Nvidia is managing badly in other ways, also, than supplying poor quality software; this Slashdot story is not an isolated occurrence. For example, All Nvidia G84 and G86s are bad [theinquirer.net]. Or see All Nvidia G84 and G86 chips faulty? [techspot.com]. Or, Nvidia Likely to Confirm Scale of Chip Troubles Soon [pcworld.com].
Dubious at best. (Score:5, Insightful)
Re: (Score:2)
It's in Intel's best interest to license QPI to nVidia, because it means more sales of Intel CPUs.
Is that really true? How many Intel CPU sales would be lost to AMD if there were no nVidia chipset for them? How many people that bought an nVidia-based board would just buy a board with Intel chips anyway?
I know that the SLI drivers do require nVidia boards, but I seem to hear that the installer or driver gets regularly hacked to remove that silly requirement.
Please note... (Score:2)
Re: (Score:2)
GeForce is a chip. nForce north/south bridges are motherboard chipSETS. No clarification is necessary.
Nforce was great (Score:4, Interesting)
I really like the Nforce chipset.
In my experience they have always been stable and well supported.
Where does this leave us AMD users... I'm still not quite the fan of the AMD chipsets as they haven't been around long enough... all of the "performance" boards were Nvidia based. My current board has an Nforce 570.
How does the AMD 780 compare? Anyone?
Re: (Score:3, Interesting)
Don't have a 780 chipset myself, I have a 790, but I figure this ought to be just as relevant. I felt the same way when I built my latest machine back in February. I didn't want to go with an AMD chipset and ATI cards. I've been an ardent AMD fan for CPUs, but for the last two builds, I went from ATI to nVidia for graphics...
Of course, when I looked into it, it turned out that the latest ATI offerings beat the pants off of nVidia's, and the new CrossfireX SLI system looked like they took nVidia SLI, and
Re: (Score:2)
Of course, when I looked into it, it turned out that the latest ATI offerings beat the pants off of nVidia's
I barely missed dual-NICs, the thing that got me most, was the lack of RAID5 support on the SATA chips...
I don't have personal anecdotal experience with recent NVIDIA and AMD/ATI chipsets, but I enjoy reading chipset reviews at sites like Tech Report, Ars Technica, and Anandtech. From the reviews, I've noticed that AMD/ATI might have significantly inferior "south bridge" performance (hard disk, USB, ethernet, etc) compared to NVIDIA.
AMD's most glaring problem might be flakey AHCI (SATA, NCQ, hot-swap) support. The Tech Report thinks it's so bad, AMD chipset SATA ports should be run in legacy IDE mode. They r
Re: (Score:2)
That's an interesting board. Thanks for the info. I've been planning on building a large multi-monitor config for some time, and had been thinking of nForce boards that would let me get 3 x16 graphics boards in, but if this one'll take four boards that's even better. I don't suppose you know if there's any issue using the boards independently, rather than joined for faster rendering...?
Re: (Score:2)
In my experience they have always been stable and well supported.
bah!!
as much as I like amd (as the little guy) their cpu's are not competitive with intel and the chipset story is even worse!
ethernet drivers are the biggest nightmare. 'force death' ??? huh? totally reverse engineered. and that's a bad thing.
I did not find this production quality on linux and VERY bad on freebsd.
lets not even talk about sata bugs and 'behavior' from nv chipsets.
please LET NVIDIA DIE, ENTIRELY. competition is good but th
Re: (Score:2)
Really? I have never, ever, in my entire life had an onboard ethernet port that didn't work (under Linux).
Can you list some boards so I know which ones to avoid?
Ties between chipset and CPU (Score:3, Interesting)
Although this story seems groundless, it does look like the ties between CPU and chip set are getting stronger. This seems to be one of the reasons for ATI to be taken over by AMD. Intel was already creating its own chip sets and has a monopoly on defining an interface between the two. This is an interesting relationship since it would seem that the CPU is only part of the machine nowadays. I'm expecting that this relationship will turn around somewhere in the distant future.
With Intel it was always hard to sell your own chip set against theirs (for the desktop market). Now it will get harder with AMD as well, since they have the ATI chip sets to think about. It would be strange if there would not be some casualties. Hopefully nVidia is big enough to keep some chip sets around for some time. VIA has already given up, I hope their gamble on the embedded market pays off, although they will have pretty strong competition there as well.
Re:Ties between chipset and CPU (Score:5, Interesting)
Notice the odd one out? What do you think the logical long-term plan should be, if you were nVidia?
Re: (Score:2)
Well, I heard they were trying to build their own CPU, but building a x86 compatible one - and more importantly one that can compete - will take some time and work. They should better be quick with it though, otherwise they will be missing the boat. AMD and Intel are both very well capable of building chip sets and video. Just the high end video market just won't cut it.
Re: (Score:2, Insightful)
Re: (Score:2)
You would think so, wouldn't you?
But it's not that easy. For a few years, AMD beat Intel at the CPU business because Intel made some very, very stupid decisions, and AMD made some very smart ones. But assuming that a company doesn't make such stupid decisions, then the CPU game comes down mostly to "Who has the best fab technology?". Intel has some of the (if not THE) best fab tech in the world, and is thusly returning the favor to AMD. NVidia doesn't have any fabs at all, they would have to go to 3rd-p
Re:Ties between chipset and CPU (Score:4, Interesting)
Via CPUs are crap, VIA chipsets are extra crap, and VIA video blows chunks.
nVidia have stuck to the things they do well.
Re: (Score:2)
Well, which part should they have stuck to then? Or are you saying they are crap whichever way you look at it?
The've server quite a few of my computers with their chipsets and pretty well, but they are in a corner. Mini-ITX and Nano-ITX are great ideas and they've got groovy things like crypto hardware acceleration and decent video decoding facilities.
And don't forget that their latest VIA Nano CPU is an interesting, "all new" design, which is faster than the Intel Atom.
Re: (Score:2)
But the VIAs try hard to be something they're not, so for the purpose they're intended for... they're not worthy.
Re: (Score:2)
Sell out to intel? :-)
The right thing to do... (Score:4, Insightful)
I was never happy that nVidia got into the chipset business in the first place. If any company has a talent to specialize and do one thing really really well (in a competitive environment), then that is what they should continue to concetrate on. nVidia seems to have talented people who can ultimately bring us photorealistic graphics at high performance for our games, as well as other engineering, and creative needs. I really frown on companies that water down their core business by diversifying into areas which they shouldn't be messing about in.
This kind of thing seems to happen quite often and in other ways. For example, John Carmack seemed to really have a talent in producing great engines for games on the bleeding edge of what is possible with new PC technology. John drove PC gaming technology. But what does John do? John goes off to create rockets. And then he journeys off to work on pocket devices, which are basically PCs from 1995 running Win31 with 16 bit graphics. ;( John has allowed Crytek and other engine creators to walk all over id software. (Or maybe John and his company never really were that great to begin with?)
The whole nVidia chipset fiasco is what brought about the feud between Intel and nVidia so that we, the consumers, could not buy a Intel motherboard chipsets with nVidia SLI graphics. Shame.
Focus! Focus! You will never be great at something unless you do it well and are the best at it. A jack-of-all-trades rambling about between different technologies will not make you great, or competitive.
Re: (Score:2)
If you look at the evolution of supercomputing, one of the fundamental components to such as a system is the scalability from multi-cores (many processors per single chip die) to multi-processor cards (many GPU's per card) to rack mounted systems (multiple cards - SLI technology) and multiple racks (ultimately needing high-performance data networks).
As the PC can support multiple GPU cards, that forces nVidia to need some motherboard real estate to support SLI. I can hardly imagine the other manufacturers a
Re:The right thing to do... (Score:4, Interesting)
Simple: resources put to developing the chipsets are resources that are not being put to developing GPUs. I work for a company that up until a year and a half or so ago had a single product, which had then and has now the largest market share among vendors in that market. Then we launched a new product in a complementary market sector in which there is a dominant player. Sure, we hired new staff to work on this product and it has gained considerable traction in the last nine months and the dominant player in that market is probably starting to sweat :) However, some staff from our existing product line also moved onto that new product line and while development has continued on our core product and new features continue to be released and we continue to be number one in that market, we could have done more, faster if we were focusing on a single product line.
That doesn't mean nv was wrong to get into the chipset business. Certainly, my employer was not wrong to enter the complementary market sector we entered. Sales are going very well and there are great cross-sell opportunities between our two product lines. I can see us becoming the dominant player in this new market. However, that doesn't mean that entering a new market will not have an effect on your existing products, especially in the short term. I assure you it does.
The problem nv has in the chipset market, as I see it, is that they entered a very crowded market with a dominant player (Intel), which didn't really need another player, and they put themselves in direct competition with Intel, something they weren't when they only made GPUs. It got more complicated when AMD bought ATI, since that also put them in direct competition with AMD. If nv were to exit the chipset business they could make nice with Intel as a hedge against AMD. Thus, exiting the chipset business, even if they are profitable in it, could make business sense.
Sure, they deny it. Of course, a lot of these sorts of denied stories later turn out to mostly or wholly true. Time will tell, but I shan't be surprised if we see an announcement from nv in 3 months that they are leaving the chipset business. Who knows? They might even be able to sell some of their IP to Intel and recover some of their initial investment.
Re: (Score:2)
I was never happy that nVidia got into the chipset business in the first place. If any company has a talent to specialize and do one thing really really well (in a competitive environment), then that is what they should continue to concetrate on.
The problem nv has in the chipset market, as I see it, is that they entered a very crowded market with a dominant player (Intel), which didn't really need another player, and they put themselves in direct competition with Intel, something they weren't when they only made GPUs. It got more complicated when AMD bought ATI, since that also put them in direct competition with AMD.
I freakin' celebrated (not literally) when NVIDIA entered the chipset business way back in 2001 [anandtech.com]. The way I remember it, AMD finally had a very nice CPU (the original Athlon) but, unlike Intel, AMD refused to get into the consumer chipset business and left that part to the three Taiwanese "cheapset" makers: VIA, ALi, and SiS.
I know Intel made a few blunders with their chipsets (think RAMBUS-to-SDRAM translater), but I trusted Intel's reliability way more than VIA, ALi, and SiS. NVIDIA, using their experien
Hah! (Score:2)
Didn't I just say the other day their chipset drivers - at least the IDE ones - were crap? I spent a fair amount of time reinstalling Windows XP for a client on a box with the NForce chipsets, and it was the IDE drivers that were hosing the install.
Stick to graphics, Nvidia, maybe you know how to do that.
Re: (Score:2)
Considering their driver issues for Vista, I wonder if they should stick to hardware (they seem to be good at that) and just outsource their driver development. Possibly just open the specs and let open source drivers get written.
Bogus, Nvidia denied this (Score:2)
The rest is here http://techreport.com/discussions.x/15240 [techreport.com]
At least the article isn't a dupe, but Slashdot found a way of making this news for nerds since we geeks have to fix ze mistakes
Re: (Score:2)
nvidia-to-ars-were-not-leaving-the-chipset-market [arstechnica.com]
1 nail in the coffin for GLSL (Score:2)
After 4 years, they're the only ones supporting GLSL. In today's inflation, you need to have 40% earning growth in dollars to break even. nVidia isn't doing that.
The word in the silicon valley... (Score:4, Interesting)
...is that NVidia is working on an Intel-compatible CPU.
My speculation:
They're probably shuffling resources internally, Their chipset designers might be working on the chipset to interface with their CPU.
You haven't heard this from me ;)
Re: (Score:2)
When INTC gloated that they were working on a GPU core for their CPU, NVDA shot back that customers don't really benefit from better CPUs anymore, only from faster GPUs.
NVDA's GPUs are way way way better at parallel tasks than INTC's CPUs. INTC's chips really are only good at serial tasks. Perhaps if NVDA released a massively parallel GPU, INTC really would be in trouble?
In Other Random Speculation... (Score:4, Funny)
Slashdot Founder CmdrTaco reportedly plans to shut down the popular Slashdot web destination. "People forget my website was once largely just a portal for enlightenment themes. The purpose of Slashdot is to report on news that matters."
Slashdot will be replaced with a new web portal, OMGPonies which will focus on over reacting to all pony news that someone might have remembered reading on 4chan.
CmdrTaco has asked that the community assist him by submitting stories and drawings in the idiom of a otatu-catgirl and jacked up on pocky. We were unable to substantiate the rumor that the site will feature bedazzler fan art of Hannah Montana, despite CmdrTaco's legendary collection.
(Man, I'm just picking on that guy today!)
NVIDIA probably won't leave chipsets...yet (Score:2)
This is a highly unlikely move for NVIDIA. Check out this article for good info on why not:
http://www.pcper.com/article.php?aid=601 [pcper.com]
Maybe out in 5 years, but not anytime soon.
this is good news. (Score:3, Informative)
The nforce 6xx series chipsets were a striking failure. They did not work properly.
No motherboard manufacturer can claim 6xx boards with few problems.
check those backups (Score:2)
I always shied away from using nforce raid (instead using md-raid), since I could never be sure the array would work on newer versions (though I'm led to believe that it would actually work).
If this is even half true, I hope it scares nvidia raid users enough to check their backups (or start making some), or perhaps purchase a second motherboard to use as replacement, while they still can.
With md-raid, I've switched motherboards several times and the array just comes up without any trouble.
Other Processors (Score:2)
Oh gee, darn (Score:2)
Re: (Score:2)
AMD / ATI chipsets are better and intel on board video is dead last.
Re: (Score:3, Interesting)
Re: (Score:3, Informative)
Re: (Score:3, Insightful)
Re: (Score:2)