Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Graphics Software Hardware

NVidia Reportedly Will Exit Chipset Business 173

xav_jones sends along a story from X bit Laboratories claiming that NVidia is ready to quit making chipsets. That story links one from DigiTimes, which reports that NVidia has denied that it's getting out of the business. "[NVidia] is about to quit chipset business, which automatically means that the company's much-hyped multi-GPU SLI technology is either in danger or re-considered. Moreover, several mainboard makers have already ceased making high-end NVidia-based mainboards. [NVidia has]... reportedly decided to quit core-logic business to concentrate on development of graphics processors and following failure to secure license to build and sell chipsets compatible with Intel Corp.'s microprocessors that use Quick-Path Interconnect bus."
This discussion has been archived. No new comments can be posted.

NVidia Reportedly Will Exit Chipset Business

Comments Filter:
  • Damn (Score:5, Insightful)

    by Enderandrew ( 866215 ) <.moc.liamg. .ta. .werdnaredne.> on Saturday August 02, 2008 @04:50PM (#24450557) Homepage Journal

    Nvidia released nice overclocking tools, had good BIOS options, nice features (such a firewall built directly into the NIC), etc.

    I always buy NForce chipsets.

    • AMD / ATI chipset are good on board with side port ram, nice overclocking tools, hyper flash, pci-2.0. Cross fire works on any chip set as well.

    • Re: (Score:3, Interesting)

      nice features (such a firewall built directly into the NIC)

      Actually, NAM can be very buggy. My system had never been really stable when it was installed; for example, I wasn't able to install a game without getting a BSOD. Just playing music in Winamp with nothing else open would crash my computer at times as well.

      • Re: (Score:3, Interesting)

        by Lord Apathy ( 584315 )

        Well I think that I'm about to get called a troll but I've owned several systems over the last few years. Some have had nforce chipssets and others a mixture of AMD, Intel, and VIA. When I setdown and think about the systems that I have had the most trouble out of have nforce chipsets.

        My current system has a AMD chipset in it and it has been the most stable and trouble free system I've. I bought two systems a few years ago. One for a linux server and the other for a HTPC. The linux server was a VIA

    • by SpaceLifeForm ( 228190 ) on Saturday August 02, 2008 @05:39PM (#24450855)
      Link [google.com]

      Nvidia (NSDQ:NVDA) has asked Digitimes for "a full retraction" of a story appearing Friday in the tech journal that claims the Santa Clara, Calif.-based graphics chip maker "has decided to throw in the towel and quit the chipset business."

      Link [google.com]

      Nvidia said Friday that there's no truth to a Taiwan report that claims it's exiting the chipset business.

      That report was published by Digitimes, a normally fairly reputable IT publication that claimed that Nvidia met with its main motherboard clients this week and asked for support for its next-generation chipsets.

      The motherboard makers' response? Silence.

      Although such a withdrawal would be highly unlikely, ExtremeTech asked Nvidia for comment. "The story on Digitimes is completely groundless. We have no intention of getting out of the chipset business," said Bryan Del Rizzo, a company representative, in a statement. (The same statement was later resent as an official company statement.)

      • by eonlabs ( 921625 ) on Saturday August 02, 2008 @05:45PM (#24450893) Journal

        Where the hell did this rumor even start?

        It's like IBM stopping all work with Java or Starbucks announcing it will no longer sell baked goods at its stores.
        Who even comes up with this stuff? On your mark, get set, castrate the company of your choice...

        • by lewp ( 95638 ) on Saturday August 02, 2008 @05:57PM (#24450981) Journal

          Well, to be fair, this is about core logic chipsets (nForce). They aren't exactly core to NVIDIA's business. Besides, given how poorly AMD is fairing in the enthusiast market, the merger of ATI/AMD making NVIDIA an AMD competitor (nForce originally made its splash, and had its "glory days", for AMD), and the desire of Intel to push its own chipsets (which have also been quite good recently, lessening the room for an "enthusiast" class third party) I wouldn't be incredibly surprised to see them make this move -- even though they apparently aren't doing it now.

          According to Ars [arstechnica.com], the original source was one of the motherboard manufacturers. Aside from NVIDIA themselves, they'd be most likely to know. But again, according to NVIDIA, this is a load of crap.

          • I don't believe that this is the time to throw in the towel. Depending upon how the antitrust investigations against Intel go, they could very well be required to make it easier for third parties to provide chipsets for pentiums.

            Even neglecting that specific possibility, people do still buy AMD based computers with nVidia chips in them, my new barebones has both an nForce chipset as well as an integrated nVidia video card onboard. I'll probably ditch the onboard, but it's a huge step up over any intel integ

          • by Barny ( 103770 )

            Well, the rumour about their mobo partners dropping chipsets was probably started by the Inquirer (long history of fact deficient articles), they were purporting that Gigabyte had dropped their 700i series chipsets, when, in fact, they had never started making them (as I pointed out here [theinquirer.net] in the first comment).

          • by julesh ( 229690 )

            Well, to be fair, this is about core logic chipsets (nForce). They aren't exactly core to NVIDIA's business

            Based on their 2007 financial statements, core logic (what they call MCP) accounted for $660M of revenue, compared to $1990M for GPUs. I'd hardly call a clear quarter of their business "not exactly core".

        • by Anonymous Coward on Saturday August 02, 2008 @05:58PM (#24450995)

          Apple announced today that it will stop selling actual products and will only sell hype, in pretty packages of course.

          • Apple announced today that it {...} will only sell hype, in pretty packages of course.

            I fail to see how this is any different from the current situation. :-P

          • Re: (Score:3, Funny)

            by NitroWolf ( 72977 )

            Apple announced today that it will stop selling actual products and will only sell hype, in pretty packages of course.

            That's not funny. Their garbage will still sell to the mindless assholes who buy their current packages of hype.

            Except for the iPod. You'll have to pry my shiny iPod from my cold, dead hands.

        • by j01123 ( 1147715 ) on Saturday August 02, 2008 @06:03PM (#24451027)

          It's like IBM stopping all work with Java or Starbucks announcing it will no longer sell baked goods at its stores

          or, Slashdot will only post stories after they've been fact-checked.

          • Re: (Score:2, Funny)

            by rarel ( 697734 )
            If you think about it, Starbucks abandoning Java would make for interesting news as well...
            • If you think about it, Starbucks abandoning Java would make for interesting news as well...

              You're saying you wouldn't be equally intrigued if you found out that IBM are leaving the baked goods market?!

              • by RMH101 ( 636144 )
                it would probably improve the quality of their enterprise support if they put those pastry chefs back to work on the phones...
          • Or slashdot readers insisting that this is a news site and not a blog built around discussing things that interest the editors.

          • by Surt ( 22457 )

            I'd be shocked if slashdot even reached spell-checked.

        • But given the mutual problems with licensing (NVidia refusing to license SLI technology to Intel, and Intel paying back by making it difficult to license Quickpath for nVidia), this could sound realistic. They could actually stop producing Intel chipset for Nehalem.

          It' prank, but at least it's one which sound realistic.

          It's actually possible that they'll drop the towel for Nehalems, and shift to only support AMDs and lower end Intel Cores still running with an actual north bridge, and require that hardcore

        • Re: (Score:3, Insightful)

          by giminy ( 94188 )

          Who even comes up with this stuff?

          Probably someone trying to make money on stock... [google.com]

          Hopefully nVidia catches whoever started this one and successfully sues them for conspiracy to affect stock price, defamation, and a slew of other fun charges that I no doubt have never heard of...

          • Actually, they wouldn't have to, it's the domain of the SEC to handle these sorts of things.

            nVidia could sue for trademark infringement and probably a few other things, but enforcing the stock market regulation portion would be the domain of the SEC.

        • by p0tat03 ( 985078 )

          Who even comes up with this stuff?

          People going short on the stock? You can generate a pretty penny doing this stuff.

      • It's a shame that the EEtimes reports such unreliable crap sometimes. When they run articles on solar cells, new products, and tech business they're usually pretty good. Whenever I see an EEtimes article about superconducting circuits, MRAM, FE-RAM, or any tech coming out of research labs, their articles SUCK. Classic case of 'the new journalism' - unless there's a press release they can quote from, they're lost.

      • by hairyfeet ( 841228 ) <bassbeast1968 AT gmail DOT com> on Saturday August 02, 2008 @06:17PM (#24451115) Journal

        Let us look at the facts. 1. The inquirer,in a story I submitted over a day ago and is still rotting in pending BTW,reported that a several vendors are dropping the 790i chipset due to data corruption issues. 2. Nvidia has had a bad run of chips lately,and since the 790i boards are being pulled,it looks like that bad run is across the board(no pun intended). 3. Nvidia makes a BUTTLOAD of money from SLI,as there are way too many gamers that like having dual Nvidia monsters if nothing else for the bragging rights. 4. And finally the ONLY way you can go Nvidia SLI is with an Nforce board,which even with the economic downturn is still a VERY popular seller,at least it was last time i looked at the sales numbers.

        So I think it is quite safe to say that the article is BS. BTW,if you want to read the inquirer story about the canceled 790i boards and which manufacturers are involved,it is right here [theinquirer.net]. But considering they had to set aside 150 million for repairs and replacements of mobile GPUs,to kill the Nforce which is the only way they can sell two high priced boards to the same customer frankly would be suicide. The last thing they would do is a move like this that would make them look weak after the mobile GPU fiasco. So even if they have come up with some way to add SLI to a single board by some kind of drop in chip most likely they'd wait until the whole mobile GPU mess has blown away. But as always this is my 02c,YMMV

        • Well, they could get out of the the chipset business and license SLI to AMD and Intel increasing their market for multiple NVidia cards. AMD might be a little hesitant, but if Intel buys it, they would buy it too.

          • AMD already has CrossFire, and no desire to help NVidia sell more video cards, when AMD has to sell video cards of its own.
                  This would be great for Intel, but they might want to get the SLI technology so that they would be able to use it on their own (maybe) future video cards.

            • I doubt seriously Intel would want SLI,as they can simply tie their GPU straight into their CPU via CSI Interconnect. So Nvidia killing SLI would IMHO be slitting their own throat. It would give ATI a huge advantage as they could say they had the only tech that can tame the most graphics intensive games thanks to Crossfire, and with Intel owning the integrated graphics market I don't see them pushing hard for the discrete market. More likely they'll take their new chip and drop it into a slot connected to t

      • by Kamokazi ( 1080091 ) on Saturday August 02, 2008 @06:38PM (#24451249)
        Look at the editor who posted it...you suprised? Time to bust out the kdawsonfud tag.
      • I applied for a job there a few years ago. I live in Taipei where it's headquartered. They told me I wanted too much money so it never happened. The staff is almost entirely Taiwanese with just a couple of native English speakers and they pay a basic local salary which is not that much while expecting long hours in the office. Because of the low wages and long hours, they have high turnover and most of the staff aren't really all that geeky, they're just doing a job. The guy who interviewed me was hoping I

    • by makomk ( 752139 )
      Did NVidia ever manage to get the hardware firewalling support to actually work properly? Last I heard, they ended up having to disable it because it was causing corruption of network data...
    • Re: (Score:3, Interesting)

      by mariushm ( 1022195 )

      I don't have much experience with ATI chipsets but what I can say about nvidia's chipsets is that they're usually HOT and consume a lot of power.

      I have an Asus mainboard with an Nforce2 chipset, it was great, with a great onboard soundcard (Soundstorm). Now, nVidia won't use a good soundcard anymore, to make their chips cheaper.

      Now, I have an Asus mainboard with the Nforce4-SLI chipset... you can make eggs on it, that's how hot it is (see the P5ND2-SLI motherboard on google if you want). It's good, it's sta

      • Re: (Score:3, Insightful)

        by Enderandrew ( 866215 )

        I have an NForce5-SLI chipset with just air cooling. The whole system runs pretty cool, even with a 10% overclock.

      • Re: (Score:3, Funny)

        by Hektor_Troy ( 262592 )

        you can make eggs on it, that's how hot it is

        Well, I think you can cook eggs at around 60 deg C (140 deg F), which isn't that much, considering that you're trying to convey the image of a hotplate (mine can hit around 200 deg C).

    • This news report is idiotic and old news, and has been proven to be false. Yesterday. Get with the program Kdawson. On second thought, don't. That might cause some sort of apocalyptic catastrophe, as it's never happened before.
      http://arstechnica.com/journals/hardware.ars/2008/08/01/nvidia-to-ars-were-not-leaving-the-chipset-market [arstechnica.com]
    • For the last few years, most benchmarking and reviewing sites have suggested that unless you want SLi features, the Intel's lineup have been a lot better. Cheaper, faster and with better and more effective features. The mainboards that are sold with the Intel chipsets seem to be cheaper too (in general).

    • by Pyrion ( 525584 )

      Do you inevitably start wondering what the hell is up with all of those TCP checksum errors when you run a packet sniffer on traffic handled by the onboard NIC?

      It's an old bug that nvidia still has yet to fix, started in the nForce4's and has remained in all nforce boards since. You have to disable TCP checksum offloading, otherwise the NIC will continuously discard otherwise perfectly good packets as failing their checksum (and of course, request resends, which will also fail, ad infinitum).

      And ActiveArmor

    • by Fweeky ( 41046 )

      nice features (such a firewall built directly into the NIC)

      Um, wasn't that so buggy they never actually hooked it up? Which isn't really a problem because it's about the most useless feature ever, is packet filtering on your Windows box really that big a CPU hog?

      And their NICs are a fucking joke; no documentation, and awful binary blob drivers which barely worked and required Linux and OpenBSD to reverse engineer the damn thing to make working open source ones (which can still be wobbly). I'll never forgive Sun for replacing the Intel 1000/Pro NICs in their Galax

  • Several senior personnel who worked for the foundries which provided the faulty chips have been found dead in their homes. Though the deaths appear to be suicides, foul play has not been ruled out.
    • So ... nVidia doesn't need to fix the thousands (possibly many many more) laptops that they've supplied bad parts for?

      Or are they still in the graphics game, but not in the chipset game (can you do that?)
  • by AbsoluteXyro ( 1048620 ) on Saturday August 02, 2008 @04:59PM (#24450623)

    Is the correct title to this story. See here [extremetech.com]. "The story on Digitimes is completely groundless. We have no intention of getting out of the chipset business."

    "Mercury Research has reported that the Nvidia market share of AMD platforms in Q2'08 was 60%," Del Rizzo said. "We have been steady in this range for over two years."

    "We're looking forward to bring new and very exciting MCP products to the market for both AMD and Intel platforms," Del Rizzo added.

    • Never mind the actual truth! Has Netcraft confirmed it? Or at least has Gartner predicted it will happen?

    • "in danger" and "reconsidered" aren't even the only two options. Even if Nvidia decided to stop working on chipsets themselves, they wouldn't just destroy all the work in progress. They'd sell it or license it or something. The work already done is still an asset.
    • Especially since AMD still seems to suck at making chipsets for their own boards. I use Intel processors myself but all the AMD fans I know recommend nVidia chipsets. As long as AMD doesn't do a good job in that market, I can't see nVidia leaving entirely.

      Also as far as I know, nVidia has been able to get a QuickPath license. Basically Intel was annoyed that nVidia was playing hardball on SLI licenses. You may note that nearly all Intel boards are Crossifre only, despite ATi now being owned by AMD. Reason w

      • Especially since AMD still seems to suck at making chipsets for their own boards.

        Maybe I'm behind the times, but I thought that AMD just didn't make chipsets at all.

  • by metalcup ( 897029 ) <metalcup@gmELIOTail.com minus poet> on Saturday August 02, 2008 @05:02PM (#24450643)
    Joel Hruska at Ars Technia appears to have spoken to NVidia, and the article he's written says NVidia is not going to quit the chipset market anytime soon. Looks like its just a rumor... http://arstechnica.com/journals/hardware.ars/2008/08/01/nvidia-to-ars-were-not-leaving-the-chipset-market [arstechnica.com]
  • I doubt this.. Anyway, one one hand, this won't be good for the market - less competition.
    On the other, no flame here, recent NVIDIA products that I've used (although this is graphics, not a chipset as mentioned in the article), like in T61p, were quite buggy. So I won't be missing NVIDIA products.

  • Dubious at best. (Score:5, Insightful)

    by bluephone ( 200451 ) * <grey@burntelectrOPENBSDons.org minus bsd> on Saturday August 02, 2008 @05:09PM (#24450679) Homepage Journal
    Successful companies don't usually just pack it in and go home in market segments they've been in for a long time. Sure there are issues with some chipsets and certain features, but they're not going to just call it a day. Also, "failure to secure license"? So, what, Intel said "You can only make one bid to get a license," and nVidia failed and now quits? What about "ongoing negotiations"? This is for some IP, this isn't like the Yahoo-MS deal. It's in Intel's best interest to license QPI to nVidia, because it means more sales of Intel CPUs.
    • It's in Intel's best interest to license QPI to nVidia, because it means more sales of Intel CPUs.

      Is that really true? How many Intel CPU sales would be lost to AMD if there were no nVidia chipset for them? How many people that bought an nVidia-based board would just buy a board with Intel chips anyway?

      I know that the SLI drivers do require nVidia boards, but I seem to hear that the installer or driver gets regularly hacked to remove that silly requirement.

  • When the article says chipset, they mean the nForce, not the GeForce.
    • GeForce is a chip. nForce north/south bridges are motherboard chipSETS. No clarification is necessary.

  • Nforce was great (Score:4, Interesting)

    by corychristison ( 951993 ) on Saturday August 02, 2008 @05:11PM (#24450703)

    I really like the Nforce chipset.

    In my experience they have always been stable and well supported.

    Where does this leave us AMD users... I'm still not quite the fan of the AMD chipsets as they haven't been around long enough... all of the "performance" boards were Nvidia based. My current board has an Nforce 570.

    How does the AMD 780 compare? Anyone?

    • Re: (Score:3, Interesting)

      Don't have a 780 chipset myself, I have a 790, but I figure this ought to be just as relevant. I felt the same way when I built my latest machine back in February. I didn't want to go with an AMD chipset and ATI cards. I've been an ardent AMD fan for CPUs, but for the last two builds, I went from ATI to nVidia for graphics...

      Of course, when I looked into it, it turned out that the latest ATI offerings beat the pants off of nVidia's, and the new CrossfireX SLI system looked like they took nVidia SLI, and

      • Of course, when I looked into it, it turned out that the latest ATI offerings beat the pants off of nVidia's

        I barely missed dual-NICs, the thing that got me most, was the lack of RAID5 support on the SATA chips...

        I don't have personal anecdotal experience with recent NVIDIA and AMD/ATI chipsets, but I enjoy reading chipset reviews at sites like Tech Report, Ars Technica, and Anandtech. From the reviews, I've noticed that AMD/ATI might have significantly inferior "south bridge" performance (hard disk, USB, ethernet, etc) compared to NVIDIA.

        AMD's most glaring problem might be flakey AHCI (SATA, NCQ, hot-swap) support. The Tech Report thinks it's so bad, AMD chipset SATA ports should be run in legacy IDE mode. They r

      • by julesh ( 229690 )

        That's an interesting board. Thanks for the info. I've been planning on building a large multi-monitor config for some time, and had been thinking of nForce boards that would let me get 3 x16 graphics boards in, but if this one'll take four boards that's even better. I don't suppose you know if there's any issue using the boards independently, rather than joined for faster rendering...?

    • In my experience they have always been stable and well supported.


      as much as I like amd (as the little guy) their cpu's are not competitive with intel and the chipset story is even worse!

      ethernet drivers are the biggest nightmare. 'force death' ??? huh? totally reverse engineered. and that's a bad thing.

      I did not find this production quality on linux and VERY bad on freebsd.

      lets not even talk about sata bugs and 'behavior' from nv chipsets.

      please LET NVIDIA DIE, ENTIRELY. competition is good but th

  • by owlstead ( 636356 ) on Saturday August 02, 2008 @05:15PM (#24450737)

    Although this story seems groundless, it does look like the ties between CPU and chip set are getting stronger. This seems to be one of the reasons for ATI to be taken over by AMD. Intel was already creating its own chip sets and has a monopoly on defining an interface between the two. This is an interesting relationship since it would seem that the CPU is only part of the machine nowadays. I'm expecting that this relationship will turn around somewhere in the distant future.

    With Intel it was always hard to sell your own chip set against theirs (for the desktop market). Now it will get harder with AMD as well, since they have the ATI chip sets to think about. It would be strange if there would not be some casualties. Hopefully nVidia is big enough to keep some chip sets around for some time. VIA has already given up, I hope their gamble on the embedded market pays off, although they will have pretty strong competition there as well.

    • by Vanders ( 110092 ) on Saturday August 02, 2008 @05:44PM (#24450885) Homepage
      If you notice, what you have is essentially four companies, who break down as:
      1. Intel, who have their own CPU, chipset & video
      2. AMD, who have their own CPU, chipset & video
      3. Via, who have their own CPU, chipset & video
      4. nVidia, who have their own chipset and video

      Notice the odd one out? What do you think the logical long-term plan should be, if you were nVidia?

      • Well, I heard they were trying to build their own CPU, but building a x86 compatible one - and more importantly one that can compete - will take some time and work. They should better be quick with it though, otherwise they will be missing the boat. AMD and Intel are both very well capable of building chip sets and video. Just the high end video market just won't cut it.

        • Re: (Score:2, Insightful)

          by Vanders ( 110092 )
          Yes, it's not like the good old days when anyone with enough capital could just buy up the IP of an existing CPU like Cyrix or the IDT WinChip and rework the core until it was usable: there isn't anyone left. One route for them might be to license the IP for something like Transmeta Efficeon and try to bring the core up-to date, but the overhead and effort required may make it more effort than it's worth.
      • You would think so, wouldn't you?

        But it's not that easy. For a few years, AMD beat Intel at the CPU business because Intel made some very, very stupid decisions, and AMD made some very smart ones. But assuming that a company doesn't make such stupid decisions, then the CPU game comes down mostly to "Who has the best fab technology?". Intel has some of the (if not THE) best fab tech in the world, and is thusly returning the favor to AMD. NVidia doesn't have any fabs at all, they would have to go to 3rd-p

      • by fostware ( 551290 ) on Saturday August 02, 2008 @09:42PM (#24452293) Homepage

        Via CPUs are crap, VIA chipsets are extra crap, and VIA video blows chunks.

        nVidia have stuck to the things they do well.
        • Well, which part should they have stuck to then? Or are you saying they are crap whichever way you look at it?

          The've server quite a few of my computers with their chipsets and pretty well, but they are in a corner. Mini-ITX and Nano-ITX are great ideas and they've got groovy things like crypto hardware acceleration and decent video decoding facilities.

          And don't forget that their latest VIA Nano CPU is an interesting, "all new" design, which is faster than the Intel Atom.

      • Sell out to intel? :-)

  • by rmdyer ( 267137 ) on Saturday August 02, 2008 @05:20PM (#24450761)

    I was never happy that nVidia got into the chipset business in the first place. If any company has a talent to specialize and do one thing really really well (in a competitive environment), then that is what they should continue to concetrate on. nVidia seems to have talented people who can ultimately bring us photorealistic graphics at high performance for our games, as well as other engineering, and creative needs. I really frown on companies that water down their core business by diversifying into areas which they shouldn't be messing about in.

    This kind of thing seems to happen quite often and in other ways. For example, John Carmack seemed to really have a talent in producing great engines for games on the bleeding edge of what is possible with new PC technology. John drove PC gaming technology. But what does John do? John goes off to create rockets. And then he journeys off to work on pocket devices, which are basically PCs from 1995 running Win31 with 16 bit graphics. ;( John has allowed Crytek and other engine creators to walk all over id software. (Or maybe John and his company never really were that great to begin with?)

    The whole nVidia chipset fiasco is what brought about the feud between Intel and nVidia so that we, the consumers, could not buy a Intel motherboard chipsets with nVidia SLI graphics. Shame.

    Focus! Focus! You will never be great at something unless you do it well and are the best at it. A jack-of-all-trades rambling about between different technologies will not make you great, or competitive.

    • by mikael ( 484 )

      If you look at the evolution of supercomputing, one of the fundamental components to such as a system is the scalability from multi-cores (many processors per single chip die) to multi-processor cards (many GPU's per card) to rack mounted systems (multiple cards - SLI technology) and multiple racks (ultimately needing high-performance data networks).

      As the PC can support multiple GPU cards, that forces nVidia to need some motherboard real estate to support SLI. I can hardly imagine the other manufacturers a

  • Didn't I just say the other day their chipset drivers - at least the IDE ones - were crap? I spent a fair amount of time reinstalling Windows XP for a client on a box with the NForce chipsets, and it was the IDE drivers that were hosing the install.

    Stick to graphics, Nvidia, maybe you know how to do that.

    • Considering their driver issues for Vista, I wonder if they should stick to hardware (they seem to be good at that) and just outsource their driver development. Possibly just open the specs and let open source drivers get written.

  • "The story on Digitimes is completely groundless. We have no intention of getting out of the chipset business."

    The rest is here http://techreport.com/discussions.x/15240 [techreport.com]

    At least the article isn't a dupe, but Slashdot found a way of making this news for nerds since we geeks have to fix ze mistakes :X
  • After 4 years, they're the only ones supporting GLSL. In today's inflation, you need to have 40% earning growth in dollars to break even. nVidia isn't doing that.

  • by Anonymous Coward on Saturday August 02, 2008 @06:49PM (#24451313)

    ...is that NVidia is working on an Intel-compatible CPU.

    My speculation:
    They're probably shuffling resources internally, Their chipset designers might be working on the chipset to interface with their CPU.

    You haven't heard this from me ;)

    • When INTC gloated that they were working on a GPU core for their CPU, NVDA shot back that customers don't really benefit from better CPUs anymore, only from faster GPUs.

      NVDA's GPUs are way way way better at parallel tasks than INTC's CPUs. INTC's chips really are only good at serial tasks. Perhaps if NVDA released a massively parallel GPU, INTC really would be in trouble?

  • Slashdot Founder CmdrTaco reportedly plans to shut down the popular Slashdot web destination. "People forget my website was once largely just a portal for enlightenment themes. The purpose of Slashdot is to report on news that matters."

    Slashdot will be replaced with a new web portal, OMGPonies which will focus on over reacting to all pony news that someone might have remembered reading on 4chan.

    CmdrTaco has asked that the community assist him by submitting stories and drawings in the idiom of a otatu-catgirl and jacked up on pocky. We were unable to substantiate the rumor that the site will feature bedazzler fan art of Hannah Montana, despite CmdrTaco's legendary collection.

    (Man, I'm just picking on that guy today!)

  • This is a highly unlikely move for NVIDIA. Check out this article for good info on why not:

    http://www.pcper.com/article.php?aid=601 [pcper.com]

    Maybe out in 5 years, but not anytime soon.

  • this is good news. (Score:3, Informative)

    by DragonTHC ( 208439 ) <Dragon.gamerslastwill@com> on Sunday August 03, 2008 @01:13AM (#24453475) Homepage Journal

    The nforce 6xx series chipsets were a striking failure. They did not work properly.

    No motherboard manufacturer can claim 6xx boards with few problems.

  • I always shied away from using nforce raid (instead using md-raid), since I could never be sure the array would work on newer versions (though I'm led to believe that it would actually work).

    If this is even half true, I hope it scares nvidia raid users enough to check their backups (or start making some), or perhaps purchase a second motherboard to use as replacement, while they still can.

    With md-raid, I've switched motherboards several times and the array just comes up without any trouble.

  • Then forget Intel and their closed patented Quick-Path Interconnect and make chipsets for other processors that aren't so anal about people trying to improve their products. Hummm, who could that be?
  • Is anyone brought to tears by this announcement? Is anyone other than myself rejoicing? Every Nvidia POS I ever bought was a complete piece of junk. I'd buy something worthless like the ASUS VT333 w/ RAID and get screwed by the onboard Nvidia shit. I'd swear off Nvidia but forget about it in a year or two only to find myself pulling a dumbass move and buying another Nvidia piece of shit. Of course I'd get burnt again. Nvidia and their chipsets can rot in hell for all I care. There's no love lost here

1 Mole = 007 Secret Agents