Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel AMD Government Graphics Technology

Nvidia Waiting In the Wings In FTC-Intel Dispute 143

The NY Times has a Bits Blog piece speculating on some of the fallout if the FTC prevails in its anti-competition lawsuit against Intel. The Times picks out two among the 26 remedies proposed by the regulator, and concludes that they add up to Nvidia being able to license x86 technology. This could open up 3-way competition in the market for combined CPU-graphics chips. There is a good deal of circumstantial evidence pointing to the possibility that Nvidia has been working on x86 technology since 2007, including the presence on its employment rolls of more than 70 former Transmeta workers.
This discussion has been archived. No new comments can be posted.

Nvidia Waiting In the Wings In FTC-Intel Dispute

Comments Filter:
  • Wow. (Score:5, Insightful)

    by zippthorne ( 748122 ) on Saturday December 19, 2009 @11:28PM (#30502160) Journal

    Why does the remedy appear to be more harmful to AMD - an Intel competitor - than to Intel themselves?

    • Re:Wow. (Score:5, Insightful)

      by sdnoob ( 917382 ) on Saturday December 19, 2009 @11:37PM (#30502180)

      I don't know, but you're right. Any increased competition from another manufacturer will hurt AMD much more than Intel. AMD already has the bulk of the business from those willing to purchase non-Intel chips and an additional competitor will draw its customers from that group, not from Intel (who enjoys a large loyal following of customers who won't even consider anything else).

      • Not necessarily. (Score:3, Insightful)

        by XanC ( 644172 )

        If AMD and Nvidia can truly make competitive products, then having more of a non-Intel option makes that option seem much more mainstream.

        • by Kemeno ( 984780 )
          Not only that, but what if Intel tries to leverage their monopoly to get Nvidia out of their graphics offerings, and instead tries to bundle their processors with their own integrated graphics chipsets? One of the FTC's complaints was that Intel was doing something pretty close to this on their netbook/atom platform. If they tried it on the higher end, I could see that backfiring for them.

          A good Nvidia or AMD offering combined with Intel's abuse of their monopoly could lead to their own demise...
          • Re:Not necessarily. (Score:5, Interesting)

            by servognome ( 738846 ) on Sunday December 20, 2009 @03:46AM (#30502822)

            Not only that, but what if Intel tries to leverage their monopoly to get Nvidia out of their graphics offerings, and instead tries to bundle their processors with their own integrated graphics chipsets? One of the FTC's complaints was that Intel was doing something pretty close to this on their netbook/atom platform. If they tried it on the higher end, I could see that backfiring for them.

            You mean the same way Nvidia has integrated PhysX into their hardware and gone so far as to disable such acceleration if any additional cards made by a competitor are present.
            The move to system on a chip is not an anti-competitive practice, it's the way the entire industry works. Third party hardware solutions have long been incorporated into mainstream designs as their silicon requirements decrease. Discrete math coprocessors and memory controllers were devoured by the CPU, video decoding and physics acceleration have been integrated into GPUs.
            Why would SoC's from intel be considered anti-competitive, while AMD fusion and Nvidia Tegra, which are essentially the same, be considered innovative?
            The FTC needs to consider whether the consumer would really benefit by forcing chipmakers to keep various pieces seperate for the sake of competition. The continued decline in average selling price, combined with the increasing capability of each new generation of microprocessor indicates that consumers are not negatively impacted by such design integration.

            • You mean the same way Nvidia has integrated PhysX into their hardware

              PhysX isn't that much "integrated". PhysX is just 1 of the middleware for physics simulation on the market. The original version was compiled to run on some peculiar accelerator boards (Ageia PhysX cards), Nvidia just ported and compiled it for a different platform (Their CUDA-enabled GPUs). From a technical point of view nothing prevents a port / compile for OpenCV. From a legal point of view : Nvidia owns the code and can do pretty much everything they want with it.

              Now, the big difference with Intel, is t

              • Intel's situation is different. Modern commercial computer games run almost only on x86 hardware. Intel has a huge market share in x86 CPUs.

                Computer games do not constitute a market. I would have agreed with you 10 years ago when x86 was the leading force behind most consumer level computer. However, the nature of the semiconductor industry is changing.
                The traditional desktop PC is being replaced by portable computing. The industry is clearly moving to a low cost, low power, highly integrated solutions.
                W

                • Modern commercial computer games run almost only on x86 hardware.

                  Computer games do not constitute a market. I would have agreed with you 10 years ago when x86 was the leading force behind most consumer level computer. {...} The traditional desktop PC is being replaced by portable computing.

                  Yes, I acknowledge that the real future in gaming is in the hands of ARM + PowerVR inside some i- / Google- / Whatever- Phone. And/or inside a virtual machine like Flash Player.

                  But that's not what WoW is running on, and that's not what the companies mentioned in this article and thread usually do.
                  This story is about change in the equilibrium between Intel, AMD/ATI and Nvidia, and thus specifically affect the landscape of *computer* gaming (hence my emphasis). A given the number of WoW-subscriber and such, e

            • by Anpheus ( 908711 )

              The issue is that Intel has been slowly but surely changing the licensing on everything so that no one else will have the IP rights to put something on a motherboard with an Intel CPU, or at the very least ensuring that they have to go through a route of Intel's choosing.

              For example, Nvidia can't utilize QPI or DMI, and make an enthusiast chip for the Lynnfield, Bloomfield or Clarkdale chips so far. And they won't be able to do so for Arrandale in Q1 2010, Gulftown in H1, or other new designs later on.

              The g

            • "The FTC needs to consider whether the consumer would really benefit by forcing chipmakers to keep various pieces seperate for the sake of competition. The continued decline in average selling price, combined with the increasing capability of each new generation of microprocessor indicates that consumers are not negatively impacted by such design integration."

              The problem is not integration as it is the liscensing issues, where Intel tweaked Nehalem chips to deny Nvidia a liscense to make motherboard chipset

          • Nvidia already bundles their graphics acceleration with their own processor [nvidia.com]. The Tegra processor has plenty of power for Internet Tablets, with way lower power and overall system cost. I think it would be a terrible mistake for Nvidia to enter the i386 compatible market. They should instead help us build a future without all that baggage.

      • Re: (Score:1, Troll)

        by Comatose51 ( 687974 )
        Well, maybe a more viable competitor may be bad for AMD but good for the market. AMD has really stagnated in the last few years.
        • wha? neither intel nor AMD has stagnated

        • Comment removed (Score:5, Insightful)

          by account_deleted ( 4530225 ) on Sunday December 20, 2009 @03:18AM (#30502756)
          Comment removed based on user account deletion
          • Re: (Score:3, Interesting)

            by drinkypoo ( 153816 )

            When I recently went to price intel and AMD solutions, I ended up getting MB+CPU for less than intel's CPU with similar performance alone. And we're talking retail "black edition" overclockable CPU, not OEM, and a motherboard with every port I could want (well, OK, there's no fw800 on it) and support for overclocking, which I haven't even messed with yet. AMD's big problem is that they are fighting the perceptual technology leader — not the real one, but the one the public perceives as being there. It

            • No, Intel has been the real CPU leader since Intel Core came out. They use more advanced manufacturing technology faster. They do their process shrink a year before AMD. AMD still does not use high-k metal gates in their process. AMD's CPU design is also worse in many regards. Less total cache, no macro-op, and micro-op fusion, hyperthreading, etc. Intel's processors are also 4 issue, instead of 3 issue. AMD will only fix these design deficiencies in Bulldozer.
              • Re: (Score:3, Interesting)

                by drinkypoo ( 153816 )

                The intel part is superior only where the intel part costs twice as much or more. AMD is by far the leader in price:performance, regardless of what interconnect technology intel is using. An intel CPU twice as fast costs at least four times as much money, that's not a win!

              • by Khyber ( 864651 )

                What are you talking about? AMD pretty much owns the x64 market, since Itanium flopped hard. QPI is *JUST NOW* catching up with HyperTransport in bandwidth capability. AMD has been able to offer fewer features on their chips and STILL get the performance needed simply because they had superior design (on-die memory controller, moved away from FSB.) Intel has been playing catch-up. Shit the IGP AMD/ATi offers blows away anything Intel can put out.

                More isn't necessarily better.

                Give me a Pentium 3, 512MB of PC

                • Give me a Pentium 3, 512MB of PC3200, a decent GPU, and some people that know how to write tight optimized assembly code, and I'd laugh at anything anybody puts out today.

                  as much of a fan of assembly code as i am (and I really am) if you think writing everything in assembly on a modern computer to do typical modern computer things is feasible, you're kidding yourself

                  Also, if you know assembly, and know enough about compilers etc, you can use this knowledge to write C code that when compiled translates to the assembly you would have written anyway. (possible exception being using vector units for highly specialised applications)

                  When you know how it works, the c language is li

                • by bjb ( 3050 ) *

                  Give me a Pentium 3, 512MB of PC3200, a decent GPU, and some people that know how to write tight optimized assembly code, and I'd laugh at anything anybody puts out today. The only reason we need more of anything, is because most people simply don't know how to program and rely upon high-level abstracted development interfaces and languages to achieve the same performance on beastly hardware that we got using pure assembler and plug-in math co-processors a decade ago.

                  Now apply that to video encoding.

                  Ye

                  • by Khyber ( 864651 )

                    I was going to say, if you dared mention NetBurst I'd have to smack you silly.

                    BTW video encoding has been done on much, much weaker systems. I've been doing HD encoding (frame by frame) on a Pentium 2 266MHz. Sure, it may take a longer amount of time to get a finished result, but the results are, for the most part, one and the same, on a slower computer or faster one.

                    Most people just don't have patience, and thus we end up in this particular situation because of it.

              • by warrior ( 15708 )
                AMD's chips are on a silicon-on-insulator (SOI) technology, which comes with less parasitic source/drain cap and no junction leakage. Switching less cap means you switch faster. No junction leakage means your part uses less power when it's idle. High-k will be coming soon enough for AMD, the 45nm tech is competing pretty well w/o it at the moment.

                AMD uses an exclusive cache architecture, Intel's is inclusive. Exclusive means that a given line exists in only one cache at a given time. Inclusive means
            • Re: (Score:3, Interesting)

              Comment removed based on user account deletion
              • A Quad 925 is about on par with a q9400 ($140 vs $190). Which is great if you want to save $50 when building a midrange system, but for high-end systems, the (on the verge of being discontinued) i7 or upper Q96xx models simply have no competition from AMD, and by the time they do, Intel will have 6-core i9s on the market.

          • Agreed. I wasn't talking about technical merits but the ability to compete in the marketplace. Technical merits alone won't win in the marketplace. We've all seen many technically superior products lose in the marketplace. Unless AMD can win in the marketplace, there won't be pressure on Intel to compete more aggressively, thus lowering prices and creating better products.
      • Don't worry... nVidia will never, or at least not in the near future, be able to make x86 chips that would come even close to Intel's and AMD's performance...

        I think nVidia wants to release it's own x86+GPU Ion style platform... They realise that Intel has it's own CPU's and GPU's and AMD has it's own CPU's and GPU's.AMD is their main competitor and if the market is going to shift towards hybrid processing units than this is the only way nVidia can keep up with the competition...

      • I don't know, but you're right. Any increased competition from another manufacturer will hurt AMD much more than Intel. AMD already has the bulk of the business from those willing to purchase non-Intel chips and an additional competitor will draw its customers from that group, not from Intel (who enjoys a large loyal following of customers who won't even consider anything else).

        Nonsense. I'd bet less than half of Intel's customers are loyal. Most just want the most performance.

        If AMD starts pumping out x86 quad-core CPUs with 64 GPU shaders inside, I can see them taking over the HTPC, high end laptop, and low end gaming machine markets.

        They might even be popular for business machines, if the price is right. It all depends on where they get the power consumption to.

        It'll hurt both AMD and Intel.

  • by xzvf ( 924443 ) on Saturday December 19, 2009 @11:38PM (#30502182)
    In the long run getting multiple competitors in the CPU space is good. The problem is trust busting worked when the competitors were slow moving oil companies or railroads, by the time this gets through the court system the market will be significantly different. What computer were you using at the turn of the century?
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      What computer were you using at the turn of the century?

      One with an x86 instruction set. Same as now.

      • Re: (Score:1, Funny)

        by Anonymous Coward

        What computer were you using at the turn of the century?

        One with an x86 instruction set. Same as now.

        Fuck you.

    • What computer were you using at the turn of the century?

      RFC 1149 - Standard for the transmission of IP datagrams on avian carriers.

      Silly Doncha member??

    • Re: (Score:2, Insightful)

      by coaxial ( 28297 )

      Intel x86. Serving all of us since 1978.

      There's no reason to believe that this is going to change. Motorola's 68k never went anywhere, and PowerPC is dead. IBM's Cell went nowhere. AMD? Well they make a clone, and have 15% versus 83% marketshare, and one-fifth the revenue [robabdul.com]. Cyrix? Well they went belly up and got bought by NS, then Via. We're talking scraps. less than 2% of the market here.

      Oh yeah, and AMD is teetering into bankruptcy. Primo competitive environment eh?

      • AMD is teetering out of bankruptcy, and the settlement + extra intel damage = expect AMD to leapfrog intel the next couple generations of cpu's. ARM, if it gains traction, will kill intel's market.

      • Re: (Score:3, Informative)

        by jonwil ( 467024 )

        Power is dead? Tell that to Nintendo, Microsoft and Sony, all of whom are using PPC chips of various kinds in their current generation consoles.

        Cell is basically a PPC core with a bunch of specialist number-crunching coprocessors attached. And its by no means dead unless you consider the fact that a Cell CPU is found in every one of the 27 million and counting PS3 systems out there as being dead.

        I will grant that PPC is dead as a desktop CPU with x86 being the only viable solution at this point for mainstre

    • Re: (Score:3, Insightful)

      by A12m0v ( 1315511 )

      Since we are still stuck with Unix 40 years later and still will be 40 years from now, I can see that we could be still stuck with x86 for a long time. To the Computer Science graduate, they are flawed designs, but in the real world they work and work good enough not to merit a costly change.

      Yes there are CPU architectures, but are they significantly better to warrant a change? Even Apple after touting the merits of PowerPC succumbed to the x86 train. Even Intel tried multiple times to bring an alternative

      • stuck with Unix"

        Seems to me we're stuck with Windows

        "The X server has to be the biggest program I've ever seen that doesn't do anything for you." -- Ken Thompson

        I wonder if Ken has ever seen Vista?

        • by A12m0v ( 1315511 )

          Don't get me wrong, I love Unix. Maybe "stuck" was the wrong term. I never claimed to be an English major. If you read my whole post, you'd have got the message that I believe x86 and Unix are more than good enough and will continue to be for the foreseeable future.

          Seems to me we're stuck with Windows

          "The X server has to be the biggest program I've ever seen that doesn't do anything for you." -- Ken Thompson

          I wonder if Ken has ever seen Vista?

          Yeah, I'm stuck with Windows at work. I go home to Mac OS X and Linux/X/GNU goodness, my main machine runs BlackBox on top of Debian.

          My signature is a tongue-in-cheek comment, I don't expect people to take it too seriously. Just like x86 and Unix

  • by distantbody ( 852269 ) on Saturday December 19, 2009 @11:49PM (#30502220) Journal
    Intel effectively defrauded AMD of many billions of dollars in revenue. Intel should be forced to return those ill-gotten-gains to AMD and THEN be fined.

    In the near future if AMD goes bankrupt (possible given their current uncertain situation) and Intel's unlawful actions could reasonably be considered to have led to the demise of their main competitor (AMD), Intel shouldn't be allowed to live with the benefits of their wrong-doing, namely a monopoly, and instead be forced to establish an equivalent competitor. The FTC may indeed be acting along these lines as Nvidia could possibly be a capable CPU producer.
    • Re: (Score:3, Interesting)

      Intel will be screwed if AMD goes bankrupt and the patents on a large part of the x86 tech fall into the hands of someone who has no desire to make x86 chips.
      Currently they cross license to avoid a patent war. AMD going bankrupt will screw Intel over big time.
      • yeah right because intel won't simply buy up AMD's patents for cents on the dollar.

        • yeah right because intel won't simply buy up AMD's patents for cents on the dollar.

          Because there aren't other companies that specialize in purchasing the patents of companies in order to sue the dominant manufacturer?

    • We have legal precedent in this case: look up how the Feds went after United Shoe Machinery Company in the first half of the 20th Century. United Shoe was notorious for using its patent portfolio on shoe-machine machinery to drive out competitors, just as Intel is using its CPU and motherboard chipset patents to keep AMD/ATI at bay.

      We could see Intel hit with a multi-billion dollar fine and be forced to share information on x86 CPU and motherboard chipset technology with AMD and nVidia.

      • One good thing is motherboard chipsets are becoming irrelevant. At least in the desktop and mobile segment. Intel's increasingly bundling the north bridge with the CPU package. The next quarter you will see several 32nm processor releases [wikipedia.org] which will make this more evident.

        Still, if I was the FTC, I would force Intel to do two things: license the X86 ISA and its extensions, plus the CPU bus interface to all comers in a RAND basis.

        • One good thing is motherboard chipsets are becoming irrelevant.
          Good for intel maybe, I don't think it's good for those who want better 3D performance from their laptops and small form factor boxes though.

          At least in the desktop and mobile segment. Intel's increasingly bundling the north bridge with the CPU package.
          Indeed they are and things don't look good for anyone trying to compete with intel in the onboard graphics market.

          For LGA775 processors (late p4 and core 2) intel integrated the graphics with the

          • by Agripa ( 139780 )

            Current nahelm based processors don't seem to have any provisions for onboard graphics at all (other than using a PCIe soloution with it's own memory) which would seem to be a good thing for vendors of add-in cards. However with the next shrink it seems they are going to put graphics in the package with the CPU. I'd bet that the vast majority of nahelm based laptops will be using that on-chip graphics.

            Many onboard graphics solutions have used a shared memory architecture without a separate frame buffer but

    • Didn't AMD just win a billion dollars from Intel in a lawsuit? Maybe it's not the billions that they were 'defrauded', but it seems like that its seperate from any fines from the FTC.
  • Ugg... (Score:3, Insightful)

    by g0dsp33d ( 849253 ) on Saturday December 19, 2009 @11:54PM (#30502238)
    I'm jaded enough to realize someone says so and so will be getting into the CPU market soon every few months. I've heard Creative and NVIDIA, probably some others I've forgotten. The thing that stands out to me is that VIA gave up. IBM gave up. Motorola gave up. Maybe the FTC can change things, but if they do it will probably break a few patent laws apart or force some fairly broad cross licensing agreements. Anything monetary is really just some fodder for the bankers to burn.
    • Re:Ugg... (Score:5, Informative)

      by the linux geek ( 799780 ) on Sunday December 20, 2009 @12:00AM (#30502260)

      IBM gave up?

      16-core 4GHz processor modules would like to have a word with you.

      http://en.wikipedia.org/wiki/POWER7 [wikipedia.org]

    • Re:Ugg... (Score:4, Informative)

      by bhtooefr ( 649901 ) <bhtooefr@bhtoo[ ].org ['efr' in gap]> on Sunday December 20, 2009 @12:00AM (#30502262) Homepage Journal

      VIA is still at it, they're just attacking the Atom end of the market, now. This is where they were before Atom came along, but they have been developing newer processors.

      • by A12m0v ( 1315511 )

        The market that could very well turn to ARM?
        I don't see much of a future for Via. I'm not being a troll, it is just my observation.

        • Re: (Score:2, Insightful)

          by bhtooefr ( 649901 )

          ARM's problem is, quite simply, they don't have Windows, and to get the desktop, they either have to wait 10 years (and pay Microsoft to maintain a Windows port for that entire time) for a Windows port to take root, or displace Windows, too.

          I don't see the latter happening.

          • Re: (Score:3, Insightful)

            by cheesybagel ( 670288 )
            Even if you port Windows, you still need applications. Otherwise you are better off using a Linux distro where you can recompile the apps most people use yourself.
          • by Big Jojo ( 50231 )

            ARM's problem is, quite simply, they don't have Windows

            I don't see a problem there....

            ... unless you're concerned with entering the market for Windows platforms? They aren't.

            • Except they're going after the netbook market, which didn't take off until it got Windows.

              And, back when the first netbooks came out, they didn't have Windows netbooks to compete against.

              ARM is going to have one hell of a fight on their hands, for Joe Sixpack, when $50 gets you a faster CPU, faster graphics (even the GMA500, which is worse than the GMA950, is (slightly) faster than the fastest stuff strapped to an ARM,) Windows, twice the RAM, 18.6-37.3 times the storage, albeit a fan and 3 hours battery li

        • by jonwil ( 467024 )

          The problem with the switch to ARM is that for the kind of grunt you need in these sorts of machines (vs a MP3 player or a cellphone or whatever), the price difference between ARM and x86 is not big enough for consumers to pick the ARM option.

    • Re: (Score:3, Informative)

      by sznupi ( 719324 )

      It seems you forgot ARM processors...this tiny, insignificant part of the market which, by now, perhaps ships more CPUs annually than Intel has ever produced.

      • Re: (Score:3, Informative)

        by ShooterNeo ( 555040 )
        What's the profit margin on those ARM CPUs? How much does each individual chip sell for? Oh, right, there's very little profits and the chips are dirt cheap...
        • by sznupi ( 719324 )

          This only means ARM isn't ripping you off. Doesn't change the simple fact that they are hugely successful (and also profitable, of course)

          • Re: (Score:3, Informative)

            by ShooterNeo ( 555040 )
            ARM itself has a market cap of 3.5 billion. Intel is worth, according to the market, 108 billion. Relatively speaking, ARM is a failure and doesn't make much money compared to Intel.
            • by sznupi ( 719324 )

              That's a US CEOs definition of "failure" at most.

              Interestingly, a market that Intel is eager to get into (remember their claims about future of Atom? Licensing its IP on a similar terms to ARM cores?)

              • It is a reasonable definition. According to those numbers, Intel could purchase all of ARM from petty cash.
                • by sznupi ( 719324 )

                  Intel could purchase almost every individual company on the planet. It doesn't make most companies on the planet a failure in any way.

                  Arguing that ARM is not successful in light of their very positive situation - yes, that is unreasonable.

                  ARM and Intel are simply different. Which also influences the thing that you can't really compare their "market capitalization" directly. ARM is an IP company. How much is IP part of Intel worth? How should we include in comparisons fabs, chips & OEM manufacturers that

                  • The market has decided that the all of ARM is only worth 3.5 bil, including the IP. Now, that measures the ability of ARM to extract wealth from it's intellectual property. This does NOT mean that ARM's true worth to society as a whole is really 33 times less than Intel. I think that's what you are getting at. And of course "the market" is not really the omniscient entity that economists like to model it as, but more like an unruly mob of sheeple. Still, investors do try to put their money where they
            • but Unlike Intel, what is ARM's debts? Almost no-existent AFAIK because they simply don't produce much more then the actual Design of the CPU and own the patents. What's the difference between ARM and x86? Simply put, it's who can produce the damn things. Unlike the current x86 issue (Intel/AMD) there are so many ARM producers that even if one of the fails and goes bankrupt, the market is hardly impacted due to the numbers and ease of gaining a license but if either Intel or AMD goes bankrupt, you have the

        • What's the profit margin on those ARM CPUs? How much does each individual chip sell for? Oh, right, there's very little profits and the chips are dirt cheap...

          This is what I came to say.
          If you look at the stocks of ARM [google.com] & Intel [google.com], you'll notice a massive disparity in their trading volumes.
          Intel sometimes trades more stock in an hour than ARM does in a day.
          Yes ARM sells billions of chips, but the margins are barely there.

          • What exactly are you trying to infer from the trading volumes? It just looks like more people are trading Intel because Intel is a bigger company. Also, the link to ARM you posted is for an ADR [wikipedia.org], so Google might not even be including the numbers from the native exchange. And above all the, the only thing a heavily traded stock should mean is a low bid-ask spread

            Finally, ARM doesn't sell any chips. They design them, and license the cores to companies that fab them, ie TI, Nvidia, and even Intel.

        • Intel has less profit margin in their X86 processor division than IBM has in their S/390 mainframe division as well.
  • The investment it takes to start up a new chip line is enormous. To some extent, CPU manufacturing is like the classic steel mill example in economics: The start up cost is so massive that monopolies become very hard to break once someone is has most of the market. This is true not just for chip manufacturing but even to individual classes of chips (such as x86 architecture). If I were running Nvidia right now I'd be very worried about entering a market with massive start up cost and where most buyers will
  • Is x86 shit? (Score:5, Interesting)

    by some_guy_88 ( 1306769 ) on Sunday December 20, 2009 @01:36AM (#30502490) Homepage

    We've been using this instruction set for years and years now. There's gotta be something better around by now. Is it ARM? Cell?

    Are Microsoft and Windows the only reasons we haven't moved on? How hard would it be for them to target a different architecture? Linux seems to manage fine in this regard. Rewrite a bit of assembly and choose a different c compiler. Shouldn't be too hard right?

    • Re: (Score:3, Interesting)

      by Rockoon ( 1252108 )
      No, there doesnt really have to be something better. There are many ways to view things, and certainly x86 is one of the ugliest instruction sets still in use.

      But the modern x86 architecture has almost all the key features that make processors faster, and x64 has the one thing that x86 lacked (gratuitous amounts of registers)
    • Re:Is x86 shit? (Score:5, Insightful)

      by SpazmodeusG ( 1334705 ) on Sunday December 20, 2009 @02:12AM (#30502580)
      It's just an instruction set.

      The modern CPUs you call x86s use a non-x86 core with an instruction decoder bolted on to make it run the x86 instruction set. It has been that way since the Pentium Pro, the NextGen chips and the AMD K5.
      The AMD K5 in particular was pretty much identical to the Am29000 RISC processor. AMD just put a decoder on it and sold it as an x86.

      CISC type instruction sets are considered to be the most optimal for code density (better cache and memory usage). So we pretty much have the best of both worlds. The instruction set is CISC so we get the memory benefits and the code is run as RISC via an instruction decoder which makes it easier to pipeline and for parallelism.
    • Re:Is x86 shit? (Score:5, Informative)

      by maccodemonkey ( 1438585 ) on Sunday December 20, 2009 @02:25AM (#30502622)

      There have been quite a few different architectures, all supported by Microsoft and Windows.

      http://en.wikipedia.org/wiki/PowerPC [wikipedia.org]
      http://en.wikipedia.org/wiki/IA64 [wikipedia.org]

      Even though Microsoft abandon PowerPC long ago (XBox excluded), they still support IA64 to this day.

      The biggest problem hasn't really been vendor support, but compatibility. PowerPC held Apple back for the longest time because users had no good solutions for running x86 Windows apps when needed, whereas now they have WINE and native booting. IA64, while having some x86 compatibility, does not have clear enough benefits for consumers, and generally runs existing apps slower.

      Ironically enough, AMD pretty much killed IA64 and gave x86 a longer life when they came out with x86-64, thus cutting off Intel's attempts to replace x86. Smart business decision for AMD, but it hampered attempts to replace x86.

    • Re: (Score:3, Insightful)

      by evilviper ( 135110 )

      We've been using this instruction set for years and years now. There's gotta be something better around by now. Is it ARM? Cell?

      Actually, it's just the opposite. There WERE plenty of better architectures in the early days of x86. Today, x86 is just simply THE chip. The one that's left, competing for the high-end, pushing economies of scale, being all things to all people, and most importantly, with a healthy ecosystem of competitors continually trying to one-up each other.

      Everything but the kitchen sink

    • Do you know how much Microsoft has invested in .net?

      Platforms that huge don't just rewrite themselves to run on ARM or another architecture.

      Switching would probably cost Microsoft 100 billion or more.

      • Clarification: Porting their OS, .net platform, and Office software would cost that much, and a lot of time.

  • I still think they're using Transmeta's engineers to run x86 code on their GPUs so they can get Windows to run on systems with other ISAs for their CPU. ARM and POWER, anyone? It sounds much cheaper and simpler than doing the insane amount of testing needed to roll out a new chip, and you'd get the added benefit of accellerating your everyday applications without needing to recompile them for CUDA. Plus NVIDIA will have the advantage of being the first ones out there with SSE5. So BAM!

  • by Vigile ( 99919 ) * on Sunday December 20, 2009 @02:12AM (#30502578)

    I posted some of my thoughts on this topic here:

    http://www.pcper.com/comments.php?nid=8143 [pcper.com]

    Why would NVIDIA want to dive into such a complex product line when the GPU is becoming more and more important in general purpose computing anyway and that is obviously where their expertise is.

    • by JDeane ( 1402533 )

      I agree and Windows should support some sort of DirectX standard for accelerating the OS on graphics cards (and I mean more then the GUI in Vista/7 lol)

      Would be nice to put that power to use for things like video conversion (I know you already can do this in a limited fashion but if it was coded into some sort of framework things would go better)

      I know Flash just got some beta support for hardware acceleration so thats at least some improvement.

      By the way awesome article, I am more of an ATI man myself but

  • by rsmith-mac ( 639075 ) on Sunday December 20, 2009 @03:03AM (#30502720)

    I suppose the NYT could be right, in the sense that they see NVIDIA getting an x86 license out of this in the same way that conspiracy theorists see that the Apollo 11 landings were filmed on a soundstage.

    There's nothing about remedy 17 or remedy 18 that would lead to NVIDIA getting an x86 license directly from Intel. In short:

    17: Intel has to license its chipset buses to other companies (e.g. NVIDIA) so that they can make chipsets for Intel's newest CPUs. NVIDIA only has an AGTL+ license for older Core 2 CPUs, they don't have one for DMI (low-end and mid-range Core i3/i5/i7) or QPI (high-end Core i7).

    18: Intel can't get in the way of AMD's efforts to spin off their fabs in to Global Foundries. Up until AMD and Intel inked their own settlement, Intel intended to enforce provisions of AMD's x86 license that required them to do the vast majority of production in-house, which wasn't going to be possible if they spun-off their fabs.

    The only way NVIDIA could end up with an x86 license out of this is that remedy 18 would allow VIA to transfer their x86 license, and in reality Intel has never fully acknowledged them having one. VIA only gets away with it because they have a couple of patents that are critical to Itanium, and those patents should be expiring soon.

    So I don't know why the NYT is claiming that NVIDIA is going to get an x86 license out of this. This seems to be wild dreaming, or an attempt to generate traffic with ridiculous claims.

Keep up the good work! But please don't ask me to help.

Working...