Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Intel AMD Businesses Technology

Arch-rivals Intel and AMD Team Up on PC Chips To Battle Nvidia (pcworld.com) 169

Intel and AMD, arch-rivals for decades, are teaming up to thwart a common competitor, Nvidia. On Monday, the two companies said they are co-designing an Intel Core microprocessor with a custom AMD Radeon graphics core inside the processor package. The chip is intended for laptops that are thin and lightweight but powerful enough to run high-end videogames, the companies said. From a report: Executives from both AMD and Intel told PCWorld that the combined AMD-Intel chip will be an "evolution" of Intel's 8th-generation, H-series Core chips, with the ability to power-manage the entire module to preserve battery life. It's scheduled to ship as early as the first quarter of 2018. Though both companies helped engineer the new chip, this is Intel's project -- Intel first approached AMD, both companies confirmed. AMD, for its part, is treating the Radeon core as a single, semi-custom design, in the same vein as the chips it supplies to consoles like the Microsoft Xbox One X and Sony Playstation 4. Some specifics, though, remain undisclosed: Intel refers to it as a single product, though it seems possible that it could eventually be offered at a range of clock speeds. [...] Shaking hands on this partnership represents a rare moment of harmony in an often bitter rivalry that began when AMD reverse-engineered the Intel 8080 microchip in 1975.
This discussion has been archived. No new comments can be posted.

Arch-rivals Intel and AMD Team Up on PC Chips To Battle Nvidia

Comments Filter:
  • There it is (Score:4, Insightful)

    by DontBeAMoran ( 4843879 ) on Monday November 06, 2017 @10:27AM (#55499113)

    This is what Apple should be using in future Macs. Maybe they knew of Intel plans, that's why the MacBook Air and Mac mini haven't really been updated in such a long time. It's the two Macs that will have this new CPU first.

    • Re: (Score:2, Interesting)

      by tsa ( 15680 )

      This is what Apple should use in future iPhones. A phone that runs an OS that is compatable with both macOS and iOS, that can connect to a keyboard and monitor and can be used as a PC in that way. They already working on that already and I wouldn't be surprised if some of Apple's money is quietly going towards AMD and Intel's new project.

      • by tsa ( 15680 )

        Argh, I meant 'They are probably working on that already.'

      • It would also explain why Tim Cook is so focused on the iPhone and iPad while almost dismissing the Macs.

        If the future is an iPad that runs macOS and is as powerful as a MacBook/Air/Pro, I can see the point.

        • by Anonymous Coward

          iOS will continue to evolve and acquire features that make it more suitable as a macOS replacement for some users, but don't hold your breath waiting for convergence. The two UI models are fundamentally incompatible. If you want a kludge that combines pointer and touch (but isn't particularly good at either), talk to Microsoft.

        • by Junta ( 36770 )

          Well also, financially the Macs are but a blip compared to the iPhone sales. Even iPads haven't seen consistent investment compared to their iPhones in recent history (a reasonable reaction to iPad sales trends).

          In the wider market, Tablet form factor has in general tanked relative to the traditional laptop form factor. The 'two in one' form factor has a very vocal fanbase and logically would *seem* like the best of both worlds, but even there traditional laptops have higher sales (lower prices drive, tho

        • Apple have once been bitten by using chips that are slow and power hungry. The G5 macs were well known to run very hot, and i expect Apple are seeing the newer Intel chips in the same light. Bringing their chip design in-house allows Apple to design it's CPU's and GPU's as it see's fit, in full knowledge that if their chips become slow and hot, they can only blame themselves. And it's not like they don't have enough money to throw at the problem.
      • Re:There it is (Score:5, Insightful)

        by Kjella ( 173770 ) on Monday November 06, 2017 @11:05AM (#55499391) Homepage

        This is what Apple should use in future iPhones. A phone that runs an OS that is compatable with both macOS and iOS, that can connect to a keyboard and monitor and can be used as a PC in that way. They already working on that already and I wouldn't be surprised if some of Apple's money is quietly going towards AMD and Intel's new project.

        If you think Apple is walking away from their own in-house A11 chip you're nuts. They've consistently out-performed all other ARM chips in single threaded performance, in their power envelope they're class leading while Intel has repeatedly tried, failed and eventually given up to sell a compelling phone chip. The question is rather when they decide the as-of-yet unreleased A11X tablet version is ready to go in a convertible/laptop form factor. The connectors are no problem, the phone already talks USB over lightning and the wireless streaming is the same only compressed and without a physical port. They probably have all the relevant bits merged to make an ARM laptop, though knowing Apple it'll probably be a walled garden.

        • Keep in mind the A11 is double the die size of most cellphone chips, giving Apple an advantage..Apple doesnt have some secret ARM sauce.
      • In short, "macs" can be a "dock" providing keyboard, mouse, local storage, display and usb connectivity. The iPhone/iPad providing the "cpu".
    • Asking slashdot: Should I care at the software perfromance level if I have an AMD or an Intel.

      it's been over a decade since I bought a big AMD cluster. I regretted that because I found that at that time in history while some code did run equally well on these that in general the software libraries for AMD just weren't tuned as well for these chips. Many optimizations not taken.

      the main issue was that make files were just defaulting to x386 (this was pre ia64) and not special instructions. SIMD support w

      • by raymorris ( 2726007 ) on Monday November 06, 2017 @11:12AM (#55499445) Journal

        Putting aside AMD's very newest chip for a moment, there are basically three different kinds of use cases:

        A) I want the best performance I can get within my $X budget.

        B) it's a server serving many clients (lots of threads)

        C) It's a single thread and I don't care how much it costs because I'm spending taxpayer money, I want the very fastest single-thread performance, cost be damned

        Intel specializes in case C. Raw single-thread performance, cost be damned.

        AMD will give you more cores for the dollar, so it competes well in case B, servers running many threads. AMD also traditionally costs significantly less, so it fits case A, getting the best CPU you can within a certain price range.

        That's a generalization, though. It's best to compare one CPU model to another, evaluating based on the needs of your specific application and budget.

        • "I don't care how much it costs because I'm spending taxpayer money"

          And this ladies and gentlemen is precisely what is wrong with government.

        • well actually D) which is like C). I don't want got get into a situation where I have to carefully study all the archane nuances to get the best results. My time has value so I don't want to have to become an achitecture wizard just to do the rest of my job. I don't have my own IT dept.

          • If you're no willing to spend a few seconds to think about whether your workload is multithreaded, and you are willing to spend more money than needed, get Ryzen.

            A difference between C and D is that in the case of D, whole you're willing to spend 10 times as money as you should, that still doesn't tell you whether you should spend lots of money on 16 AMD cores or on 4 Intel cores.

            If your workload is heavily multithreaded (servers), AMD will likely give you tell best performance, at *whatever* your budget is

        • by MobyDisk ( 75490 )

          In B) they want price per watt. Intel has done much better than AMD on price per watt [cpubenchmark.net]. AMD hasn't had a server chip in 6 years. [extremetech.com]

          • You probably mean performance per watt. I was actually thinking of (D) best performance in a laptop, which is why I have Intel laptops and AMD everywhere else. Of course, there are other good reasons to want power efficiency besides mobile uses. (In my case, cooling/heat in laptop form factor is much more of an issue than battery life.)

            BTW, am I the only one who cringes at all these unmatched parentheses like "A)", "B)" and "C)"?

            • by MobyDisk ( 75490 )

              Yes, that is what I meant. Thank you for correcting that, rather than calling me an idiot which is what usually happens on Slashdot.
              P.S. Those aren't unmatched parens: they are closed in a post further down on the page. Just keep reading. I would close them myself right here, but then that will produce an error on that other person's post.

        • Putting aside AMD's very newest chip for a moment, there are basically three different kinds of use cases:

          A) I want the best performance I can get within my $X budget.

          B) it's a server serving many clients (lots of threads)

          C) It's a single thread and I don't care how much it costs because I'm spending taxpayer money, I want the very fastest single-thread performance, cost be damned

          Intel specializes in case C. Raw single-thread performance, cost be damned.

          AMD will give you more cores for the dollar, so it competes well in case B, servers running many threads. AMD also traditionally costs significantly less, so it fits case A, getting the best CPU you can within a certain price range.

          That's a generalization, though. It's best to compare one CPU model to another, evaluating based on the needs of your specific application and budget.

          Intel must have a second source for it's products. AMD was the patent holder for the virtual logic that Intel licensed. AMD needed a second source for it's chips. Intel is good at reducing chip lines and circuitry to nanometers. Intel is not a design king. They buy designs. AMD is a vendor and also a purchaser of designs.

      • by amorsen ( 7485 )

        And many libraries I had to use were pre-compiled to generic specs rather than optimized.

        Or they were compiled with icc, which adds check for whether the emitted code is running on an Intel chip, and jumps to generic unoptimized code if not. If it IS an Intel chip, the code does the usual flag checks to see which modern instructions are available and runs the appropriate code, but on non-Intel it doesn't get as far as checking flags.

        • Intel gets a lot of bad press for this, but it's worth remembering why they did it: a bunch of x86 chips advertised the relevant CPUID feature flags and then either didn't implemented the instructions correctly (IDT, I'm looking at you), or (in AMD's case) implemented them entirely in slow microcode so that the fast paths ended up being slower than the versions that used the older instructions. AMD complained when they emitted code that used the newer instructions that were much faster on Intel chips than
          • by amorsen ( 7485 )

            It definitely is worth remembering. I have never heard that side of the story. Do you have any references for this?

            I have read stuff like this one:

            "The CPU dispatch, coupled with optimizations, is designed to optimize performance across Intel and AMD processors to give the best results. This is clearly our goal and with one exception we believe we are there now. The one exception is that our 9.x compilers do not support SSE3 on AMD processors because of the timing of the release of AMD processors vs. our co

            • The compiler community is pretty small. I know a few people who worked for Intel's compiler teams until quite recently (I don't know anyone who still does - they've been scaling down their compiler teams for quite a while now). They got a lot of bad press back in the '90s for emitting instructions that were either much slower on non-Intel systems (IDT in particular, but also Cyrix implemented a few things that ICC liked to use in microcode, whereas on the Pentiums they were 1-2 cycle instructions), or hit
      • by UnknownSoldier ( 67820 ) on Monday November 06, 2017 @12:03PM (#55499819)

        > I regretted that because I found that at that time in history while some code did run equally well on these that in general the software libraries for AMD just weren't tuned as well for these chips. Many optimizations not taken.

        Part of that was do to Intel's shenanigans.

        Intel's "cripple AMD" function in their compiler [agner.org]

        Unfortunately, software compiled with the Intel compiler or the Intel function libraries has inferior performance on AMD and VIA processors. The reason is that the compiler or library can make multiple versions of a piece of code, each optimized for a certain processor and instruction set, for example SSE2, SSE3, etc. The system includes a function that detects which type of CPU it is running on and chooses the optimal code path for that CPU. This is called a CPU dispatcher. However, the Intel CPU dispatcher does not only check which instruction set is supported by the CPU, it also checks the vendor ID string. If the vendor string says "GenuineIntel" then it uses the optimal code path. If the CPU is not from Intel then, in most cases, it will run the slowest possible version of the code, even if the CPU is fully compatible with a better version.

        • by cfalcon ( 779563 )

          Even today, Intel admits to this when you look at icc. I think they only have that disclaimer because of the lawsuit: to my knowledge they are still running really slow code on AMD on purpose.

        • > I regretted that because I found that at that time in history while some code did run equally well on these that in general the software libraries for AMD just weren't tuned as well for these chips. Many optimizations not taken.

          Part of that was do to Intel's shenanigans.

          Intel's "cripple AMD" function in their compiler [agner.org]

          Unfortunately, software compiled with the Intel compiler or the Intel function libraries has inferior performance on AMD and VIA processors. The reason is that the compiler or library can make multiple versions of a piece of code, each optimized for a certain processor and instruction set, for example SSE2, SSE3, etc. The system includes a function that detects which type of CPU it is running on and chooses the optimal code path for that CPU. This is called a CPU dispatcher. However, the Intel CPU dispatcher does not only check which instruction set is supported by the CPU, it also checks the vendor ID string. If the vendor string says "GenuineIntel" then it uses the optimal code path. If the CPU is not from Intel then, in most cases, it will run the slowest possible version of the code, even if the CPU is fully compatible with a better version.

          Nobody's forcing you to use the Intel compiler though. Use the other well established standard compilers.

      • I jumped back up here from my answers further down in the thread to see if you have any hints about your use case.

        You can turn on GPU rendering in Blender, in which case your GPU becomes more important and your CPU becomes less important. This article is about Intel using graphics technology from AMD because AMD is so far ahead in GPUs, but you likely have a separate video card.

        Blender uses threads efficiently, meaning CPU cores, so for Blender you want a CPU with at least 8 cores. That favors AMD for Ble

        • Actually I'd love to hear your thoughts. I'm in the process of buying my first new desktop in years. Since I stopped configuring my own clusters I've mainly used laptops at home and whatever was on the purchase approved list at work. Now I'm buying a workstation class computer for home and don't know the best thing to do.

          On my list are ryszen 7 and 5, and intel i7s.
          here's an example of a couple I'm considering:
          https://www.aliexpress.com/ite... [aliexpress.com]
          http://www.magicmicro.com/smor... [magicmicro.com]
          https://www.magicmicro.com/pr [magicmicro.com]

          • I would agree, I would probably avoid the AliExpress because the GPU is important to you. "Lots of animation renders" sounds like you may want to look at whether the software you use uses GPU rendering and if so, via which API - opengl? If opengl, most of the major brands will have reasonably good support.

            Other than that, I'm more of a software guy; I don't stay up on the latest hardware. I just know, from a software perspective, that some software will take full advantage of multiple cores, some will not

            • oooh good tip on the socket. That would take some of the sting out of buying this just to try it out.

              I see that SSDs come in two flavors one comes in a box package like a hard drive and top out at 500MB/sec and then there's ones in a M.2 format that go 2000Mb/sec and look like they fit in a small slot. these aren't more expensive either. So I wonder why they arent's preferred (smaller and faster at the smae price-- duh?). What's the catch.

              the other thing I'm pondering is heatsinks. I'm not sure how one

              • Yeah the M.2 PCIe drives are attractive. I would have bought one two days ago if it weren't for the fact that the machine I was putting the drive only has two PCIe slots. You'll need a $12 adapter to put that drive in a PCIe slot. M.2 has pins for PCIe, USB, and SATA. A particular device may use/require any of those interfaces. That is unless you get a mobo with a M.2 NVMe slot.

                You may have a couple second delay at boot while interface is initialized, which doesn't seem like much but it may eat up the e

              • I forgot to comment about heat sinks. The engineers at AMD selected a heat sink (hunk of metal) that would make the CPU look good. From what I've seen, they weren't stupid enough to cheap out on that. I'd use what they selected and put in the box with the CPU (though you can also buy the CPU without a cooler included). AMD engineers speced the AMD Wraith Spire heat sink. It looks like a good option.

                Choices in heat sinks and fans can also be affected by size, fan noise, ambient temperature in the room, c

    • by ReneR ( 1057034 )
      maybe that is what Apple _demanded_, ..! ;-)
    • by bongey ( 974911 )
      I predicted this in another forum down to the EMIB, to the HBM2 and the second part Apple is likely push them together and a Apple laptop will ship first.
  • by Anonymous Coward

    This is like, "Hell has frozen over" kind of news

    Am I the only one who smell this as very "Apple" wanted?

    I dont think AMD will be giving up any GFX secret, more likely this is AMD shipping Intel a Mobile Gfx Die to be integrated within the same CPU package.

    But in any case, Why not just have Intel ship a Mobile CPU without iGPU and a Separate GPU.

    And AMD, why now? When Zen is doing great, has great roadmap and potential, along with much better GFx then Intel. Why?

    • by Anonymous Coward

      You might be too young to remember this, but Intel and AMD have a long history of working together. AMD used to fab chips for Intel.

      • by Anonymous Coward

        Intel licensed X86 technology to multiple companies to get IBM to use the 8088 in the original PC. By the time the 80386 came out, IBM was not in control of the PC platform, and Intel refused to license the 32 bit âoepartsâ to anyone. This gave them years to pull ahead and eliminated all competition except AMD. If not for the commercial failure of their 64 bit extensions and the need to license AMDâ(TM)s, there would be no competition. They keep AMD alive but near death.

        • Itanium was a completely new chip design, with limited compatibility with x86. Intel never had plans to make any 64-bit extensions to x86, opting to design a completely different architecture as it's entry into 64-bit processor market. AMD saw this as an unbelievable oversight, and designed it's own 64-bit architecture based heavily on (and fully compatible with) the x86 instruction set. This gave AMD a near-total monopoly on the 64-bit PC market for a year or so, whilst Intel scrambled to design it's own c
          • Itanium was a completely new chip design, with limited compatibility with x86. Intel never had plans to make any 64-bit extensions to x86, opting to design a completely different architecture as it's entry into 64-bit processor market. AMD saw this as an unbelievable oversight, and designed it's own 64-bit architecture based heavily on (and fully compatible with) the x86 instruction set. This gave AMD a near-total monopoly on the 64-bit PC market for a year or so, whilst Intel scrambled to design it's own compatible architecture to compete.

            Intel was majorly mis-stepping on Itanium, and on x86 they were throwing money away on Netburst (Pentium 4, D, etc) which had terrible performance per clock compared to even a Pentium 3. It also had terrible performance per Watt.

            AMD happened to release x86-64, and a fairly decent K8 architecture that kept them as market leaders until Intel threw Netburst wholesale into the garbage, in favor of the Israel developed side project of "Pentium M", which later developed to the Core and Core 2 platforms. Though in

    • Intel's graphics cores are way behind AMD's and AMD's chip processes and CPU designs are not quite as good as Intel's. If they put their technologies together, we'll have the best of both worlds in one single chip, and the important part you're missing is that AMD *AND* INTEL would make money off of this combined chip. It's intended for low-power high-performance gaming so it's obviously not going to under-cut the desktop CPU market that AMD has just made exciting again and it won't replace the "cheap chips
      • by mark-t ( 151149 )

        The best of both worlds is not a big deal when neither of the worlds is really any better than an alternative in one of the categories.

        NVidia trounces both on graphics. Not to mention far more capable Linux support than either has (even if NVidia's offering is not open source, it's still pretty damn sweet).

    • This is like, "Hell has frozen over" kind of news

      I guess all industries consolidate once they mature. Less freezy than Micros~1 supporting Linux though!

      Am I the only one who smell this as very "Apple" wanted?

      I doubt they're upset, but do apple ship enough PC units to make a difference in this regard with the likes of Intel? Sure apple are huge, but most of the revenue is in iStuff now and the the store (IIRC). They are big enough to make AMD bite I expect (c.f. custom dies for games consoles), but Intel typ

      • Looks great for desktop and server use. Does AMD have a good mobile offering at the moment?

        Yes, the recently released [anandtech.com] Raven Ridge [wikipedia.org] aka Ryzen Mobile.

        My personal hope is for AMD to release some low-power APU's that fit between mobile & 'classical' desktop applications. Say, AM4 socket parts with around ~30W TDP to go on affordable mini-ITX boards for SFF PC's, home theatre, all-in-ones and such. Not that I would mind even lower-power mobile parts, but those tend to be thin on the ground in terms of availability for diy builds (eg. separate APU + motherboard purchase). And there's quite some s

    • "Why not just have Intel ship a Mobile CPU without iGPU and a Separate GPU."

      Because a GPU on die is much faster and more efficient. That external bus might be high bandwidth but the latency is a killer compared with sitting on the same die. Many of the same reasons that AMD managed to coast along with much smaller and more efficient intel chips for awhile by putting the memory controller on the die.

      "And AMD, why now? When Zen is doing great, has great roadmap and potential, along with much better GFx then I
  • Hell (Score:4, Insightful)

    by Luthair ( 847766 ) on Monday November 06, 2017 @10:29AM (#55499137)
    has frozen over.
  • The first two words of the article are really "Absolutely Incredible!"? It's news. It's interesting. Incredible? I don't know if I'd say it's absolutely incredible that two companies are working together to bring a product to the market. I think PCWorld might be a bit to excited about this and forgot about actual journalism.
  • by DontBeAMoran ( 4843879 ) on Monday November 06, 2017 @10:41AM (#55499245)

    What's next? Sonic on Nintendo consoles? Square-Enix games on computers?

  • by Anonymous Coward on Monday November 06, 2017 @10:51AM (#55499309)

    About 5 years ago INTC tried to buy NVDA. They had enough money to do it, and the offer was going to be reasonable, but there was a sticking point about who would become CEO of the combined company. Paul Otellini of Intel was about to step down, and the assumption from NVIDIA's Jensen Huang was he would become the CEO of the combined Intel-NVIDIA. But Intel's board wasn't going to have it and promoted Brian Krzanich to CEO instead. And that's the story of how Intel managed to lose a ton of money and missed opportunities in 3D graphics and Compute.

  • The linchpin of the Intel-AMD agreement is a tiny piece of silicon that Intel began talking up over the past year: the Embedded Multi-die Interconnect Bridge, or EMIB. Numerous EMIBs can connect silicon dies, routing the electrical traces through the substrate itself. The result is what Intel calls a System-in-Package module. In this case, EMIBs allowed Intel to construct the three-die module, which will tie together Intelâ(TM)s Core chip, the Radeon core, and next-generation high-bandwidth memory, or HBM2.

    AMD sell Intel bare dies that talk EMIB. Interesting thing is that Intel could do a deal with NVidia to supply GPU dies which use the same interface. Well except that Intel pays NVidia licence fees whereas the AMD Intel patent licensing agreement is completely one sided - AMD pays Intel but Intel gets IP rights to anything AMD invents for free.

    It's not like AMD is selling Intel a synthesizable core or even a hard macro. And Intel being Intel they probably pay people to do competitor analysis on AMD stuff an

    • Intel claim 20x better power efficiency for EMIB compared to PCIe chip to chip here.

      http://www.tomshardware.com/ne... [tomshardware.com]

      https://i.imgur.com/q4cxMtU.jp... [imgur.com]

    • by dfghjk ( 711126 )

      "Well except that Intel pays NVidia licence fees whereas the AMD Intel patent licensing agreement is completely one sided - AMD pays Intel but Intel gets IP rights to anything AMD invents for free."

      What a gross misrepresentation of a cross-license agreement. Intel doesn't get AMD IP "for free", it's CROSS-licensing. Many cross-license agreements include cash considerations in one direction, it would be surprising if it weren't so.

      Lower your fanboy ranting a level.

      • Lower your fanboy ranting a level.

        I prefer Intel CPUs and NVidia GPUs given a choice.

        What I said is true for x64.

        https://www.cnet.com/au/news/a... [cnet.com]

        The lawsuits started in 1987. Rich Lovgren, former assistant general counsel for AMD, recalled that AMD founder Jerry Sanders sat through "every second" of one of the trials. "There were certainly bridges that were burned," he said.

        Under the terms of the settlement, both companies gained free access to each other's patents in a cross-licensing agreement. AMD agreed to pay Intel royalties for making chips based on the x86 architecture, said Mulloy, who worked for AMD when the settlement was drafted. Royalties, he added, only go one way. AMD does get to collect royalties from Intel for any patents Intel might adopt.

        AMD also agreed not to make any clones of Intel chips, but nothing bars Intel from doing a clone of an AMD chip, Mulloy added.

        While the terms may seem one-sided, AMD has benefited from the agreement as well. Without the clean and enforceable right to make x86 chips granted by the agreement, AMD would not have been able to produce the K6, K6 II, K6III, Athlon, Duron, Athlon 64 or Opteron chips without fear of incurring a lawsuit.

        Intel probably doesn't have access to the graphics patents AMD picked up when it bought ATI though. There were rumours it would licence them which it denied.

        https://www.extremetech.com/ex... [extremetech.com]

        Intel, however, has reached out to put the kibosh on such rumors. In a statement sent to Barrons, Intel stated, "The recent rumors that Intel has licensed AMDâ(TM)s graphics technology are untrue." The company has said that further information will not be provided.

        Of course if it buys AMD GPU dies and puts them in the same package as Intel chips it doesn't need to licence all AMD's graphics patents, just agree on a price for the dies. It also doesn't need to hire a bun

  • Makes sense. Intel graphics are still a failure.

    Remember when the industry panicked when Intel bought Chips & Technologies and the Real3D patents?

    That didn't go so well. Who else had a shoebox full of Intel i740 cards bought at fire-sale prices?

    • It depends what you mean by failure. For gaming, yeah. But then gamers are always going to have a discrete GPU.

      For people who don't game and want long battery life onboard graphics are better because they're lower power.

      I suspect Intel know if they keep the non gamers happy with a good CPU with 'good enough' graphics they'll sell a lot of chips. And for non gaming 'good enough' graphics isn't that hard to do. Gamers will more than likely buy an Intel CPU and pair it with discrete graphics provided Intel's C

      • by Khyber ( 864651 )

        "And for non gaming 'good enough' graphics isn't that hard to do."

        You say this but on a brand new i3 with a GTX970 I get lag just scrolling most websites. People don't know how to code properly any longer.

    • by PRMan ( 959735 )
      I don't agree. We had laptops with AMD graphics. Even when we aren't using them (using Intel graphics only) the laptop still blue screens because of driver issues. No thanks. Don't want Radeon graphics anywhere near my laptop. Intel or Nvidia only.
    • Comment removed based on user account deletion
      • > ... but the "discrete graphics" offerings are generally no better than what's built into a modern Intel Core CPU unless you go for a laptop specifically aimed at gamers.
        > ... Games seem to have plateaued in terms of the GPU power they need, and Intel's graphics are, as a result, "good enough" for a higher and higher percentage of new games. I debated using the graphics built into my new i5 last year based on tests showing me that overall performance with both GTA V and Skyrim was no different to my

        • by MobyDisk ( 75490 )

          I'm not buying it.

          Not buying what?

          Your post claims that $700 discrete GPUs are much better than Intel integrated GPUs, especially in high-end scenarios. That's obvious, and nobody said otherwise.

    • by Kjella ( 173770 )

      Makes sense. Intel graphics are still a failure.

      From a business perspective? I doubt it. They basically drained the bottom out of the market by force-bundling it with their CPUs, weakening AMD and nVidia considerably. Will a 15-45W CPU+GPU ever "catch up" to a dGPU that draws a few hundred watts alone? Obviously not. But they took almost the whole non-gamer market (71% by units) and according to this article 22% [pcgamer.com] of Overwatch players use Intel integrated graphics. Consider then non-competitive games like Civilization etc. and you'll realize not everyone n

    • by mjwx ( 966435 )

      Makes sense. Intel graphics are still a failure.

      So much of a failure that they're in almost every laptop made. The last two laptops I bought had NVidia and Intel chips in them (Intel for low power consumption, NVidia for games). They aren't good for gaming but they're fine for everyday use which is why no-one cares that their laptop hasn't got a NVidia chip in it (and its hard to find a reasonably priced laptop with a NVidia chip in it).

      Now in desktop gaming, NVidia pwns the graphics market, Intel pwns the CPU market.

  • I mean, now that they have a real chance of making a good Ryzen based APU they join forces with Intel? That surely doesn't make sense.

    I wonder what the price will be like. Will it be Intel like (read: too much) or AMD (eternal underdog) like?

    • I think it will be Mac-only like, at least at the beginning.

    • by Ramze ( 640788 )

      It's a brilliant move. If it's successful, they will control the GPU on CPU market for the 64 bit 86 platform globally. Intel still holds the majority of sales and it'll be painful to compete with them directly as Intel has enough cash to match AMD at any price point as well as enough volume and customers to out-produce and out-market AMD.... but... if AMD gets a portion of every Intel sale because of the GPU, they get a lot of cash for just a bit of support work and help to raise brand awareness for

  • Didn't see that coming. I can think of a couple other tech companies I would like to see work together on projects. Not quite on topic, but I would still like to see Microsoft buy the BB10 OS, spend a year working with it, NOT fuck up the still awesome interface and bring me a phone I actually want. I use an S8+ now, but the BB Classic is still the best phone I have ever used. I never had the slightest problem running side loaded Android apps. If whatever framework was behind that can be maintained and deve
    • by sl3xd ( 111641 )

      The BB 10 OS is derived from QNX, which BlackBerry bought when they realized their own OS was a POS that couldn't scale into the future. (Very much like Apple declaring Copland a lost cause [wikipedia.org], and started deciding between BeOS and NeXTSTEP in 1996).

      QNX is one of the premier embedded RTOS's, and is used in the majority of the automotive industry, including most automobile infotainment systems.

      Licensing fees from QNX is probably one of BlackBerry's largest remaining revenue sources at this point.

  • Title erroneously leads one to believe that Intel and AMD are so terrified of PC Chips (now ECS) that some teamwork is in order...
  • Lately, AMD made news for their well received Zen CPUs and lackluster Vega GPUs.
    So... let's pair an Intel CPU with an AMD GPU...

    The AMD Raven Ridge GPU performance is said to be on the same level as Intel's current offering, which is pretty bad, but the CPU is quite good.

    TBH, it kind of makes sense : Intel CPUs have good single thread performance, and dedicated AMD GPUs are better than Intel's offering, combining them can be good for mid-range gaming, but still, weirdest partnership ever...

  • Wouldn't something like this fall afoul of cartel laws?

    It's not like nVidia's putting out desktop processors or anything.

  • what about more pci-e intel???

  • Except it's not. This really is amazing news to me, and they definitely will blow a big hole in the budget gaming market.

  • Release your register level specs and get proper Linux kernel support, like a professional product.

  • What good is a top notch GPU if it shares memory access with the CPU? I would assume it matters in some cases, but I'm not sure to what extent. As I do a lot of iterated render to texture, I feel much safer with a fairly basic discrete GPU with its own fast RAM, than the latest and greatest integrated one.

Technology is dominated by those who manage what they do not understand.

Working...