Arch-rivals Intel and AMD Team Up on PC Chips To Battle Nvidia (pcworld.com) 169
Intel and AMD, arch-rivals for decades, are teaming up to thwart a common competitor, Nvidia. On Monday, the two companies said they are co-designing an Intel Core microprocessor with a custom AMD Radeon graphics core inside the processor package. The chip is intended for laptops that are thin and lightweight but powerful enough to run high-end videogames, the companies said. From a report: Executives from both AMD and Intel told PCWorld that the combined AMD-Intel chip will be an "evolution" of Intel's 8th-generation, H-series Core chips, with the ability to power-manage the entire module to preserve battery life. It's scheduled to ship as early as the first quarter of 2018. Though both companies helped engineer the new chip, this is Intel's project -- Intel first approached AMD, both companies confirmed. AMD, for its part, is treating the Radeon core as a single, semi-custom design, in the same vein as the chips it supplies to consoles like the Microsoft Xbox One X and Sony Playstation 4. Some specifics, though, remain undisclosed: Intel refers to it as a single product, though it seems possible that it could eventually be offered at a range of clock speeds. [...] Shaking hands on this partnership represents a rare moment of harmony in an often bitter rivalry that began when AMD reverse-engineered the Intel 8080 microchip in 1975.
There it is (Score:4, Insightful)
This is what Apple should be using in future Macs. Maybe they knew of Intel plans, that's why the MacBook Air and Mac mini haven't really been updated in such a long time. It's the two Macs that will have this new CPU first.
Re: (Score:2, Interesting)
This is what Apple should use in future iPhones. A phone that runs an OS that is compatable with both macOS and iOS, that can connect to a keyboard and monitor and can be used as a PC in that way. They already working on that already and I wouldn't be surprised if some of Apple's money is quietly going towards AMD and Intel's new project.
Re: (Score:2)
Argh, I meant 'They are probably working on that already.'
Re: (Score:2)
It would also explain why Tim Cook is so focused on the iPhone and iPad while almost dismissing the Macs.
If the future is an iPad that runs macOS and is as powerful as a MacBook/Air/Pro, I can see the point.
Re: (Score:1)
iOS will continue to evolve and acquire features that make it more suitable as a macOS replacement for some users, but don't hold your breath waiting for convergence. The two UI models are fundamentally incompatible. If you want a kludge that combines pointer and touch (but isn't particularly good at either), talk to Microsoft.
Re: (Score:3)
Well also, financially the Macs are but a blip compared to the iPhone sales. Even iPads haven't seen consistent investment compared to their iPhones in recent history (a reasonable reaction to iPad sales trends).
In the wider market, Tablet form factor has in general tanked relative to the traditional laptop form factor. The 'two in one' form factor has a very vocal fanbase and logically would *seem* like the best of both worlds, but even there traditional laptops have higher sales (lower prices drive, tho
Re: (Score:2)
Re:There it is (Score:5, Insightful)
This is what Apple should use in future iPhones. A phone that runs an OS that is compatable with both macOS and iOS, that can connect to a keyboard and monitor and can be used as a PC in that way. They already working on that already and I wouldn't be surprised if some of Apple's money is quietly going towards AMD and Intel's new project.
If you think Apple is walking away from their own in-house A11 chip you're nuts. They've consistently out-performed all other ARM chips in single threaded performance, in their power envelope they're class leading while Intel has repeatedly tried, failed and eventually given up to sell a compelling phone chip. The question is rather when they decide the as-of-yet unreleased A11X tablet version is ready to go in a convertible/laptop form factor. The connectors are no problem, the phone already talks USB over lightning and the wireless streaming is the same only compressed and without a physical port. They probably have all the relevant bits merged to make an ARM laptop, though knowing Apple it'll probably be a walled garden.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
are AMD and intel cpu interchangable (Score:2)
Asking slashdot: Should I care at the software perfromance level if I have an AMD or an Intel.
it's been over a decade since I bought a big AMD cluster. I regretted that because I found that at that time in history while some code did run equally well on these that in general the software libraries for AMD just weren't tuned as well for these chips. Many optimizations not taken.
the main issue was that make files were just defaulting to x386 (this was pre ia64) and not special instructions. SIMD support w
Depends on the application (Score:5, Interesting)
Putting aside AMD's very newest chip for a moment, there are basically three different kinds of use cases:
A) I want the best performance I can get within my $X budget.
B) it's a server serving many clients (lots of threads)
C) It's a single thread and I don't care how much it costs because I'm spending taxpayer money, I want the very fastest single-thread performance, cost be damned
Intel specializes in case C. Raw single-thread performance, cost be damned.
AMD will give you more cores for the dollar, so it competes well in case B, servers running many threads. AMD also traditionally costs significantly less, so it fits case A, getting the best CPU you can within a certain price range.
That's a generalization, though. It's best to compare one CPU model to another, evaluating based on the needs of your specific application and budget.
Re: (Score:2)
"I don't care how much it costs because I'm spending taxpayer money"
And this ladies and gentlemen is precisely what is wrong with government.
Re: (Score:2)
No, I have actually witnessed first hand, FEMA offices stack with Sun and SGI servers in boxes. And heard them say they have to spend it, or lose it.
It is common, less so in business, but very much so in Government. Government automaticaly grows each budget every year and if it doesn grow they cry a cut in funding.
Re: (Score:2)
well actually D) which is like C). I don't want got get into a situation where I have to carefully study all the archane nuances to get the best results. My time has value so I don't want to have to become an achitecture wizard just to do the rest of my job. I don't have my own IT dept.
Impossible, but close enough (Score:2)
If you're no willing to spend a few seconds to think about whether your workload is multithreaded, and you are willing to spend more money than needed, get Ryzen.
A difference between C and D is that in the case of D, whole you're willing to spend 10 times as money as you should, that still doesn't tell you whether you should spend lots of money on 16 AMD cores or on 4 Intel cores.
If your workload is heavily multithreaded (servers), AMD will likely give you tell best performance, at *whatever* your budget is
Re: (Score:2)
In B) they want price per watt. Intel has done much better than AMD on price per watt [cpubenchmark.net]. AMD hasn't had a server chip in 6 years. [extremetech.com]
Re: (Score:2)
You probably mean performance per watt. I was actually thinking of (D) best performance in a laptop, which is why I have Intel laptops and AMD everywhere else. Of course, there are other good reasons to want power efficiency besides mobile uses. (In my case, cooling/heat in laptop form factor is much more of an issue than battery life.)
BTW, am I the only one who cringes at all these unmatched parentheses like "A)", "B)" and "C)"?
Re: (Score:2)
Yes, that is what I meant. Thank you for correcting that, rather than calling me an idiot which is what usually happens on Slashdot.
P.S. Those aren't unmatched parens: they are closed in a post further down on the page. Just keep reading. I would close them myself right here, but then that will produce an error on that other person's post.
Re: (Score:2)
Putting aside AMD's very newest chip for a moment, there are basically three different kinds of use cases:
A) I want the best performance I can get within my $X budget.
B) it's a server serving many clients (lots of threads)
C) It's a single thread and I don't care how much it costs because I'm spending taxpayer money, I want the very fastest single-thread performance, cost be damned
Intel specializes in case C. Raw single-thread performance, cost be damned.
AMD will give you more cores for the dollar, so it competes well in case B, servers running many threads. AMD also traditionally costs significantly less, so it fits case A, getting the best CPU you can within a certain price range.
That's a generalization, though. It's best to compare one CPU model to another, evaluating based on the needs of your specific application and budget.
Intel must have a second source for it's products. AMD was the patent holder for the virtual logic that Intel licensed. AMD needed a second source for it's chips. Intel is good at reducing chip lines and circuitry to nanometers. Intel is not a design king. They buy designs. AMD is a vendor and also a purchaser of designs.
Re: (Score:3)
And many libraries I had to use were pre-compiled to generic specs rather than optimized.
Or they were compiled with icc, which adds check for whether the emitted code is running on an Intel chip, and jumps to generic unoptimized code if not. If it IS an Intel chip, the code does the usual flag checks to see which modern instructions are available and runs the appropriate code, but on non-Intel it doesn't get as far as checking flags.
Re: (Score:3)
Re: (Score:2)
It definitely is worth remembering. I have never heard that side of the story. Do you have any references for this?
I have read stuff like this one:
"The CPU dispatch, coupled with optimizations, is designed to optimize performance across Intel and AMD processors to give the best results. This is clearly our goal and with one exception we believe we are there now. The one exception is that our 9.x compilers do not support SSE3 on AMD processors because of the timing of the release of AMD processors vs. our co
Re: (Score:2)
Re:are AMD and intel cpu interchangable (Score:5, Interesting)
> I regretted that because I found that at that time in history while some code did run equally well on these that in general the software libraries for AMD just weren't tuned as well for these chips. Many optimizations not taken.
Part of that was do to Intel's shenanigans.
Intel's "cripple AMD" function in their compiler [agner.org]
Re: (Score:2)
Even today, Intel admits to this when you look at icc. I think they only have that disclaimer because of the lawsuit: to my knowledge they are still running really slow code on AMD on purpose.
Re: (Score:2)
> I regretted that because I found that at that time in history while some code did run equally well on these that in general the software libraries for AMD just weren't tuned as well for these chips. Many optimizations not taken.
Part of that was do to Intel's shenanigans.
Intel's "cripple AMD" function in their compiler [agner.org]
Nobody's forcing you to use the Intel compiler though. Use the other well established standard compilers.
Blender uses threads and GPU. AMD wins (Score:2)
I jumped back up here from my answers further down in the thread to see if you have any hints about your use case.
You can turn on GPU rendering in Blender, in which case your GPU becomes more important and your CPU becomes less important. This article is about Intel using graphics technology from AMD because AMD is so far ahead in GPUs, but you likely have a separate video card.
Blender uses threads efficiently, meaning CPU cores, so for Blender you want a CPU with at least 8 cores. That favors AMD for Ble
Re: (Score:2)
Actually I'd love to hear your thoughts. I'm in the process of buying my first new desktop in years. Since I stopped configuring my own clusters I've mainly used laptops at home and whatever was on the purchase approved list at work. Now I'm buying a workstation class computer for home and don't know the best thing to do.
On my list are ryszen 7 and 5, and intel i7s.
here's an example of a couple I'm considering:
https://www.aliexpress.com/ite... [aliexpress.com]
http://www.magicmicro.com/smor... [magicmicro.com]
https://www.magicmicro.com/pr [magicmicro.com]
I can only agree with you (Score:2)
I would agree, I would probably avoid the AliExpress because the GPU is important to you. "Lots of animation renders" sounds like you may want to look at whether the software you use uses GPU rendering and if so, via which API - opengl? If opengl, most of the major brands will have reasonably good support.
Other than that, I'm more of a software guy; I don't stay up on the latest hardware. I just know, from a software perspective, that some software will take full advantage of multiple cores, some will not
Re: (Score:2)
oooh good tip on the socket. That would take some of the sting out of buying this just to try it out.
I see that SSDs come in two flavors one comes in a box package like a hard drive and top out at 500MB/sec and then there's ones in a M.2 format that go 2000Mb/sec and look like they fit in a small slot. these aren't more expensive either. So I wonder why they arent's preferred (smaller and faster at the smae price-- duh?). What's the catch.
the other thing I'm pondering is heatsinks. I'm not sure how one
PCIe slot, stock cooler (Score:2)
Yeah the M.2 PCIe drives are attractive. I would have bought one two days ago if it weren't for the fact that the machine I was putting the drive only has two PCIe slots. You'll need a $12 adapter to put that drive in a PCIe slot. M.2 has pins for PCIe, USB, and SATA. A particular device may use/require any of those interfaces. That is unless you get a mobo with a M.2 NVMe slot.
You may have a couple second delay at boot while interface is initialized, which doesn't seem like much but it may eat up the e
Re: (Score:2)
I forgot to comment about heat sinks. The engineers at AMD selected a heat sink (hunk of metal) that would make the CPU look good. From what I've seen, they weren't stupid enough to cheap out on that. I'd use what they selected and put in the box with the CPU (though you can also buy the CPU without a cooler included). AMD engineers speced the AMD Wraith Spire heat sink. It looks like a good option.
Choices in heat sinks and fans can also be affected by size, fan noise, ambient temperature in the room, c
Re: (Score:2)
Thanks again !
Re: (Score:2)
Re: (Score:2)
calm ur tits (Score:1)
This is like, "Hell has frozen over" kind of news
Am I the only one who smell this as very "Apple" wanted?
I dont think AMD will be giving up any GFX secret, more likely this is AMD shipping Intel a Mobile Gfx Die to be integrated within the same CPU package.
But in any case, Why not just have Intel ship a Mobile CPU without iGPU and a Separate GPU.
And AMD, why now? When Zen is doing great, has great roadmap and potential, along with much better GFx then Intel. Why?
Re: (Score:1)
You might be too young to remember this, but Intel and AMD have a long history of working together. AMD used to fab chips for Intel.
Re: calm ur tits (Score:1)
Intel licensed X86 technology to multiple companies to get IBM to use the 8088 in the original PC. By the time the 80386 came out, IBM was not in control of the PC platform, and Intel refused to license the 32 bit âoepartsâ to anyone. This gave them years to pull ahead and eliminated all competition except AMD. If not for the commercial failure of their 64 bit extensions and the need to license AMDâ(TM)s, there would be no competition. They keep AMD alive but near death.
Re: (Score:2)
Re: (Score:2)
Itanium was a completely new chip design, with limited compatibility with x86. Intel never had plans to make any 64-bit extensions to x86, opting to design a completely different architecture as it's entry into 64-bit processor market. AMD saw this as an unbelievable oversight, and designed it's own 64-bit architecture based heavily on (and fully compatible with) the x86 instruction set. This gave AMD a near-total monopoly on the 64-bit PC market for a year or so, whilst Intel scrambled to design it's own compatible architecture to compete.
Intel was majorly mis-stepping on Itanium, and on x86 they were throwing money away on Netburst (Pentium 4, D, etc) which had terrible performance per clock compared to even a Pentium 3. It also had terrible performance per Watt.
AMD happened to release x86-64, and a fairly decent K8 architecture that kept them as market leaders until Intel threw Netburst wholesale into the garbage, in favor of the Israel developed side project of "Pentium M", which later developed to the Core and Core 2 platforms. Though in
Re: calm ur tits (Score:2)
Re: (Score:2)
The 64-bit AMD Opterons quickly gained a foothold in the server space, e.g. running Solaris. Also, Intel's first x86-64 implementation, the NetBurst-based Pentium D, had terrible performance per watt. The initial Intel Core CPUs lacked x86-64 support, and it wasn't until the Core 2 that they had a competitive offering. By this time, Windows for x86-64 was widespread.
Re: (Score:2)
SPARC was losing badly in price/performance by the point the 64-bit Opterons came along. The SunPro compilers, while having excellent code generation for SPARC, were clearly showing their age and lagging GCC and MSVC in C++ features. GCC had pretty poor SPARC code generation, but did quite well for x86-64. This meant with an x86-64 box running Solaria, you could get better performance and use a less frustrating compiler, without losing all your investment in the Solaris ecosystem. (Alpha was dead by tha
Re: (Score:2)
Bullshit. It was widely deployed in enterprise environments. Businesses were demanding higher-performance systems from Sun and Opteron 64 was all they could deliver. Businesses switched over as quickly as they could recompile in-house applications, and Java stuff just moved across.
Re: (Score:2)
Re: (Score:3)
The best of both worlds is not a big deal when neither of the worlds is really any better than an alternative in one of the categories.
NVidia trounces both on graphics. Not to mention far more capable Linux support than either has (even if NVidia's offering is not open source, it's still pretty damn sweet).
Re: (Score:3)
Re: (Score:3)
Not quite as good is still a fair assessment, though a bit complicated.
Intel has insisted on less scalable designs to avoid some dramatic latency penalties.
AMD has gotten bigger core count by going to a more scalable design, albeit with cross-complex latency penalties, which make things complicated.
Ignoring that, the AMD core isn't *quite* as good single threaded performance per watt or performance per clock, but has more cores to make up for it in the price point.
Things get more complex as you go to Epyc.
Re: (Score:2)
Re: (Score:1)
This is like, "Hell has frozen over" kind of news
I guess all industries consolidate once they mature. Less freezy than Micros~1 supporting Linux though!
Am I the only one who smell this as very "Apple" wanted?
I doubt they're upset, but do apple ship enough PC units to make a difference in this regard with the likes of Intel? Sure apple are huge, but most of the revenue is in iStuff now and the the store (IIRC). They are big enough to make AMD bite I expect (c.f. custom dies for games consoles), but Intel typ
Mobile offerings? (Score:1)
Looks great for desktop and server use. Does AMD have a good mobile offering at the moment?
Yes, the recently released [anandtech.com] Raven Ridge [wikipedia.org] aka Ryzen Mobile.
My personal hope is for AMD to release some low-power APU's that fit between mobile & 'classical' desktop applications. Say, AM4 socket parts with around ~30W TDP to go on affordable mini-ITX boards for SFF PC's, home theatre, all-in-ones and such. Not that I would mind even lower-power mobile parts, but those tend to be thin on the ground in terms of availability for diy builds (eg. separate APU + motherboard purchase). And there's quite some s
Re: (Score:2)
Because a GPU on die is much faster and more efficient. That external bus might be high bandwidth but the latency is a killer compared with sitting on the same die. Many of the same reasons that AMD managed to coast along with much smaller and more efficient intel chips for awhile by putting the memory controller on the die.
"And AMD, why now? When Zen is doing great, has great roadmap and potential, along with much better GFx then I
Re: (Score:2)
Per transistor? No, they have not.
"Now, true, you have the problem of communication across the PCIe bus - but that can be resolved with a different system architecture (like OpenPower)."
No, it can not. It's called the speed of light. It is physically impossible for any architecture to make an external bus be as fast as communications can be on the same die.
Re: (Score:2)
Hell (Score:4, Insightful)
Re: (Score:1)
Re: (Score:2)
Which is why Apple's MBPs have discrete Video card as an extra option.
Re: (Score:2)
Hell has frozen over.
It won't be frozen for long since the overclockers in Hell can now turn on their gaming rigs. ;)
Re: (Score:2)
Dark Helmet: I am your father’s brother’s nephew’s cousin’s former roommate.
Lone Star: So what does that make us?
Dark Helmet: Absolutely nothing.
Absolutely Incredible? (Score:2)
Intel and AMD team up? (Score:5, Funny)
What's next? Sonic on Nintendo consoles? Square-Enix games on computers?
Re: (Score:2)
And why do you think I chose these examples?
It's because while Intel+AMD sounds weird today, the examples I mentioned are true today but sounded just as weird two decades ago.
Re: (Score:2)
Whoosh!
Also, this is why we need a sarcasm font.
Re: (Score:2)
Whooosh!!
You are welcome.
No like seriously: Whooooooooosh!
You are welcome.
Re: (Score:2)
Don't take it the right way, but you sound like an angry GLaDOS.
Intel tried to buy NVIDIA (Score:3, Interesting)
About 5 years ago INTC tried to buy NVDA. They had enough money to do it, and the offer was going to be reasonable, but there was a sticking point about who would become CEO of the combined company. Paul Otellini of Intel was about to step down, and the assumption from NVIDIA's Jensen Huang was he would become the CEO of the combined Intel-NVIDIA. But Intel's board wasn't going to have it and promoted Brian Krzanich to CEO instead. And that's the story of how Intel managed to lose a ton of money and missed opportunities in 3D graphics and Compute.
Not a monolithic chip (Score:2)
The linchpin of the Intel-AMD agreement is a tiny piece of silicon that Intel began talking up over the past year: the Embedded Multi-die Interconnect Bridge, or EMIB. Numerous EMIBs can connect silicon dies, routing the electrical traces through the substrate itself. The result is what Intel calls a System-in-Package module. In this case, EMIBs allowed Intel to construct the three-die module, which will tie together Intelâ(TM)s Core chip, the Radeon core, and next-generation high-bandwidth memory, or HBM2.
AMD sell Intel bare dies that talk EMIB. Interesting thing is that Intel could do a deal with NVidia to supply GPU dies which use the same interface. Well except that Intel pays NVidia licence fees whereas the AMD Intel patent licensing agreement is completely one sided - AMD pays Intel but Intel gets IP rights to anything AMD invents for free.
It's not like AMD is selling Intel a synthesizable core or even a hard macro. And Intel being Intel they probably pay people to do competitor analysis on AMD stuff an
Re: (Score:2)
Intel claim 20x better power efficiency for EMIB compared to PCIe chip to chip here.
http://www.tomshardware.com/ne... [tomshardware.com]
https://i.imgur.com/q4cxMtU.jp... [imgur.com]
Re: (Score:3)
"Well except that Intel pays NVidia licence fees whereas the AMD Intel patent licensing agreement is completely one sided - AMD pays Intel but Intel gets IP rights to anything AMD invents for free."
What a gross misrepresentation of a cross-license agreement. Intel doesn't get AMD IP "for free", it's CROSS-licensing. Many cross-license agreements include cash considerations in one direction, it would be surprising if it weren't so.
Lower your fanboy ranting a level.
Re: (Score:2)
Lower your fanboy ranting a level.
I prefer Intel CPUs and NVidia GPUs given a choice.
What I said is true for x64.
https://www.cnet.com/au/news/a... [cnet.com]
The lawsuits started in 1987. Rich Lovgren, former assistant general counsel for AMD, recalled that AMD founder Jerry Sanders sat through "every second" of one of the trials. "There were certainly bridges that were burned," he said.
Under the terms of the settlement, both companies gained free access to each other's patents in a cross-licensing agreement. AMD agreed to pay Intel royalties for making chips based on the x86 architecture, said Mulloy, who worked for AMD when the settlement was drafted. Royalties, he added, only go one way. AMD does get to collect royalties from Intel for any patents Intel might adopt.
AMD also agreed not to make any clones of Intel chips, but nothing bars Intel from doing a clone of an AMD chip, Mulloy added.
While the terms may seem one-sided, AMD has benefited from the agreement as well. Without the clean and enforceable right to make x86 chips granted by the agreement, AMD would not have been able to produce the K6, K6 II, K6III, Athlon, Duron, Athlon 64 or Opteron chips without fear of incurring a lawsuit.
Intel probably doesn't have access to the graphics patents AMD picked up when it bought ATI though. There were rumours it would licence them which it denied.
https://www.extremetech.com/ex... [extremetech.com]
Intel, however, has reached out to put the kibosh on such rumors. In a statement sent to Barrons, Intel stated, "The recent rumors that Intel has licensed AMDâ(TM)s graphics technology are untrue." The company has said that further information will not be provided.
Of course if it buys AMD GPU dies and puts them in the same package as Intel chips it doesn't need to licence all AMD's graphics patents, just agree on a price for the dies. It also doesn't need to hire a bun
Makes sense. Intel graphics are still a failure (Score:3)
Makes sense. Intel graphics are still a failure.
Remember when the industry panicked when Intel bought Chips & Technologies and the Real3D patents?
That didn't go so well. Who else had a shoebox full of Intel i740 cards bought at fire-sale prices?
Re: (Score:1)
It depends what you mean by failure. For gaming, yeah. But then gamers are always going to have a discrete GPU.
For people who don't game and want long battery life onboard graphics are better because they're lower power.
I suspect Intel know if they keep the non gamers happy with a good CPU with 'good enough' graphics they'll sell a lot of chips. And for non gaming 'good enough' graphics isn't that hard to do. Gamers will more than likely buy an Intel CPU and pair it with discrete graphics provided Intel's C
Re: (Score:2)
"And for non gaming 'good enough' graphics isn't that hard to do."
You say this but on a brand new i3 with a GTX970 I get lag just scrolling most websites. People don't know how to code properly any longer.
Re: (Score:2)
Re: (Score:3)
Re: (Score:2)
> ... but the "discrete graphics" offerings are generally no better than what's built into a modern Intel Core CPU unless you go for a laptop specifically aimed at gamers. ... Games seem to have plateaued in terms of the GPU power they need, and Intel's graphics are, as a result, "good enough" for a higher and higher percentage of new games. I debated using the graphics built into my new i5 last year based on tests showing me that overall performance with both GTA V and Skyrim was no different to my
>
Re: (Score:2)
I'm not buying it.
Not buying what?
Your post claims that $700 discrete GPUs are much better than Intel integrated GPUs, especially in high-end scenarios. That's obvious, and nobody said otherwise.
Re: (Score:2)
Makes sense. Intel graphics are still a failure.
From a business perspective? I doubt it. They basically drained the bottom out of the market by force-bundling it with their CPUs, weakening AMD and nVidia considerably. Will a 15-45W CPU+GPU ever "catch up" to a dGPU that draws a few hundred watts alone? Obviously not. But they took almost the whole non-gamer market (71% by units) and according to this article 22% [pcgamer.com] of Overwatch players use Intel integrated graphics. Consider then non-competitive games like Civilization etc. and you'll realize not everyone n
Re: (Score:2)
Makes sense. Intel graphics are still a failure.
So much of a failure that they're in almost every laptop made. The last two laptops I bought had NVidia and Intel chips in them (Intel for low power consumption, NVidia for games). They aren't good for gaming but they're fine for everyday use which is why no-one cares that their laptop hasn't got a NVidia chip in it (and its hard to find a reasonably priced laptop with a NVidia chip in it).
Now in desktop gaming, NVidia pwns the graphics market, Intel pwns the CPU market.
To me AMD is shooting themselves in the foot (Score:1)
I mean, now that they have a real chance of making a good Ryzen based APU they join forces with Intel? That surely doesn't make sense.
I wonder what the price will be like. Will it be Intel like (read: too much) or AMD (eternal underdog) like?
Re: (Score:2)
I think it will be Mac-only like, at least at the beginning.
Re: (Score:3)
It's a brilliant move. If it's successful, they will control the GPU on CPU market for the 64 bit 86 platform globally. Intel still holds the majority of sales and it'll be painful to compete with them directly as Intel has enough cash to match AMD at any price point as well as enough volume and customers to out-produce and out-market AMD.... but... if AMD gets a portion of every Intel sale because of the GPU, they get a lot of cash for just a bit of support work and help to raise brand awareness for
Re: (Score:2)
Its a good move for them, as they'll move more units.
People that want Ryzen will get Ryzen - Joe Consumer will get Intel with Radeon built in, because Joe Consumer buys Intel because GeekSquad told him to.
Good point -- it's about marketing awareness of AMD, in a way. Eventually the JCs might realize, if the integrated AMD GPU is good enough for them, then why not the entire AMD package deal, as it's cheaper than the Intel combo anyway.
For those who need discrete GPUs it won't matter either way. Frankly, my experience of AMD APUs hasn't been exactly stellar, and I'm not expecting a huge improvement this time. Integrated GPUs have their inherent limitations in power envelope and shared memory, so you won't
Well damn (Score:2)
Re: (Score:3)
The BB 10 OS is derived from QNX, which BlackBerry bought when they realized their own OS was a POS that couldn't scale into the future. (Very much like Apple declaring Copland a lost cause [wikipedia.org], and started deciding between BeOS and NeXTSTEP in 1996).
QNX is one of the premier embedded RTOS's, and is used in the majority of the automotive industry, including most automobile infotainment systems.
Licensing fees from QNX is probably one of BlackBerry's largest remaining revenue sources at this point.
Misleading title (Score:2)
The worst of both worlds? (Score:2)
Lately, AMD made news for their well received Zen CPUs and lackluster Vega GPUs.
So... let's pair an Intel CPU with an AMD GPU...
The AMD Raven Ridge GPU performance is said to be on the same level as Intel's current offering, which is pretty bad, but the CPU is quite good.
TBH, it kind of makes sense : Intel CPUs have good single thread performance, and dedicated AMD GPUs are better than Intel's offering, combining them can be good for mid-range gaming, but still, weirdest partnership ever...
Cartel laws (Score:2)
Wouldn't something like this fall afoul of cartel laws?
It's not like nVidia's putting out desktop processors or anything.
what about more pci-e intel??? (Score:2)
what about more pci-e intel???
Article read to me like an April Fool's send up (Score:1)
Except it's not. This really is amazing news to me, and they definitely will blow a big hole in the budget gaming market.
Step 1 (Score:2)
Release your register level specs and get proper Linux kernel support, like a professional product.
Shared memory bottleneck? (Score:2)
What good is a top notch GPU if it shares memory access with the CPU? I would assume it matters in some cases, but I'm not sure to what extent. As I do a lot of iterated render to texture, I feel much safer with a fairly basic discrete GPU with its own fast RAM, than the latest and greatest integrated one.
Re: (Score:2)
Re: (Score:2)
When a laptop is crap, you don't blame Intel, you blame the crap company that made your crappy laptop.
Maybe you just bought a 7970 from a crappy company.
Re: (Score:2)
I got a free nVidia GTX 650 that was headed for the scrapyard. Runs my games at 30+ FPS just fine.
Re: (Score:2)
The 7970 was widely considered a good card, but you're making decisions based on an experience you had 5-6 years ago. It's entirely possible that the slow framerate was a chipset compatibility issue or your motherboard manufacture screwed up or that you needed to install some updates to your BIOS to fix a well known issue. Most of the cards have thermal sensors that shut off if the fan dies because you won't find a 100% reliable fan on GeForce cards either.
Re: (Score:2)
https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
With this new technology [techreport.com] announced less than two weeks ago?