Intel Boldly Claims Its 'Ice Lake' Integrated Graphics Are As Good as AMD's (pcworld.com) 147
While Intel is expected to detail its upcoming 10nm processor, Ice Lake, during its Tuesday keynote here at Computex, the company is already making one bold claim -- that Ice Lake's integrated Gen11 graphics engine is on par or better than AMD's current Ryzen 7 graphics. From a report: It's a bold claim, and one that Ryan Shrout, a former journalist and now the chief performance strategist for Intel, said that Intel doesn't make lightly. "I don't think we can overstate how important this is for us, to make this claim and this statement about the one area that people railed on us for in the mobile space," Shrout said shortly before Computex began. Though Intel actually supplies the largest number of integrated graphics chipsets in the PC space, it does so on the strength of its CPU performance (and also thanks to strong relationships with laptop makers). Historically, AMD has leveraged its Radeon "Vega" GPUs to attract buyers seeking a more powerful integrated graphics solution. But what Intel is trying to do now, with its Xe discrete graphics on the horizon, is let its GPUs stand on their own merits. Referencing a series of benchmarks and games from the 3DMark Sky Diver test to Fortnite to Overwatch, Intel claims performance that's 3 to 15 percent faster than the Ryzen 7. Intel's argument is based on a comparison of a U-series Ice Lake part at 25 watts, versus a Ryzen 7 3700U, also at 25 watts.
Re: (Score:2, Interesting)
Re:It is meaningless. (Score:5, Interesting)
No, it's incredibly important. Intel has spent the last 15+ years working on improving their graphic offering. It's why AMD bought ATI and why AMD/ATI has constantly pushed for the low-end integrated graphics to be substantially better than the Intel Integrated Graphics offering. Even today, using Intel GPUs is like a failsafe, as you say; often it's even worse than a failsafe for a lot of applications and things simply won't work. AMD low end graphics, though, are enough to at least provide enough performance for people to buy into the AMD ecosystem to then later on buy an AMD or Nvidia GPU. Often integrated AMD is enough to play older (read 5+ year old) games at lower settings at 60+ FPS.
Short term if true, this cuts into the adoption of rate of Ryzen systems as people contemplating an expensive Intel/AMD CPU will consider taking the Intel option and suffering under okay performance until they save money for a GPU. Long term if true, this finally shows Intel may actually be able to move into the premium GPU space. This is not only a threat to AMD but Nvidia as well as the high end GPU space is very much still an unexplored space when it comes to a lot of alternate GPU-usage applications.
Intel has always been about the long game. They've totally saturated the high end CPU market for many years. Ryzen in the short term changes this, but it's really unclear if long-term CPUs are the ticket towards dominance. Intel is in many ways both blessed and cursed with the x86 line. They can't move away from it--they've tried--and it would seem their fate is tied to it. Hence, Intel really wants to diversify in another highly competitive computational space, GPUs, where their brains and foundries have good odds of entering into and possibly long-term dominating.
It's a game changer if Intel finally shows that it can rise to the challenge and be actually competitive.
Re:It is meaningless. (Score:5, Insightful)
When the benchmarks are from an unbiased third party, run with open source drivers with a verifiable checksum, aren't cherry picked to showcase the design of the hardware, and run on Linux THEN I will give a second consideration. Otherwise it is smoke.
Re: It is meaningless. (Score:3, Interesting)
Have you read the news lately? You shouldn't trust them on the CPU side either. Seems like every week a new speculative execution attack is discovered.
Intel's model has always been to innovate as little as they have to in CPU architecture - look at the early 2000's. They didn't go 64-bit until AMD forced them to and, only because they were starting to lose market share, launched the wildly successful Core series. Once supremacy was reestablished, they promptly put innovation in park again until AMD came up
Re: It is meaningless. (Score:5, Insightful)
They didn't go 64-bit until AMD forced them to and, only because they were starting to lose market share you need to brush up on your history. [wikipedia.org] Intel actually beat AMD to 64 bit by a couple of years. However they did it by breaking X86 compatibility meaning if you wanted to run your 32 bit X86 programs you would be doing so in an emulation layer or VM which were few and far between in 2001. What AMD did was propose an extension to the X86 ISA [wikipedia.org] to add 64 bit instructions. Intel though they were dominant enough to force people to switch and they were hoping to lock AMD out of 64 bit altogether. It's similar to when IBM introduced their PS2 [wikipedia.org] line of PC's that had a proprietary bus they demanded licensing to use. OEM's didn't take kindly to the move and subsequently developed their own EISA [wikipedia.org] bus which did not require licensing. Since AMD X64 was backwards compatible no updates were needed to run existing 32 bit programs.
Re: It is meaningless. (Score:5, Insightful)
Mod parent up.
Intel tried to push the industry to Itanium because only Intel could make Itanium. (I remember something about patents on the technology but a Google search didn't find anything so I don't have a supporting link.) AMD was able to make x86 chips, and Intel wanted that to stop.
So AMD just stretched the 32-bit x86 instruction set to 64-bit, and all the customers said "Yes, that's what we wanted all along" and voted with their money. Itanium became better known as "Itanic" (rhymes with Titanic [wikipedia.org].
If Intel had gotten away with their Itanium plans, we would be paying several times more for CPU chips and those chips would be worse than what we have now. Competition is good for consumers... Intel hates it.
Re: (Score:2)
Whether Itanic was a play to lock x86 out of the market or simply an attempt to execute a new architecture for performance reasons is pure speculation. Had Intel's assumptions about compiler capabilities came to be Itanic would have likely had a different ending. And that's exactly what the whole Itanic debacles speaks to, Intel made an assumption about compiler technology that didn't come about and their entire architecture was completely dependent on this compiler advancement. Intel learned a valuable les
Re: (Score:2)
Whether Itanic was a play to lock x86 out of the market or simply an attempt to execute a new architecture for performance reasons is pure speculation.
Fair enough... I have no special secret evidence to prove my theory.
But Intel's whole business strategy is to make expensive chips that you can only get from Intel.
Consider the mobile market... Intel has a huge amount of fab capacity, and nothing would stop them from licensing the ARM instruction set and selling ARM chips for mobile. But they would just be o
Re: (Score:3)
Re: (Score:2)
Whether Itanic was a play to lock x86 out of the market or simply an attempt to execute a new architecture for performance reasons is pure speculation.
Intel themselves said that was the case at the time and that the Pentium 4 (and Pentium 3 based low power processors) would be the last x86 processors developed by them.
It probably hurts Nvidia more than AMD (Score:5, Interesting)
And that's probably what they had in mind. The idea this is a blow agains AMD doesn't square.
It doesn't really make a great headline to say, hey our newest graphics system sucks so much less than it used to that it's about as good as the competitions, maybe even slightly better, at least this month.
However If, instead, you are thinking about buying a lowend graphics card from Nvidia, why bother now if it's only a bit better than the Intel integrated graphics? Save some dough and get a better CPU or save up for a higher end graphics card next time.
This cuts Nvidia's base sales off. Sure they can't compete at the high end, but they don't have to. Anyone buying a high end Nvidia is going to be buying a high-ish end Intel. And not an AMD. if they were buying a high end AMD they will -- more often than not-- buy an AMD graphics card.
What Intel has to worry about if the possibility that in the low end machine regime NVIDIA also starts competing with their CPUs. This move foreclosed NVIDIA from getting a foothold. If the integrated graphics is good enough then it's going to be a better general purpose computer if the CPU is intel not some crap Nvida is trying to fob off as acceptable.
SO I think this is shot at NVIDIA and only secondarily AMD.
Re:It probably hurts Nvidia more than AMD (Score:5, Interesting)
Apple's new ARM processors are more than competitive with Intel CPUs. THe next generation of ARMs will very likely be better power to performance than Intel. So Intel has a problem at the low end it needs to fix. If Nvidia gets a foothold on the ARM space Intel has a big problem.
Re: (Score:3)
THe next generation of ARMs will very likely be better power to performance than Intel.
That seems astonishingly unlikely. The place where ARM wins, and why you'll never see an intel equivalent to a Cortex M0 is on the low end, due to having the much simpler instruction decoder and in order architecture.
Once you get to the high end, the decoder is a tiny fraction of the overall cost and heat goes in to fast, wide execution units, the out of order scheduler, register renamer and cache. The other big bit is th
Re: (Score:2)
Haven't Apple's ARM chips already done that? They're already on par with Intel's mobile x86 CPUs in terms of performance, with the A12X SoC performing on par with a mobile i7, and while I don't have any specific numbers on power draw, they get a pretty big boost on account of being a full process node ahead of Intel (Apple is on TSMC's 7nm process, which is equivalent to Intel's 10nm, while Intel is still shipping almost all their chips on 14nm). There are also power savings to be had on account of the A12X
Re: (Score:3)
Haven't Apple's ARM chips already done that? They're already on par with Intel's mobile x86 CPUs in terms of performance, with the A12X SoC performing on par with a mobile i7,
There were some geekbench results but those have always been incredibly suspect: they always gave Apple devices incredibly high scores relative to even high end desktop CPUs, unlike just about every other benchmarking suite. These days there seem to be fewer and fewer different suites used, making geekbench dominate, but I remember how
Re: (Score:2)
They're already on par with Intel's mobile x86 CPUs in terms of performance, with the A12X SoC performing on par with a mobile i7
This isn't remotely true, though.
There were some silly benchmarks for a while peddling iPad Pro A12X's against some MacBook Pros, but they weren't even remotely on the high end of mobile i7s.
Then you get into talking about process size... The fact that TSMC 7nm isn't even competitive with Intel's 14nm(++++++?) means what to your argument?
Re: (Score:2)
Intel doesn't compete on the low end for fear of cannibalizing their $XXX-$XXXX sales with a low double digit sale. Cell phone processors are CHEAP, if intel started selling performant silicon that cheap they'd erode their entire product line and margins.
It's why when netbooks came out Intel put strict limitations on their use, including memory limits, clocking limits and even screen resolution limits to prevent them from eating into their higher margin sales.
Re: (Score:2)
ARM cores are great when power is key. That's why you find them in phones and mobile devices.
They excel at performance per Watt consumed. Not so much at performance in general when power is of minor importance.
Re: (Score:2)
Re: (Score:2)
Apple's new ARM processors are more than competitive with Intel CPUs.
No they aren't. They might be better at a phone application where Intel failed because they didn't want to cannibalize desktop sales but no ARM processor offers no where near the performance of a desktop processor.
No ARM processor currently produced can compete effectively in either the server or desktop market. That's just a fact.
Depends on Microsoft's deal with Qualcomm (Score:2)
at least for now desktops applications let alone an OS can't run only on a gpu.
Which is why NVIDIA licenses a CPU design from ARM. Nintendo Switch, for example, contains an NVIDIA Tegra X1 with four ARM Cortex-A57 CPU cores and 256 Maxwell GPU cores. The biggest barrier I can see is legacy proprietary Windows applications and whether or not Microsoft's partnership with Qualcomm for Windows 10 on ARM is exclusive.
Re: (Score:2)
nVidia has their own CPU designs, as they shipped a version of the Tegra K1 with their "Denver" CPU. It ended up being overly power hungry, and X1 went back to an ARM-only design. They tried again with the X2, which has a mix of Denver 2 and Cortex A-57 CPUs, and the same is true for the following chip, the Tegra Xavier, which has an nVidia CPU named "Carmel". Who knows which chip Nintendo will go with in the future, though.
Re: (Score:3)
They are probably worried that Ryzen is really, really good now and combined with AMD's integrated graphics offers much better value and performance than Intel.
Re: (Score:2)
The Ryzens with integrated graphics have always offered better value and performance than Intel as far as I know.
AMD has always offered better value for money, but Intel has always made faster CPUs. We used to think that was due to their process technology, but now we know it was also about compromising security. The next run of AMD consumer CPUs should actually be more powerful than the Intel consumer CPUs they are competing against, for the first time since Athlon.
Re: (Score:2)
Intel already murdered the low-end dedicated graphics market a while ago. Prior to the GT graphics in the Core series, Intel graphics were in fact bad (or non-existent), so it was almost mandatory to get something else.
Nowadays of course they're still slower (depends on the model but like 50% speed of MX150 https://techreport.com/review/... [techreport.com]) but it's easily good enough for the vast majority of people. Serious gamers or CAD users would still get their GPUs.
And even if you do end up buying an nvidia GPU, why
Re: (Score:1)
No they won't because expensive desktop Ryzenndoesnt have integrated graphics.
Only the low end do and clearly some higher end laptop ones but there you won't typically be buying into some other graphics solution too. And performance will be similar enough to not really make a difference. Weak but possibly adequate.
Re: (Score:2)
Intel has spent the last 15+ years working on improving their graphic offering.
Yes, and so far they have always failed, kind of like how they failed at security. Without their anticompetitive advantage, Intel would have gone under a long time ago. Without their process advantage, Intel can only fail now.
Intel should have spent more effort on process technology. That was their only legitimate competitive advantage, and they have failed to maintain it.
Re: (Score:2)
Re: (Score:3, Informative)
If you do not do GPU-intensive work, the improvement hardly matters.
Nonsense. I bought an AMD A8-7600 APU years ago. One of the reasons being that its integrated GPU was capable enough to run many not-the-very-latest-most-demanding games out there. Saving me from having to add a discrete graphics card, for any casual gaming needs should they arise. That kind of thing matters for a Mini-ITX build.
Was I alone in that? Sure as hell not. Similar Mini-ITX builds, home theatre PC's, budget gaming rigs, all have one thing in common: a more capable integrated GPU is welcome. On
Re: (Score:2)
A NUC-like device based on Ryzen mobile CPUs would be awesome. At least much more "wife compatible" than the case I have sitting next to the TV right now.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
No, if meaningful it means you don't need a second GPU. Everyone does something GPU intensive sometimes.
Re: (Score:2)
These days, I'd rather have the extra cores than an on-board GPU. Initially I thought it was going to be a pain, booting with no onboard graphics, but it turned out to be no big deal, and now that's the way I always want it, with the possible exception of a super cheap build for a kid or granny or something, that is never going to have a GPU. Then I'd go for one of AMD's rather powerful APUs. So basically, really don't care about Intel's graphics. At all.
Re: (Score:1)
Re: It is meaningless. (Score:1)
Negative. The E3 also has ECC and virtualization support that the i7 lacks.
Re: (Score:2)
Reminded me how life is crap under the thumb of intel.
Re: (Score:2)
I agree here - integrated graphics is great for "simple" applications and great for most office PCs.
But as soon as you want to do some serious stuff like 3D graphics you need a dedicated card. Unless you just do it once in a while and can accept that it's sluggish.
Re: (Score:2)
Rubbish. There's a whole spectrum of users in between the VIM editor and hardcore gaming.
eg. You don't need gaming-level performance to run a PCB designer that does 3D visualization of the boards.
Re: (Score:3)
If you do not do GPU-intensive work, the improvement hardly matters.
If you do GPU-intensive work, you are using a stand-alone GPU anyway.
A comment brought to you by the 90s, not 2019 where AMD-G series APUs are able to play Battlefield 5 at usable framerates.
Unless you don't count gaming as GPU intensive, in which case maybe look at the benchmarks and see the difference between "usable" and "nice". There definitely is a lot of scope for improvement and right now we're in a very difficult position where someone who casually games actually needs to have a long hard thing about whether to buy a GPU or just a better CPU.
Re: (Score:2)
"Whole shebang gets heat limited."
As if that isn't already a factor for discrete GPUs and CPUs without an iGPU.
Re: (Score:2)
Some games hit the CPU hard. Some hit the GPU hard, but usually configurable. Any fps is always an improvement though.
Put them both on the same die? Whole shebang gets heat limited.
With only one thing to cool, I can justify using liquid cooling. It never made sense to water cool only my CPU or GPU.
Re: (Score:1)
No because this is laptops and typically this is all you have.
Then again it will be weak regardless and 3-15% unlikely change whatever you are ok running a title on it or not.
Re: It is meaningless. (Score:1)
Re: (Score:2)
If you do not do GPU-intensive work, the improvement hardly matters.
If you do GPU-intensive work, you are using a stand-alone GPU anyway.
To some extent, Intel's GPU is more like a failsafe: it ensures that you can see something on the screen before installing your GPU driver.
Um, what about laptops?
Re: (Score:1)
Re: Alt headline : (Score:1)
Try severely hobbled. It took years to fix it.
"Intel Inside"
Re: (Score:1)
Are you looking at a different 3700U? Says here [amd.com] its TDP is 15W.
Bullshit from Intel: AMD is far superior in this a (Score:2)
Re: Bullshit from Intel: AMD is far superior in th (Score:2)
Re: (Score:1)
I can mine crypto on AMD's APU's and GPU's right out the box, :)
good luck getting Intel's to work right, without crashing,
and have fun waiting for opensource devs to support Intel's new lines.
Have fun with your Intel CPU exploit of the month club.
What was the last one called... ZombieLoad... lol @Intel
And AMD gives me ECC reliable bitrot free memory support in ALL Ryzen CPU's.
Intel you have to pay out the ass even more for Xeon to get that.
Intel is a huge nope.
AMD is getting better.
Intel claims ? (Score:1)
So now intel claims it is designing better CPU than AMD's ? Read it again.
Intel is admitting that AMD is making better chips and is publicly playing catch-up. Who would have bet on this a couple of years ago ?
Re: Intel claims ? (Score:1)
In other news (Score:2)
Zen 2 seems to be a pretty strong contender to Intel's CPUs. So now if only Navi is able to go toe-to-toe with nVidia, we might actually see a highly competitive CPU, iGPU and dGPU market. Good times, at least if you're a consumer.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Radeon VII already matches Nvidia at the high end for everything below 2080ti. If RDNA specs are to be believed, Navi will clean up everything below 2080. Plus, the top selling card on Amazon today and for the last many weeks is RX 580, if value is your thing. Not a great time to be a money grubbing price gouger with a rep for fiddling benchmarks methinks.
Re: (Score:2)
Now that I would believe.
Obviously! (Score:2)
They offer at least 20FPS more in Windows Desktop, have superior performance in MS Word, and offer unparalleled rendering capacities with aalib and ncurses.
Nobody cares after your price gouging (Score:2, Insightful)
It doesn't matter if it's equally good, or a bit faster. The fact that any comparable offering is 50% more expensive, or worse, and the fact that you price-gouged customers for a whole decade just because you could, means that people are slowly moving to AMD. They have always been better priced, more honest, and are now even beating you in this market.
"Intel Inside" means shit these days, and I would only put an Intel in my computer if I was forced to.
Re: (Score:2)
"Intel Inside" means shit these days, and I would only put an Intel in my computer if I was forced to.
It hasn't meant shit since FDIV. Before that, Intel could reasonably be considered to be about as competent and scrupulous as any other CPU manufacturer. That bug wasn't even spectacularly bad as CPU bugs go (although it was pretty bad) but Intel's response to the bug was intolerable. They initially tried to get out of doing anything for the affected customers. At that moment, everyone with one half of one clue stopped choosing Intel by default. Most people do not have even that much.
Our best is better than your worst. (Score:2)
Now if they had compared the intel Gen 11 with say an old but still good RX580 and they claimed it was as fast or faster than that I'd have been interested but oh well. At least the Gen 11 can run word and excel just dandy.
Ryzen 7 doesn't have integrated graphics. (Score:2)
Re: (Score:3)
Intel is comparing it to mobile versions of Ryzen 7.
Ryzen 7 (Pro) 2700U, Ryzen 7 3700U and Ryzen 7 3750H have integrated Vega 10 GPU. Ryzen 7 2800H has integrated Vega 11 GPU.
Re: (Score:2)
Intel is comparing it to mobile versions of Ryzen 7.
Meanwhile, GCN (Vega) just got superseded by RDNA with claimed 1.5x power efficiency improvement.
gg intel (Score:5, Insightful)
Re: gg intel (Score:1)
No doubt. Who the fuck brags about a GPU that hasn't been released yet being as good as something that's been out for a few years? That should show just how desperate Intel is.
I'm probably not in the majority here (Score:1)
I like Intel integrated graphics. Intel provides working open source drivers in the mainline Linux kernel. AMD's Vega graphics only became usable with Linux about a year after the Ryzen with integrated graphics came to market. That's a year of crashes and hangs in Linux which nudged people towards Windows 10. Not cool, AMD, not cool.
Intel's credibility (Score:3)
Dunno about Intel's credibility at the moment. Did Intel make that claim about 10nm being on track lightly or what?
hmmm.. (Score:4, Informative)
Is this the Intel chip (Score:2)
Just my 2 cents
Not credible (Score:2)
Intel has screwed up time and again in the graphics space. That they now claim their new product is as good or better than the one from somebody that has been active in this space for a long time is simply not credible.
Re: (Score:1)
Intel has screwed up time and again in the graphics space. That they now claim their new product is as good or better than the one from somebody that has been active in this space for a long time is simply not credible.
Intel's graphics have worked painlessly out of the box with Linux for whatever. I don't mind that they play games 3 times slower than anybody else's GPU if they don't crash, don't require proprietary drivers, don't interfere with power management, hibernating, or sleep, don't eat battery like anything and don't turn black after warranty is over. I need my laptop for getting work done, and Intel graphics have delivered for that. It's really a criterion for buying a used laptop whether they have Intel grap
Re: (Score:2)
AMD has screwed up time and again in the CPU space. That they now claim their new product is as good or better than one from somebody that has been active in this space for a long time is simply not credible.
New designs are often better than the old, and frankly, AMD has always been the middle road in GPUs.
Re: (Score:2)
AMD has screwed up time and again in the CPU space.
And there is your problem. Not true.
Re: (Score:2)
Re: (Score:2)
Have they fixed the trig errors yet? (Score:2)
I ran across sin/cos glitches like this [github.com] a couple of years ago on Sandy Bridge. I've since noticed similar issues on all my Intel systems, the latest being Kaby Lake. So I don't have my hopes up for them fixing this any time soon. I guess it's yet another case of favouring speed over correctness.
- How many Intel engineers does it take to change a light bulb?
- cos(2.0*pi)
Intel has come a long way since the i740 Real3D (Score:2)
Intel has come a long way since the i740 Real3D semi-failure.
I installed lots of PCs with that card. They had great high-resolution 2D graphics with ironically lame 3D rendering performance.