Why is Intel's GPU Program Having Problems? 38
An anonymous reader shares a report: If you recall, DG2/Arc Alchemist, was supposed to debut late last year as a holiday sales item. This would have been roughly 18 months late, something we won't defend. What was meant to be a device that nipped at the heels of the high end market was now a solid mid-range device. Silicon ages far worse than fish but it was finally coming out. That holiday release was delayed 4-6 weeks because the factory making the boards was hit by Covid and things obviously slowed down or stopped. SemiAccurate has confirmed this issue. If you are going to launch for holiday sales and you get delayed, it is probably a better idea to time it with the next obvious sales uplift than launch it between, oh say Christmas and New Years Day. So that pushed DG2/AA into mid/late Q1. Fair enough. During the Q2/22 analyst call, Intel pointed out that the standalone cards were delayed again and the program wasn't exactly doing well. While the card is out now, the reports of drivers being, lets be kind and say sub-optimal, abounded. The power/performance ratio was way off too, but there aren't many saying the price is way off unless you are looking at Intel's margins to determine what to buy the kiddies.
[...] Intel is usually pretty good at drivers but this time around things are quite uncharacteristic. Intel offered a few reasons for this on their Q2/22 analyst call which boiled down to, 'this is harder than we thought' but that isn't actually the reason. If that was it, the SemiAccurate blamethrower would have been used and refueled several times already so what really caused this mess? The short version is to look where the drivers are being developed. In this case Intel is literally developing the DG2 drivers all over the world as they do for many things, hardware and software. The problem this time is that key parts of the drivers for this GPU, specifically the shader compiler and related key performance pieces, were being done by the team in Russia. On February 24, Russia invaded Ukraine and the west put some rather stiff sanctions on the aggressor and essentially cut off the ability to do business in the country. Even if businesses decided to stick with Russia, it would have been nearly impossible to pay the wages of their workers due to sanctions on financial institutions and related uses of foreign currencies. In short Intel had a key development team cut off almost overnight with no warning. This is why SemiAccurate say it isn't their fault, even if they saw the war coming, they probably didn't see the sanctions coming.
[...] Intel is usually pretty good at drivers but this time around things are quite uncharacteristic. Intel offered a few reasons for this on their Q2/22 analyst call which boiled down to, 'this is harder than we thought' but that isn't actually the reason. If that was it, the SemiAccurate blamethrower would have been used and refueled several times already so what really caused this mess? The short version is to look where the drivers are being developed. In this case Intel is literally developing the DG2 drivers all over the world as they do for many things, hardware and software. The problem this time is that key parts of the drivers for this GPU, specifically the shader compiler and related key performance pieces, were being done by the team in Russia. On February 24, Russia invaded Ukraine and the west put some rather stiff sanctions on the aggressor and essentially cut off the ability to do business in the country. Even if businesses decided to stick with Russia, it would have been nearly impossible to pay the wages of their workers due to sanctions on financial institutions and related uses of foreign currencies. In short Intel had a key development team cut off almost overnight with no warning. This is why SemiAccurate say it isn't their fault, even if they saw the war coming, they probably didn't see the sanctions coming.
One word... patents (Score:2)
The bulk of graphics card technology is covered by patents that are owned by Nvidia, AMD or third parties. Some of which can be licensed which only has the impact of adding to the cost of the chips. Some however cannot making it extremely difficult to make competing products.
Re:One word... patents (Score:4, Insightful)
Re: (Score:3)
^^ THIS 100%.
Intel has been making GPUs since 1998 [wikipedia.org].
Your note about discrete GPUs is a good one. Expecting Intel to match the performance of AMD or Nvidia is unrealistic. Their (lack of quality) drivers and (over) price are the main problems. (Also, sometimes you just need raw performance.)
If Intel continues to invest in their hardware (and driver team) for the next few years they can have something competitive. We will see.
Re: (Score:2)
Re: (Score:2)
Their (lack of quality) drivers
A driver is sending (or receiving) data sent by the OS to the GPU/device.
There is no quality in it. It is a bunch of a few lines of code.
In other words: either the driver works or it does not. There are no good/bad - or - fast/slow drivers.
Re: (Score:2)
This is their first discrete offering.
Not true. eg. https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
I was going to say that. It was also possible to get the GMA950 on a PCIe card, although they were quite rare (GMA950 PCIe cards were included with the Pentium 4-based Apple Developer Transition Kit machines).
But strictly speaking, neither the i740 nor GMA950 came from Intel organically. The i740 was from Real3D, acquired from Martin Marietta (who were previously involved in development of the 3D hardware for Sega's Model 2 and Model 3 arcade platforms and various commercial and military flights simulator
Re: (Score:2)
This is their first discrete offering.
Not exactly. [wikipedia.org]
Re: (Score:3)
its far from their first discrete offering ... they were offering DirectX compatible cards back in the windows 9x era and they sucked big. to your point I don't think anyone expects intel to come out swinging and matching blow by blow the likes of AMD or Nvidia ... but for fucks sake they could have a bit better driver support than Matrox ... they do have 20 something years of writing graphics drivers
Re: (Score:1)
The bulk of graphics card technology is covered by patents that are owned by Nvidia, AMD or third parties. Some of which can be licensed which only has the impact of adding to the cost of the chips. Some however cannot making it extremely difficult to make competing products.
Those of us that read the article know that Semiaccurate's assessment is that the one word is: Russia. As in, driver work was being done in Russia, and those teams are no longer accessible to Intel due to Russia's invasion of Ukraine.
Who knows if their assessment is fully accurate.
Re:selective memory (Score:4, Insightful)
I think Intel's drivers have always been "good" in terms of being stable and doing what was expected of them. Of course what was expected from an iGPU has been to not crash and continue displaying the basics of the OS and browsers on the screen. In my experience with Intel Integrated systems this has always been the case. They tend to install and work with little to no drama. The benefit of low expectations.
Of course now the expectation for these drivers is to provide competetive 3D rendering performance for games on par with AMD and NVidia who have a couple decade head start on the problem, a much taller order if there ever was one.
Re: (Score:2)
The review I have all seen say these cards are anything but that.
Re: (Score:2)
Well it seems like the hardware is capable of competing but drivers are more important thatn hardware and they are struggling with competing on that front. it's hard no doubt but these things need to be in the cooker for another 6 months probably.
For the one GPU that's been released (Score:2)
Re: (Score:2)
Re:selective memory (Score:4, Interesting)
I've generally run into Intel drivers being "good" compared to many competing products' drivers. Excepting graphics, like the GP post said, I too have run into problems with that.
Intel chipset drivers were generally pretty rock-solid in the era when Via chipsets were utterly miserable and bluescreened my Win9x install regularly, needing a reinstall of the IRQ routing drivers, and often locked up under Win2k/XP too. The Nvidia NForce line was pretty competitive and offered AMD chip support, but tended to be slightly more expensive and a little flaky with some of the fancier tech it tended to support. A couple other chipsets I'm blanking on the name of now also tended to be a little questionable now and then.
Intel NIC drivers were always just fine. They were pretty standardized, had only a handful of drivers that supported every card they made (compared to, say, Realtek that has a giant list of drivers that only cover a very small range of chips each), and also supported a weirdly wide range of OSes. MS-DOS/WFW311 drivers for their then-latest gigabit ethernet? Sure, why not.
Graphics drivers is where things have been much more hit and miss for me. In my experience there's basically a "sweet spot" of driver versions, usually when a particular version of Windows has been out for awhile but not replaced yet. For example, I had good luck with Intel graphics drivers on Win2k once it'd been out for awhile but XP hadn't fully replaced it yet. The earliest versions were terrible and left me with 640x480x16 colors plenty, needing a reinstall. Eventually as driver versions crawled on, later versions on Win2k got more and more unstable again as most focus was put on XP. Same thing happened with XP when it was new vs. after Vista came out, even for chipsets that wouldn't have supported WDDM/Aero anyway. It's a mystery. The real secret is to find a good version that works and don't touch it. (I've also had more trouble with Intel graphics drivers than other chips when doing things like running Windows 7 on a laptop with an Intel chipset with no graphics drivers past official XP support, even in XPDM graphics driver modes, but that's hard to fault them too much on considering the nature of mixing up versions like that even if I generally have good luck with other vendors' graphics cards)
Re: (Score:2)
Realtek is a bit odd because a lot of what they make is stuff that will be looked to as replacement/up-integration options for connecting really old systems that run on DOS etc to modern networks.
As far as talk about VIA drivers on win9x goes and Intel's netbook era display drivers for their IGP products - I know its hard to accept but we are getting old. These experiences are probably no longer representative. The people technology and process that lead to those conditions are long gone.
Intel is good at drivers? (Score:3)
Re:Intel is good at drivers? (Score:5, Interesting)
Gamers Nexus have videos about this.
https://www.youtube.com/watch?... [youtube.com]
https://www.youtube.com/watch?... [youtube.com]
They identified two problem areas. The first is DX11 and earlier require game-specific optimizations that just take a lot of time to create and make stable. The other is lack of QA for driver UI. The first just need time in the order of many quarters. The second is getting fixed in a timely manner.
Re: (Score:2)
Um... it's a brand new GPU (Score:3)
It's geninuenly silly to see people bitching about drivers on these. In year or two yeah. Go ahead and bitch. But you're an early adopter. When did we forget what that was like? I suppose to be fair nobody makes new tech much anymore. But still, if you can find it (big if) the current intel card hangs with the RX 6400 and has 2 gigs more ram for the same price. Not all that bad (though I have to admit I think I'd still go with the 6400 or try to squeeze out the cash for a 6500 XT)
Re: (Score:2)
Intel have been clear that older games won't be supported, which is fair enough. Problem is that current games have issues, which suggests that they released the drivers before they were really ready.
That is backed up by many features simply not working in any game. Particularly shortly after launch a lot of stuff was simply non-functional. Being an early adopter is one thing, but this is more like alpha testing.
Re: (Score:2)
Re: (Score:2)
It would be nice if they could at least make drivers that would just be slow with less supported games instead of blowing up.
Not my problem though, I try to keep Intel outside. I use the occasional Intel NIC, but they've had some notable flubs there too.
Inconsistent excuses (Score:3)
A Note About Intel CPU/GPU chips. (Score:3, Insightful)
If you work in music production you are best not to use combo versions of such chips. The GPU side introduces noise into the USB 5v power and that can be heard, if you listen, by moving even just your mouse around. Though you may be able to turn the GPU side off in the BIOS, it's not 100%. Intel does make not combo versions of their CPUs.
Re: (Score:2)
That bad?
Re:A Note About Intel CPU/GPU chips. (Score:4, Informative)
It's power rail feedthrough into the output DAC and amplifiers... You're literally hearing the rails bop around ever so slightly as the switching regulators snap between speeds and continuous/discontinuous conduction mode. The switching is at 100s of KHz but the slow evolution is audible. The voltages required by modern processors and DRAM are incredibly stringent, but audio devices have little/no PSRR at the frequencies SMPSes now run at.
There's also magnetostriction noise, but that doesn't generally go into the line (though magnetic field pickup is _always_ a concern with non-toroidal magnetics), that's literally sound created by the alternating magnetization of the power inductors. This too can be subject to audio-range envelope modulation which leads to lightly loaded SMPSes "squealing."
It's not impossible or even that hard to design a filter circuit that will kill it all, but hey, that's more components = more dollars to make.
Re: (Score:2)
Interesting. I do understand the comment about the regulators. Good for power-saving, bad because it produces low frequency (relatively speaking) effects. I also understand about ferrites producing mechanical noise under some conditions. Funnily, the only case I have observed in my computers so far was with toroid cores, which are much less likely to produce this effect. Guess the manufacturer went cheap there on core material and size.
I am not a musician in any way, but I have reasonably good hearing. The
Ugh... (Score:2)
"Silicon ages far worse than fish "
No it dosen't. It's the pumped up demand for the latest happy shiny that causes silicon to "age".
The chips themselves will likely be laying around for centuries, many of them still perfectly fine after we die.
Games are garbage (Score:2)
It's hard to develop a new driver when other drivers only work because they contain years of patches and hacks to fix non-compliant games. You can do everything according to the specs, but there are always blind spots.
A good Intel display driver? (Score:1)
Intel ... Very Dumb (Score:2)
How could you not see the sanctions coming and not have a backup. WE SAW THAT COMMING, and got our tech developed in Russia out along with the high value devs working on it. I mean WTF. How can giant intel not see that coming and small with a sliver of a fraction of their revenue did.
simple enough (Score:1)
b) GPU drivers are hard