Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Intel Technology

Why is Intel's GPU Program Having Problems? 38

An anonymous reader shares a report: If you recall, DG2/Arc Alchemist, was supposed to debut late last year as a holiday sales item. This would have been roughly 18 months late, something we won't defend. What was meant to be a device that nipped at the heels of the high end market was now a solid mid-range device. Silicon ages far worse than fish but it was finally coming out. That holiday release was delayed 4-6 weeks because the factory making the boards was hit by Covid and things obviously slowed down or stopped. SemiAccurate has confirmed this issue. If you are going to launch for holiday sales and you get delayed, it is probably a better idea to time it with the next obvious sales uplift than launch it between, oh say Christmas and New Years Day. So that pushed DG2/AA into mid/late Q1. Fair enough. During the Q2/22 analyst call, Intel pointed out that the standalone cards were delayed again and the program wasn't exactly doing well. While the card is out now, the reports of drivers being, lets be kind and say sub-optimal, abounded. The power/performance ratio was way off too, but there aren't many saying the price is way off unless you are looking at Intel's margins to determine what to buy the kiddies.

[...] Intel is usually pretty good at drivers but this time around things are quite uncharacteristic. Intel offered a few reasons for this on their Q2/22 analyst call which boiled down to, 'this is harder than we thought' but that isn't actually the reason. If that was it, the SemiAccurate blamethrower would have been used and refueled several times already so what really caused this mess? The short version is to look where the drivers are being developed. In this case Intel is literally developing the DG2 drivers all over the world as they do for many things, hardware and software. The problem this time is that key parts of the drivers for this GPU, specifically the shader compiler and related key performance pieces, were being done by the team in Russia. On February 24, Russia invaded Ukraine and the west put some rather stiff sanctions on the aggressor and essentially cut off the ability to do business in the country. Even if businesses decided to stick with Russia, it would have been nearly impossible to pay the wages of their workers due to sanctions on financial institutions and related uses of foreign currencies. In short Intel had a key development team cut off almost overnight with no warning. This is why SemiAccurate say it isn't their fault, even if they saw the war coming, they probably didn't see the sanctions coming.
This discussion has been archived. No new comments can be posted.

Why is Intel's GPU Program Having Problems?

Comments Filter:
  • The bulk of graphics card technology is covered by patents that are owned by Nvidia, AMD or third parties. Some of which can be licensed which only has the impact of adding to the cost of the chips. Some however cannot making it extremely difficult to make competing products.

    • by UnknowingFool ( 672806 ) on Wednesday September 14, 2022 @02:24PM (#62881815)
      But it is not like Intel has never made a GPU before. They have made GPUs for generations of their chips. This is their first discrete offering. It is not surprising that the performance was not up to Radeon or Nvidia as their integrated versions were not. It was expected it might take a few generations for them to get better. What is surprising as noted is that things like drivers were so bad.
      • ^^ THIS 100%.

        Intel has been making GPUs since 1998 [wikipedia.org].

        Your note about discrete GPUs is a good one. Expecting Intel to match the performance of AMD or Nvidia is unrealistic. Their (lack of quality) drivers and (over) price are the main problems. (Also, sometimes you just need raw performance.)

        If Intel continues to invest in their hardware (and driver team) for the next few years they can have something competitive. We will see.

        • Also this is not Intel's first attempt at discrete graphics. They tried like 15 years ago but that product never launched.
        • Their (lack of quality) drivers
          A driver is sending (or receiving) data sent by the OS to the GPU/device.

          There is no quality in it. It is a bunch of a few lines of code.

          In other words: either the driver works or it does not. There are no good/bad - or - fast/slow drivers.

      • This is their first discrete offering.

        Not true. eg. https://en.wikipedia.org/wiki/... [wikipedia.org]

        • by _merlin ( 160982 )

          I was going to say that. It was also possible to get the GMA950 on a PCIe card, although they were quite rare (GMA950 PCIe cards were included with the Pentium 4-based Apple Developer Transition Kit machines).

          But strictly speaking, neither the i740 nor GMA950 came from Intel organically. The i740 was from Real3D, acquired from Martin Marietta (who were previously involved in development of the 3D hardware for Sega's Model 2 and Model 3 arcade platforms and various commercial and military flights simulator

      • by kriston ( 7886 )

        This is their first discrete offering.

        Not exactly. [wikipedia.org]

      • by Osgeld ( 1900440 )

        its far from their first discrete offering ... they were offering DirectX compatible cards back in the windows 9x era and they sucked big. to your point I don't think anyone expects intel to come out swinging and matching blow by blow the likes of AMD or Nvidia ... but for fucks sake they could have a bit better driver support than Matrox ... they do have 20 something years of writing graphics drivers

    • The bulk of graphics card technology is covered by patents that are owned by Nvidia, AMD or third parties. Some of which can be licensed which only has the impact of adding to the cost of the chips. Some however cannot making it extremely difficult to make competing products.

      Those of us that read the article know that Semiaccurate's assessment is that the one word is: Russia. As in, driver work was being done in Russia, and those teams are no longer accessible to Intel due to Russia's invasion of Ukraine.

      Who knows if their assessment is fully accurate.

  • by DrXym ( 126579 ) on Wednesday September 14, 2022 @02:42PM (#62881869)
    Maybe for their normal chipsets but they've been absolutely abysmal for IGP GPUs. Typically you got the driver that came with your laptop and fuck you that's it. Maybe there were reasons for this (probably involving passing the buck onto vendors) but as an end user it sucked.
    • by klui ( 457783 ) on Wednesday September 14, 2022 @03:13PM (#62881985)

      Gamers Nexus have videos about this.

      https://www.youtube.com/watch?... [youtube.com]
      https://www.youtube.com/watch?... [youtube.com]

      They identified two problem areas. The first is DX11 and earlier require game-specific optimizations that just take a lot of time to create and make stable. The other is lack of QA for driver UI. The first just need time in the order of many quarters. The second is getting fixed in a timely manner.

      • by DrXym ( 126579 )
        I doubt it even matters if older games are optimal - on a modern rig even through some thunk they probably run fine. But Intel should have gotten the basics right, especially drivers and it looks like they haven't. Maybe in 12 months the picture will be different.
  • by rsilvergun ( 571051 ) on Wednesday September 14, 2022 @02:54PM (#62881917)
    of course they're having problems. Ever time there's a new game AMD and NVidia release a driver built for it. They've built up over time.

    It's geninuenly silly to see people bitching about drivers on these. In year or two yeah. Go ahead and bitch. But you're an early adopter. When did we forget what that was like? I suppose to be fair nobody makes new tech much anymore. But still, if you can find it (big if) the current intel card hangs with the RX 6400 and has 2 gigs more ram for the same price. Not all that bad (though I have to admit I think I'd still go with the 6400 or try to squeeze out the cash for a 6500 XT)
    • by AmiMoJo ( 196126 )

      Intel have been clear that older games won't be supported, which is fair enough. Problem is that current games have issues, which suggests that they released the drivers before they were really ready.

      That is backed up by many features simply not working in any game. Particularly shortly after launch a lot of stuff was simply non-functional. Being an early adopter is one thing, but this is more like alpha testing.

      • by BigZee ( 769371 )
        It's not fair enough. Most of us have games in our collections going back a couple of decades, if not more. I enjoy an occasional replay of Half Life or Portal. The idea that these games are not supported is not reasonable. I'm glad to say that I do have a modern card that supports old games without any issues. Arc may be new but Intel are not new to the GPU game. They need to do a lot better.
    • It would be nice if they could at least make drivers that would just be slow with less supported games instead of blowing up.

      Not my problem though, I try to keep Intel outside. I use the occasional Intel NIC, but they've had some notable flubs there too.

  • by misnohmer ( 1636461 ) on Wednesday September 14, 2022 @03:21PM (#62882017)
    The timeline doesn't add up. If the release last year was caused solely by a boards manufacturing issue (as the report claims), that would mean the drivers were ready before Russia invaded Ukraine. Obviously had there not been a manufacturing issue, the drivers would have been in the same or worse shape than they are today, unless the claim is they messed up the drivers since last year, in which case the fix is trivial - just revert all changes done since the originally planned release (presumably Intel has a dev outside of Russia who knows how to revert code using source control).
  • by 3seas ( 184403 ) on Wednesday September 14, 2022 @04:17PM (#62882205) Homepage Journal

    If you work in music production you are best not to use combo versions of such chips. The GPU side introduces noise into the USB 5v power and that can be heard, if you listen, by moving even just your mouse around. Though you may be able to turn the GPU side off in the BIOS, it's not 100%. Intel does make not combo versions of their CPUs.

    • by gweihir ( 88907 )

      That bad?

      • by zeeky boogy doog ( 8381659 ) on Wednesday September 14, 2022 @08:32PM (#62882965)
        Even using my laptop's discrete GPU, I can easily hear the noise injected into the audio line out with my good headphones if the room is fairly quiet. Almost any sound being played will be louder, but it's there. Ears are incredibly sensitive when there's nothing else going on.

        It's power rail feedthrough into the output DAC and amplifiers... You're literally hearing the rails bop around ever so slightly as the switching regulators snap between speeds and continuous/discontinuous conduction mode. The switching is at 100s of KHz but the slow evolution is audible. The voltages required by modern processors and DRAM are incredibly stringent, but audio devices have little/no PSRR at the frequencies SMPSes now run at.

        There's also magnetostriction noise, but that doesn't generally go into the line (though magnetic field pickup is _always_ a concern with non-toroidal magnetics), that's literally sound created by the alternating magnetization of the power inductors. This too can be subject to audio-range envelope modulation which leads to lightly loaded SMPSes "squealing."

        It's not impossible or even that hard to design a filter circuit that will kill it all, but hey, that's more components = more dollars to make.
        • by gweihir ( 88907 )

          Interesting. I do understand the comment about the regulators. Good for power-saving, bad because it produces low frequency (relatively speaking) effects. I also understand about ferrites producing mechanical noise under some conditions. Funnily, the only case I have observed in my computers so far was with toroid cores, which are much less likely to produce this effect. Guess the manufacturer went cheap there on core material and size.

          I am not a musician in any way, but I have reasonably good hearing. The

  • "Silicon ages far worse than fish "

    No it dosen't. It's the pumped up demand for the latest happy shiny that causes silicon to "age".

    The chips themselves will likely be laying around for centuries, many of them still perfectly fine after we die.

  • It's hard to develop a new driver when other drivers only work because they contain years of patches and hacks to fix non-compliant games. You can do everything according to the specs, but there are always blind spots.

  • I wouldn't say Intel has a recent problem with GPU drivers. In my experience, Intel display drivers have historically suffered from poor implementation, ranging from 'passable' to 'awful'. Let's take the GMA500 chipset as an example, a design that was created outside of Intel. Its Intel production drivers offer to this day 10-20% the performance of tweaked drivers released by enthusiasts, such as an Italian developer (https://gma500booster.blogspot.com/). These non-Intel optimised drivers lift performance
  • How could you not see the sanctions coming and not have a backup. WE SAW THAT COMMING, and got our tech developed in Russia out along with the high value devs working on it. I mean WTF. How can giant intel not see that coming and small with a sliver of a fraction of their revenue did.

  • a) it's intel; and
    b) GPU drivers are hard

Don't panic.

Working...