Intel Graphics Chief Leaves After Five Years (theverge.com) 25
After five years attempting to make Intel into a competitor for Nvidia and AMD in the realm of discrete graphics for gamers and beyond -- with limited success -- Raja Koduri is leaving Intel to form his own generative AI startup. The Verge reports: Intel hired him away from AMD in 2017, where he was similarly in charge of the entire graphics division, and it was an exciting get at the time! Not only had Intel poached a chief architect who'd just gone on sabbatical but Intel also revealed that it did so because it wanted to build discrete graphics cards for the first time in (what would turn out to be) 20 years. Koduri had previously been poached for similarly exciting projects, too -- Apple hired him away from AMD ahead of an impressive string of graphics improvements, and then AMD brought him back again in 2013.
Intel has yet to bring real competition to the discrete graphics card space as of Koduri's departure. [...] But the company has a long GPU roadmap, so it's possible things get better and more competitive in subsequent gens. It took a lot longer than five years for Nvidia and AMD to make it that far. By the time Koduri left, he wasn't just in charge of graphics but also Intel's "accelerated computing" initiatives, including things like a crypto chip.
Intel has yet to bring real competition to the discrete graphics card space as of Koduri's departure. [...] But the company has a long GPU roadmap, so it's possible things get better and more competitive in subsequent gens. It took a lot longer than five years for Nvidia and AMD to make it that far. By the time Koduri left, he wasn't just in charge of graphics but also Intel's "accelerated computing" initiatives, including things like a crypto chip.
Too bad the ARC cards have to emulate older APIs (Score:2)
Too bad the ARC cards have to emulate older graphical APIs which are used by both older and current games.
What were they thinking?
Re: (Score:3)
Re: (Score:2)
It's probably the case with all graphic cards, given that for example dxvk is able to match or sometimes even surpass the video drivers at running older directX versions.
Intel actually improved the old game performance by quite a lot by using dxvk and just plain optimization if compared to the launch drivers.
Re: (Score:3)
First thing you do when you start, prepare three envelopes...
Re: (Score:2)
What were they thinking?
Probably they were thinking as a new entrant to the market to not focus on something outdated which may actually be irrelevant by the time they seriously become a market contender.
Re: (Score:2)
which may actually be irrelevant by the time they seriously become a market contender
That didn't work out that well.
Re: (Score:2)
You do realize that this is also EXACTLY how all pre-DX9 titles work, and have worked pretty much since the introduction of DX9, right? Old APIs are emulated in the new, modern, more robust API. That's how things evolve over time. There is absolutely nothing wrong with it, and nothing wrong with emulating DX9 inside of DX12. The utilities to do so have massively improved.
Or, put another way... This is what killed Intel ARC, but is also the EXACT same thing that made the SteamDeck a success?
Re: (Score:2)
So it's the drivers to blame, right? And why did it take so long to get that right?
"Gone on sabatical" (Score:4, Interesting)
Re:"Gone on sabatical" (Score:5, Funny)
First off I find your statements unbelievable. I don't know this person specifically but I know many Indians in IT and none of them have ever exaggerated their abilities before. It's almost as if their culture forbids it.
Re: (Score:2)
It's weird that they fired him NOW, though, right after Intel fixed most of their serious Arc driver issues. It's finally acting like a competitive mid-range GPU.
Re: (Score:2)
Re: (Score:1)
Re: (Score:2)
Intel's discrete GPUs are not terrible. They had some severe driver issues early on, and support for older games is poor, but for certain market segments they are decent. Mostly due to low pricing.
For streaming the option to have a separate AV1 encoder is nice. For laptops they offer a reasonable alternative to an AMD or Nvidia GPU, for CAD, video encoding, and many eSports titles. Even for light gaming they are okay.
For the budget end of the market that Intel is targeting, they have a decent product. I say
Exciting! (Score:2)
I'm excited to see someone go from failing upwards to failing sideways.
This guy is Marissa Mayer of GPU world.
We had a senior manager like that at our company (Score:2)
Boasting about all his achievements at previous companies in a senior position but
he managed to almost wreck our company with his incompetence.
Leaving for AI startup (Score:2)
Re: (Score:1)
He went on sabbatical for a reason (Score:2)
Generative AI (Score:2)
Yup, generative AI is very good at generating money for copycat startups from naive FOMA investors.
They're selling Intel short (Score:2)
I wouldn't count Intel out of the game yet.