Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics AMD Intel

Intel Graphics Chief Leaves After Five Years (theverge.com) 25

After five years attempting to make Intel into a competitor for Nvidia and AMD in the realm of discrete graphics for gamers and beyond -- with limited success -- Raja Koduri is leaving Intel to form his own generative AI startup. The Verge reports: Intel hired him away from AMD in 2017, where he was similarly in charge of the entire graphics division, and it was an exciting get at the time! Not only had Intel poached a chief architect who'd just gone on sabbatical but Intel also revealed that it did so because it wanted to build discrete graphics cards for the first time in (what would turn out to be) 20 years. Koduri had previously been poached for similarly exciting projects, too -- Apple hired him away from AMD ahead of an impressive string of graphics improvements, and then AMD brought him back again in 2013.

Intel has yet to bring real competition to the discrete graphics card space as of Koduri's departure. [...] But the company has a long GPU roadmap, so it's possible things get better and more competitive in subsequent gens. It took a lot longer than five years for Nvidia and AMD to make it that far. By the time Koduri left, he wasn't just in charge of graphics but also Intel's "accelerated computing" initiatives, including things like a crypto chip.

This discussion has been archived. No new comments can be posted.

Intel Graphics Chief Leaves After Five Years

Comments Filter:
  • Too bad the ARC cards have to emulate older graphical APIs which are used by both older and current games.

    What were they thinking?

    • They were thinking it was a waste of time to build dedicated hardware or invest in legacy drivers for deprecated APIs. It's taken NVIDIA and AMD many years to polish their legacy drivers, Intel doesn't have that luxury.
    • by Z80a ( 971949 )

      It's probably the case with all graphic cards, given that for example dxvk is able to match or sometimes even surpass the video drivers at running older directX versions.
      Intel actually improved the old game performance by quite a lot by using dxvk and just plain optimization if compared to the launch drivers.

    • Advice for every new Intel graphics head from the outgoing graphics head since the i740 quarter of a century ago:

      First thing you do when you start, prepare three envelopes...

    • What were they thinking?

      Probably they were thinking as a new entrant to the market to not focus on something outdated which may actually be irrelevant by the time they seriously become a market contender.

      • by kriston ( 7886 )

        which may actually be irrelevant by the time they seriously become a market contender

        That didn't work out that well.

    • by darkain ( 749283 )

      You do realize that this is also EXACTLY how all pre-DX9 titles work, and have worked pretty much since the introduction of DX9, right? Old APIs are emulated in the new, modern, more robust API. That's how things evolve over time. There is absolutely nothing wrong with it, and nothing wrong with emulating DX9 inside of DX12. The utilities to do so have massively improved.

      Or, put another way... This is what killed Intel ARC, but is also the EXACT same thing that made the SteamDeck a success?

      • by kriston ( 7886 )

        So it's the drivers to blame, right? And why did it take so long to get that right?

  • "Gone on sabatical" (Score:4, Interesting)

    by locater16 ( 2326718 ) on Tuesday March 21, 2023 @05:54PM (#63388969)
    AMD fired him, and Intel just discovered why. Koduri is a narcissistic idiot that talks up bullshit that doesn't deliver.
    • by Anonymous Coward on Tuesday March 21, 2023 @06:29PM (#63389079)

      First off I find your statements unbelievable. I don't know this person specifically but I know many Indians in IT and none of them have ever exaggerated their abilities before. It's almost as if their culture forbids it.

    • by leonbev ( 111395 )

      It's weird that they fired him NOW, though, right after Intel fixed most of their serious Arc driver issues. It's finally acting like a competitive mid-range GPU.

      • by sbszine ( 633428 )
        Came here to say this. I think they've also had to drop the prices a bit though to shift units, so he might be copping the blame for that.
    • A new AI company.. Elizabeth Holmes MkII.
    • by AmiMoJo ( 196126 )

      Intel's discrete GPUs are not terrible. They had some severe driver issues early on, and support for older games is poor, but for certain market segments they are decent. Mostly due to low pricing.

      For streaming the option to have a separate AV1 encoder is nice. For laptops they offer a reasonable alternative to an AMD or Nvidia GPU, for CAD, video encoding, and many eSports titles. Even for light gaming they are okay.

      For the budget end of the market that Intel is targeting, they have a decent product. I say

  • I'm excited to see someone go from failing upwards to failing sideways.

    This guy is Marissa Mayer of GPU world.

  • Boasting about all his achievements at previous companies in a senior position but
    he managed to almost wreck our company with his incompetence.

  • And guess which GPUs he will not be using at his new AI startup, the ones he had a hand in creating
  • He is not good at his job and decided to retire instead of getting fired, but Intel gave him a ton of money to do nothing for five years.
  • Yup, generative AI is very good at generating money for copycat startups from naive FOMA investors.

  • ARC is a long term project, if they get the architecture right for the long term (for example, by getting rid of legacy blocks to support DirectX 9), they may find themselves in a really strong position in a couple of years. Not being hampered by legacy.

    I wouldn't count Intel out of the game yet.

One good reason why computers can do more work than people is that they never have to stop and answer the phone.

Working...