Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics AMD Intel Games Hardware

Intel Reportedly Designing Arctic Sound Discrete GPU For Gaming, Pro Graphics (hothardware.com) 68

MojoKid shares a report from HotHardware: When AMD's former graphics boss Raja Koduri landed at Intel after taking a much-earned hiatus from the company, it was seen as a major coup for the Santa Clara chip outfit, one that seemed to signal that Intel might be targeting to compete in the discrete graphics card market. While nothing has been announced in that regard, some analysts are claiming that there will indeed be a gaming variant of Intel's upcoming discrete "Arctic Sound" GPU. According to reports, Intel originally planned to build Arctic Sound graphics chips mainly for video streaming chores and data center activities. However, claims are surfacing that the company has since decided to build out a gaming variant at the behest of Koduri, who wants to "enter the market with a bang." Certainly a gaming GPU that could compete with AMD and NVIDIA would accomplish that goal. Reportedly, Intel could pull together two different version of Arctic Sound. One would be an integrated chip package, like the Core i7-8809G (Kaby Lake-G) but with Intel's own discrete graphics, as well as a standalone chip that will end up in a traditional graphics cards. Likely both of those will have variants designed for gaming, just as AMD and NVIDIA build GPUs for professional use and gaming as well.
This discussion has been archived. No new comments can be posted.

Intel Reportedly Designing Arctic Sound Discrete GPU For Gaming, Pro Graphics

Comments Filter:
  • by Anonymous Coward

    While they may be able to get away with it in the cpu market, unless they actually make these GPUs price competitive it won't matter much. And since these GPUs will definitely have signed firmware and DRM, it isn't like they will be a compelling alternative for the Open Source crowd.

    Get back to me when they offer a version with unsigned firmware and methods for technically savvy end-users to prove their GPUs are secure and we might have something to talk about. Otherwise it will be another has-been like the

    • by goombah99 ( 560566 ) on Wednesday April 11, 2018 @09:31AM (#56417763)

      Normally it would be hard to crack this market but five stars are aligning right now that are going to make this easy.
      1. Crypto Currency has made NVIDIA scarce and expensive. While that probably at it's peak and will wane for bitcoin and now etherium, new emerging currencies are going to emerge for which GPUs will still matter.

      2. There about to be a paradigm shift to real-time ray tracing. GPUs have just reached the critical level of performance while new standards, drivers and libraries to support them are emerging that will bring this into the next generation of games about to be written.

      3. VR and augmented reality are not come and gone. Far from dead they are just resting like parrot. Well maybe not like a parrot. Vapor ware like Magic leap is about to become real ware but the problem has been insufficient performance for real time augmented graphics.

      4. And I save the best market for last. Driverless cars and self flying drones depend on GPUs. that market isn't even commerical yet. time to leap.

      5. But none of the above matters. Nvidia and AMD can and are expanding into all those niches and they have the market channels and cost scales to do it. Competition cant get started. Unless of course you happen to have infitely deep pockets, a known history of selling loss leaders to crush competitors, and and superior channel to mobo makers who just gulp down your chip sets already.

      Intel's timing is pretty good. demand rising, shifting requirements, actual need for improved performance, and deep pockets mean they can enter the market at the top teir of performance while competing on price without the worry of market share.

      • At first I wanted to make some joke about having the sustained fury of crabs scuttling about and reaching long claws fruitlessly in the dark of the ocean floor. But all that actually came to mind was the sustained fury of the towering boat tossing waves of an arctic storm and in fact how a GPU could render that power and scale.

    • I wonder if this will go the way of Voodoo Graphics or Cyrix: the big two are just... too big!

    • by bspus ( 3656995 )

      Given the whole mining frenzy that is still driving GPU prices to the moon (unlike the coins they are used to mine), if there ever was a time for a third contestant in this market, now should be it

  • by klingens ( 147173 ) on Wednesday April 11, 2018 @08:19AM (#56417473)

    Is it the third or are there even more failed attempts?
    Intel 740, Larrabee and now this. Even if they are successful and finally get a miracle where they produce hardware that is actually good enough. They won't beat nvidia or AMD unless they use their fab tech to build a much bigger, much more expensive to fab chip. But let's just say they pull the miracle rabbit out of the hat. Their drivers will still suck for games. To be able to get a foot in this market you will need several years/generations with competitive hardware so game engines are written with explicit thought for you, games do tests and fixes on your hardware, a driver team works with game makers for a long time,etc. I just don't see the Intel videocard driver team being capable this way.

    The only chance Intel maybe has is to convince the console makers to use theirs instead of AMD for the next consoles. With enough money/rebates and the great Intel sales magic to OEMs this might even work. But for discrete PC gaming this is all DOA. I just don't see how Intel can make money on this, not with the rebates they would have to give the console makers to actually "succeed". This sounds like another Atom/mobile CPU/ARM competitor fiasco where they burn billions.

    • Meh, they just need to make sure it puts out a good hash rate on some random cryptocurrency and they will sell every single one they make.

    • Re: (Score:3, Insightful)

      by Anonymous Coward

      So they failed in the past, so what? Hopefully they are smart enough to learn from their mistakes and do it right he next time. When you have a duopoly like AMD/Nvidia more competition is a good thing for consumers.

      • Indeed - but it’s only competition if you offer a competitive package, and OPs point was that Intel don’t have the capability to do this.

    • by Anonymous Coward

      I guess they stand a chance to capture the low end market. Like it or not, 7th and 8th gen already killed the sub US$ 60 GPU market.

      I would say they stand a chance to cature the sub, I don't know, maybe US$ 150 discrete GPU market? It may not seen like it, but there's money to be made in this market.

      • Certainly. Especially if their $150 discrete GPU outperforms their competitor's $200 GPU.

        (Not saying it will, but if it did it would be interesting if it did.)

    • by Anonymous Coward

      Dont care. Still want to see them try.

      Competition is competition.

  • Typical Intel (Score:5, Insightful)

    by DontBeAMoran ( 4843879 ) on Wednesday April 11, 2018 @09:09AM (#56417647)

    They named a Graphics Processing Unit Arctic Sound .

    Dumbasses.

    • by vux984 ( 928602 )

      Yup. I had to read the headline 3 times, and then the summary to finally be clear that it was a strictly just a new graphics chipset, and that intel wasn't trying to do something with audio as part of the project.

  • /Oblg. Nvidia poking fun of Intel [dvhardware.net] a few years back.

    Maybe _this_ time will be different. We'll have to wait and see ...

  • by Kohath ( 38547 ) on Wednesday April 11, 2018 @09:35AM (#56417791)

    Intel has been half-assing GPU capabilities for decades now. Why would anyone believe they'll do a good job on this chip? Are they giving the new guy everything he wants?

    Intel has a history of programs that exist because "Intel needs to sell chips in [whatever market]". Not at all based on what customers want. They tried it for phones. Customers continued not to want Intel's offerings. Intel eventually gave up trying to push on that rope.

    • Comment removed based on user account deletion
      • Comment removed based on user account deletion
    • by Anonymous Coward

      Yeah, it's not a technology question but a market planning and accounting question.

      Intel has excellent SIMD designs (see AVX and AVX2 in modern cores), excellent breadth for different types of core (see the latest i7 cores versus the simpler Atom and Knights Landing cores), and excellent silicon processes. They also seem to have reasonable interconnects and memory controllers, after AMD shamed them out of their fixation with plain old shared buses.

      The question is, can they form a business plan and get it t

    • Speaking as someone whose daily-use computers range from an Intel HD 4400 to a GeForce1080 Ti:

      Intel's integrated graphics are perfectly fine for what they needed to be - something that can run a basic compositor and decode video. For those use cases, what they've got offers enough performance, and does so with the least power draw and the smallest die area. And, arguably, their drivers are the best from a standards-following standpoint. They don't do the "hand-write optimized shaders for specific high-profi

  • by Anonymous Coward

    Intel Reportedly Designing Arctic Sound Discrete GPU For Crypto Mining

    • by Anonymous Coward

      Intel Reportedly Designing Arctic Sound Discrete GPU For Crypto Mining

      Well, so now the "Arctic Sound" makes sense: It would be the sound of polar caps rupturing due to ever increasing Earth temperatures due to excessive crypto mining.

Do you suffer painful hallucination? -- Don Juan, cited by Carlos Casteneda

Working...