Intel Unveils Arc Pro GPUs (tomshardware.com) 23
Intel's Arc graphics cards aren't just for gamers, it seems, as the previously CPU-exclusive company has taken the lid off a new line of professional GPUs to complement the existing Arc line -- well, existing in China, maybe. From a report:The new cards are called Arc Pro, and target those who use their graphics cards for more than shooting bad guys. Maybe they won't be among the best graphics cards for gaming, but the AV1 encoding at least might get some takers. Intel today unveiled one mobile professional GPU, the A30M, and two desktop models: the single-slot A40 and double-slot A50. Both desktop cards are described as being for small form-factor machines, which makes us suspect Intel may have some much larger cards up its sleeve.
All the newly announced GPUs feature built-in ray tracing hardware, machine learning capabilities and industry-first AV1 hardware encoding acceleration. Google's royalty-free, open source alternative to HEVC, AV1 hasn't gained a lot of traction on the web so far despite promises from Netflix and YouTube, with its main use being in Google's Duo video calling despite beating HEVC for compression quality. It's always been very slow to encode, however, so a good hardware accelerator and Intel's backing could see it take off.
All the newly announced GPUs feature built-in ray tracing hardware, machine learning capabilities and industry-first AV1 hardware encoding acceleration. Google's royalty-free, open source alternative to HEVC, AV1 hasn't gained a lot of traction on the web so far despite promises from Netflix and YouTube, with its main use being in Google's Duo video calling despite beating HEVC for compression quality. It's always been very slow to encode, however, so a good hardware accelerator and Intel's backing could see it take off.
Yet the word on the grapevine... (Score:3, Interesting)
With crypto crashing (Score:4, Insightful)
That's the real barrier to entry. My Rx580 gets new driver updates every month, and the latest drivers are about 15% faster across the board than when I bought the card. nVidia cards do about the same. That's what intel would be competing with.
Re: (Score:3)
it's going to be a tough sell. I'm pretty sure they were after the huge price spikes there, coupled with selling to a specialized market that isn't going to care about constant updates for new games.
I think there are a few places where Intel *could* compete if they went for a niche. "Most powerful card available without a power connector" would be helpful. AMD has a FirePro GPU with four display outputs in a half-height form factor, but there are very few of them. "Inexpensive card that fits in a PCIe x1 / x4 / x8" would be nice; a very small number exist.
Similarly, I'd love to have a card that could fit in an x16 slot and be told, ideally by DIP switches, whether to limit itself to using 8/4/1 PCIe la
Re: (Score:3)
Re: (Score:2)
Gamers obviously prefer getting improved performance and don't have unlimited patience; but they are comparatively cost sensitive; and will certainly overlook merely static levels of performance levels if the price is right; and potentially even forgive some glitches.
People running CAD packages that cost more than most gamers' computers, though, are a lot less forgiving of GPUs that don't cover e
Re: (Score:2)
As I understand it takes them years of design and prep before release, so presumably they started work on this before covid was a thing and the price spikes and other covid associated supply chain issues, etc were not planned for.
So that means it was already worked on before Gelsinger became Intel CEO.
If they had started work a year earlier, it would have been a nice bonus to them to enter the market when there was huge demand and not enough supply.
Re: (Score:2)
Re: (Score:2)
Yeah, as I recall, the Xbox series lost billions in it's first gen, before becoming profitable years later.
Too bad... (Score:3, Insightful)
...the drivers are completely unusable.
Re: (Score:2)
And they discontinued support for slightly older silicon with those.
Translation: if you buy Intel you're not going to get much longevity out of it because they'll discontinue drivers rapidly. One hell of a bet with a chip shortage with no end in sight.
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
...the drivers are completely unusable.
Also, this makes me wonder if we are eventually going to get some kind of standard programming interface for GPU instead of proprietary ones like cuda for nvidia etc.
Even OpenGL for graphic rendering doesn't seem as efficient as it could be...
Re: (Score:2)
OpenGL isn't, that's why there's Vulkan now.
Re: (Score:2)
If you mean for general purpose programming like CUDA we have had OpenCL and SYCL for years. The latter is a core part of Intel's OneAPI work. It may not satisfy everything everyone wants from a proprietary solution like CUDA but you can't really argue than an open standard doesn't exist.
Awesome.. (Score:3)
I'd love a GPU from a company that relegated their GPUs to 'legacy support' that shipped as recently as last year, whose drivers are terrible at their best. From a company that frequently kicks off some 'this time it'll work' non-CPU product to abort it shortly thereafter, often not even making it to market before giving up.
I wish for more competitive GPU market, but Intel seems incapable of market leading products except CPUs... sometimes...
Re: (Score:2)
Re: (Score:2)
This may be true, but it's not the first time Intel has hit 'reset', so it's hard to have confidence that *this* iteration is when Intel finally figured it out, rather than just another iteration for them to declare hopeless a couple of years from now in favor of the next reset.
Re: (Score:3)
Word is that the drivers are hot garbage. Unstable, glitchy mess. Many missing features, out features that just don't work.
Intel is only targeting current gen games, so a lot of older ones run really badly. Performance might be okay given the aggressive pricing, but it's only in some games and even the supported ones aren't well optimized for Intel GPUs.
Professional tier is kind of useless (Score:3)
These things have little application support for programs that professionals might care about, and their 3D performance is pretty terrible (even in the few places they perform decently, they have very poor consistency). You might as well just get the absolute cheapest consumer version they'll make (the A310) and use it as an add-in card on top of a real GPU to get access to its accelerated video encode/decode (like AV1).
The Achilles heel... (Score:3)
Intel i740 (Score:4, Interesting)