Intel Says AI is Overwhelming CPUs, GPUs, Even Clouds, So All Meteor Lakes Get a VPU (theregister.com) 63
Intel will use the "VPU" tech it acquired along with Movidius in 2016 to all models of its forthcoming Meteor Lake client CPUs. From a report: Chipzilla already offers VPUs in some 13th-gen Core silicon. Ahead of the Computex conference in Taiwan, the company briefed The Register on their inclusion in Meteor Lake. Curiously, Intel didn't elucidate the acronym, but has previously said it stands for Vision Processing Unit. Chipzilla is, however, clear about what it does and why it's needed -- and it's more than vision. Intel Veep and general manager of Client AI John Rayfield said dedicated AI silicon is needed because AI is now present in many PC workloads. Video conferences, he said, feature lots of AI enhancing video and making participants sounds great -- and users now just expect that PCs do brilliantly when Zooming or WebExing or Teamising. Games use lots of AI. And GPT-like models, and tools like Stable Diffusion, are already popular on the PC and available as local executables.
CPUs and GPUs do the heavy lifting today, but Rayfield said they'll be overwhelmed by the demands of AI workloads. Shifting that work to the cloud is pricey, and also impractical because buyers want PCs to perform. Meteor Lake therefore gets VPUs and emerges as an SoC that uses Intel's Foveros packaging tech to combine the CPU, GPU, and VPU. The VPU gets to handle "sustained AI and AI offload." CPUs will still be asked to do simple inference jobs with low latency, usually when the cost of doing so is less than the overhead of working with a driver to shunt the workload elsewhere. GPUs will get to do jobs involving performance parallelism and throughput. Other AI-related work will be offloaded to VPUs.
CPUs and GPUs do the heavy lifting today, but Rayfield said they'll be overwhelmed by the demands of AI workloads. Shifting that work to the cloud is pricey, and also impractical because buyers want PCs to perform. Meteor Lake therefore gets VPUs and emerges as an SoC that uses Intel's Foveros packaging tech to combine the CPU, GPU, and VPU. The VPU gets to handle "sustained AI and AI offload." CPUs will still be asked to do simple inference jobs with low latency, usually when the cost of doing so is less than the overhead of working with a driver to shunt the workload elsewhere. GPUs will get to do jobs involving performance parallelism and throughput. Other AI-related work will be offloaded to VPUs.
Re:Anyone believe that Intel is doing it to help u (Score:5, Interesting)
They probably developed the technology for enterprise customers, and putting a little bit of it in everything means that there might be more future demand.
Re: (Score:3)
I'd definitely rather just have more GPU cores and RAM instead of these VPU cores and RAM (because part of their function is having their own scratchpad memory.) But oh, Intel's GPU is only good for video encoding and screen savers, whoops! Guess they'll just have to throw this VPU thing in the system in an attempt both to make their purchasing it make sense, and also to try to distract from their crap GPU performance. Larrabee, where are youuuuu?
Re: (Score:2)
So I'd think this would cannibalize Arc sales for those who actually.... care about this.
Re: (Score:2)
They should have just called them IQ-cores and ridden the AI bandwagon all the way to the bank.
Re: (Score:3)
Probably just to make the usual idiots pay more for Intel CPUs than they are worth.
Re: (Score:3)
What's certain is that Intel has not suddenly become a pro-customer company.
Intel sells chips. In order to sell chips you need to make chips with the feature set expected by customers. You don't need to be pro-customer to create a product which meets the needs of customers. There's no underhanded conspiracy here, just another behemoth that is (as always) late to a party that other companies have already dominated.
Here's a hint:
NVIDIA and AMD GPUs dominate the workload.
AMD has already introduced Ryzen AI with its 7000 series.
ARM introduced AI co-processors with the Cortex-X4
Apple ke
Re: (Score:3)
Why would Intel be the only one to not introduce the VPU in their CPU?
Because Intel makes the majority of its CPU money in the server room and "business related" desktops. Intel also used to own the tiny workstation market outright, but my prediction is that Epyc eventually eats Xeon's dinner.
The only reason to put a VPU into their CPU is because a customer wants it. I don't see that. Its probably too early to be introducing into Meteor Lake. Intel should be moving to an even more IPC efficient architecture than hybrid cores. The only reason they lost the efficiency figh
Re: (Score:3)
Why would Intel be the only one to not introduce the VPU in their CPU?
Because Intel makes the majority of its CPU money in the server room and "business related" desktops.
Intel only gets those sales because they are perceived as being the industry choice, because they sell the bulk of the CPUs. If execs start seeing AMD stickers everywhere instead of Intel stickers, they are going to think that Intel is the also-ran and they won't want to be associated with it any more.
The only reason to put a VPU into their CPU is because a customer wants it.
Intel's plan is to tell the customer what they want. It's always worked for them before, and they are allergic to change. They have this product sitting around after an acquisition and they have to justify the
Re: (Score:2)
Think how hard it is to hire a competent programmer that is fluent with applying concurrency to code. You think AI programmers are just lying around waiting to be hired?
Exactly. If nerds don't smell the money on the AI Tree, they have only themselves to blame. This is great news for anyone with any experience even tangentially related to AI.
MBAs don't know what it is, but they know the boss wants it, so they're going to pay whatever it takes to beat the other guy.
Re: (Score:2)
Because Intel makes the majority of its CPU money in the server room and "business related" desktops.
Yes it does. You seem to not be understanding what business means these days. An staggering amount of business related desktops (and laptops) sit around pegging their CPUs processing video and audio. Teams already includes AI based audio reverb reduction on by default. The requirement for this in business laptops is as prevalent as the need for a hardware video transcoder. This isn't a chicken an egg problem. We already have a shed full of chickens here. People don't seem to reliase just how widespread the
helpful links in order (Score:3)
https://en.wikipedia.org/wiki/... [wikipedia.org]
https://combox.io/upload/combo... [combox.io]
https://www.mathworks.com/help... [mathworks.com]
Re:helpful links in order (Score:4, Interesting)
Re: (Score:2)
Quick matrix evaluators for running the networks on.
They're quite helpful in mobile applications (which is why every phone CPU has included them for a long time now), but I wouldn't have thought anyone really cared for PCs. But maybe more NNs are being run on PC hardware these days instead of at the "edge"
Re: (Score:2)
Re: (Score:2)
increasingly GeLU too (which is basically similar to ReLU but doesnt have the differentiation-at-zero problem) , and something called Swish that I dont know much about except its sigmoid-ish
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
That's not an Intel problem then because my 10 year old PC plays youtube just fine.
So chipsets are locked down more like consoles? (Score:5, Interesting)
I'm not seeing the superior benefit of this. It sounds to me more like selling a most expensive chip with everything combined into one so you have to replace the entire component is one gets outdated, and likely replace the entire motherboard too.
So this is the netbook / console evolution I'm guessing so you constantly have to pay high prices to replace the entire device? I always found it really advantageous when the items I'm running on my system have a specific bottleneck.
In the past my choice for upgrade was my GPU as an example for some of the graphical improvements I wanted. Since it was a standard port, I could use the latest graphics card, even if my CPU won't let me get the most out of it at the time. Later when the pricing and product availability is right, I can replace the CPU, ram and motherboard, and keep the GPU. It makes it a lot more affordable to upgrade over time instead of each time I want an upgrade, it's all or nothing, drop 3000$ or spend nothing.
It sounds like this will lead to purchasing mid grade items and getting slight improvements at an expensive price tag. Maybe you buy the system for 1200$, then two years go down, and you can spend 1300$ for the current model which is 10% faster overall, versus spending 400$ on a new cpu and getting a large improvement depending on the bottleneck for your applications.
P.S - I like the idea of a VPU (Score:3)
I actually like specialized additional processors that add more capabilities than the CPU can reasonably handle, like back in the day you'd add a math-coprocessor and other items, I thought that was awesome. I think it'd be great if there was a component slot of motherboards that allowed additions of specialized processors like a VPU for specific tasks that people need them for, for those who can make use of it.
Re: (Score:3)
I bought a math coprocessor back in the 1980s for about $200 (I think) and came to find out it didn't do shit for shit.
There were like 3 programs that used it, none of which I owned, but the hyperbole about what it "could" do were too alluring not blow a shitload of money on it.
It sped up Lotus123 which I didn't use, but I got mad bragging rights....for about an hour until we figured out how utterly useless it was in real life.
I wouldn't buy something like that today, not because I've gotten smarter, but be
Re: (Score:3)
FYI $200 was a ton of money back then. Gas was maybe $0.60/gal. You could feed a family of 6 at Pizza Hut for $16. Gum cost $0.10.
$200 seems like nothing now, but I remember I ran up a $100 phone bill as a kid when I first found BBS' and my father was about ready to murder me. Now my cell phone bill is $100 every month...
Re: (Score:2)
True
The last time gas was $0.60/gal. was in the mid to late 70s. Gas was over a dollar per gallon on average for most of the 80s. [creditdonkey.com]
Re: (Score:2)
Math coprocessors were unnecessary for most productivity applications. Back when you had to buy them separately they were only really relevant for graphics, science, and gaming. AutoCAD, FALCON, FORTRAN, MATLAB, MS Flight Sim, you get the idea [ctrl-alt-rees.com]. That's actually still true, in that our processors are so fast that most people wouldn't even notice if they had to emulate floats, but now lots of people do gaming and graphics.
Re: (Score:2)
You hit the nail on the head for me.
If I had applications that would benefit from the math-coprocessor, then it's awesome and I'd buy one. I don't want to pay more money for my CPU overall because it includes a VPU, which I do absolutely nothing with and don't need. Which is why I support the freedom for people who want one to buy items with specific chipsets or slots on their motherboard. The fact they are going hey, all our CPUS will support it..with the GPU...built in..since..uh..people don't want to pay
Re: (Score:2)
I'm not seeing the superior benefit of this. It sounds to me more like selling a most expensive chip
That's Intel's business model, yes.
Their "Why you need this new CPU!" press releases have always been comedy gold aimed at clueless middle-managers.
Case in point: "Chipzilla is, however, clear about what it does and why it's needed"
Fear, Uncertainty and Doubt if ever I heard it.
Re: (Score:2)
Re: (Score:2)
It allows for the creation of a new socket.
It does nothing of the sort. Intel don't need magic hardware as an excuse to change the socket, and we're not talking about a device with dedicated pins exposed to the motherboard. You were getting a new socket irrespective of what was included in the CPU.
Re: So chipsets are locked down more like consoles (Score:2)
Re: (Score:2)
I'm not seeing the superior benefit of this.
You're not seeing a benefit to hardware acceleration of common tasks? Everyone else has. For the record Intel is the last to the party here. AMD Ryzen AI, Apple Neural Engine, ARM's AI Coprocessor in the Cortex-X4, and that's before we discuss what GPU vendors have been doing all of which include hardware acceleration in their dedicated GPUs.
It sounds like this will lead to purchasing mid grade items and getting slight improvements at an expensive price tag.
And that is okay. You shouldn't need a high end CPU and dedicated GPU to run a Teams call simply because it uses AI and video transcoding. There's a world of tasks to
Re: (Score:2)
And that is okay. You shouldn't need a high end CPU and dedicated GPU to run a Teams call simply because it uses AI and video transcoding. There's a world of tasks to optimise in the low-mid range of PCs. I don't give a crap if my work PC is 10% faster, but sign me up for 10% better battery life any day.
What is this AI that teams is using, and what does it do for me? Also what is it transcoding? Transcoding implies taking one video format and transferring it into another video format codec. It does nothing of the sort, it does basic video decoding which all onboard gpus and process support with very little effort. What, do I need a VPU for advanced AI so they can get better telemetrics from my app usage without straining my CPU with all the bloat?
I can take a basic, cheap laptop and run teams just fine wit
Re: (Score:1)
No, it's more like having the math coprocessor eliminated on the Pentium and putting it on-dye, as opposed to having it be a separate socket on 486 (if it was even available).
IE it's basic functionality which everyone will soon use, because everyone wants it - but doesn't want to shell out $$$ to get in the door. It raises the bar, as it were, so that software which was not previously accessible is broadly accessible without discreet hardware for the task.
It will drive hardware sales in what has become a ve
Re: (Score:2)
Bonzi Buddy (Score:2)
Re: (Score:2)
So how soon can I get Bonzi Buddy back? Powered by local AI running on these VPUs. And maybe Microsoft can bring back Clippy.
Clippy is already here. It's baked into Windows 11 and getting an assist from Microsoft's version of AI.
ALL ABOARD THE HYPE TRAIN!!! (Score:1)
Re: (Score:2)
This hype train will quickly grind to a halt, because when AI will have put everybody out of a job, nobody will have any money to buy what AI will be so much more efficient at producing.
Re: (Score:2)
I didn't think I'd ever live in a world where people did the hard, repetitive labor and AI wrote the books, music, and poetry.
But here we are, almost.
Re: (Score:2)
Re: (Score:2)
Still needs human in the loop.
Yes....but for how long?
I don't think it's unreasonable to project that eventually they'll be very, very capable at all those things.
And they'll have one AI checking the results of another, so yeah, I think it's not far-fetched to expect them to get to the point of "good enough", which is the point at which most people won't care, won't know, or where it won't make any difference.
Will it be 100% across every field? No, but again, at some point it won't matter- it'll be good enough.
Re: (Score:2)
Re: (Score:2)
I didn't know capability increases lead to lack of work.
If you're breaking rocks, and you go from hitting rocks with rocks to hitting rocks with hammers, you're probably not going to put anyone out of work. In this scenario you probably have other real dumb jobs for them to do.
If you're weaving fabric, and you go from hand looms to machine looms, what work do you have for those weavers to do? Their skills aren't relevant for anything else. Now you just need someone to load bobbins and hey, you can use child labor for that. (Everything old is new again [nytimes.com].)
The indu [localhistories.org]
Vapor processing unit? (Score:5, Funny)
Now there is something we've needed since the '80's.
Vector (Score:2)
Re: Vector (Score:2)
If it's raining you'll want a Vertically Propagated Umbrella.
Or an SoC with integrated 32 bit floating DAC: Volume Pumping Unit.
Re: (Score:2)
Re:Vector (Score:5, Informative)
VPU means Vision Processing Unit.
It's an admittedly stupid name... and one I haven't heard used since they were popular on old TI OMAP parts.
What it really is, is a chunk of dedicated MMA hardware that makes running NN inference engines really efficient and snappy.
These days, people like to call them NPUs (Apple), TPUs (NV, Google)
They're obscenely faster than a normal GPU shader core at this particular line of work, and much more efficient.
Re: (Score:1)
Re: (Score:2)
For GPUs- ya, VPU (vector) would have been a much better name than GPGPU, which is as dumb a name as VPU (vision)
Companies always make a mess out of shit like that.
Re: (Score:2)
AI (Score:2)
Yes, just include the word AI somewhere and watch your stock value rise.
AI Crypto Blockchain Quantum LLM.
Please send your deposits in furtherance of my awesome and detailed 5-word business plan above to
SWIFT CHASUS33XXX ABA 12210024 ACCT 3733+
KTB
E
Re: (Score:2)
So wait... (Score:3)
I *SO* want AI on my conference calls (Score:2)
What could possibly go wrong?
I say: "Because lead times for key hardware components were unexpectedly short, we are able to deliver six weeks ahead of schedule." [1]
They hear: "Because [racial slur] for [biological function], we are [what AI thinks my mother did for a living]."
[1] That's a true statement for a project I'm currently working on.
Re: (Score:2)
Nothing can go wrong because literally no one is talking AI generated voice synthesis in this case. You very likely already have AI on your conference calls and don't even realise it. AI noise and reverb cancellation is on by default in Teams. The only thing this will do is offload the workload from the CPU giving you a bit more battery life on your work laptop.
Re: (Score:2)
Is that AI or just really good DSP?
VPU vs TPU (Score:1)
Microsoft + Arm (Score:2)