Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Technology

AI PCs Made Up 14% of Quarterly PC Shipments (reuters.com) 73

AI PCs accounted for 14% of all PC shipped in the second quarter with Apple leading the way, research firm Canalys said on Tuesday, as added AI capabilities help reinvigorate demand. From a report: PC providers and chipmakers have pinned high hopes on devices that can perform AI tasks directly on the system, bypassing the cloud, as the industry slowly emerges from its worst slump in years. These devices typically feature neural processing units dedicated to performing AI tasks.

Apple commands about 60% of the AI PC market, the research firm said in the report, pointing to its Mac portfolio incorporating M-series chips with a neural engine. Within Microsoft's Windows, AI PC shipments grew 127% sequentially in the quarter. The tech giant debuted its "Copilot+" AI PCs in May, with Qualcomm's Snapdragon PC chips based on Arm Holdings' architecture.

This discussion has been archived. No new comments can be posted.

AI PCs Made Up 14% of Quarterly PC Shipments

Comments Filter:
  • What's AI PC now...? (Score:4, Interesting)

    by Parsiuk ( 2002994 ) on Tuesday August 13, 2024 @11:39AM (#64702324) Homepage

    Do you just pre-install annoying "ai assistant" app, and can call it "AI PC"? How far this AI nonsense can go?

    • Re: (Score:3, Insightful)

      by NettiWelho ( 1147351 )
      its just marketing talk, the hoomans have not developed AI yet
    • by timeOday ( 582209 ) on Tuesday August 13, 2024 @12:02PM (#64702396)
      It's a PC with onboard hardware to do things like speech recognition and generation without going to the cloud. Which creates an interesting tension between PC makers, who want to sell the capabilities to end-users, vs. service providers, who want everything to be cloud-based.

      I don't rate the PC makers odds very highly, outside perhaps gaming which has been resistant to moving to the cloud due to latency. Generative capabilities are sure to become integral to gaming. Being limited to canned content is soon going to be very old-fashioned. NPC's will be much, much more varied and interesting. No more driving through LA with only 13 different vehicles repeated over and over.

      • Speech generation? So my 1993 AST 386DX wintel box had AI? SICK I'M A PIONEER
        • You can consider it pioneering or AI or not. Did GPUs expand what a computer could do? Computationally, no. But eventually a GPU became a requirement to run games and certain other applications. 'AI' functionality is largely the same as the GPU, but some added functions like multiplying higher-dimensional tensors efficiently will eventually be required to run some applications.
        • DECtalk or nothing, bitch.

      • No more driving through LA with only 13 different vehicles repeated over and over.

        Generative AI is dramatic overkill for that though. Just look at the "crypto kitties" for example. It doesn't take very many bits of "DNA" paired with a pseudo-random number generator to get lots of variety. I don't play those driving games, but I can't believe they're limiting the number of cars for technical reasons. Maybe it's because they get paid to promote particular makes and only the companies who make those 13 mod

        • Procedurally-generated terrain or clouds are easy because they don't have to look like anything. Procedurally-genreated cars with unique wear & tear are medium. But think about an open-world game where the interior of every house has a distinctive, "lived-in" feel down to unique family photos on the wall. And having sensible, unscripted conversations with those NPC's that advance the plot and reflect what's happened so far (or not) in the game. I remember buying Omikron - they tried, but the tech wa
          • I can see how this all sounds very cool but when I think about how it would actually work I'm not sure...

            Take the unique house interiors as an example. Wow! I can go through someone's house, all their stuff is unique!

            But does it actually have anything to do with the game? would examining it and interacting with it just be a pointless dead-end waste of time? When playing the game how do I know that I'm actually advancing the game instead of going off on wild goose chases with ai that lead nowhere?

            Ok, wild go

            • Project Zomboid does all that, it just does not need to stoop to marketing buzzword bingo by calling out how many advanced machine learning algorithms are involved in what is effectively dice roll.
          • A game developer needs significant control over whats in the game at runtime but generative ai doesnt provide that amount of control generally.

            So it wont be used at runtime. Maybe for text to voice, one of the few places you do have significant control because of what those networks were trained with.

            It will be used during development. The game then ships with geometry and textures, not generative networks that make geometry and textures.

            Also, copyright. No matter where the copyright and AI issue ends
            • There could be games where the AI basically is the game. For example, along the lines of Tower Defense games or Diner Dash, you could be a restaurant manager and your job is to recruit a team of unique characters (given a tight budget) and give instructions to each employee, by telling them (or typing them out) in english. Then you watch it all play out during the lunch or dinner rush - wait times, food quality, customer satisfaction - and you can intervene to help steer your crew through emergencies lik
    • by Z00L00K ( 682162 )

      It all depends on how an AI PC is defined.

      At work we got a new laptop recently - a Dell 5450 - and it did in the device manager indicate that it had an "AI" component in the CPU.

      • Microsoft would like to define it with their own branding.

        They wont win this one. They have nothing to do with any of it. They are just desperately trying to be involved.
    • Do you just pre-install annoying "ai assistant" app, and can call it "AI PC"? How far this AI nonsense can go?

      It's just like the crypto thing. There are ten or twelve "enthusiasts" who jump on the band wagon hard hoping to make a killing, they then stand around looking confused when the rest of the world just goes 'meh'.

    • Oh no, it's much more than that. AI PCs have an AI *button* on the keyboard!

      • by GoTeam ( 5042081 )

        Oh no, it's much more than that. AI PCs have an AI *button* on the keyboard!

        Nice, that's next level!

    • In general we've had AI systems for 3 years, when Apple switched Macs to Apple Silicon CPUs. Apple's CPU includes a "Neural Engine". Among other things it can accelerate ML models.

      PCs now come with CPUs that have hardware acceleration for some AI tasks. That's it. Again, nothing new. Every Apple computer, phone, tablet and even watch ships with such hardware.
    • As far as I understood, it is all about an embedded accelerator called NPU that accelerates some arithmetic operations. A proper GPU can do more than a NPU in many cases. So NPU stand in that nowhere place where it does something but it is not the best solution if you need the throughput. Just think of it like Intel QuickSync accelerator for video encoding. It is on most if not all recent Intel CPUs, whether you use it or not. NPU is a solution looking for a problem to solve. Whereas MMX instruction set hel
      • Remember PhysX, the hardware acceleration standard that games dont use anymore because gpus are all general purpose compute now?

        Microsoft is trying to insert itself here, in a domain it cant force its way into because gpu makers like their garden and they dont let 3rd parties play in it. There will never be a Microsoft CUDA killer.
    • by jonadab ( 583620 )
      Do you remember "multimedia PCs"?

      It's like that.
  • by Improv ( 2467 ) <pgunn01@gmail.com> on Tuesday August 13, 2024 @11:43AM (#64702334) Homepage Journal

    Repeating a term some wide-eyed clueless tech reporter made up (or maybe repeated from a marketer) isn't a good idea. Particularly given that any midrange PC made now and many phones would probably meet whatever criteria someone came up with for the category.

    • The last time I looked - and it must be all of two weeks ago now - Microsoft had certified some ARM co-processors as acceptable for their AI processing, but not anything from Intel or AMD. Apparently they are not powerful enough for Microsoft.

      • or Microsoft has intentionally left them out because they were never going to use microsofts AI branding anyways and already hit all the non-arbitrary goal posts, the ones related to actually doing the thing and not the ones related to the manner in which its done
    • by drnb ( 2434720 )
      Apple phone, tablets and watches would qualify. That all have Apple Silicon CPUs with Neural Engines of various sizes. They can do, even the watches, interesting things on board that used to have to sent to cloud servers for processing. Faster and greater privacy doing it on device.
  • by liqu1d ( 4349325 ) on Tuesday August 13, 2024 @11:43AM (#64702338)
    Now the part that really matters is if they were bought because of the AI or in spite of.
    • Now the part that really matters is if they were bought because of the AI or in spite of.

      Yeah, it's like someone could wrongly conclude that I bought my car because of the heated steering wheel and seats features, even though I live in Florida and literally will never use them.

  • But is the AI really be using aside from front ending the normal search engines.
  • by dmomo ( 256005 ) on Tuesday August 13, 2024 @11:48AM (#64702348)

    Because my PC came with Notepad installed on it.

  • I'm skeptical that a low precision multiplication coprocessor is the key to Intelligence. Hype machine is in full swing.

    • by Improv ( 2467 )

      What's interesting is the higher-level organisation, not the low-level substrate.

      Although we probably agree that real AI isn't here yet.

  • by atomicalgebra ( 4566883 ) on Tuesday August 13, 2024 @12:10PM (#64702426)
    Unless there is a Nvidia chip in the computer they aren't AI computers.
  • All I want to know is, will Excel go faster? I need Excel to go as fast as it can, 32 milliseconds is way too long for me to wait for a spreadsheet to recalculate. Will the AI chips make Excel faster?

    While we're at it, if the AI could help me with a pivot table I've been working on . . .

    • Forget calculating, it takes Excel long enough just to open and draw a blank window. Opening a spreadsheet? That might take 20 seconds easily - assuming it does indeed open, which is not guaranteed. Sometimes you have to kill the process and try again.

      Only after all that are you able to worry about the contents of the spreadsheet, or macros.

      We are running everything purely with Microsoft's cloud services. Office 365 for the client software, and all the Excel sheets are on MS' cloud services - Sharepoint/One

  • Iâ(TM)ve always been an early adopter of AI since the 90s and every time it disappoints. It has no current use case, itâ(TM)s just another blast processing gimmick. For example: look at those articles about AI flooding the job market with crappy low quality CVs. The part you should pay attention to is the âoecrappy low qualityâ part. Humans arenâ(TM)t even any good at writing CVs yet the most advanced AIs in the world canâ(TM)t even write a good one. AI isnâ(TM)t just use
    • Hell, I'm between work right now. The fellow with Unemployment strongly suggested I rewrite my resume (a seven page long chronological CV), a task which I didn't want to do. I consider myself a rather decent amateur wordsmith, but not wanting to do this I asked ChatbotGPT to convert my chronological resume to a targeted one.

      The AI missed, it left my entire chronological work history in there . . . but it rewrote my existing summary and added a section describing the job I feel I want to do (describing my

      • Just FYI, if you apply for any government jobs you will want your entire work history from cradle to as close to the grave as possible on there and described in exhaustive detail.

        • Contract firms don't care how old I am, just can they plug me into their victims' infrastructure and not lose their contract due to incompetence. All government/civil service types are into that Joe Friday "Just the facts, Ma'am" thing (I'm male, that's really kinda awkward). I'm going to experiment with sending targeted resumes at direct-hire leads, see if I get a better response than my existing (very long) resume.
    • by q_e_t ( 5104099 )
      AI is very much in use and has numerous use cases. It might analyse your cancer scan, predict the structure of a new crystal, check your credit score, or monitor power generation equipment for predictive maintenance. That's just a few examples.
  • I have picked up two recent generation dell workstation laptops this quarter, both of which are "AI pcs" based on their specs, and I bought neither for any sort of AI functionality, but rather plain old cad workstation use
  • Apple Intelligence =/= artificial intelligence? Therefore their product is in a league of its own? Harrrrr
  • What the hell is an AI Computer?
    • by leonbev ( 111395 )

      Probably a PC with Windows 11 and Copilot "AI" extensions pre-installed on it. If you're using that as the definition of an "AI PC", basically any PC built after 2018 would apply as they likely meet the hardware requirements for Windows 11. That doesn't make for a good press release, though.

      • by leonbev ( 111395 )

        Actually, after reading the "article", they're also counting Mac OS systems with AI extensions as well. Which is weird, because Apple isn't making their big "Apple Intelligence" push until their next OS release.

    • What the hell is an AI Computer?

      They are really just indicating CPU capabilities. Once upon a time we had plain vanilla CPUs.

      Then MMX instructions were added to support some image processing and graphics tasks. These machines were called these "Multimedia PCs".

      Then GPUs were added to PCs to accelerate various graphics tasks, in particular 3D tasks. We called these machines "Gaming PCs".

      Now CPUs are adding, to use Apple's terminology, "Neural Engines" to CPUs. This is specialized hardware to accelerate some AI tasks. We are now ca

  • by byronivs ( 1626319 ) on Tuesday August 13, 2024 @12:50PM (#64702560) Journal

    Haven't been this excited since I got my 36inch tube television with Betamax built right in! (It's a technically superior format, you know.) And AI is going to blow up! I really can't lose.

  • Also, does anybody else think there's a new niche for "regular" computers now? ... like a normal, regular, non-AI computer? fit the price in below Celerons and above the weaker Chromebooks, say $200-250 USD?

    for "normal, regular" activities like email, office work, web browsing, and media playback?

    also... what are you doing on an *AI computer?*... if not email, web browsing, and media playback?
    • AI is like video cards, and math co-processors. It's a specialized capability that will make a popular task run faster.

      At some point, you're probably going to have an AI agent on your system for local search, speech recognition and synthesis, and improving game graphics and gameplay.

      Oh, and interactive porn, of course.

      • Well, nice sell, but ... no thanks.
        I like to bang sticks and stones.
        i.e. I only use , urm... nano text editor most of the time. for instance.
        Hey, it totally works, for editing text on a server.
        Dont want more. Fit for purpose basically.

        Also, since you asked, I'll start a cult that will be left behind by computer enhanced children... like day of the triffids with AI...
        I'm all in on privacy rights now, so encrypted and open souce is the way to go.. You can now see the cult like reference. But here's the core
        • So you design your own ICs and built your own OS from machine code?

          • I was thinking of dusting off my assember, and changing some encryption algorithms, but I admit I'm a spread a little thin.
            But the premise stands. Deep analysis has been replaced by reaching for the Microsoft or Apple "solution" and calling it a day.

            I'm by no means the first to say this... Moxie Marlinspike said approximately the same thing (article here about Black Hat conference just a few days ago). It's not hard to be cynical and say he was just playing to the house. But I do believe the tools are repl
  • AMD (or was it still ATI) sold graphics cards with "HD" in the name, and then they even named their CPUs after Windows XP.
    • HD, at least, is a real numerical specification. In the early 2000s, it was not guaranteed that your video card could push 1920x1080 pixels.

  • Been here before, done this before, Intel MMX.
    • by leonbev ( 111395 )

      It makes me remember the "Internet Ready" PC's sold in the late 1990's, which came pre-installed with a 56K modem.

      At least those came with some useful hardware, and not just a button the opens Microsoft Copilot in a browser window. Something that will basically work on ANY modern PC made in the last 10 years out of the box.

  • Okay, I'll bite: What exactly is an "AI PC"?

    Is there some actual hardware difference or does it mean they just slapped an "AI Inside!" sticker on it?

    That is, is there hardware included that's specifically geared for AI?

  • Didn't they just announce their Apple Intelligence just two months ago?

    https://www.apple.com/newsroom... [apple.com]

    • They say they do, but as the M4 is "the AI chip" and precisely zero Apple laptops or desktops ship with it, they don't. The do have the top of the line tablet with an M4 though.

      Interestingly enough there has been a rash of sales on M2 and M3 hardware now that those chips are obsolete as far as Marketing is concerned. Apple has committed to M4 everywhere ASAP. The real question is will Tightwad Tim finally make 16 GB the base RAM configuration.

      It's not just Apple either AMD seems to be pushing the 8845HS as

  • Apple is shipping pretty much all of its products with ANE chips with anywhere from 2 to 32 cores. The watch has 2 neural cores for example.
    Most of this simple genAI stuff with a few billion parameter language models can run on a modest machine. Windows PC's with AI chips as a selling point is pretty silly considering their main competition includes dedicated cores on all their systems.

    • What I would like to know is how many people are specifically buying pcs with AI vs buying a PC and it happens to have AI.

  • the 80486 launched in 1989, in FPU (DX) and non-FPU (SX) variants

    Originally, even AMD, Cyrix and NextGen dismissed the FPU as "not ussefull for most computing users"

    Cue the headlines from late 1990 saying "FPU PCs made 14% of quarterly shipments"

    A few years latter .... 100% of PCs every quarter were FPU PCs.

    Substitute FPU for AI ans see what the future holds in store

  • Sad thing is, an "AI PC" generally has to have a gimmick like a "NPU" that is actually only good for about 40-50 AI TOPS (Tera Operations per Second). On the other-hand, a desktop PC with an Nvidia RTX 4080 GPU would be good for about 800 TOPS, yet still wouldn't qualify as an "AI PC" if the CPU didn't *also* have a gimmick NPU. Even in Laptops with discrete GPUs, chances are that the discrete GPU can do more AI TOPS than the NPU. The NPU is only really useful in laptops that don't have a discrete GPU or
  • As opposed to the buyers just buying a PC?
  • I'd also like to see somebody create a physics chip.

    I remember having an 8088 CPU and wishing I had an 8087 math coprocessor for the empty socket. I could never justify the expense of putting a math coprocessor in my XT, even while I was working with decimal fractions all the math was happening with binary ints and Visicalc was still faster and more accurate than me at math. Well, now the math coprocessor is built in. I could never justify upgrading my CGA card, now an accelerated GPU is pretty well exp

    • > I'd also like to see somebody create a physics chip.

      There is no market.

      Compute (whether running on CPUs or GPUs) such as CUDA or OpenCL already makes dedicated hardware redundant and obsolete.

      > Visicalc was still faster and more accurate than me at math.

      Ironically Applesoft used five byte floats which has more precision than C's IEEE754 four byte floats due to Microsoft's configuration of BASIC. (One could assemble BASIC with four byte floats.) I'm not sure what math routines Visicalc used -- I'll

"I got a question for ya. Ya got a minute?" -- two programmers passing in the hall

Working...