Apple's iOS 18 AI Will Be On-Device Preserving Privacy, and Not Server-Side (appleinsider.com) 59
According to Bloomberg's Mark Gurman, Apple's initial set of AI-related features in iOS 18 "will work entirely on device," and won't connect to cloud services. AppleInsider reports: In practice, these AI features would be able to function without an internet connection or any form of cloud-based processing. AppleInsider has received information from individuals familiar with the matter that suggest the report's claims are accurate. Apple is working on an in-house large language model, or LLM, known internally as "Ajax." While more advanced features will ultimately require an internet connection, basic text analysis and response generation features should be available offline. [...] Apple will reveal its AI plans during WWDC, which starts on June 10.
Ha! (Score:3, Funny)
I'm sure existing phones have plenty of extra memory, cpu cycles, and battery life to handle a full blown LLM engine. The *normal* apps aren't even local to the phone most of the time and instead rely upon a back office.
Re: (Score:2)
Querying is completely fine. Training is a different story.
Re: (Score:2)
Re: (Score:1)
Modern iPhones have AI cores, aka silicon based artificial neural networks.
So you can have plenty of "AI"s as data, and just load the one you want to execute. Simple things will be lightening fast.
Re:Ha! (Score:5, Informative)
by default, everything is backed up to iCloud along with the key to decrypt it.
This is absurd and patently false. They give you a paltry 5 GB of free iCloud storage, so right away we know they aren’t backing up “everything” by default. More importantly, iCloud Backup, the feature you’re talking about, is disabled by default. You’re prompted to enable it during initial setup, but you can easily choose not to, just like with Siri and the rest of their opt-in features.
And their features that share data with Apple are opt-in, not opt-out like you’re claiming. They have a specific sharing screen with a specific sharing icon whenever you’re prompted to opt-in to a feature that will share any data with Apple. Simply say no. You can even skip setting up an Apple ID if you want.
I’d challenge you to name a single feature that shares user data with Apple that is enabled by default on a brand new iPhone.
That said, I agree that their privacy focus is a convenient way for them to spin a weakness as a strength. Apple tried to get into social networks (e.g. anyone remember iTunes Ping?) and failed. Likewise in most of these other areas where their competitors have strength. Having failed to break into those fields, they finally realized they could spin their inability to collect data en masse as a strength while leveraging the trust that lever builds by asking people to share their data anyway. People buy it and share the data.
Re: (Score:1)
Re: (Score:2)
everything is backed up to iCloud along with the key to decrypt it.
Bullcrap.
They hand that over to any law enforcement agency that asks.
Citation needed.
Here's a citation of Apple doing the opposite: Apple - FBI encryption dispute [wikipedia.org].
Re: (Score:2)
That proves the exact opposite. That's how we know, without any doubt, Apple has complete access to the backups: they handed them over to the FBI without complaint.
However, they were stale, as the phone hadn't been backed up for a while, and the FBI wanted them to unlock the phone so that they could get the more recent data. That's where Apple refused to help the FBI.
In the end, the FBI didn't need Apple's help, they were able to root the phone due to one of the many zero days in iOS.
Re: (Score:3)
I'm sure existing phones have plenty of extra memory, cpu cycles, and battery life to handle a full blown LLM engine. The *normal* apps aren't even local to the phone most of the time and instead rely upon a back office.
It actually isn’t as crazy as you think. In much the same way that there’s dedicated silicon in most CPUs for video encoding or encryption, Apple’s SoCs—both the M- and A-series—have for years been including dedicated chips for AI, tailored to their models. Hitting a general purpose CPU with an LLM is slow and power hungry, as you suggest, but these dedicated chips can do it far more efficiently, especially if the model they’re using is a lightweight (read: less capable,
Re: (Score:3)
The snag here isn't the chip, the snag is the immense amount of data required to operate. Terabytes worth.
Re: (Score:2)
If LLM models required terabytes of data to run, I fail to see how they could respond under a second to any request made to them ...
Maybe you're confused with the learning phase requirements ?
Re: (Score:2)
Google has vastly more data than that, and responds quickly. Ie, fast backoffice server, cached results, etc. On a local phone you won't have that. Ie, ask your AI phone a question about Rust programming and you get an answer - that answer was not stored on the phone, it had to have gone out to the internet.
Re: (Score:1)
They do not have to process a terrabyte to answer simple questions, the words in the question just trickle down through some paths through the network. :D
But you are right about his confusion
Re: (Score:2)
Not terabytes. A big chat-type system might be 100 GB, but you can do very well with a lot less. You can do very well with a LOT less if you stop trying to encode the world's knowledge into the thing and let it search the web.
Re: (Score:2)
Right, let it search the web. But the topic's title is "Will Be On-Device Preserving Privacy, and Not Server-Side". The strong implication is that your queries never go out to search engines. What they really mean is some language processing will be local, but will go to the internet, as the article actually states. Which means that the phrase "Preserving Privacy" is wrong. There's a lot of hand waving to mislead here; AI will be on chip, but the AI will be highly limited, mostly improving the sad state
Re: (Score:2)
There are lots of useful things you can do completely locally that would be improved by a decent language model. All of the "hey siri, set an alarm for me" would be more reliable, and you could expand them into more complicated requests. A decent language model with a reasonable amount of background knowledge could be very useful. Like an assistant.
If you want something like "hey siri, what was the final score for the local sportsball team last night?" you're going to have to look it up online, as would a h
Re: (Score:2)
Re: (Score:1)
This is the data to train an ANN, it is not the data in the end in the network.
And modern iPhones have 1TB or more storage anyway.
Re: (Score:2)
The snag here isn't the chip, the snag is the immense amount of data required to operate. Terabytes worth.
Not so. Training requires huge amounts of data to produce a model, but the resulting models can be tailored from large to small, with diminishing returns the larger you get. Some perfectly capable, not state-of-the-art LLMs (e.g. DLite) only need a few hundred MBs to exhibit ChatGPT-like behavior that would be sufficient for narrowly-focused tasks. I could easily imagine a lightweight AI model being used to make pretty much any of Apple’s existing AI tools (e.g autocorrect, on-device object identifica
Re: (Score:1)
Google's phones apparently do, they have had AI acceleration since they started using their own CPUs. Mostly used for image processing and audio processing, as they do it all on-device for privacy reasons.
Re: (Score:2)
I left the on device inference area a little over a year ago.
Unless anything has changed, apple are streets ahead of everyone else in this regard.
Basically across the board, in terms of ops/s, FP16 not int8 (or 4 thanks Samsung), buggyness and of course everyone's favourite bugbear, fragmentation.
Now technically you don't neeeeed int8, in practice it's a pain in the arse. Most models trained as FP32 will just work when you chop off half the bits. Training for int8 is a bit of a black art. It's getting bette
Re: (Score:2)
It will be interesting to see what they do with it. Siri is notorious for being a bit thick, and they don't seem to have deployed AI tech similar to what Google has (on-device speech recognition and noise removal, sound removal from recordings, generative fill and object identification for photos etc.)
Maybe they are about to take a big step forward.
Re: (Score:2)
No idea what they do themselves, I don't have an iPhone and am not likely to get one any time soon
Talking from the perspective of third party app developers deploying stuff.
Re: (Score:3)
AI acceleration is just preliminary work, there is actual data that needs to be searched if there's a query. There will be local processing, but then the queries will be remote. You can't even stick the dumb non-AI version of Google on your phone. Anything practical beyond image cleanup or voice/face recognition has to go out to bigger servers. Massive amounts of data go into the training sets; it gets distilled down somewhat, but not nearly small enough to fit on today's phones. Maybe it stores answers
Re: (Score:2)
Google does phone hold and a lot of basic assistant tasks like setting reminders on the phone.
Re: (Score:2)
How do you think existing phones manage to constantly be listening for you to talk to Siri, and scanning what is essentially a 3D video feed to try and recognize the owner to know when to unlock? Apple highly optimizes these processing pathways in their custom silicon. Any AI processing will be no different. I think people are confusing the fact that since it takes some very serious GPU hardware to run inference and process tokens real-time, that it also requires a lot of processing power. In reality, the p
Re: (Score:1)
There is a button to press to activate Siri and Face recognition for unlocking.
Unless you are on bluetooth, then Siri is listening.
In other words: neither the camera is constantly scanning nor Siri constantly listening.
Gonna need a bigger iPhone⦠(Score:2)
I suspect that the newest iPhone will be required, or there will be a hybrid approach where the pro-phones can do it locally, and the lower-powered and older phones would need to use cloud resources.
Re: (Score:1)
Or do you need a bigger boat? I caught the reference. :)
Re: (Score:1)
Turn it off (Score:1)
As long as I can turn it off, I don't care where it resides.
I have no interest in using "AI". I don't need it.
Re: Turn it off (Score:2)
Now if you let your dog in your garden, wouldnâ(TM)t it be great if an app could detect dog poo everywhere in the grass, and not detect brown lea
Re: Turn it off (Score:2)
Re: (Score:2)
Re: (Score:1)
Or you rely on natural intelligence and just configure your keyboard correctly?
Re: (Score:2)
Or you rely on natural intelligence....
That's a recipe for failure for most people.
Re: (Score:2)
I'm not denying it can be useful. I'm just stating that I don't care to use it.
It's like alcohol. I don't deny that a lot of people get utility from it. I'm just not one job them.
ha ha, nice AI you got there... (Score:1)
It would be a shame if anything was to happen to your favorite AI, wouldn't it?
Don't worry one little bit !
We do everything. so we keep a copy in the cloud at all times.. for your convenience and protection, of course.
Yeah, covers all bases. You'll hear whichever message you want to. Good marketing.
But this is Apple right? Relaaaax.
Zero control (Score:2)
Apple? You're using a device that Apple has 100% control over. It doesn't matter if the usage is local when Apple still has the keys to come in and get whatever they want.
Re: Zero control (Score:2)
Ajax on device so won't require an Ajax request? (Score:2)
So voice recognition and text to speech on-phone. (Score:2)
Likely true (Score:2)
"Apple's iOS 18 AI Will Be On-Device Preserving Privacy, and Not Server-Side"
This description is correct except for the 'privacy' part.
I Hope I get the option to opt-out (Score:1)
I might sound old fashioned but my phone has way more "features" than I ever use, why would I want AI?
And if I do want it, I install it's App and watch my battery life plunge!
Re: I Hope I get the option to opt-out (Score:2)
Re: (Score:1)
I don't want to have to use RAM to store it.
I don't want to use CPU cycles to manage it.
I don't want to consume battery to support it.
Re: (Score:2)
I mean, without reference to AI, your phone having features you don't use doesn't imply it's missing features you would use.
So far though, I don't keep going back to AI systems.
Some questions. (Score:2)
Apple is working on an in-house large language model, or LLM, known internally as "Ajax." While more advanced features will ultimately require an internet connection, basic text analysis and response generation features should be available offline.
A) Will the user know when the phone has to hit the mothership? Or will this be another debacle like the cell-assist when using data on a spotty wifi network thing, where you don't find out it's been using the cell connection instead of the local wifi until the bill arrives at the end of the month? If it defaulted to off, and allowed you turn on connection, great. But that's not Apple's MO.
B) Will it store all the local, on-device usage, to upload all of it the second it needs the mothership power?
I'm genui
AI Has No Ethical Purpose but Games (Score:2)