Apple's AI Plans Include 'Black Box' For Cloud Data (appleinsider.com) 14
How will Apple protect user data while their requests are being processed by AI in applications like Siri?
Long-time Slashdot reader AmiMoJo shared this report from Apple Insider: According to sources of The Information [four different former Apple employees who worked on the project], Apple intends to process data from AI applications inside a virtual black box.
The concept, known as "Apple Chips in Data Centers" internally, would involve only Apple's hardware being used to perform AI processing in the cloud. The idea is that it will control both the hardware and software on its servers, enabling it to design more secure systems. While on-device AI processing is highly private, the initiative could make cloud processing for Apple customers to be similarly secure... By taking control over how data is processed in the cloud, it would make it easier for Apple to implement processes to make a breach much harder to actually happen.
Furthermore, the black box approach would also prevent Apple itself from being able to see the data. As a byproduct, this means it would also be difficult for Apple to hand over any personal data from government or law enforcement data requests.
Processed data from the servers would be stored in Apple's "Secure Enclave" (where the iPhone stores biometric data, encryption keys and passwords), according to the article.
"Doing so means the data can't be seen by other elements of the system, nor Apple itself."
Long-time Slashdot reader AmiMoJo shared this report from Apple Insider: According to sources of The Information [four different former Apple employees who worked on the project], Apple intends to process data from AI applications inside a virtual black box.
The concept, known as "Apple Chips in Data Centers" internally, would involve only Apple's hardware being used to perform AI processing in the cloud. The idea is that it will control both the hardware and software on its servers, enabling it to design more secure systems. While on-device AI processing is highly private, the initiative could make cloud processing for Apple customers to be similarly secure... By taking control over how data is processed in the cloud, it would make it easier for Apple to implement processes to make a breach much harder to actually happen.
Furthermore, the black box approach would also prevent Apple itself from being able to see the data. As a byproduct, this means it would also be difficult for Apple to hand over any personal data from government or law enforcement data requests.
Processed data from the servers would be stored in Apple's "Secure Enclave" (where the iPhone stores biometric data, encryption keys and passwords), according to the article.
"Doing so means the data can't be seen by other elements of the system, nor Apple itself."
Interesting Strategy (Score:2)
Re: (Score:1)
Yeah, this is the thing. I like the privacy-first approach in theory, but - as we've seen with Siri - there's no evidence Apple's team has the skills necessary to produce a quality product while following it.
Re: (Score:2)
Yeah, this is the thing. I like the privacy-first approach in theory, but - as we've seen with Siri - there's no evidence Apple's team has the skills necessary to produce a quality product while following it.
Apple didn't write Siri; they bought it. Unfortunately, it simply wasn't designed correctly for Extensibility.
And they are every bit as frustrated by Siri as everyone else is!
That's why Apple is finally Redesigning/Rebuilding "Siri" from the ground-up. . .
Re: (Score:1)
you have been spouting this nonsense for years.
apple had 10 years to make siri extendable, only apples inability to write quatity code stopped them
Re: (Score:1)
Anonymization doesn't work.
This is a failure. It's bad for privacy, and it means that Apple can't do the processing on-device like Google does. Apple was late to the AI game and seems to be several years behind, as Google started doing on-device processing of this stuff (voice recognition, image recognition and editing etc.) back in the Pixel 6 days when it introduced its first custom CPU.
Even if you ignore the privacy issues, having to send data to the cloud and back means latency will be higher. Google's
Re: Interesting Strategy (Score:2)
Yet another reason... (Score:2)
...to avoid Apple
The cloud is not a good thing
Re: (Score:2)
-IF- you expect to use AI, where/how do you expect your data to be stored? How would Apple be any different than any other company that is considering things like Large Language Model training? I know "hate for Apple" is A Thing on Slashdot, but in this case it seems to me that you could provide at least a little justification for this particular hate.
Re: (Score:2)
where/how do you expect your data to be stored?
As other posters have said, on my device. There's even some small scale LLMs that work on a single GPU and a bog standard x86 white box machine if you need more complicated things. There's no reason for it to be cloud only, beyond that "black box" really meaning a "black box to everyone but Apple." As per the standard with Apple devices in general.
OpenAI won't let them (Score:2)
Supreme court decision on training data is still years out. Even if they could give OpenAI enough money to run their models on their own hardware, at that point they would be just as liable as letting OpenAI do it.
For the smaller less capable models they can train on public domain and licensed content, they can run it on their own cloud, but the encryption is mostly smoke and mirrors. Yes, in theory the server hardware could create a public/secret key combo and then export the public key so you can't MitM c
Re: (Score:2)
I'm not really convinced that you *can't* train the AI on smaller datasets. And I rather think you must. Smaller networks can be trained faster, and are easier to validate. (I.e., it's easier to ensure that the data used to train the network is the right data). They *are* less capable, So you need a network of networks, and you've got to figure out how to train *that*. This probably repeats for several layers. I think people usually claim the brain uses seven layers, but I'm not sure that's a close an
Private-layer inside public-layer (Score:2)
It's been said and needs to be said again: Anonymization doesn't work.
Bayesian statistics means, with sufficient time, everything can be tracked.
If data doesn't go from the NIC directly to the NPU, then somewhere in the computer, it is plain-text and vulnerable to copying.
Of course, the NPU will be doing private-layer inside public-layer encryption to guarantee data can be decrypted only by a 'known' end-point: That requires a lot of speed.
It's bad that phone applets are always connected (to the inter
Sounds like a plot line for Silicon Valley (Score:1)
Remember The Box? The fact that it was an example of what not to do didn't change the fact that it could sell.