Security Flaw Lets Attackers Recover Private Keys From Qualcomm Chips (zdnet.com) 44
Devices using Qualcomm chipsets, and especially smartphones and tablets, are vulnerable to a new security bug that can let attackers retrieve private data and encryption keys that are stored in a secure area of the chipset known as the Qualcomm Secure Execution Environment (QSEE). From a report: Qualcomm has deployed patches for this bug (CVE-2018-11976) earlier this month; however, knowing the sad state of Android OS updates, this will most likely leave many smartphones and tablets vulnerable for years to come. The vulnerability impacts how the Qualcomm chips (used in hundreds of millions of Android devices) handles data processed inside the QSEE.
Could private keys be separated from devices (Score:1)
Re:Could private keys be separated from devices (Score:4, Funny)
That seems a bit risky on the same device that you install God knows what from God knows who in the app store.
Re: Could private keys be separated from devices (Score:2)
The keys need to be both written, read and processed. In most of these exploits, the only reason to attack the chip is to get the keys faster or simply demonstrate its possibility, in real life the attacker with such accessibility most likely could've just waited for or forced the OS/kernel to write/overwrite the keys there in the first place.
Any sort of security enclave is simply a set of bits that designate which areas of the memory contain the key and the processor has some software to limit access but i
Re: Could private keys be separated from devices (Score:1)
Don't load it to memory. Also only when encrypting (Score:3)
When you use an external key device, such as a Yubikey, you don't read the key into RAM.
Even if you did, it's quite a bit more secure to only have your GPG key present while you're encrypting a secure email than have it laying around on disk all the time. As another example, I could use a USB key to log in to my corporate network, which again has the key physically attached to the computer for only a few seconds, not constantly while I'm surfing the web or whatever I do all day.
Normally, though, the USB key
Solved problem with PUFs... (Score:2)
Some chips have a physically unclonable function, usually resulting from the taking advantage of impurities in the silicon die. With this, a device wouldn't need to keep a private key around, other than the time to use it for a function. Or, it could generate a private symmetric encryption key for storing stuff in a register or other place, and only in the chip's memory for the time it takes to do the encryption/decryption.
Keeping keys in a "secure" area is yesterday's tech. A PUF allows you to tell a de
Is it a flaw or is it intentional? (Score:5, Interesting)
Re: (Score:3)
Is it a flaw or is it intentional?
It's definitely a flaw. It's another cache based attack that people haven't considered until recently.
We live in a time where governments, intelligence services, and law enforcement desperately want to be able to break encryption on any device at any time without delay.
If they had cut a deal with Qualcomm then they wouldt have just made the private keys static and recorded them. Testing a billion keys until you find the right one is trivial,
Re: (Score:3)
Re: (Score:2)
It's definitely a flaw. It's another cache based attack that people haven't considered until recently.
If they had cut a deal with Qualcomm then they wouldt have just made the private keys static and recorded them. Testing a billion keys until you find the right one is trivial,
There is nothing definite about it. The deal the NSA cut with RSA over Dual_EC_DRBG involved an engineered exploit which had nothing to do with static keys and the same could have happened here.
Re: (Score:2)
The NSA and GCHQ always got their plain text in real time.
Re: (Score:2)
I'd also say part of it is that a lot of businesses just don't care about security, especially when the VC guys tell the business owner that they better start showing some better numbers or else funding will be yanked. So, just to get stuff out the door, any real semblance of security gets gutted.
There is just no real financial interest in having secure products. If a product is completely compromised, there is no liability for the company, and customers will just buy version 1.1 of the product that fixes
Re: (Score:2)
You probably meant "Fortunately, Apple uses their own secure enclaves built into their own CPUs."
Project Zero (Score:3)
Looks like a success story here (Score:2)
Let's summarize:
- The chipmaker designed an 'airtight' area that kept it's cover for years and years
- then a researcher finds a way in, informs the chipmaker. This is March 2019.
- The chipmaker reproduces and fixes the vulnerability in no? time and gets the fix included into the April 5th security update (the very next)
- I read about the issue here, check my phone (which is among the affected ones), check the security patch level (05.04.19) and find the fix was installed like 10 days ago.
- my phone is 5 yea
Re:Looks like a success story here (Score:5, Informative)
- then a researcher finds a way in, informs the chipmaker. This is March 2019.
The actual paper [www.nccgroup.trust] says March 2018. Here's the full timeline:
We disclosed the key extraction vulnerability to Qualcomm in March 2018, and from then until October
2018, they developed, reviewed, and propagated a fix. Qualcomm notified affected OEMs and carriers at
this point, triggering the start of a six-month recertification process. Finally, the issue was publicly disclosed
in April 2018. This issue was assigned identifier CVE-2018-11976.
Re: (Score:2)
my mistake, I misread the fine article. 12 months in between make the flow of events less implausible indeed.
Re: (Score:2)
Is this why Lineage wiped out some Qualcomm keystores?
I recall Signal users being particularly pissed.
Can I use this to get into my *own* phone? (Score:5, Interesting)
Every time one of these "OMG huge flaw in smartphones!" stories comes up, my first thought isn't "OMG I'm vulnerable!" it's "Hey, I wonder if I could use that to get root access and get full control of my own phone, maybe install some customer firmware or at least scrape out all of this horrible invasive bloatware."
I'm assuming in this case it's not really relevant for that since in this case it sounds like you'd need root access already to make use of it, and I'm assuming this only affects "user" data rather than, for example, bootloader-locking.
But, still, it feels weird to me to be constantly rooting for more security flaws, on the off chance that it'll widen the range of mobile devices that I'm willing to buy (i.e. rootable/customizable devices).