Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Android Security Technology

Security Flaw Lets Attackers Recover Private Keys From Qualcomm Chips (zdnet.com) 44

Devices using Qualcomm chipsets, and especially smartphones and tablets, are vulnerable to a new security bug that can let attackers retrieve private data and encryption keys that are stored in a secure area of the chipset known as the Qualcomm Secure Execution Environment (QSEE). From a report: Qualcomm has deployed patches for this bug (CVE-2018-11976) earlier this month; however, knowing the sad state of Android OS updates, this will most likely leave many smartphones and tablets vulnerable for years to come. The vulnerability impacts how the Qualcomm chips (used in hundreds of millions of Android devices) handles data processed inside the QSEE.
This discussion has been archived. No new comments can be posted.

Security Flaw Lets Attackers Recover Private Keys From Qualcomm Chips

Comments Filter:
  • Seems too often that "private" "secure" areas of such devices are compromised. Wouldn't it be better if private keys were only kept on secure removable media (say, a USB drive / yubikey / java ring / etc) and when a computer or phone needed the private key it asked for it, and the device with the private key could be connected to it physically?
    • I also say your proposal only produces more attack vectors. And does not solve the problem if the area being read is still just as [un]protected as before. Store my key in #aaa, which gets read maliciously Vs Store my key on my belt, insert into PC, which copies to #aaa (memory), which gets read maliciously Difference?
      • When you use an external key device, such as a Yubikey, you don't read the key into RAM.

        Even if you did, it's quite a bit more secure to only have your GPG key present while you're encrypting a secure email than have it laying around on disk all the time. As another example, I could use a USB key to log in to my corporate network, which again has the key physically attached to the computer for only a few seconds, not constantly while I'm surfing the web or whatever I do all day.

        Normally, though, the USB key

    • Some chips have a physically unclonable function, usually resulting from the taking advantage of impurities in the silicon die. With this, a device wouldn't need to keep a private key around, other than the time to use it for a function. Or, it could generate a private symmetric encryption key for storing stuff in a register or other place, and only in the chip's memory for the time it takes to do the encryption/decryption.

      Keeping keys in a "secure" area is yesterday's tech. A PUF allows you to tell a de

  • by Rick Schumann ( 4662797 ) on Wednesday April 24, 2019 @05:51PM (#58486316) Journal
    We live in a time where governments, intelligence services, and law enforcement desperately want to be able to break encryption on any device at any time without delay. Furthermore government agencies have strong-armed companies in the past, like Cisco, to intentionally install 'backdoors' into their technology, so those services can access them freely. Who's to say that this 'security vulnerability' that's been discovered wasn't intentionally put there for similar reasons? There's so much more 'plausible deniability' if you disguise your 'backdoor' as an unintentional flaw.
    • Is it a flaw or is it intentional?

      It's definitely a flaw. It's another cache based attack that people haven't considered until recently.

      We live in a time where governments, intelligence services, and law enforcement desperately want to be able to break encryption on any device at any time without delay.

      If they had cut a deal with Qualcomm then they wouldt have just made the private keys static and recorded them. Testing a billion keys until you find the right one is trivial,

      • It's a flaw, one of endless more still to be discovered when you have a "security" system that consists of a marketroid taking a block diagram of an ARM SoC, drawing a line around one part in magic marker, and labelling it "secure", which is what TrustZone and TEE are. No need to look for conspiracies here, it's secure by marketing fiat, not by actual practice.
      • by Agripa ( 139780 )

        It's definitely a flaw. It's another cache based attack that people haven't considered until recently.

        ...

        If they had cut a deal with Qualcomm then they wouldt have just made the private keys static and recorded them. Testing a billion keys until you find the right one is trivial,

        There is nothing definite about it. The deal the NSA cut with RSA over Dual_EC_DRBG involved an engineered exploit which had nothing to do with static keys and the same could have happened here.

    • by AHuxley ( 892839 )
      Read up on how the NSA and GCHQ won over every in use crypto product sold in the 1950-1990's.
      The NSA and GCHQ always got their plain text in real time.
    • I'd also say part of it is that a lot of businesses just don't care about security, especially when the VC guys tell the business owner that they better start showing some better numbers or else funding will be yanked. So, just to get stuff out the door, any real semblance of security gets gutted.

      There is just no real financial interest in having secure products. If a product is completely compromised, there is no liability for the company, and customers will just buy version 1.1 of the product that fixes

  • by SJ2000 ( 1128057 ) on Wednesday April 24, 2019 @06:21PM (#58486510) Homepage
    I found Gal Beniamini work with Project Zero [blogspot.com] quite informative on the issues with these TEEs, especially how they handle memory which appears to be directly related to this exploit with the QSEE ECDSA implementation?
  • Let's summarize:

    - The chipmaker designed an 'airtight' area that kept it's cover for years and years
    - then a researcher finds a way in, informs the chipmaker. This is March 2019.
    - The chipmaker reproduces and fixes the vulnerability in no? time and gets the fix included into the April 5th security update (the very next)
    - I read about the issue here, check my phone (which is among the affected ones), check the security patch level (05.04.19) and find the fix was installed like 10 days ago.
    - my phone is 5 yea

    • by SlaveToTheGrind ( 546262 ) on Wednesday April 24, 2019 @06:43PM (#58486624)

      - then a researcher finds a way in, informs the chipmaker. This is March 2019.

      The actual paper [www.nccgroup.trust] says March 2018. Here's the full timeline:

      We disclosed the key extraction vulnerability to Qualcomm in March 2018, and from then until October
      2018, they developed, reviewed, and propagated a fix. Qualcomm notified affected OEMs and carriers at
      this point, triggering the start of a six-month recertification process. Finally, the issue was publicly disclosed
      in April 2018. This issue was assigned identifier CVE-2018-11976.

      • my mistake, I misread the fine article. 12 months in between make the flow of events less implausible indeed.

      • Is this why Lineage wiped out some Qualcomm keystores?

        I recall Signal users being particularly pissed.

  • by Dr.Dubious DDQ ( 11968 ) on Wednesday April 24, 2019 @08:26PM (#58487038) Homepage
    (Probably not in this case specifically, but...)

    Every time one of these "OMG huge flaw in smartphones!" stories comes up, my first thought isn't "OMG I'm vulnerable!" it's "Hey, I wonder if I could use that to get root access and get full control of my own phone, maybe install some customer firmware or at least scrape out all of this horrible invasive bloatware."

    I'm assuming in this case it's not really relevant for that since in this case it sounds like you'd need root access already to make use of it, and I'm assuming this only affects "user" data rather than, for example, bootloader-locking.

    But, still, it feels weird to me to be constantly rooting for more security flaws, on the off chance that it'll widen the range of mobile devices that I'm willing to buy (i.e. rootable/customizable devices).

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...