Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Google Android Chrome Privacy Security

Google is Bringing Passkey Support To Android and Chrome (googleblog.com) 63

Android Developers Blog: Passkeys are a significantly safer replacement for passwords and other phishable authentication factors. They cannot be reused, don't leak in server breaches, and protect users from phishing attacks. Passkeys are built on industry standards and work across different operating systems and browser ecosystems, and can be used for both websites and apps. Passkeys follow already familiar UX patterns, and build on the existing experience of password autofill. For end-users, using one is similar to using a saved password today, where they simply confirm with their existing device screen lock such as their fingerprint. Passkeys on users' phones and computers are backed up and synced through the cloud to prevent lockouts in the case of device loss. Additionally, users can use passkeys stored on their phone to sign in to apps and websites on other nearby devices.

Today's announcement is a major milestone in our work with passkeys, and enables two key capabilities: Users can create and use passkeys on Android devices, which are securely synced through the Google Password Manager. Developers can build passkey support on their sites for end-users using Chrome via the WebAuthn API, on Android and other supported platforms. To try this today, developers can enroll in the Google Play Services beta and use Chrome Canary. Both features will be generally available on stable channels later this year. Our next milestone in 2022 will be an API for native Android apps. Passkeys created through the web API will work seamlessly with apps affiliated with the same domain, and vice versa. The native API will give apps a unified way to let the user pick either a passkey or a saved password. Seamless, familiar UX for both passwords and passkeys helps users and developers gradually transition to passkeys.

For the end-user, creating a passkey requires just two steps: (1) confirm the passkey account information, and (2) present their fingerprint, face, or screen lock when prompted. Signing in is just as simple: (1) The user selects the account they want to sign in to, and (2) presents their fingerprint, face, or screen lock when prompted. A passkey on a phone can also be used to sign in on a nearby device. For example, an Android user can now sign in to a passkey-enabled website using Safari on a Mac. Similarly, passkey support in Chrome means that a Chrome user, for example on Windows, can do the same using a passkey stored on their iOS device. Since passkeys are built on industry standards, this works across different platforms and browsers - including Windows, macOS and iOS, and ChromeOS, with a uniform user experience.

This discussion has been archived. No new comments can be posted.

Google is Bringing Passkey Support To Android and Chrome

Comments Filter:
  • linked to biometrics??

    and when some copys that biometrics you may not beable to change it

    • by Junta ( 36770 )

      Based on my experience with webauthn, I presume it's the same deal. Biometrics only in the local device context. The infrastructure would probably let a 'bad' device skip the biometric check. Therefore they probably require device attestation to prove hardware and software load and non-rooted devices. Presumably each device is given a key and expected to protect that key locally using the face id/fingerprint/pin facility already present in the device. The biometrics never leave the device.

      • by ArmoredDragon ( 3450605 ) on Wednesday October 12, 2022 @12:52PM (#62960537)

        The private key never leaves the device either way, the biometrics just authorize using the key to sign messages. A simple PIN would do the job pretty well. The idea is that some random person can't steal your hardware and just go and use it willy nilly. If they failed to enter the PIN x number of times, the device either locks or deletes the private key, depending on the policy.

        The idea is you have more secure 2FA without needing to go through that verification crap that's prone to hacking anyways. Passwords can also be eliminated. Regardless of how strong or weak the other factor is, it's a hell of a lot more secure than passwords and sms/email verification because they can no longer rely solely on social engineering.

      • Biometrics are a concern because they're not as protected as passwords legally.

        Governments can compel or trick you to get a picture of your face or fingerprint or DNA --- but they can't compel you to give up a password without more due process.

    • Wouldn't the biometrics be used to unlock the private key which exists (and which exists in highly protected storage) on the device?

      If this is this case, if you don't have the private key, a copied fingerprint wouldn't do any good.

  • So we know that: 1. it will be discontinued in 6 months, unless 2. data about every use of your password brings them enough money.

    • Doubtful, passkeys basically supersedes webauthn/fido2, they'd be shooting themselves in the foot. It's an open standard that pretty much all of the big players are moving to. That would be like discontinuing support for TLS.

      • Exactly. And, Google is a month the consortium of companies that developed this technology. Apple was the first to announce it as part of iOS 16 and discussed how to use passkeys at their developerâ(TM)s conference.

        What I do wonder is if Google was able to add the additional features that Apple presented or if they are proprietary to Apple.

        I think I read that Amazon and some other key players are jumping onboard as well.

        This is good news.

  • by Locke2005 ( 849178 ) on Wednesday October 12, 2022 @12:29PM (#62960437)
    I'm tired of having to do 2FA to access my bank account while I'm driving!
  • by ArmoredDragon ( 3450605 ) on Wednesday October 12, 2022 @12:30PM (#62960443)

    Recent version of iOS basically deprecated hardware keys in favor of passkeys. But apple, in their infinite wisdom, decided that the passkeys have to be shareable, making the whole thing fucking pointless. Although Google is doing shareable passkeys, they can also be provisioned as non-exportable so the private key never leaves the device. Why apple wouldn't do that as well, only apple knows. Why they also had to cripple the fuck out of hardware keys for no good reason, only they know as well.

  • Passkeys on users' phones and computers are backed up and synced through the cloud to prevent lockouts in the case of device loss.

    Is that stored encrypted such that it is not recoverable without a passcode known only to the user? So either it's not stored encrypted by the user password, or it is stored in a manner that makes it susceptible to being retrieved by Google or its employees.

    • Hey now, I am sure that the Google employees would need a valid* request from a law enforcement authority in order to unlock it...

      *valid requests can be bypassed if the requester is in a hurry

      • They probably wouldn't need to do anything other than inject their own signing key/certificate somewhere into the process so that any auth requests signed with their key would let you in to whichever account you want.

    • That's the apple approach. The Google approach allows both that and non-exportable keys. The app/website can request device attestation (signed by a vendor issued key) that the key is non-exportable if desired.

  • They want to use my music and porn device as a security key? This doesn't seem like a good idea.
  • If my password gets leaked, I can change it. I can't, however, get a new fingerprint. This makes biometrics an inherently bad choice for any form of authentication.

    Now, when combined with passkeys, if your biometrics become the only barrier to getting full access to your online accounts, if they leak, the malicious actor gets access to your entire online identity in one go. Forever.

    This may also affect any future accounts that you create, unless you fancy logging in with a new finger. You'd get 10 lives, ca

    • by ebh ( 116526 )

      That's why biometrics should be user IDs, not passwords. Otherwise your IT department's paleolithic password-aging rules come into conflict with the cost of organ transplants.

      • by ArmoredDragon ( 3450605 ) on Wednesday October 12, 2022 @02:46PM (#62960913)

        This is a horrible, horrible misunderstanding of how this works. Your biometric information never leaves your device. It's stored in a secure enclave, and its only purpose in this case is to give another hardware component permission to cryptographically sign messages. This proves two things:

        - You have the private key, which another hardware component can testify (via vendor issued certs) is non-exportable, meaning the signer physically has the original device that generated the trusted public key.
        - Your biometric information was used to authorize that physical device to cryptographically sign a message.

        Can this be hacked? Yes, but it requires one of a few very non-trivial things:
        - The hacker somehow knows the public key (feasible) and used some serious compute effort to determine the private modulus. So long as the algorithm is reasonably up to date, this would be extremely costly, think multiple millions of dollars.
        - The hacker physically obtained the device and somehow extracted the private key. So long as the implementation is reasonably secure, this will also be outrageously expensive, not to mention some novel hard hacking techniques will need to be employed.
        - Some technical means of extracting the key via software is discovered. This would likely be just as costly as the above mentioned methods, and would rely on a long chain of vulnerabilities in both the software AND the hardware.

        The endgame here is that this makes social engineering attacks far less likely to succeed, perhaps even impossible to succeed without also bringing some very hard technical skills into the mix.

        • Can this be hacked? Yes, and with the simple trivial case of:
          1) The hacker runs some code on the device that triggers an authentication request / waits for one to occur.
          2) The code then gets the token they need, and spoofs an "error has occurred please try again" message.
          3) User performs authentication again, gets what they want, and is none the wiser.
          ???
          Profit!

          Note: 2 and 3 need not occur in that order.
          Authentication fatigue is a thing. Just as much as confirmation prompt fatigue. Most victims won'
          • 1) The hacker runs some code on the device that triggers an authentication request / waits for one to occur.

            You can't trigger authentication requests from random devices on the internet, that is one of the big differences compared with other solutions. If you meant that the attacker has control over the device that has the key and/or the one that you are authenticating anyway and use to access that resource you already lost, there's no point having extra steps or discussing authentication fatigue.

          • Note: 2 and 3 need not occur in that order.

            Depends on the implementation, but you're assuming that step 1 would be trivial. It won't be trivial.

            While nobody is claiming that this will be hack proof, the barrier to entry is really high, even in your scenario here. For step 1 to work, you'd again need a chain of hardware and software vulnerabilities that you've exploited. The implementation will need hardware attestation that the authorized person has interacted in some way to approve the sign request.

            For those unaware, hardware attestation is done wi

            • While nobody is claiming that this will be hack proof, the barrier to entry is really high, even in your scenario here. For step 1 to work, you'd again need a chain of hardware and software vulnerabilities that you've exploited. The implementation will need hardware attestation that the authorized person has interacted in some way to approve the sign request.

              Step 1 doesn't require a massive exploit chain. Only that they can execute a request to the legit APIs that provide access to the enclave. (I.e. an Oracle. [wikipedia.org]) FYI, a simple browser request could do that. Run a captive portal, wait for the user to access a site that requires the use of the enclave, then provide a spoofed page. Step 2 and 3 are then invoked. You don't need to exploit anything other than the weariness of the user. AKA. Social Engineering, the greatest threat to IT Security by far.

              For those unaware, hardware attestation is done with a unique vendor issued certificate that is provisioned/signed by the vendor's issuing keys at the time the device was created at the factory, and is also non-exportable. The owner of the device can't even extract its private keys without some kind of exploit.

              For those una

              • Step 1 doesn't require a massive exploit chain. Only that they can execute a request to the legit APIs that provide access to the enclave. (I.e. an Oracle.) FYI, a simple browser request could do that. Run a captive portal, wait for the user to access a site that requires the use of the enclave, then provide a spoofed page. Step 2 and 3 are then invoked. You don't need to exploit anything other than the weariness of the user. AKA. Social Engineering, the greatest threat to IT Security by far.

                You think this because you don't understand the protocol. The original domain that the user authenticated against is hashed into the authentication request along with many other bits. Because the spoofed request will be from an invalid domain, the authenticator will reject the hash. So yeah, they can try fatiguing the user, and the user could approve a spoofed page, but it's not going to do the attacker any favors.

              • Oh and this bit:

                For those unaware, any such device should be immediately considered untrustworthy. As it's keys could be held in escrow, with or without the user's knowledge, and used against them.

                Yeah, that's why it's an attestation key. Any competent vendor is going to warn you against using that key for any purpose other than attestation, for exactly this reason. Yubikey offers exactly such a warning in all of its documentation. So don't use it for authentication or for signing documents. Only use it for its intended purpose.

                People much smarter than you and much more experienced in both engineering and breaking authentication schemes have put a lot more thought into this than your

    • Why is everyone going on a rant about biometrics?! The word just doesn't appear in TFA or summary! You aren't "getting full access to your online accounts" with the biometrics, you're just using it to unlock the local device. It's like ssh-ing with key based authentication instead of a password. You can still change everything and it's safer in many ways (no weak passwords, no password reuse, no way to fill it in the wrong site and so on) and you can still change anything you like. The biometrics are used j

  • Why not use client certs instead of the Google controlled crap? That's all this is except with less security and unnecessary complexity.

    Why settle for the inferior solution?

  • Another step in the ongoing quest to tie people to specific devices to access online resources. So much for borrowing your friend's PC/laptop to do something while you're visiting, etc... Also don't really want my login info backed up to Google's cloud. I get the supposedly increased security, but it's not something I need over passwords and current TFA.

    • So much for borrowing your friend's PC/laptop to do something while you're visiting, etc...

      Actually it makes using "guest PCs" easier AND SAFER than ever. There's nothing to sniff (like your password), there's no way to enter the password or OTP in a fake site, there's no way to approve by mistake some remote attacker (like it happened recently in some high profile attacks).

      Sure, you still need the phone to approve the login (the link between the guest PC and the phone are either bluetooth or a QR code from

  • The summary states "Passkeys on users' phones and computers are backed up and synced through the cloud to prevent lockouts in the case of device loss." Doesn't this also introduce an attack vector where my passkeys could be stolen and used to auth as me?
    • by HiThere ( 15173 )

      Yes. The summary was clearly written, or at least edited, by the PR department to remove any trace of a reason why this is a bad idea.

      I don't know whether it's a good idea, or a bad one, but I sure wouldn't decide based on this article.

  • Google assumes that everyone's only got one Google account. Meanwhile, putting your main email account on your phone is a horrendously stupid idea for a zillion of different reasons. Phone theft, malware, tracking... the list goes on and on. That's why many people, including myself, use a separate phone-only account which completely undermines the usefulness of using your phone for authentication on other devices.

    • That's why many people, including myself, use a separate phone-only account which completely undermines the usefulness of using your phone for authentication on other devices.

      Why? The phone just acts like a FIDO2 key, the account on the phone is never exposed to the sites you log in to, you just register that phone like you'd do it with a (much dumber and simpler) yubico yubikey or similar. The sites won't know your google account (be it main or phone-only).

      • The problem is not with the sites seeing my account. The problem is that the account A on my phone is separate to the account B that I use on my laptop. If i auth on the web, it will ask me to authenticate with B, whereas my phone is linked to A.

        • There is no "account" involved (except for the purposes of moving your key to another phone at some other time if needed/desired)! The phone has basically a fancy crypto key that can be plugged into any site that supports it. "plugging in" means when using the laptop and trying to log in to (for example) github it'll ask for the hardware key (if you configured it previously) and it'll either find the phone over bluetooth or present a QR code you need to scan with the phone. Then you tap ok that you want to

  • by Jiro ( 131519 ) on Wednesday October 12, 2022 @02:31PM (#62960871)

    There have been a number of infamous cases of Google closing people's accounts for spurious reasons and allowing them no way to appeal. What's going to happen if you have passkey support that works like this, and then Google locks you out of your account?

    • by AmiMoJo ( 196126 )

      Your Google account is only used for sync. If you can't access your Google account your devices still have a local copy of the key. After all, if they didn't you wouldn't be able to use it to log into your Google account.

      Decent implementations of WebAuthn support multiple keys, so for example I have both my phone and a Yubikey on Github. WebAuthn also provides backup codes that you are supposed to store securely yourself (I use Keepass), which can get you into the account if you lose all your keys.

  • If we are going to move away from passwords to something involving cryptography, a hardware key (u2f/fido/whatever it's called these days) would be better than something relying on a phone.

    Much less likely that a hardware key can be hacked or has a flaw that allows the secrets to be exposed. You can use a hardware key on any device that supports the necessary APIs and has an appropriate interface to allow the hardware key to talk to it. And a hardware key is less likely to be out of battery at the time you

    • Yes, you CAN use a hardware key! Are you using one? Are the people around you using one? Surely not. Do they have smartphones? Probably yes. This is for these people (which covers really mostly everyone).

  • So how do you log into your phone if you have no internet connection?
    • This isn't about some fancy thing that replaces the lock screen on the phone (although probably detailing how you log into your phone got everyone up in arms about biometrics and stuff). Your log in to your phone as you normally would, biometrics (buhuhu, biometrics bad because you can't change your face or fingerprint), if you don't like that with PIN or password, whatever, NOTHING changes. This isn't the discussion.

      The phone is basically used as a fancy crypto hardware key. That's all. It's like you'd buy

  • (1)Google creates a new "one ring to rule them all" authentication scheme ...

    (2)Once a large number of people have signed up and Google has gobbled up a lot of meta-data [call me credulous, but it's possible Google won't go after the security mechanism itself, rather your history of where, when and with whom you logged in] ...

    (3)Google gets bored and drops the product - it's got what it wants and who cares about loads of users who now have a lot of security housekeeping to do?

    I may be cynical, but I wouldn'

  • ... existing device screen-lock ...

    This is 2 'something you have' security; the authentication string and the device: Screen-locks tend to be low security (ie. stopping crimes of opportunity) and thus don't count when the authentication string is plaintext. My gut reaction is, put the phone in debug mode and copy the plaintext authentication string and username to another device: Access granted

  • Isn't this like 2FA? Where you link up your times well enough, and a starting shared key. Then generate a time based key periodically.

    Is this just a FIDO like thing? Didn't they support that already?

It is now pitch dark. If you proceed, you will likely fall into a pit.

Working...