Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Google Chrome U2F API decommission: What the change means (yubico.com)
112 points by vmoore on Feb 5, 2022 | hide | past | favorite | 46 comments


A few things to note because this can be confusing. Old u2f tokens are still supported by webauthn. The change here is specifically about what javascript APIs are available to websites.

If you are wondering if a site is using the webauthn or u2f api: the webauthn flow involves the browser showing a modal dialog. The u2f flow allowed for javascript to interact with your token without the browser itself showing a dialog.

You can see what the webauthn flow is with one of the many test sites, like https://webauthn.io/


Very fortunately, USB smartcards still work, work accross platforms, work without change since V1 of Chrome, and are really out of Chrome devs reach since they operate on the OS platform lib level in Windows.


Every device work by that definition. The question is about browser support without installing additional software.

Here's my anecdote. In my country we use certificates to access government services. There are USB smartcards for secure storing and using those certificates. As there's no browser API to interact with smartcards, I'm forced to install their software. That software works only with installed root certificates (it uses https websocket on 127.0.0.1 to interact with browser). Few weeks ago they released new version and for some reason they decided to host it on a completely unrelated site owned by some private person. This is complete security madness, not even theater.

So no, smartcards don't work. They're terrible from security perspective compared to native browser support, as they force user to install untrusted software with full access to the system and terrible security practices.


> The question is about browser support without installing additional software.

It's there. SSL library interfaces with pkcs11 without browser involvement, (and yes, browser makers sabotaged the PKCS 12 API to push webauthn — which is not even a direct replacement.)

I repeat, it's not singing/encrypting with API, but providing authentication/client side encryption on the TLS layer, which works flawlessly.


Smartcard access API should be made in a decade ago, rather than Web * (Bluetooth, MIDI, whatever) API. It's fundamental.


Smartcard access api was present in all browsers decades ago, then purposefully aabotaged to make way for webauthn - which still doesn't work.


Webauthn works great.


Is it called Java Applet? It's overkill.


What's the user experience like? Last time I saw the state of things, you had to run low level CLI tools at least on Ubuntu [1] [2]. It wasn't something that you could assume nontechnical end users can handle.

[1] https://help.ubuntu.com/community/CommonAccessCard#Google_Ch...

[2] https://unix.stackexchange.com/questions/302115/installing-s...


It seems the removed systemwide p11 setup in Ubuntu, requiring few extra clicks. In Windows, it works without any hassle.


I think the main thing with Chrome/Chromium was the browser's lack of a GUI for enabling the pkcs11 plugin your smartcard reader, necessitating the use of the modutil CLI. Firefox has this in the GUI (settings -> privacy and security -> security devices).


If you're curious why it has taken Google accounts so long to switch over to webauthn, there was a fairly interesting discussion about this on the mozilla-dev mailing list back in 2019[0]. The tldr is: some Android OEMs baked U2F into their OS images that were then never updated. Switching to webauthn could have effectively locked users out of these devices.

[0]: https://groups.google.com/g/mozilla.dev.platform/c/q5cj38hGT...


I always thought that standards are there to be stable and well documented. This whole fido mess shows how IMHO it should not be done. Anyone who e.g. build stuff on top of CTAP on windows saw their stuff break because windows proxies the whole webauthn stuff through their windows hello APIs and made sure that USB access to FIDO devices is blocked on OS level. What is left is a totally underdocumented webauthn.h header file from Microsoft on GitHub. I am confident that they will break things over and over again, so that it best fits them.


It is kind of unrealistic to read every maillist your work depends on.. Many people don't know about this until the deprecation note appears on his desktop.


This mailing list post has nothing to do with the u2f api being deprecated in chrome.

Edit: correct which api is deprecated.


We just migrated our web service from u2f api to webauthn and it was a massive pain. It’s true the old keys worked but it relies on using an extensions api within webauthn.

Even without backwards compatibility, there are a lot of pieces that have to be set properly and rely heavily on third-party libraries to be written correctly for things to work.

I get the security benefits over something like OTP but it is vastly simpler to setup. Getting an RP ID to work across multiple subdomains, legacy trusted facets, origin checks, proper marshaling and unmarshaling of binary / base64 / base64 url safe are real pain points.

Things will work with localhost but then when you deploy it doesn’t with the real domains / subdomains / origin / rp id.

Not fun and the documentation is at times rather vague. For example, what is a registrable domain? I spent hours and hours trying to figure this out by reading the specification. I don’t know why but no matter what I tried I couldn’t get webauthn to recognize ngrok.io as one so things would fail. Probably my misinterpretation but still frustrating nonetheless.

We also have a cli counterpart and libfido2 has virtually no documentation so I’m traversing C code to figure out vague error messages like “err invalid length.”


From your description it sounds like you were fighting very hard to make this into a box ticking exercise which didn't deliver any security, and unsurprisingly WebAuthn wasn't interested in helping you achieve that. I found it very easy to deploy WebAuthn securely, writing everything from scratch as a toy project to understand how it works, but of course I wasn't fighting to mark the checkbox while not delivering security.

> what is a registrable domain? [...] I couldn't get webauthn to recognize ngrok.io

A registrable domain is a domain that "you" aren't sharing with somebody else. For example this site is sharing .com with millions of other separate entities, so WebAuthn is not interested in having credentials for "com" because that wouldn't mean anything and introduces needless privacy risk, but Y Combinator owns all of ycombinator.com so that would constitute a registrable domain.

ngrok.io as you presumably knew but omitted to mention, is used by large numbers of different entities to build network tunnels, so it doesn't mean anything to have "ngrok.io" credentials any more than for "com" and WebAuthn won't allow that.

For now, for want of anything better, the PSL is used in browsers to figure out what counts and what doesn't, but the rule of thumb I described above will serve you very well.


> so it doesn't mean anything to have "ngrok.io" credentials any more than for "com" and WebAuthn won't allow that.

Nowhere in the spec did I see where ngrok.io should be excluded as a valid rp id, I looked pretty intently but could have missed something. Just like with localhost this was purely a way to cheaply get tls to simulate a more real implementation.

> I found it very easy to deploy WebAuthn securely, writing everything from scratch as a toy project to understand how it works

I’m glad you found it easy but judging by the fact that you implemented webauthn by yourself, your skill level is probably advanced. Furthermore, in my particular case I needed to support legacy u2f-api keys, which added to the complexity. The APIs are different, what they expect are different, and the third party libraries are different.


Not an expert in this topic, but:

ngrok.io is on the Public Suffix List: https://publicsuffix.org/list/

This other comment says that WebAuthn uses the PSL in its definition of "registerable domains": https://news.ycombinator.com/item?id=24444476

Edit: yes, here is the part of the spec that says where you can't register public suffixes: https://html.spec.whatwg.org/multipage/origin.html#is-a-regi...


> Nowhere in the spec did I see where ngrok.io should be excluded as a valid rp id, I looked pretty intently but could have missed something. Just like with localhost this was purely a way to cheaply get tls to simulate a more real implementation.

It's in the spec, though you need to click a few links to get to the details. The WebAuthn spec (https://www.w3.org/TR/webauthn-2/) has the following in the terms defined by reference:

> is a registrable domain suffix of or is equal to

This references the following page: https://html.spec.whatwg.org/multipage/origin.html#is-a-regi...

Which contains a reference to a "public suffix": https://url.spec.whatwg.org/#host-public-suffix

Which references the Public Suffic List: https://publicsuffix.org/list/

Which ngrok.io is on: https://publicsuffix.org/list/public_suffix_list.dat

So, this limitation is in the spec, it's just not immediately obvious if you skim through the spec to find the problem.

The public suffix list ensures that you cannot accidentally share your authentication domain with others for all kinds of security measures, and WebAuthn is just one of those. Learning about the PSL and its uses (and applied limitations) is imperative if you write any kind of security-related web content, IMO.


I must have missed that! I did meander my way to the public suffix list at some point but I must not have searched it properly. Thanks!


Adam Langley's blog about webauthn is still the best resource that explains how everything fits together: https://www.imperialviolet.org/2018/03/27/webauthn.html


IIRC the only reliable way to check if a domain is registrable is to rely on a public suffix list. ICANN's TLD binge has made keeping those PSLs up to date a challenge.


We used USB smartcard, and they were plug-and-play for the last n-years, despite Google's repeated attempts to axe/stealth brake them (and being returned to working order every time due to corporate clients pressure)


I just migrated our auth service for my employer and wrote up a guide on what I think you need to know and should do if you're going through it. I think it was overall incredibly straightforward, and the Yubico maintainers even added some additional functionality to their Java server-side library as I corresponded with them.

To plug myself a bit, hopefully this can be helpful to others, and I'd appreciate feedback if you find it unclear: https://www.jacobcasper.com/u2f2webauthn.html


This is the first I'm hearing of this change, and considering that this is likely true for many people the migration timeline seems awfully short.


I think we found out about it in October or November from a pre-release Chromium Edge user? I definitely was surprised to see as short a turnaround as 5-6 months, and if you're only hearing about it now I can't imagine the stress.

I was lucky to have some time to work on it and get it done before the holidays so I didn't have to worry too much. I wrote up a migration blog post and commented it here as well because I know I wanted that kind of resource when I was working on it, so if anyone needs that I hope it helps.


If you're using the U2F API you've been getting a warning about it since November. I can't imagine developers that are using it are still unaware at this point, particularly when it's a user-facing warning and not just buried in the developer tools.


3 months is not a long time at all.


It's three months to become aware of the problem, and maybe migrate. If you cannot complete the migration in the window you can opt into the deprecation trial and you'll have until around August.


Especially not across the Holiday period.


It's not, but as someone who has to deal with multi-year depreciations, setting the expectation that 3/4 month depreciations are something you'll have to deal with, is probably healthier for the ecosystem as a whole. It sets up positive incentives around automating testing and deployments. Anything that's valuable enough that you'd use u2f to auth to needs to be in this mode for general security updates regardless. Granted, moving to a new API is almost certainly more work than a normal security upgrade, but the point is that these types of websites are not something that you can set and forget. They need dedicated, ongoing maintenance.


The migration period since warnings is afully short (~3 month, over Christmas and new year too).

But as far as I understand you where supposed to start mitigating since WebAuthn was ready. Not sure when that was but it was quite a while ago.


The deprecation and accompanying warning was in November but Google publicly stated their intention to deprecate in November and remove at the end of February at least as far back as June.

Edit: The intent to deprecate and remove was sent to the blink-dev mailing list on June 11th. It can be found on row 2731 of the Blink Intents spreadsheet, https://bit.ly/blinkintents.


It's three months from notification until it's disabled by default. You can opt into the deprecation trial and you'll have until early August to migrate.


I have been hearing about this for a few months now since logging in to Vanguard (the brokerage) throws a Chrome warning about using the deprecated API. I don't think Vanguard has migrated yet.


As a YubiKey user, is this something I need to worry about?


Most likely not.

There had been a U2F API which later on had been superseded by WebAuthn (which internally uses the U2F protocol) and which is now shut down.

There is a good chance all your services are using WebAuthn.

I'm not sure but I think Firefox never supported the API which is now deprecated.


No. Webauthn is backwards compatible with old u2f tokens, so reguardless of how old your YubiKey is, it will continue to work.


Only site I use that uses U2F (and I enroll my keys everywhere I can) is Vanguard; every other site I use supports WebAuthn.


The blog post is misconstruing things; WebAuthn is not totally backwards compatible with U2F. Specifically, WebAuthn doesn't support the U2F trusted facets functionality, which allowed for a U2F credential to be used on a cross-origin basis. https://fidoalliance.org/specs/fido-u2f-v1.2-ps-20170411/fid...


Is anybody actually doing that? To me it looks like something that's astoundingly fragile and unwieldy and I'd expect that most "users" of this feature were either opening themselves up to a vulnerability they hadn't considered or were in fact attackers targeting such a vulnerability. Yuck.


Oh god, that scared me for a moment until I realized what's going on ...


Atlassian is the only site where I’ve gotten warnings about the old API being used.


Salesforce is another one.


Very unrelated: why does the favourite stock photo for "Google Chrome" is a 2014-era Firefox on a Windows 7 computer displaying the Chrome download webpage? I'm not just talking about this particular post, but rather it seems that there's no incentive for stock photo companies to commission an update.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: