Author of the original medium post here. I had simply never heard of COSE at the time of writing this. There was no conspiracy to bury the spec.
There are a bunch of vague accusations that I'm trying to profit or rent seek off of one of the specs I did write about. I didn't create and I don't maintain any of those. I also wouldn't trust any crypto designed by myself.
The original context for writing this post was discussions around using JOSE in the context of signing container images [1]. I was against it and preferred something simpler.
This is the same guy who cribbed MessagePack for his CBOR standard, and now it looks like he wants to jam it onto JOSE. The likely thought process here is "CBOR is MessagePack, MessagePack is binary JSON, the J in JOSE is JSON..... profit."
Disclaimer: I wrote and maintain a MessagePack implementation.
I read both this post and yours- I didn’t see anything inflammatory and your company seems orthogonally related at best.
If anything this comes across poorly for the IETF, while I understand the main point (they identify marketing/updates as a weakness) it feels like a personal rant against you.
Even more so, there is no organization. The IETF is an activity, like dancing, which humans can participate in or not as they please. You can't "join", there's nobody "in charge" and it can't believe anything.
It's true, mostly, but it's worth knowing that different IETF working groups are, in fact, organized groups, with a set process and all the path dependency you get from any planned human enterprise. There are stakeholders in those groups that have greater and lesser influence.
In the spirit of nand-gate: "An IETF[1] is angry and shocked that an internet does not want to use their crypto dumpsterfire"
Case in point, the mentioned COSE format makes signatures by defining protected and unprotected attributes of the payload and then computing signatures over the Canonical CBOR encoding of the protected attributes. All this is done so that you can have parts inside your payload that can be changed without invaliding the signature, and it of course requires two-phase canonicalization. Yes, PASETO is absolutely, unequivocally a better choice than J/COSE.
Sign. The. Goddarn. Bytes.
[1] Actually not just any IETF but the dude who made an incompatible fork of msgpack named after himself.
I had not known about JOSE and other subsequent “standards” until this article. Everyone knows JWS is a dumpster fire and to continue to pack on additional “standards” on top of it???
The original medium blog post seems fine to me.
Sounds like a salty standards author, who worked on something nobody actually wants…
Unfortunately, some have used "JWS" to refer to JOSE, because that's all they personally used. JOSE is the standard. JWS is a subset. Same with JWK, JWE, JWT, ect... It's all JOSE.
Ha that's funny. I remember looking at CBOR and wondered why anyone would use it instead of MessagePack. Looking at it again, it seems different for the sake of being different.
From the CBOR wikipedia page:
> CBOR was inspired by MessagePack, which was developed and promoted by Sadayuki Furuhashi. CBOR extended MessagePack, particularly by allowing to distinguish text strings from byte strings, which was implemented in 2013 in MessagePack.
> Integers (types 0 and 1): For integers, the count field is the value; there is no payload. Type 0 encodes positive or unsigned integers, with values up to 264−1. Type 1 encodes negative integers, with a value of −1−count, for values from −264 to −1.
WTF? JavaScript got it wrong not being able to represent 64-bit integers. CBOR wants to make 65-bit signed integers(!) by combining 64-bit positive (type 0) and 64-bit negative (type 1). A rational person would make 64-bit unsigned, and 64-bit signed types like most languages and hardware. Only an academic would propose something as disconnected. To fully support a CBOR negative (type 1) integer you'd have to use a 128-bit signed or something like that.
Back then msgpack didn’t distinguish between binary and UTF-8 strings, though that was added latter. But yeah, even if you needed to create your own format to change that you don’t need to turn your integer support into a trash fire while you’re at it.
I am not up to speed on this spec. Why on earth would they not just create an format that signs all of the bytes and makes that easy. It’s not expensive. Virtually all of the web runs on TLS… and it does fine.
I don't know anything about this spec, but typically the reason is that you want to be able to generate and verify signatures in a place and at a time where the transport isn't known. Therefore there are no "bytes" to sign. In essence the idea is to define an abstract transport (e.g. encode as JSON) and sign that. Then subsequently it doesn't matter how the bits are sent from A -> B -> C, you can always verify the signature by recreating that abstract encoding.
Obviously this is less efficient than signing the transport payload, but that doesn't help if you just don't have access to the transport payload, or when there are N different kinds of encoding used in different places in the system.
Here's an example I've encountered with a user/server system.
A user signs a message with a key and uses a thumbprint to refer to the public key. The server system needs a public key, not just the thumbprint, to verify the message. The server does not accept full public keys in the signed message since public keys are large and thumbprints are sufficient.
One design is to transport the public key along with the signed message. This allows the server to verify and store the whole signed message even if the server doesn't store the public key.
Maybe I just talk with a weird subset of cryptography engineers, but I'd have trouble finding any that think JOSE (or "COSE") is a good message signing format. The argument that people should use JOSE or COSE simply because they're standards is, of course, risible; the track record on IETF cryptography standards is miserable. I'm not sure why this person's random mailing list comment merits this kind of attention, but if you need to hear another person's response to it, sure, here it is: "no".
It's not the best we have. The best we have in that category is any reasonable standardized signature format. Seriously, there is no need for a high-level message serialization format and a signature scheme to be related in any way.
Want an ancient but still okay standard (as long as the implementation used is decent)? Use PKCS #1 signatures. Want something spiffy and modern? Use Ed25519 or similar, standardized as RFC 8032. Want a scheme based on old and very well-tested technology? Use one of many hash-based signature schemes. (Be careful -- many are stateful and you can easily shoot yourself in the foot when signing a message.)
Okay, so you need an actual written standard for how to shove the signed data and the signature into a single blob. Fine, use PKCS #7, which has been published as an RFC since before a respectable fraction of HN readers were born. Or you could just encode a tuple of (data, signature) however you like, keeping in mind that the data needs to be just plain bytes at least until the signature is verified.
Need the header to encode which key to use to verify it? Fine, add a description of the key to the tuple and make absolutely certain when verifying that you have some reason to trust the key in question. Need algorithm agility? Fine, treat the algorithm as part of the public key.
Need all the other crap in JOSE? No, actually you don't need it because it's actively harmful.
(If you don't need an interoperable standard per se and just want code, use libsodium's crypto_sign_open. It works quite well and is pretty much foolproof, which is the whole point.)
We're in agreement that JOSE could have been much simpler. Again, I'm not a fan of JOSE, but I definitely see a gaping need for something like it.
Defining the "however you like" parts is the chief role of something like JOSE. Cryptographic standards like Ed25519 or ECDSA are not messaging formats. There needs to be an agreed way to send messages. That's the point of JOSE.
> so you need an actual written standard for how to shove the signed data and the signature into a single [...] you could just encode a tuple of (data, signature) [...] Need the header to encode which key to use to verify it? Fine, add a description of the key to the tuple [...] Need algorithm agility? Fine, treat the algorithm as part of the public key.
So does RFC 8032. So do PKCS #1 and #7. It is every bit as easy and well defined to say “put a PKCS #7 message in PEM form in a header” or “put a base64-encoded RFC 8032 message in a header” as it is to say “put a JOSE message in a header”.
There is no need for a special standard explaining with great verbosity how to glue existing standards that already fit together just fine. Especially when the standard tells you how to do it wrong as JOSE does.
JOSE is "the best we have"? The best what, RFC we can pull off the shelf? The cart is dragging the horse in that logic. The point of these systems is to solve problems (interoperably, you hope). If they don't solve problems --- and JOSE creates more problems that it solves --- we should throw them into a bonfire.
Maybe there’s context I’m missing here - my only context is as a (reluctant) user of JWT, but I don’t see how the opening claim is substantiated:
> But I can’t help seeing a whole little industry creep up that is interested in creating alternative building blocks that appear to be of interest to the creators so they can attain control over them and perform rent seeking from that control.
So someone wrote an article that was negative on JOSE/JWT, and is now starting a company, which seems to have little to do with PASETO or JWT. So what? Where is the rent seeking?
Maybe I’m missing the bigger picture, as is sometimes the case when a post intended for an audience who are members of a mailing list shows up here without more context. But as it is it just reads like a jealous rant.
I think CBOR could be one of the reasons why WebAuthn unfortunately hasn’t gained more popularity. Would have been much easier for all parties if they would have simply used JSON and base64 or hex to encode/decode binary data.
I implemented the server side of WebAuthn from scratch, and CBOR felt unnecessary, the added value of encoding binary data slightly more efficient seems a small win, given the small data size transmitted/received in a WebAuthn authentication.
FWIW if you don't care about attestation, Webauthn-L2 has client-side helper functions like getPublicKey() that allow you to do the handshake without parsing any CBOR https://www.w3.org/TR/webauthn-2/#sctn-public-key-easy
If you want to check attestation you still need to parse CBOR (and whatever attestation format is inside.)
However only Chrome seems to implement the L2 spec so far. It feels like Webauthn is basically abandoned on Mozilla side of things as they still haven't finished implementing L1 (It's missing all the CTAP2 stuff) whilst it has been out for more than a year. And there have barely been any Webauthn-related commits in the past years.
But yeh; in general webauthn is a design-by-committee dumpster fire; unfortunately. U2F was conceptually way simpler.
FIDO1 was mostly just a pile of bytes. Sadly, some silly ASN.1 encoding for signatures was used that treated the big numbers as numbers instead of byte strings, causing a variable length encoding, but this wasn’t a big deal. If you wanted attestation, you had to deal with X.509, but that’s par for the course. Other than that, the spec was delightfully boring.
Can I get a simple explanation? Who is this person? What is JOSE? What is COSE? What is JWS? What is JWT? Does "crypto" here stand for "cryptography" or "cryptocurrency"? Why is this person angry?
In inverse order:
- Some people do not like/use the standards he co-wrote
- Cryptography
- JWT is a standard for giving someone a token for them to later do something on your service (and signing them so you can be sure the token's all right when it comes back to you) -- you get a demo at https://jwt.io/
- JWS is the part of JWT that deals with encryption
- COSE is further complications on top of JOSE to fix some of its footguns
- JOSE is the framework for how to use JWT, JWS and related standards, which has some obvious footguns
While I can't comment on the specific case or tech, and there are better examples than the one mentioned here, the criticism of companies trying to insert themselves into the building blocks of standards as a rent seeking strategy, especiallly when it comes to signing things, must absolutely be annoying.
I've been drawn into discussions about alternative security techs like white box cryptography, physically unique functions (PUFs), a variety of novel payments and data privacy technologies, and perhaps ironically, some very-uniquely sensible applications of blockchain based protocols compared to those other things.
I think we could put a lot of the hustles to bed if from a product and investment perspective we used the razor that betting you can outsmart your customers or market and seize some rent collecting thread in it by becoming "the standard" is a poor product strategy. Having your technology mandated isn't a product, since a standard isn't anything someone wants, it's what they must use. This makes it an anti-product.
I have seen it with certain archetype (not referencing parties here) who think they have companies and products because they have a proof of some assertion, pedigree, and credentials, and therefore you must invest with or buy from them, because they are right, and the alternative is to not be aligned to their beliefs, which raises the question of where specifically your PhD in the field is from and whether you are even qualified to decline their offer, which is to imply - you're stupid, and you should give them your money thank them for not exposing your ignorance. "Fund this or risk humiliation," must work in institutions, but it's not a product, which means it's not going to get traction, and without that user traction, it's not going to register as a candidate for a standards track.
In some very ancient words, "nothing forced is beautiful." Not referencing the parties to the original discussion specifically, but I wanted to say there is a sympathetic case to be made for the tension between standards bodies and other chancers using all kinds of institutional shenanigans to have a go at them, and perhaps this underlying dynamic is the source of the frustration that bubbles up and was expressed in the post.
Er... no? Except maybe if you are part of a specific cryptocurrency "bubble", but I'd say that for the typical HN reader, "crypto" still means cryptography in general, not its specific blockchain-related applications.
If that were true for the "typical HN reader", then most articles submitted with "crypto" in the title would be referring to cryptography, not cryptocurrency.
Well yeah, as we all know the title of a submitted article has to be left unchanged, and due to the cryptocurrency hype most articles referring to "crypto" will be actually cryptocurrency articles. But I still trust the "typical HN user" to be able to tell the two apart using just the title. And if anyone should stop using the word "crypto", it's the cryptocurrency guys, not the cryptography guys mumble mumble
There are a bunch of vague accusations that I'm trying to profit or rent seek off of one of the specs I did write about. I didn't create and I don't maintain any of those. I also wouldn't trust any crypto designed by myself.
The original context for writing this post was discussions around using JOSE in the context of signing container images [1]. I was against it and preferred something simpler.
https://github.com/notaryproject/notaryproject/pull/93