KeyUsage encoding

My question is about encoding, but I'm including it here because there is some praise to be shared. As a total newbie to the whole Certificates world, I emailed support and JC Jones redirected me here. Instead of chastising me for barking at the wrong tree at 4.30 AM (I guess it's Pacific Time), he was exceedingly supportive and encouraging. @jcjones, you're a gentleman!

Here goes the teanscript of my email:
---//---
Dear sirs,

Intro

I'm writing to you because I've been using your article A Warm Welcome to ASN.1 and DER as the backbone of my initial journey into Digital Certificates. From my position of almost 3 decades' as a teacher / technical trainer, I highly commend you for the clarity and ease of language in this article. That's a fine balance there.

I'm also using this address because there is no mention to an author in the article. I like the "collective" vibe, but I apologise in advance for eventually abusing this mailbox.

Scenario

I'm using Python's asn1crypto to decode / encode certificates. As a test, I'm decoding / encoding all 148 certificates I could find in my Linux 24.04. Only two of these are not being encoded exactly as per the original:

/usr/share/ca-certificates/mozilla/Trustwave_Global_ECC_P256_Certification_Authority.crt
/usr/share/ca-certificates/mozilla/Trustwave_Global_ECC_P384_Certification_Authority.crt

In both cases, the DER encoding of the original certificate differs from the one generated. Here's the relevant part's clear text on both, from openssl:

$ openssl x509 -noout -text -in /usr/share/ca-certificates/mozilla/Trustwave_Global_ECC_P256_Certification_Authority.crt
            X509v3 Key Usage: critical
                Certificate Sign, CRL Sign

These are bits 5 and 6 from the KeyUsage BIT STRING.
The original tree is (from ASN.1 JavaScript decoder):

Extension SEQUENCE @479+15 (constructed): (3 elem)
  extnID OBJECT_IDENTIFIER @481+3: 2.5.29.15|keyUsage|X.509 extension
  critical BOOLEAN @486+1: true
  extnValue OCTET_STRING @489+5 (encapsulates): (5 byte)|0303070600
    BIT_STRING @491+3: (9 bit)|000001100

and the bytes for the BIT STRING (NINE BITS) are:

03 03 07 06 00

Upon reencoding, I get:

Extension SEQUENCE @479+14 (constructed): (3 elem)
  extnID OBJECT_IDENTIFIER @481+3: 2.5.29.15|keyUsage|X.509 extension
  critical BOOLEAN @486+1: true
  extnValue OCTET_STRING @489+4 (encapsulates): (4 byte)|03020106
    BIT_STRING @491+2: (7 bit)|0000011

and the bytes for the BIT STRING (SEVEN BITS) are:

03 02 01 06

In your article, on BIT STRING encoding you mention very clearly that all bits should be accounted for:

A BIT STRING of N bits is encoded as N/8 bytes (rounded up)

However, what I'm getting is that since bit 0 (big-endian, so the 9th bit) is zero, it's simply being ignored.

Question (*NOT* an asn1crypto.py issue!)

This is when you politely invite me to take it up with asn1crypto's devs. Except that I have 99 other certificates with the exact same openssl output and all of them (except the two aforementioned) encode KeyUsage exactly like my generated certificates. One of these is:

/usr/share/ca-certificates/mozilla/ACCVRAIZ1.crt

Its bytes for the BIT STRING (SEVEN BITS) are:

03 02 01 06

Question: what is the correct way to encode KeyUsage when bit 0 — digitalSignature — is zero?

Again, my apologies if this is the totally wrong place to ask. This being the case, I beg of you to point me in the right direction.

Best regards
Ricardo
(Digital Certificate transparent belt)

---//---

4 Likes

My suggestion is set up a Github repo with a reproducible test case:

  • a cert you see this issue with
  • a cert that doesn't trigger this issue
  • a python script that covers the good cert and the bad cert

If you can share that here, most people will be happy to take a look and see if anything jumps out as incorrect. A few of us are Python developers and can try to debug to see if this worth escalating to the asn1crypto project.

If your code is working on 99 certificates and failing on 1, there might be some bug in the underlying code that is somehow triggered. Stuff like that happens often.

4 Likes

Hi Ricardo,

@aarongable wrote up an explainer for the encoding of those two certificates a couple of years ago on mozilla-dev-security-policy: https://groups.google.com/a/mozilla.org/g/dev-security-policy/c/EKAIB01lvlo/m/OJ10fvGMAwAJ

The short of it is that these certificates don't follow the Distinguished Encoding Rules, but since they're roots, they are rarely parsed.

As noted in the last message, ZLint now catches this encoding error [1][2] so future certs (including roots) won't have this problem. But as-is, they were accepted into the various root programs and exist in this form until they're removed.

(There used to be all sorts of incorrectly-encoded roots. Not to mention intermediates and end-entities. It's getting much better!)

9 Likes

It's fun how the Let's Encrypt forum is sometimes also the "everything you always wanted to know about PKI (but were afraid to ask)" venue. :smile_cat:

9 Likes

Dear friends and neighbours,

Thank you all so much for the assistance, both here and via email.

@griffin, I'm feeling very welcome, and that's the most PHP I saw since I left it back in 2011.

jvanasco, I would, but I believe that my question was answered. Still, if I find the time, I will do what you asked. Maybe it will help someone else in the future.

schoen, if there is a nore proper venue for this kind of questions, please let me know. As I mentioned, I had to post this under "Praise", which feels a bit like cheating. In any case, I am never afraid to ask. :wink:

Finally, @jcjones, your repost of aarongable kind of closed it for me. My takeaways are:

  1. those two certificates are ill-encoded; there's a gigantic "But WHY?!?" attached, but that's for Mozilla to answer.
  2. whenever KeyUsage has no digitalSignature, it should occupy only 8 bits;
  3. all my tests are passing;
  4. I'm feeling encouraged to write my own encoder.

Number 4 is a newbie thing and will pass as soon as I'm done climbing that Dunning–Kruger "Mount Stupid".

4 Likes