My question is about encoding, but I'm including it here because there is some praise to be shared. As a total newbie to the whole Certificates world, I emailed support and JC Jones redirected me here. Instead of chastising me for barking at the wrong tree at 4.30 AM (I guess it's Pacific Time), he was exceedingly supportive and encouraging. @jcjones, you're a gentleman!
Here goes the teanscript of my email:
---//---
Dear sirs,
Intro
I'm writing to you because I've been using your article A Warm Welcome to ASN.1 and DER as the backbone of my initial journey into Digital Certificates. From my position of almost 3 decades' as a teacher / technical trainer, I highly commend you for the clarity and ease of language in this article. That's a fine balance there.
I'm also using this address because there is no mention to an author in the article. I like the "collective" vibe, but I apologise in advance for eventually abusing this mailbox.
Scenario
I'm using Python's asn1crypto
to decode / encode certificates. As a test, I'm decoding / encoding all 148 certificates I could find in my Linux 24.04. Only two of these are not being encoded exactly as per the original:
/usr/share/ca-certificates/mozilla/Trustwave_Global_ECC_P256_Certification_Authority.crt
/usr/share/ca-certificates/mozilla/Trustwave_Global_ECC_P384_Certification_Authority.crt
In both cases, the DER encoding of the original certificate differs from the one generated. Here's the relevant part's clear text on both, from openssl:
$ openssl x509 -noout -text -in /usr/share/ca-certificates/mozilla/Trustwave_Global_ECC_P256_Certification_Authority.crt
X509v3 Key Usage: critical
Certificate Sign, CRL Sign
These are bits 5 and 6 from the KeyUsage BIT STRING.
The original tree is (from ASN.1 JavaScript decoder):
Extension SEQUENCE @479+15 (constructed): (3 elem)
extnID OBJECT_IDENTIFIER @481+3: 2.5.29.15|keyUsage|X.509 extension
critical BOOLEAN @486+1: true
extnValue OCTET_STRING @489+5 (encapsulates): (5 byte)|0303070600
BIT_STRING @491+3: (9 bit)|000001100
and the bytes for the BIT STRING (NINE BITS) are:
03 03 07 06 00
Upon reencoding, I get:
Extension SEQUENCE @479+14 (constructed): (3 elem)
extnID OBJECT_IDENTIFIER @481+3: 2.5.29.15|keyUsage|X.509 extension
critical BOOLEAN @486+1: true
extnValue OCTET_STRING @489+4 (encapsulates): (4 byte)|03020106
BIT_STRING @491+2: (7 bit)|0000011
and the bytes for the BIT STRING (SEVEN BITS) are:
03 02 01 06
In your article, on BIT STRING encoding you mention very clearly that all bits should be accounted for:
A BIT STRING of N bits is encoded as N/8 bytes (rounded up)
However, what I'm getting is that since bit 0 (big-endian, so the 9th bit) is zero, it's simply being ignored.
Question (*NOT* an asn1crypto.py
issue!)
This is when you politely invite me to take it up with asn1crypto's devs. Except that I have 99 other certificates with the exact same openssl output and all of them (except the two aforementioned) encode KeyUsage exactly like my generated certificates. One of these is:
/usr/share/ca-certificates/mozilla/ACCVRAIZ1.crt
Its bytes for the BIT STRING (SEVEN BITS) are:
03 02 01 06
Question: what is the correct way to encode KeyUsage when bit 0 — digitalSignature — is zero?
Again, my apologies if this is the totally wrong place to ask. This being the case, I beg of you to point me in the right direction.
Best regards
Ricardo
(Digital Certificate transparent belt)
---//---