Congrats on the INT overflow!

So we crushed INT and the new target is BIGINT. Just a small increase from 2 billion rows to 9 quintillion rows. Challenge accepted?

#HugOps to all the folks behind the scenes who busted their bums getting this incident resolved. You're all awesome.

19 Likes

The good news: There are 2 billion Letsencrypt certificates created :+1:

4 Likes

I wonder who was that last one… the 2 billionth cert?
[the one that consumed the last INT - leaving nothing for the rest of us]

1 Like

This can be a good trivia question—“what does Let’s Encrypt have in common with ‘Gangnam Style’?”

6 Likes

I don’t think we are at 2B certs yet, I am not sure what each row corresponds to, but here’s the query I use to see how many certificates Let’s Encrypt has issued: https://censys.io/certificates?q=parsed.issuer.organization.raw%3A+“Let’s+Encrypt”+and+NOT+tags.raw%3A+“precert”

6 Likes

Hehehe. Thanks for the kind note @rmbolger :slight_smile: We were already using BIGINT on the most important tables (certificates, etc) but had missed one table that tracked names we've issued for (principally just used for statistics) that had an INT ID column :sob:

4 Likes

@jple is correct: We haven’t issued 2B certificates (yet). The affected table has a row for every subjectAltName, so there are more rows than certificates. But we’re on our way. :slight_smile: Thanks for the kind words!

6 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.