Rate limit and accounts creation for 100K domains on caddy


We manage 100K domains. Those domains are customers domains.
Until now we sent all requests under the same account ID but from time to time, we hit the rate limits (300 orders per 3 hours).

I just saw on the rate limit page (Rate Limits - Let's Encrypt) of Lets Encrypt that I can also create 10 accounts per 3 hours or 500 accounts if they are from a different server.
So, I think it would be a good solution to split my customers into groups of accounts so that I will never hit the rate limit again.

We're using CaddyServer and this is what I offer Matt:

By the way, I know about the rate limit form option (https://isrg.formstack.com/forms/rate_limit_adjustment_request), but in two years, I filled it out four times now and never got a response. So I think it's not something that they focus on.
And if they think I can solve my issues within the rate limit range, maybe I don't need a rate upgrade.

Anyway, I would like to hear what you guys are thinking about that.


We always reply to these when an email address is given, even if it's sometimes a template reply without details, and typically assess them within a few weeks. You might want to check your spam/junk folder.


I'm a little bit confused.
You manage 100K domains/certs.
And you use:

  • a single system
  • a single IP
  • a single LE account

Talk about putting ALL your eggs into one very small basket!
IMHO, you should have:

  • multiple/redundant systems
  • multiple IPs
  • multiple LE accounts
  • multiple accounts with other free CAs as well
  1. We manage it with 20 servers (with the option to scale when needed) that share the same Caddy config.
  2. Yes, we use one LE account because this is how CaddyServer is working and Let's Encrypt recommendation.
  3. We use other CAs, such as ZeroSSL and Google (new option), but they all have rate limits or respond very slowly after too many requests.

Our system has been working amazingly well for years. However, we are currently facing an issue with the rate limit imposed by Let's Encrypt. I am exploring the possibility of resolving this by changing to CaddyServer, or alternatively, it would be great if Let's Encrypt could provide an option to pay for extra rate, as I have suggested previously.


We would prefer that you request a rate limit exemption instead of trying to work around rate limits.

As my coworker has already said, we look at rate limit requests every week and our software sends a response.

If you have a registration ID you’ve used for a past request and an approximate date, I can look into what previous requests have had happen.


Just so it's clear for me: Do you consider rate limits as speed limits (i.e. it's OK to go up to the speed limit but not more), or only safety nets to protect against abuse/buggy code (i.e. even getting close to the limit is bad)?


Both: we set our rate limits to protect ourselves, to minimize the chances that a bad client (either buggy or malicious) could cause us to fall over. But rate limits have to exist in reality, and therefore aren't perfect -- if every client started hitting but not exceeding the new orders rate limit all at the same time, we'd probably start throwing errors even though technically everyone is within the rate limits. So while we have technical ability to enforce the "letter of the law", and have multiple layers of limits (e.g. the new accounts per IP limit) to discourage clients (again, either buggy or malicious) from circumventing our limits, we do also ask that site operators also respect the "spirit of the law".

It is definitely okay for a client to hit a rate limit. In fact, hitting a limit and then backing off from there is my preferred strategy, so that the client doesn't have to have a-priori knowledge of what the limits are. But that's in the control of client authors, not site operators, so is a different concern.


@noamway I'm also interested in the topic of large scale renewal management. Based on the current LE rate limits it should be possible to spread renewals of 100K certificates without hitting the rate limits because theoretically the limit is 100 * 24 * 90 (so 216K) if renewals are batched appropriately. If you never let certs get beyond 75% of their lifetime (67.5 days) your limit is 162K.

One concept I've been looking at including in Certify The Web is renewal spreading where periodically the planned renewals are proactively planned relative to each other instead of solely relying on lifetime etc. Clients that implement ARI already include some concept of targeted renewal dates/times.

One option you could consider is proxying the ACME API so that you control your own internal rate limits (e.g. 100 overall orders per hour), that way you can stop your installs from hitting the real LE rate limits and it would also give you more observability of the rate of calls.


Maybe I'm just hard-of-reading ... But I can't tell if renewals are or are not counted in the 300 new orders per account per 3 hours limit. It's just not clear to me:


Thanks; makes sense.

I am just considering if a respectful client carefully registers multiple ACME accounts and spreads the new Orders across them, it could be able to get more certificates in a smaller timeframe, without pushing aggressively up against the rate limits. It can still back off and settle into the rate accepted by the server. As opposed to "working around" rate limits, we would simply be conforming to them in a respectful manner.


I hope these servers only handle ACME operations/traffic and otherwise sufficiently firewalled away from the public internet.

My concern isn't about your choice of server, but the deployment strategy. I'm reading what you shared as 20 different opportunities for malicious actors to compromise a single account key.

Most large integrations I know of have either developed a custom client, or do aggressive traffic shaping/routing to securely leverage existing clients.

Every now and then someone shares an anti-pattern in large-scale deployments here.


Registering multiple accounts, all of which represent the same entity, so that you can get higher issuance rates than would be allowed for a single account, is definitely "working around" rate limits.

There may be circumstances where that's the right thing to do! We are aware of several large integrators who have and use multiple accounts. I'm not going to tell someone to never do that. But it's not "conforming to rate limits in a respectful manner", it's taking advantage of rate limit implementation details to get higher request volumes than would naively be allowed.


What are the circumstances then? What defines an "entity"?


What defines an "entity"?

I can't presume to guess Let's Encrypt's position, but a strawman proposal that makes sense to me:

  • Start with "entity" is the legal entity (company/organization/individual) that owns the servers
  • If the organization is large enough that the above definition is impractical, the entity could potentially be as small as a single independently run product line or service

The major assumption I'm making is: even if you have more than one thing running on a (cluster of) server(s), the fact that the things share computing infrastructure implies they can share an ACME account too.

What are the circumstances then?

I suspect there are no known circumstances that are compelling enough to mention in a public setting.

Exceptions, of course, may exist, but as someone who has had to deal with similar asks, I'd suggest it is up to the site operator to make the case that multiple accounts are needed, and it is not Let's Encrypt's responsibility to help make the case for them.

I've personally ended up allowing things as one-offs that would kill our service if they were generally used, but I quickly learned to never mention them, because even mentioning the one-off sets an implied precedent, and encourages exactly this sort of probing follow-up question that has no safe answer.

In the end, do feel free to ignore this message though, as it is just the opinion of a passer-by.

1 Like