Public beta rate limits


@jcjones Is there any API call to list all certificates issued for the registration?


This does not really help.

When i start a new project, i need a certificate with the new subdomain in it. This means, i need a new certificate, independend from if i want to add a domain to the common certificate or if i want a certificate just for the new subdomain. So i would now need to request a new cert for the new subdomain now, and at the next renewal time i need to remember to add the new subdomain to the common certificate.

But both do not really help, when i hit the API limit and want now to add a new subdomain (either as single cert or as new common cert with one subdomain more), which will only work after 60 days. Which effectively means, that my new project is for 60 days only available unencrypted.

And on the other hand, it may be not wanted to have all subdomains on the same cert.

It is also impractical for independend configuration of the subdomains.
For example i use a configuration for my webserver, where i can do touch sites/newdomain and the needed stuff (webroot, log directory, server configuration) is created on reload of the webserver. I thought about adding “add letsencrypt certificate” there, but this will not work with the limit.

I see the point in some limit, but i think it should be like 50-100. Who needs more than 100 subdomains may want to buy a wildcard certificate somewhere anyway. But a moderate number of subdomains should be possible. So better a global domain limit than a subdomains-per-domain limit. It should not make any difference, if i sign 100 domains or 100 subdomains.


re-read further down, @jcjones make correction to the wording and clarification at Public beta rate limits it’s not per domain but certificates per domain, meaning you can issue each single subdomain cert for 50 subdomains, so you have 50 ssl certs. Or one big 50 domain multi-domain SAN ssl cert is counted as 1 certificate. So you’re just limited to reissuing those 50 ssl certs or 1 big SAN 50 domain multi-domain cert to 5 times per 7 days right now.


Reading @jcjones post i dont this this first part is correct. 50 single domain certs all for subdomains of the same root would count as 50 certificates, and exceed the current limits. I think your second statement is accurate.


guess it’s what you define as domain name as including subdomains ? @jcjones maybe some clarification :slight_smile:


I read the same stuff.

What i REALLY want: 32 subdomain certificates, with www.sub.domain and sub.domain on them (or only www for the root-domain. I currently have www as SAN, but do not use such urls).
(I want to separate most of my stuff cleanly)

What currently would work: 32 names divided on up to 5 certificates, planning ahead all names before requesting the 5th certificate.

What can make problems:

  • Start with one domain, request a certificate
  • Build some stuff (add mail, owncloud, …) and add incrementally subdomains for it.
  • You either need to request a new “subX.domain” certificate for each, or a new “domain, sub1.domain, sub2.domain, …, subX.domain” certificate for them.
  • When requesting the certificate for sub5.domain or a certificate with main domain and the 5 subdomains, you hit the API limit.

What you can do: Register more second level domains and use them.

The ressource usage for LE stays the same, if i request 100 secondlevel domain certificates or 100 thirdlevel certificates. So a global limit would be more useful to limit the ressource usage (i guess OSCP) for LE.


Having so many subdomains increases the number of roundtrips in the SSL/TLS handshake since the certificate size increases with each subdomain. In that instance it’s much better to have a handful of names at most.


I want to obtain a certificate for a rented server. Since the server is part of a pool of one of the big german hosters, the hostname ends with “”. When trying to get a certificate I get this message “There were too many requests of a given type :: Error creating new cert :: Too many certificates already issued for:”.

Are there any plans to kick the limit for hand picked “pool host names”?


I’d very much appreciate a significant increase of that limit. I created a few certificates for a domain yesterday. Then I saw I forgot to add one other domain in my SMTP cert, tried to fix it by creating a new one and got the message that I’m not allowed to.

I understand there have to be limits, especially this early. But 5/domain/7 days makes letsencrypt unusable. I think most people could live with 25/domain/day for example (not the ones on shared domains of course). Even if you make a mistake, you can fix it the next day and don’t have to wait a week. Or 5/domain/some-hours.

What I also don’t understand: If I create 5 certs, wait a week and then generate then next 5 ones - how am I supposed to renew/regenerate these? Wouldn’t this also fail?


You can ask Strato to submit a PR to the public suffix list, then each hostname would be counted as a domain.


Thx, I’ll send a mail to strato’s customer support. Hopefully they will submit a PR but I wouldn’t bet on it.



The right fix for this would be for DDNS to add themselves to the Public suffix list.
Actually, it is highly surprising they didn’t already do so, given the security implications: for instance, by default would be able to read/set cookies used by

Ah, jhass was faster.


Is it 5 per 7 days? This could (with some useless efford to choose the correct days) help for a moderate number of domains. I thought it is 5 per domain and 60 days, with up to 100 names on a cert.

I still vote for a global limit. Give us 200 certs per 60 days. If they are for the same domain, for subdomains or for 200 domains should not matter.
And grant some bonus to first time domains, so there are some failed attempts granted. Renewal should then need less certs for people, who for example forgot the www. subdomain on the first try of a certificate.


I agree with @allo, today, i started using letsencrypt, after few tests to really understand how it works, i reached out the limit of 5 per domain … As a result, no certificate that matches my needs and no more try before … 7 days !
This limitation really needs to be extended/changed, so we can really “beta test”.


It should probably be documented, but if you’re testing, you should use the staging server for issuing certificates so you don’t run into the limits when you want to get something publicly trusted.


Was bitten by the 5 domains per 7 days rate limit also while testing.
At Sovereign, we want to use Lets Encrypt certs and completely drop self signed certs.
Everything is provisioned automatically through ansible including the certs and we’d like to issue one per service because it is much cleaner and isolated. (from a provisioning standpoint).
We already have more than 5 subdomains for a full provision, so that means the rate limit will kick in during provision and fail.

Are there plans to drop or increase the limit?


probably the question should be not if but when :smile:

hopefully LE folks gives us a nice holiday gift and raises the limit i.e. 100 domains per 7 days :slight_smile:


That would be nice to have a word from someone officialy. These rate limits are too limitating.

For example I use Docker to deploy automatically my websites and sometime I need to restart the containers and so to renew the certificates… which I can’t do anymore since I used too many of them.

Meaning that I need to revert my websites from using https to http… And this switch become really really boring when servers use HTST ( cuz it prevents users from visiting website which had https support in the past.


I just got rate limited, and this is purely on the back of an hour’s tweaking of an existing configuration, adding a few (1-3 subdomains) for an existing hostname, and trying to reissue. I understand the instinct towards an overabundance of caution in the beta stage but having to wait a week to change certs is a bit excessive.


Another vote for increasing the rate limit. Increasing the rate limit to a little higher to say 20 would go a long way.