Rate limiting at an educational institution

I’m trying to issue a certificate for a server running on my university’s network (Rochester Institute of Technology) and am running into the rate limiting issue, likely because of everyone on campus trying to do the same.

I know this is a consequence of the public beta and I think it’s a perfectly reasonable way to make sure people don’t take down the ACME server, but on a campus (and a tech school at that) with tens of thousands of students, having a limit of 5 certificates per week is pretty limiting. Would it be possible to raise this limit for situations like this where the root FQDN will not be unique across hundreds of people and machines?

Thank you!

Well, this is a little more tricky. The base answer is the same as for say dynamic DNS providers, add your domain to the public suffix list. However I don’t know if that’s actually desirable in your case, since a few more implications come with it, such as foo.rit.edu will be unable to set cookies for bar.rit.edu, which might very well break some of your systems.

I am hitting the same situation with another .edu domain. However, I don't think the suggested solution of adding the domain to the public suffix list is at all helpful. I don't have control over the second-level domain, and I seriously doubt that the university's central IT will see any advantage in adding their domain to the public suffix list (and, in fact, will have the perfectly reasonable objection that it isn't a "public suffix" at all).

I strongly recommend changing the rate-limit policy for all second-level domains under .edu. Otherwise, academic departments, and personal servers for faculty, staff, and students won't have Let's Encrypt available at all.

1 Like

freifunk.net suffers from the same limitation, so adding .edu will not help us at all.
It’s a big community, so adding our domain to the public suffix list would not work.

Greetings! Now that the service is out of beta, are there any new accommodations for universities?

We are currently evaluating use of Lets Encrypt at the University of Alabama and the default implementation of domain detection, while undoubtedly instituted for good reasons, does render this service fundamentally useless in a situation where a large constellation of distinct organizations live on the same second-level domain.

And I agree with my colleagues point above - unfortunately, most educational institutions are unlikely to see being added to the public suffix is as appropriate. :slight_smile:

The rate limit was changed to 20 certificates per TLD+1 a few weeks ago. In addition to that, the CA server now skips the rate limit check completely if you’re requesting a certificate with a list of domains for which a certificate has previously been issued for, in order to ensure that you can always renew a certificate. This also allows to you gradually increase the number of certificates under your domain - you could essentially add 20 new certificates per week.

Is that something you could work with - e.g. by doing a gradual rollout (one department per week or something like that)?

Thank you for the quick response - love the community here!

That would definitely work for smaller, more localized units, but the issue is that at most universities, many colleges and divisions will have their own IT organizations, servers, deployment schedules, and so on. Furthermore, each college will have their own departments which will have their own IT resources, service requirements, etc. This doesn’t even get to the level where you begin to have individual faculty researchers, who have their own labs and facilities for which they are responsible.

In short, 20 certificates for such a diverse ecosystem simply doesn’t really work - which is really quite unfortunate given the huge asset LetsEncrypt would be to research faculty, students, and ordinary users of web properties who would benefit from an easily deployable certificate scheme.


An example, I think, may prove helpful.

I am the sysadmin for a single college that supports approximately 20 departments comprising 225 distinct web domains. Rolling out certificates for each of these supported sites would certainly be doable under the current ruleset - if a little time consuming initially.

The wrinkle begins. Each of our departments have their own evolving ecosystem of servers and network technologies which do not always fall under our direct support umbrella. In this case, it gets a bit more complicated. If a faculty member installs certificates for a number of machines in her lab which burns through our allowance - thats inconvenient, but still probably not a deal breaker since that admittedly does not happen every day (or week, in this case).

The winkle wrinkles more. Additionally we have 12 sister colleges/schools which are also comprised of their own departments, sub-units, and research areas. If one of these units decides to create certificates for a battery of course WordPress blogs, our ability to potentially have access to the 20 certificates gets a bit dicier.

The wrinkle wrinkles wrinklier. Not only are there 13 primary academic units, but you also must consider administrative divisions which cover everything from financial administration to athletics programs to student support services. Many of these areas will have evolving marketing campaigns, academic initiatives, and so on which will have web systems requiring certificates.

In short, if even a small, itsy, teeny number of the sysadmins in this population decide to tinker with LetsEncrypt on an ongoing basis, there is a virtually zero percent chance that any specific unit or area can have reliable access to certificate generation. Lets not even start thinking about the impact that students - working on computer science projects, for example, will inevitably have on the quota.

Not saying all of this to be contrary, for the record. I certainly understand the reasons for the limits and the rationale behind making sure this service is not abused. Our assertion is simply that it would be wonderful to see High Ed be taken into account in a specialized way - given the unique nature of our support ecosystem.


Hi @jhawkins! Yep, the university model of deeply nested and highly delegated subdomains is definitely something we’ve thought about and would like to figure out the best approach to. Maybe I can flip the question around and ask: What would be your ideal way to handle the situation be? At any given threshold, it’s likely that some of the time, someone in one of the departments will burn through it. Consider, for instance, someone in the CS department experimenting with Let’s Encrypt who sets up a client to issue certificates for each of 1-1000.mydomain.uni.edu. Is there a good, scalable way of “walling off” different entities within the university so they can’t chew up each others’ rate limits? I’ve been trying to think of some way to use CAA DNS records to indicate boundaries, but haven’t come up with anything conclusive yet.


What about having a much higher limit for .edu domains (since there are so few and they are more important than your average .com) and a set limit for each subdomain of a .edu domain (including subdomains of that)?

1 Like

@jsha Indeed! It is a difficult puzzle, and its wonderful to hear the team is working on this.

After thinking about this for a number of hours - the low hanging fruit options do appear to be either radically upping the limit for .edu domains (although ‘radical’ for one institution may be a drop in the bucket for another) as @xxjfe mentions, or whitelisting .edu addresses altogether. I will say theres some merit to the latter approach as .edu’s are neither distributed nor administered like traditional TLD’s so having a more open approach to them would have a certain logic.

Assuming a rate-limit option stays in place, I do have one question. Is there a way to run a query in the LetsEncrypt client to provide a log or manifest of certificates requested against a respective second-level domain? If it would fall to the campus IT administrators of university units to come up with policies for accountability - it would be difficult to make headway without having a toolset to know where the quota was being diverted. Even then, I don’t know how you would handle students.

Really, the quota system just doesn’t work for our situation - but that may just be the way the cookie crumbles, as they say. :wink:

1 Like

Not in the client, but all certificates issued are logged to certificate transparency logs which can be searched at e.g. https://crt.sh/ or Google Transparency Report. There's also this unofficial script which scrapes crt.sh to provide a useful analysis.

1 Like

@jmorahan Great info, appreciate the share!

Just to add we have also hit this amusing yet frustrating situation as a i started to add letsencrypt SSL certificates to a bunch of our public web servers.

> There were too many requests of a given type :: Error creating new cert :: Too many certificates already issued for: ox.ac.uk

There must be hundreds of subdomains here all under slightly different control from Departments, Colleges and Central IT and yet no one can now register any more SSL certificates for 7 days in the whole of Oxford university. It would be nice to see a different approach for educational institutions both under .edu and .ac.uk here in the UK.

1 Like

Hello all, just to add a “me too” here from the the University of Edinburgh (ed.ac.uk), School of Informatics (inf.ed.ac.uk), in my case.

This issue is likely to stop us from using letsencrypt in any meaningful way, as we could very well be prevented from obtaining certificates for crucial public-facing websites due to the entirely legitimate behaviour of other parts of the University, over which we have no control.


1 Like

@jsha Hello once again! Just wanted to report back a latest finding - it appears, at least in our case, renewals are being counted against our rate-limit so we have begun to have certificate renewals fail. I was under the impression renewals were exempted from the 20/wk. limits?

Also, we are very eager to hear if any progress has been made on making new policy regarding university institutions.

1 Like

We’ve seen one or two other examples on this site of people saying their renewal tripped rate limits that shouldn’t apply for a renewal. If that’s true it would likely be a bug in Boulder (the Let’s Encrypt ACME server)

Hopefully, unlike some of those you’d be willing to share the FQDN(s) affected and the output of a failed run? With luck that ought to make it possible to either pin down the bug in Boulder (hence you’d get a fix, hopefully sooner rather than later) or figure out what you did wrong that caused your certs not to count as a renewal.

Thanks for reporting! We have an open issue and a pull request to fix the behavior: https://github.com/letsencrypt/boulder/issues/1925.

I’m afraid we don’t have any news about rate limiting policy for universities.

Thank you for the quick response - fantastic to hear news that a fix is in the works!!

Hello all! Wanted to provide a quick update based on our experiences.

If you happen to run a cPanel server shop, the latest release (58) includes support for what they are calling AutoSSL - which enables automatic, free provisioning of DV certificates via a provider partnership they have set up with Comodo. The key size is 2048 but depending on your use case that may not be a huge issue. Just wanted to share, as they have not instituted Lets Encrypt style domain rate limits.

The cPanel AutoSSL service does allow for adding additional certificate providers, so support for Let Encrypt is rumored to be released soon. That said - the baked-in cPanel/Comodo default provider is still going to be the only real option for institutional academic use.

Hope this helps someone out there!