When will rate limits increase, what can we expect in future rate limits?

That’s basically it: When will the rate limits increase and what can we expect as future rate limits?

1 Like

Could somebody from LetsEncrypt respond to this?

I suspect you won’t get an answer because it’s not agreed yet - so it’s not possible to answer definitively when, or what any modified rate limits will be.

I hit the 5 certificates / week limit when testing a CloudFront plugin of the command line client. By the time I fixed a problem with a “not deleted before uploading” certificate, I was getting “Too many certificates already issued” errors from the letsencrypt server and now it seems I have to wait a week during which CF resources won’t be loaded on a HTTPS page by Chrome and probably other browsers.

I now know that we’re supposed to use ‘–staging’ while testing, but the limit is ridiculously low since it applies to a domain and all its subdomains. Even the FAQ is misleading when it says “Hopefully wildcards aren’t necessary for the vast majority of our potential subscribers because it should be easy to get and manage certificates for all subdomains.”

No, letsencrypt team, it’s not easy at all. And to make things worse, I can’t go back to HTTP and wait a week because I made the mistake of using a 301 redirect to HTTPS and Chrome caches that forever. So now I’m looking at StartSSL and a much more complicated procedure.

Please avoid this problem for future users by documenting the “–staging” option clearly in the main README and by increasing the limits and/or making them per subdomain.

1 Like

Just echoing the frustration here.

Well since I issue 3, and fumbled around renewing the 2 old once I hit the rate limit. Now 1 on those two old ones is going to expire before the rate limit lifts.

It wouldn’t be so bad if we had wildcard certs, but with every sub-domain being a separate cert, five per week is just not sufficient especially for renewals. Now I have to remember to login again in 7 days to again attempt to manually renew my certs :frowning:

1 Like

No, you can have 100 names on a single cert. Just pass multiple -d flags.

1 Like

Also echoing my frustration here as we have 11 sub-domains, used to be on a wildcard cert, switched to LetsEncrypt and now stuck in this 7 day cycle. The LetsEncrypt concept and implementations are brilliant and easy to follow but would suggest a better help/roll-out guide to highlight how to properly/efficiently use LetsEncrypt.

Running on AWS I have a single AMI for all 11 subdomains. Normally, I could launch this master AMI 11 time for each subdomain, log in to each machine, change a parameter to control this subdomains requirements, attach an elastic IP and update Route53. This 1 AMI used to contain the wildcard cert thus SSL issues were sorted regardless of how I had the subdomain configured.
Deciding to use LetsEncrypt I was very willing to accept a slower rebuild process whereby I’d launch an AMI, SSH into it, change the subdomain parameter, elastic IP and route 53 it, but now add in the get a LetEncrypt Cert process for each particular subdomain i.e. ./letsencrypt-auto --apache -d sub-domain-N.example.com
This approach was working fine and only went pear-shaped when I stared building subdomain 6 as I hit the 7 day/5 cert limit.

So my point/question is, should or could I have built out my master AMI with a LetsEncrypt cert covering all subdomains in the 1 cert request, i.e. using the -d switch for each subdomain??
If so can I now go back to my master AMI and follow this process or will I have to wait the 7 days to get this 1 cert for all subdomains?
Also, if this process works, how then do I renew the single master cert across all subdomain servers once they are up and running as separate machines?

Yes you could do it with a single certificate across all your domains (by adding the -d for each subdomain)

The alternative (which I use with a number of domains / subdomains) is just spread them out. Get 3 every week ( which allows for the occasional mistake) and then renew spread over the same time ( it's automated, so not really a problem )

If you have already hit the limt, then yes you would need to wait until 7 days after the first cert to get another one ( containing all).

You can either use DNS ( since that can all be done from one machine) alternatively if you have ssh access to all then you can use GitHub - srvrco/getssl: obtain free SSL certificates from letsencrypt ACME server Suitable for automating the process on remote servers. which is a simple bash script that will upload the various test files via ssh for verification, and can also uplad the cert afterwards and reload the service.

Paul

I understand your frustration but there is another way: Don’t learn on your production systems!

LE can be run on a laptop/desktop with a little care. All you need is an external IP address, DNS access and external firewall access. Dynamic DNS can help if your external IP moves around a bit sometimes. So why not get to grips with LE by running up a little webserver on your PC of choice and port forward 443/tcp to it?

If you don’t have an external DNS domain to play with then I believe there are free ones available or at worse you will need to cough up around £20 for a year. Another option is running your own DNS server for a testing sub-domain and adding glue records to your main domain. Another option again is to note that DNS does not really distinguish between hosts and domains as such, so if you register host1.testing in your example.co.uk domain then you have effectively created testing.example.co.uk as a domain. FHM/WHM supports this in its DNS manager - don’t know how AWS works in this regard.

Anyway, with a bit of imagination you can play with LE to your hearts content until you have everything tested for your use case. Also note there is a testing environment you can play with as well to gain some experience.

Then you go to production.

1 Like

Hi all,

We definitely hear your pain and are working on ways to improve the rate limiting situation. Sorry we haven’t been more communicative on the topic, but we’re still working out exactly what steps we’re going to take, and don’t want to over-promise and then change our mind.

Definitely we need to improve the documentation in the official client, and an official “guide to integrating with Let’s Encrypt” is high on my personal to-do list, which should help a bit.

From a technical side, in Boulder we’re implementing a new rate limit that matches on an exact set of names. This will be in addition to the “certificates per domain” rate limit, and will allow us to increase that one. Hopefully it will catch both misconfigured cron jobs and people testing out clients, letting folks know they need to make a change.

Thanks,
Jacob

1 Like

I’ve done my testing with letsencrypt and I’ve configured my automation infrastructure, I even have the default configuration set to staging to prevent accidentally burning precious certificates.

But with the current limits, a few things are true.

  • I can’t deploy an application that has more than five components, it takes multiple weeks to do so.
  • I can’t have redundant components broker their own certificates without reducing the overall size of my infrastructure. At best, I can have one unit in a cluster obtain a certificate then I need some other manual process for distributing certs. This asynchronicity completely negates the benefit of letsencrypt (namely automation), especially when it has to be done every 60-90 days.
  • My infrastructure has a maximum size of 60 entities (i.e., certs) After 12 weeks at 5 certs a week I can have 60 certs issued, with eight days before my first five certs expire. Week 13 must be used to renew certificates obtained in week 1.

This is a thimble that grows at a snail’s pace which makes it unusable for all but the smallest of domains and certainly too small to be used by anything you’ve heard of.

1 Like

Maybe we should try to remember a few things after all.

  1. Let’s Encrypt is still in a state of Public Beta. To me, that means the Let’s Encrypt organization is officially saying the Let’s Encrypt software, server(s) and client(s), including the managing infrastructure is definitively not production ready.

  2. When Let’s Encrypt one day does go Full Release, it’s still a service provided as Free Beer. (When did I get a free beer last time?) We can provide feedback and ideas for improvement, but we’ll have to carefully choose our approach if we feel that we want to / need to criticize the Let’s Encrypt software or the management of it.

  3. It’s probably risky (at best) to promote a Public Beta program to the management of our Business Corporations for full deployment in their Production Systems, especially if they’re facing a world-wide public audience, including both the clients and the competition.

  4. Risk Management can be very good for your health. :wink:

Personally, I’ve implemented the LE certification on two of my public servers. One is my own, personal, fun to play with, Web server. That’s where I started. After having gone through the initial learning curve I did the same on a server that holds information of a Social Club where I’m a member (and “Webmaster”). Only people I know. Very low risk. Hardly anything to lose in case of failure. A lot of experience to gain.

Personally, I wouldn’t dear pushing a Public Beta system on my Corporate Public servers with maybe hundreds of server computers just yet, because in my own experience Management has very little understanding for the IT folks’ initiatives if IT does something that just might put Share Holders benefit at risk. (Been there, done that. And if I need a T-shirt, I’ll have to buy it myself.) :slight_smile:

Just a thought.

1 Like

Yep, this is a good point. We're working on increasing the certificates per domain limit, which is what it sounds like you're running into.

Thanks for the reply. Are you able to share the new cap limits and a timeline (even if just rough estimates)?