How to use Let's Encrypt with multiple servers

I've read through a number of topics but can't decide on the best approach to use when Let's Encrypt is to be used with multiple servers. For example AWS Auto Scaling.

I was thinking of mounting an Amazon Elastic File System to each server instance. The certificates would be stored on the shared EFS file system. I've not heard of this approach being used in the wild hence my post here. Can you please tell me why this would be a good or bad idea? I'd be grateful to hear alternative ideas as well.

Thanks in advance

2 Likes

Welcome to the Let's Encrypt Community, Henry :slightly_smiling_face:

If the servers are acting as part of a load-balancing scheme (responding identically for the same domain name), it is usually best to have the load-balancer acquire and maintain the certificates. Whether the server group or the load-balancer terminates SSL (by serving the certificate and encrypting/decrypting the content) will determine whether you want to distribute copies of the certificate and its private key to the servers or keep those centralized.

4 Likes

I personally quite like the idea of using secrets/key vaults - these are generally integrated into the respective cloud services and let you push a certificate update to the vault, then your apps/services etc can pickup the latest version.

5 Likes

To me, it really depends mostly on the number of certs being "shared".
In the smallest example (one single cert - with only a few entries), it really makes little difference how the certs are obtained or shared. But all methods won't scale equally...
The larger the number of certs (and names within each cert), the less practical are the schemes that require redundancy through independently "shared" repetitive actions.

2 Likes

Yeah, there are a lot of ways one could do this, and like all interesting engineering questions it's more about just understanding the tradeoffs than one solution being clearly "right". For me, for stuff on AWS I'd recommend staying within the Amazon ecosystem as much as possible, so I'd use their Certificate Manager (which issues free automatically-renewing certificates from their own public CA, but that can only be used with Amazon services and you don't have direct access to the private keys) for whatever I could. So if your servers are web servers, you could put their load balancer in front of the auto scaling group, and I believe the load balancer can terminate TLS for you. (I've not used their load balancer myself, so perhaps I'm mistaken, though.) If you're using nginx on the servers, and you also need to secure the connection between the load balancer and the server (or if you're not using the load balancer to terminate TLS), you might want to look at AWS Certificate Manager for Nitro Enclaves, which allows you to have nginx on the EC2 instance use their free Certificate Manager certs without having access to the private key.

If it's not a web server (like have a mail server using TLS), then yeah probably something like Let's Encrypt is the way to go. What I do is have a Lambda that gets the certificate and stores it in S3, and then my server on startup loads the certificate from there. That made more sense to me, but I suspect storing in EFS would be roughly the same level of complexity and security. It may be most "correct" to use AWS Secrets Manager, but if the permissions are carefully managed I think EFS or S3 probably work just as well. I'm only using a one-server auto-scaling group myself, but I think the principle would be the same for multiple servers. And especially for multiple servers, I would encourage some sort of centralized system rather than many servers getting many certificates. See the Storing and Reusing Certificates and Keys section of the Integration Guide for the "official" guidance.

I did write up the approach I took, but it's more an example you could use for your own development than a real out-of-the-box solution.

5 Likes

There are a handful of ways this is typically done. Off the top of my head:

  1. Cloud/Vault Based Certificate Storage. Some servers have this built in, IIRC Caddy is one and uses S3. I've seen some integrations (via webserver plugins) for various secrets/Vault providers.

  2. Shared Storage. EFS, NFS, S3, etc.

  3. Local repository, Pull. A "main" node handles LetsEncrypt provisioning, the others rsync/etc off it nightly or on build/startup.

  4. Local repository, Push. A "main" node handles LetsEncrypt provisioning, it pushes certs to all other machines when obtained. Sometimes this is done in an "office" devops machine with DNS-01 challenge, and pushes out.

Whatever the option, the best model is to have one node serve as the "Main" that runs LetsEncrypt, and HTTP redirect or proxy the challenges onto that node (or handles DNS-01 challenges). Virtually everyone who tries to use multiple nodes as LE clients, or automatically detect a node to be the LE client, will mess up their logic once or twice - and this often means ratelimiting themselves into downtime.

I open sourced our solution, GitHub - aptise/peter_sslers: or how i stopped worrying and learned to love the ssl certificate , which is focused on scalable nodes running scalable domains (and overkill for finite sets of domains). It's an API based Certificate Manager and LetsEncrypt client, that does a multi-level failover caching strategy: nginx worker memory, ngninx master memory, redis, web-based Python API.

6 Likes

As long as you asked ... have you considered using CloudFront (CF) (a CDN) to front your server? I mention it in case you have mostly static content being served that the CDN might off-load so much traffic that you would only need one server.

Getting a Let's Encrypt cert for this (now) one server for an https connection between it and CF edge is routine. Your server TLS is simpler in that it only needs to cope with a single client - CloudFront.

CF manages the TLS connection between clients and its edge and it is trivial to get AWS certs which auto-renew using the CF console. The console also controls http->https redirect at the edge as well as protocols, ciphers, security headers (and other extensions). That is, all the variety to deal with myriad clients.

Mind, I present this as a concept - not a recommendation. I know nothing of your application.

3 Likes

Thanks everyone for your replies. You have certainly given me a lot of useful ideas. I'll explore the AWS Certificate Manager approach because in my case it does make sense to keep everything in the AWS ecosystem

3 Likes

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.