I have about 100 websites which are all serviced using docker containers. Each container has it’s own docker volume for persistent storage where the site data and certificate being used is stored. The entire stack is on AWS ECS using Rex-Ray to handle the creation of each volume required when a site is deployed into the cluster. Today, when deploying a container the docker entrypoint script handles the setup of the certificate using DNS validation and certbot and the Route53 plugin.
This means that each container manages it’s own certificate and if the site is site does happen to get compromised in some fashion then exposure is limited to the container. However, it does make handle certificates somewhat of a chore, even though I’ve automated much of it. I’ve thought about centralizing the certificate process but wasn’t keen on having all the certificates located in one place due to security. I guess because I’m doing DNS validation only I can run the centralized store on a private host instead.
What are your experiences with centralized vs local ??? Are you concerned with having all the cert for various sites stored in a single location??