Semi-Automatically create certs

Hi there,

Following situation: I have a server and a webspace from my hoster. The webspace does not allow to auto-create certs for subdomains, so I have to use my server for that by using the command:

./certbot-auto certonly -a manual --rsa-key-size 4096

This works fine, but I have a lot of subdomains, so I want to automate parts of it.

Once every three month I want run a script which does following:

  1. Run the above command for all my domains
  2. Autoanswer the questions
  3. Upload the verification file (well-known/acme-challenge) automatically to the webspace via FTP (own FTP server settings and path for every domain)
  4. When finished, upload the cert files also via SFTP to my webspace, so I can use activate them manually in my webhosting panel (Parallels Panel)
  5. After that I will manually delete the certs files for security reason.

So my issue is, how can I create such a script? Is there already a solution like this?

Thank you!
Zoker

Have a look at GetSSL. I’m slightly biased, since I wrote it, but it’s designed to upload tokens to your server via FTP ( or SFTP, SSH ) and automate things pretty much exactly as you suggest (including uploading certs via SFTP or SSH.

1 Like

Sounds very good, I'm currently trying to set it up. I set up a main domain and a bunch of other domains within the cfg of the main one. (like maindomain.com and otherdomain1.com, sub.otherdomain2.com etc.)

But it seems, that the script does only check the main domain. And I get this issue:

Check all certificates
Certificate on remote domain does not match domain, ignoring remote certificate
certificate for maindomain.com is still valid for more than 30 days (until Mar 10 17:20:00 2017 GMT)

You can use -f to force a new cert.

I’d suggest starting with the staging cert ( as that has no rate limits ). Once you have everything working, you can use the main Let’s Encrypt server.

1 Like

Ah perfect, works fine :slight_smile:

But while using sftp, I get this error:

./getssl: line 395: sshpass: command not found

And also the script does not try to create the folder, if the do not exists:

.well-known/acme-challenge: No such file or directory

Did you use the format

sftp:ftpuserid:ftppassword:domain.com:/web/.well-known/acme-challenge

Also, it does use sshpass if it's installed - what OS are you running the script on ?

Yes, with SFTP it currently expects the location to already exist ( although I could potentially modify it to create the location if it doesn't exist)

1 Like

Yes I’m using this format and ftp works fine, but sftp does not.

Where can I find the certs of the additional domains? I can only see the one from the main domain

What OS are you running on ?

By default it will place all the certs in .getssl/domain/ where “domain” is the domain name you created the cert for. (assuming you created separate certs).

If you added the other certs as SANS ( additional names on the same cert) then it’s all included in one cert valid for all the domains.

1 Like

Ah ok makes sense. Is there a way I can create multiple certs, but renew all certs with one command? (is that the -a param?)

Is it recommend to use one cert for all domains or to split them into different certs?

Yes -a :slight_smile: which will check "all" certs, and renew those that need it.

That depends on your requirements. I tend to put associated domains together ( e.g. example.com and www.example.com on one cert ) and completely separate subdomains / domains on different certs.

Just be aware that there are rate limits - Rate Limits - Let's Encrypt which limits you to 20 separate certificates for a single domain per week ... so if your "lot of subdomains" is more than 20, you may want to put some on the same cert. You can have up to 100 on a single cert.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.