That means the web server needs to be reloaded or restarted.
So that it can use the latest cert files.
Not in my case. That's one thing I'm sure to always do after a cert update: restart the httpd service! And for checking purposes, the other thing I always do is refresh the web cache in whatever browser I'm using to inspect whether things are working properly after such an update or modification. But I know some people forget to do it.
Thanks
Sometimes an Apache worker process will get stuck and not pickup new certs and config. In these cases it may need a reboot of the server. While unusual we see it regularly here.
If this repeats consistently you have something else wrong with your config or Apache
The certs are just files. Apache should pick up new configs and cert files with a graceful reload normally.
One other reason could be that your web server is not pointing to the lastest cert files.
In the case of certbot
, there is a /live/
directory [which contains symlinks to the latest files.
And an /archive/
directory which contains the actual cert files [and some previously issued cert files].
If the symlinks have been tampered with/manually replaced OR the web server is pointing directly to specific archive
files, then the web server won't get the latest cert [even after a reboot].
What I usually do in those cases is simply list all the running apache processes and kill them (with 'kill -9 PID') and then start apache again. I find it's a lot faster... also, other important services won't then have to lapse during the reboot.
I don't use symlinks for pointing to certs. Nor do I leave old cert files on my servers. My old certs are always immediately 'rm' deleted once the new certs have been installed and tested. This is why I'm somewhat puzzled by Mike indicating there's evidence of apache on the eccentric server boasting multiple certs for the same domain. The cert files are placed in one subdirectory structure that the vhost.conf file points to directly. And there's only one cert file per validation period per domain and each such file is pointed to only once. I'm hoping the timestamp letsencrypt gave me denoting when I can try requesting another cert will pan out. We'll know soon enough.
Testing and debugging are best done using the Staging Environment as the Rate Limits are much higher.
And to assist with debugging there is a great place to start is Let's Debug.
OK; but do you keep the private keys?
The Certificates one can always retrieve from https://crt.sh/ or elsewhere.
Care to show the "fullchain" cert file?
[or whatever cert file is being used - NOT the private key]
I change them often enough. I won't say how frequently. What about you?
I'll pass for now. Thanks though.
Since
one would still need the associated Private Key.
But if you destroy the Private Keys as well during your
phase, then retrieval of those Certificates is a moot point.
Oh, every few Planck time units.
I won't delete a private key until I've replaced it. And while I'm retaining them, I'm sure to keep them safe and inaccessible to anyone but myself. Permissions can be a wonderful thing
So you will never have the issue of running into the Rate Limits,
since you have the Private Key and can retrieve the Certificate.
Dude, the public cert [and chain] is PUBLIC information.
Why so coy with the public Certificates?
More that just coy.
It's like going to the Doctor and saying...
It hurts, but I can't say where, nor am I willing to show you any part of me.
Diagnosis: Paranoid ...
So what's your personal view on the overall internet security of folks who come here for assistance (or to similar message boards and discussion forums) and share various information about their domains, their networks, their certificate acquisition and maintenance habits, their various security protocols, and the like? It can be said that much of this information is publically available already...especially when a personal information seeker is familiar with various tools and procedures. But arguably, there's something to be said for laying relatively low and thus not calling attention to oneself. Wouldn't you agree?
I agree, that revealing unnecessary information is ... unnecessary.
Being able to see the big picture and discerning which information is necessary is the key.
I, personally, would never ask for anything that is unnecessary and would divulge too much security related information for such a public forum.
I am a security person ... when I'm not here.
[and, also, when I am here]
To give you a small insight into my paranoid world...
In order to reach my web server, one has to cross a transparent firewall, then another firewall, then hits a proxy which has to traverse another firewall ... before reaching a web server [that is completely isolated in a lonely DMZ].
[3 firewalls and a proxy]
Fair enough.
That indeed sounds quite secure. Whenever I look at security logs on an internet server it's still fairly amazing to me how much malicious noise is out there on a continual basis. From persistent brute force crack attempts (even dsitrubuted ones - which are more difficult to filter), to packet flooding efforts (targeted DDoS or just the random crap), to various other exploitative activities. It's astonishing how much pounding many of these servers endure on a daily basis. DDoS protection is almost a must anymore.