I keep getting this error in Chrome (V55 and older) and I’ve got three servers that are configured exactly the same, but one of the servers keep giving this error. It works fine on Firefox, Opera and IE.
Server version: Apache/2.4.7 (Ubuntu)
I’m using the modern profile, here:
I’m really split as to why this is still happening, from what i can read - this is caused by supporting SSLv3 but that is disabled. Also, https://www.ssllabs.com gives an A rating with no issues.
Anyone got any advice as to what i can do about this? It really makes no sense whatsoever!
I can’t share that publicly, it’s a CRM server.
But the hostname is correct and i use -d www.thehostname.com for the extra domain when issuing the cert from certbot.
Issuing a Let's Encrypt certificate means the domain is shared with the Certificate Transparency logs. It's out there already and would definitely make debugging much easier.
Ruling out the simple things first, have you reloaded/restarted this server since updating the configuration with a modern profile?
Of course, the only thing i haven’t done so far is a full server reboot.
I plan on doing that later today though to rule it out.
Where can i find these logs?
There’s a difference between “out there” and “Indexed by every search engine”.
When looking through the Certificate request log for Letsencrypt, i see that this section is wrong:
That is the old IP of the server, not the new one.
Why and how can it validate using the wrong IP?!
(All the other servers have the correct IP here, so this could be the culprit?)
This is likely the validation record for an authorization created in the past that hasn't expired and is being reused. Once this authorization has expired future attempts to issue for the name will use the correct IP.
ERR_SSL_VERSION_OR_CIPHER_MISMATCH error won't have been caused by the certificate and so changing anything related to your ACME client or the Let's Encrypt certificate will be the wrong approach.
https://crt.sh is one view into certificate transparency logs. I think operating on the assumption your domain name is secret is the wrong approach and it definitely makes it much harder to help you diagnose your server!
This would not cause the error mentioned in your first post, but SSL Labs reports a missing intermediate certificate. You’ll probably want to add a
SSLCertificateChainFile /etc/letsencrypt/live/www.fyndjobb.se/chain.pem directive to your apache vhost. On certain browsers that do not have the intermediate certificate used by Let’s Encrypt in their cache, the certificate would show up as untrusted.
Based on the cipher suites offered by the server, I don’t think apache is configured to use the “modern” cipher suite from the Mozilla Config Generator.
I should note that my version of Chrome (57.0.2970.0, macOS Sierra) was able to connect to your site despite all this. If you’re still getting that error with your version of Chrome, it might be worth checking if there are any SSL interception proxies on your network or device (for example as part of your anti-virus software) and whether disabling those helps. These proxies are known to cause issues like this one occasionally.
I’m already using the fullchain with:
Yeah, I changed cipher suite from “modern” to “intermediate” just in case it caused the issue.
Support for passing the intermediate certificate via
SSLCertificateFile was added in apache 2.4.8. With 2.4.7, you’ll want:
Ok, thanks for pointing that out, updated and restarted the server.
Did you look into whether any SSL proxies are enabled on your device? These are often responsible for intermittent or site-specific TLS errors, as recently demonstrated by Kaspersky.
It seems as if the issue is gone now that i loaded the chain separately, it’s odd because that’s how the configuration was for weeks and the issue still occurred - I changed it just an hour ago to the newer config due to the fact that i prepared an upgrade to Ubuntu 16.04 to see if that helped resolve the issue.
There’s a difference between “out there” and “Indexed by every search engine”
You’ve probably already done the following: make sure your robots.txt file denies access to [all] robots.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.