Should I disable TLS 1.0 and TLS 1.1. support and get an A at SSL Labs?

Yes.
And in your case, all of the vhost configs override that default setting with the LE included file.
So it really serves no purpose (in your case).
But who is to say that you won’t in the future add another site and you won’t use certbot to create the TLS vhost config…
So it is difficult to say what is best.
I do know that keeping a good document/notes on what is done / when / and why can make life easier when trouble arises (which it will - sooner or later).

1 Like

This fundamentally depends on you and the clients you want to support, at the end.

Let me give you some examples:

But:

1 Like

I am tempted to remove the ciphers (?) that PayPal doesn’t support. Most visitors to my site will use PayPal.

However, it does beg the question as to what are the risks of someone attacking my site by an obscure flaw. It is not exactly a website that I think anyone would particularly want to attack. The business risks of loosing a customer whose browser doesn’t work is probably more significant.

If I do mess around, I will try to do it on a virtual host for my amateur radio club. It doesn’t matter so much if I screw that up.

1 Like

Probably the biggest risk here is a downgrade attack.
Which can easily be mitigated by reducing the "downgradability attack surface area".
[patent pending - LOL]

1 Like

If you look at my website,

https://www.kirkbymicrowave.co.uk/

I think you would agree it’s not a particularly good one for someone to waste their time attacking.

  1. Nowhere to log in
  2. Nowhere asking for credit card details
  3. No JavaScript or cgi scripts
  4. Very small company
  5. Doesn’t have a huge number of backlinks
  6. No cookies

Dave

2 Likes

That's

always wrong.

Every server is an interesting target. It's not required to "hack users of that site".

A hacked server can be used

  • to hack other servers
  • to test things
  • used to create phishing pages: Check your spam mails and the phishing links: Often unknown, small sites, there a subdirectory,
  • used to send spam mails

And your main page isn't so relevant. There are a lot of possible backends. Or test installations with bugs / known problems.

3 Likes

On the other hand, someone attacking your site for the purposes @JuergenAuer mentions will probably not use a cryptographic attack that can be mitigated through ciphersuite changes, but rather a vulnerability in out-of-date software that can be mitigated through applying software updates on the server.

1 Like

Yes, I suspect so too. I am not convinced that for the majority of websites, there are any significant advantages to having SSL configured as strong as possible. It is very different if confidential information is passed through the website.

1 Like

I’m happy that the protocols are improving, that the web server and browser implementations are improving, that the defaults are improving, and that clients like Certbot can give you good configurations when turning on TLS. We didn’t have anywhere near this kind of agility (or understand of cryptographic algorithms and protocols’ failures) 10 and 15 years ago.

But…

I’m not thrilled about users’ eagerness to get the A or A+ at SSL Labs, especially for personal and small business websites, especially with static content, when they don’t have an understanding of what the threats and goals are.

If you’re new to creating and running a web site, you have greater risks from things like not updating your software (we regularly get people on the forum who are using OSes that are out of official support, sometimes years later), or not mitigating web application-layer issues like cross-site security, content injection, etc. These problems are subtle, easy to get wrong, and often easy to exploit. Unsophisticated attackers have great motivation to exploit them in order to send spam, create botnets, try to steal credit card numbers if your site processes them, or try to steal hashed passwords in order to crack them and exploit other sites where your users are using the same passwords.

If your site only has static content, there’s probably still a greater risk from traffic analysis, which can often reveal which resources a particular person is accessing because they are different sizes. That’s a lot faster and cheaper than trying to mount a complex cryptographic attack, especially an active attack, especially an active attack that requires a lot of resources. Traffic timing and volume may already reveal the information an attacker might want to know about people’s behavior on the site, without having to crack any encryption at all. (“Cryptography is typically bypassed, not penetrated.” — Adi Shamir.)

also if your adversary is an intelligence agency, or sometimes even regular law enforcement, and you’re using a commercial VPS, the adversary might well be able to get the VPS operator to dump your VMs, because the VPS operator itself might be relatively easy to coerce or penetrate

Yes, the TLS vulnerabilities that have been mitigated over time are real information-disclosure threats and some of them are exploitable in somewhat realistic threat models. But for application-layer vulnerabilities, the attacker doesn’t even need to be on-path.

We shouldn’t create the impression that having HTTPS makes your site secure in every sense, or that your SSL Labs score is an assessment of how secure your site is overall. The browser folks have been doing a nice job of presenting TLS as a bare minimum for Internet security (without it, a network attacker always wins with no significant effort). That doesn’t mean that fixing it up and modernizing it is also the low-hanging fruit for making your site more secure against realistic threats.

People, including me, used to have such a huge focus on keylengths, like, why would you ever not use the largest possible keylength for everything!?. An alternative framing might be: do you know if that key is even the weakest cryptographic link in your encryption? Are your systems getting software updates? Is some human being responsible for learning about and responding to threats to them in general? If best-practice recommendations change in any regard, will you hear about it? Are you also thinking about mitigating threats at other layers?

Maybe another take on this would be that systems like the SSL Labs rating have had a tremendously good effect because they’ve made people care about getting a good grade, where otherwise they wouldn’t have responded in the same way to the knowledge that there were problems with their systems. When I worked at EFF, I remember hearing that EFF’s periodic ratings of companies’ data protection practices with gold stars in reports like “Who Has Your Back?” had a dramatically bigger impact compared to just writing criticism of a particular company’s practices in a more abstract way. And it’s good to give people who care about best practices an opportunity to push them internally, for example by telling management that a site is publicly being ranked as not following them! But the moving target of SSL Labs ratings—while it’s informed by careful research and reasoning about cryptographic best practices—is probably not a very important target for small sites to keep up with, in comparison to other highly important and consequently best practices that they’re not getting ranked on.

I particularly remember thinking about this for HPKP because we would get people here on the forum talking about how important it was to deploy HPKP in order to get a better security ranking. But many of them didn’t know what it was or what its consequences for their sites were. As people who are familiar with it will recognize, that wasn’t a great idea because making a mistake with HPKP could easily have made your site completely inaccessible! Furthermore, the main threat that it mitigates is certificate misissuance by a trusted CA, but the interesting examples of this threat seemed to involve countries like Iran or the United States trying to take over CAs in order to target individual sites’ users with clandestine attacks. If you think that you’d like to mitigate that threat, it might help if you first know that you’re trying to do that and how HPKP attempted to deal with it. :slight_smile:

Making individual small site operators stressed because they didn’t adopt security measures whose purposes they don’t understand, or proud because they adopted them without understanding them, doesn’t seem like an effective way to materially advance web security. Perhaps there’s a way for SSL Labs to calibrate this better in terms of addressing different audiences. (I think the Mozilla configuration generator has a nice concept in explicitly presenting the cryptographic security vs. client compatibility trade-offs. If you choose a point on that continuum, it’s not that you have “bad security”, it’s that you made a particular trade-off. Yes, it would be cool to have everyone move toward that frontier so as not to have gratuitously weak security that doesn’t also make for improved compatibility!)

3 Likes

I updated all my sites to get an A, then changed the CAA on my radio club site

https://www.ssllabs.com/ssltest/analyze.html?d=www.dhars.org.uk

so that only Let's Encrypt could issue a certificate. (I always use the radio club domain for test purposes, as its less critical if things go all wrong. I don't charge the club any money for hosting it.) Then I thought of doing the same for my company, and gave it some thought and concluded it was really not worth the bother of doing it - especially since it would have required me to move from 123-reg to Godaddy or similar that supports CAA. Really, who is going to try to issue a fake certificate for my domain - I find it a bit hard to believe anyone would bother.

So although I have an A for all my sites, I have decided not to bother trying to get an A+, as I don't see it is important to running the website. Far more important are the security patches and keeping the content up to date.

I expect some on the list will consider this bad, but as the owner of a small business, where I am director, web master, chief scientist, loo cleaner, general dogs body .... etc etc, the SSL issues don't seem that important.The content of my site is static - I don't even have cookies.

1 Like

CAA isn't relevant to get an A+.

HSTS is relevant. And HSTS is very important, preload is better.

2 Likes

Yes, I did realise CAA was not relevant, as some sites are A+ without it. I just thought I would do it, then started to think whether any of this was really necessary.

I just googled HSTS. Would I need an certificate with both www.mydomain.com and mydomain.com in the one certificate? Currently I have one with the www, and one without, so two certificates per domain name.

What is preload? I have used preload for preloading binaries to change systems called with LD_PRELOAD, but I suspect you are talking of something else.

1 Like

That's not relevant. Relevant is, that you use HSTS with IncludeSubdomains, so every subdomain uses https.

See your last check - https://check-your-website.server-daten.de/?q=kirkbymicrowave.co.uk - there is a link to the Google page https://hstspreload.org/ - HSTS: One connection is required, if http, that's criticial, preload: Hardcoded always https.

But read the "Short FAQ" from "check-your-website":

  • If it is your first certificate: Grade B without HSTS and without Cookie errors.
  • If your certificate renew works: Grade A with HSTS. If your certificate renew really works , you may add the domain to the Google Preload list. Then browsers use always https to connect your domain. That's Grade A+.

You need an always working certificate.

If you use HSTS and your certificate is invalid (wrong domain name, expired, revoked), visitors can't create an exception in their browser. So it's impossible to visit your site. HSTS requires an always valid certificate , so you shouldn't add HSTS if you don't know your certificate renew works.

2 Likes

@drkirkby: there is an important difference in what a domain is for HSTS and what a domain is for HSTS preload, though.

For HSTS, each subdomain decides for themselves.

For HSTS preload, only the apex domain has control, and every subdomain must abide.

So, preload is something you need to think really hard about, before you do it. And it's not easily or quickly undone.

6 Likes

Yep, that's an important difference. Thanks!

6 Likes

There seems a good argument for the HSTS / preload discussion to be another thread, as it’s quite different from my original question. I am interested in what people have to say on this, but it is quite a different topic

1 Like

I would agree if it had nothing to do with “getting an A at SSL Labs” - but I think it does weigh in on that grade.
[Yes, it has nothing to do with disabling TLS 1.0 and TLS 1.1]

1 Like

Now for my 2 cents on this subject:

Should you disable TLS 1.0 & 1.1 = YES [that goes for everyone that doesn’t absolutely require it]
Should you do it just to get an “A” = NO [you should do it because it is the right thing to do]
Should I use HSTS = YES [if you know what you are doing]
Should I try to preload my HSTS = Is that even an option anymore?

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.