Actually, no. While your answer possibly satisfies david7634, I personally thought it unresponsive. The "Principles" state what LE is intended to do, not why.
I can't remember at the moment exactly where, but it prominently says on the LE website that a major goal is to encrypt 100% of the web. I have looked high and low but can't find a single word as to why this is, or should be, desirable. The founders and/or the implementors seem to have the peculiar notion this is some unversal truth that needs no explanation.
I registered my first domain in 1998, and built my first http site. As of a week ago, the latest successor to that site was still unencrypted. I would wager that there are millions of unencrypted web sites whose owners would be entirely mystified as to why (given that there is absolutely nothing confidential on any page of their site) they would be asked to spend even ten seconds of their time for a domain certificate.
I spent nearly three days last week puzzling over this and finally decided it was a good idea whose time had come, and my domains are now 100% encrypted. Before sharing my logic, I would love to hear the "official" reason of LE for this 100% goal. It seems to me that this statement of underlying logic should be in the first paragraph of the first page a visitor to LE would see.
ocahui, Thanks for your great posting. However, it belongs in a topic of its own, rather than on a thread discussing mostly documentation.
Your final request will not likely get the attention it deserves, posted as it is on a tiny byway of this forum. I am very curious about why you changed your mind about the value of secure websites. While I do not know the official reasoning behind wanting the web 100% secure,
I think that one motivation is spamming/malware/viruses. When the Web is 100% secure (including all Internet services, presumably), then there will be very few places for spammers and malicious people to hide or to work.
How frustrating it will be for people who drive around in a van sniffing packets from homes and businesses when all the content they will see is encrypted? And similarly for people who scrape websites for email addresses, put viruses into sites that use WordPress, etc. Anyway, that might be the hope.
Anyway, I recommend that you make your identical post in places where the core people of this project will see it. Then you might get an important discussion going.
SSL/TLS is not really intended to solve the spam or malware problem. The goal is to provide privacy/confidentiality (i.e. the packet sniffing problem you mentioned) and identity authentication (i.e.: yes, this page is actually what the domain you're currently visiting sent). Neither of these properties are particularly relevant when it comes to spam or malware prevention.
There's a small roles that CA are supposed to play with regards to malware, and that's related to revocation in case a CA is made aware that a domain is used for things like malware distribution or phishing. A lot of people consider this scope creep. It's not particularly effective either, given that revocation is not reliable (soft fail means that a certificate is considered valid if the OCSP server is currently not reachable), far from instantaneous (long OCSP lifetimes and CRLs that rarely update) and often disabled by default anyway (Chrome). All mainstream browsers have features that are significantly better and faster at this job (Google/Mozilla with Safe Browsing, Microsoft with SmartScreen). Malware authors are used to switching domains every couple of hours already, and HTTPS everywhere isn't going to change that.
I think we should generally be careful about pushing for a 100% HTTPS web where we change the roles of CAs to be some sort of central authority that decides who's honest enough to be eligible for a certificate. I certainly don't want the internet to be full of malware, but having a bunch of private organizations be some sort of internet police is even more dangerous than that, in my opinion. That's a dangerous path to walk on.
I don't see any correlation between website scraping/malware distribution via hacked WordPress sites and HTTPS.
Fair enough, David. I will consider doing just that.
I was attracted to this thread by googling site:letsencrypt.org justify 100%. I too had found the documentation extremely frustrating. After reading almost entirety of this very long thread, and skimming the rest, I reached some conclusions about the whole issue that I may expound on in another thread.
[quote="david7364, post:138, topic:15724"]I am very curious about why you changed your mind about the value of secure websites. While I do not know the official reasoning behind wanting the web 100% secure,
I think that one motivation is spamming/malware/viruses. When the Web is 100% secure (including all Internet services, presumably), then there will be very few places for spammers and malicious people to hide or to work.
How frustrating it will be for people who drive around in a van sniffing packets from homes and businesses when all the content they will see is encrypted? And similarly for people who scrape websites for email addresses, put viruses into sites that use WordPress, etc. Anyway, that might be the hope.
[/quote]
Actually, it will accomplish virtually none of that. HTTPS (TLS/SSL) does absolutely nothing to protect a web site. All it does is establish a (hopefully) secure communication channel between a client agent and a server agent. Sniffing packets of an http channel offers little of value since those packets are easily obtained using an http browser. (My point about the http sites)
That's a great side plot for a "hacker" B movie. It has nothing to do with reality, though.
The greatest knowledge is to know when you don't know something. You need to work on that.
For example, has it never occured to you that using HTTPS changes absolutely nothing if your threat is your visitor, attacking your insecure Wordpress installation? HTTPS only protects a web site's communication with its visitors from other people who have the capability to acquire the traffic - which is not the malware mafia or spammers either.
Unfortunately, you need an understanding about how the Internet works, which can't really be taught in a handful of posts on an Internet forum. I don't mean this in an insulting way, but recognize what you don't know and stop pretending you know.
To reply to the topic: The main cause for the desire to encrypt everything have been the Snowden revelations which uncovered the massive global-scale surveillance of Internet traffic. If one has been following that and the articles about LE in the tech press, then its purpose would be obvious, but maybe it really should be spelled out for the people who missed the most important scandal in the past 3 years.
pfg, Thank you for taking the time to describe what secure sites can do and can’t do for us. “Malware authors are used to switching domains every couple of hours already, and HTTPS everywhere isn’t going to change that.” You echo thoughts I have had. I have one website that has been a particular target of Russian visitors. You can watch them allocating new domain names in batches. Creating SSL credentials would be just as easy for them, particularly if they can get them automatically and for free.
Okay, then, can you tell me what is the LE motivation for making the Web 100% secure? There is some price to that–at least the overhead for doing all that encryption all the time for all the websites.
Even sites that wouldn’t seem to need encryption benefit from a security standpoint. Malicious code can be injected into websites via man in the middle or even from any routing device.
I hope someone posts an honest comparison page soon, so we can measure the actual difference.
Transfoming the Web to 100% secure will have a large cost for almost every part of the Web due to the encryption alone. So, what is Let’s Encrypt’s reason for pushing 100% secure?
Added 7/8/16: No, it’s not a hoax, it’s just missing the proper explanation. See the corrections by others below.
vizzaccaro, Please explain both claims. I don’t believe either one. If either were true, we would have heard about them as serious problems. MITM attacks only work when encryption is used, and require someone to have access to both sides of the conversation, which is not normally possible. Many routing or WiFi devices are insecure out of the box, but can be secured easily. I have posted a security advisory about one wireless router (the one I bought) to the main USA security laboratory and mailing list. Of all the security problems I reported, none affect me, because I have configured it properly, using AES, disabling certain features, providing an SSID, etc.
Insecure sites can be read by anyone, but they are not specially vulnerable to malicious code. The real vulnerability for insecure sites is WordPress, because malicious code can be inserted into WP sites run without attention to security issues.
It’s not a hoax. Nobody is interested in building new insecure protocols in this space. So there is no “unencrypted” SPDY or HTTP2. The standard says what it could look like if it existed, but nobody has implemented it nor do they plan to.
Thus the actual choice people really have is between plaintext HTTP 1.1 and HTTP2 which is always encrypted. It makes no sense to insist on an “actual difference” that doesn’t relate to a choice anybody actually has. If you want to go from Grand Central to the Empire State Building you could walk, or catch a cab, or use the subway, or do some combination. But “travel by space shuttle” isn’t an option, insisting that “space shuttle would be faster so that should be included” is nonsense because that’s not actually an option.
tialaramex, Interesting. I hadn’t known about this, or found it in my quick searches. However, I challenge you to show that most secure websites use either HTTP/2 or SPDY. SPDY, at least, is a proprietary product of Google, and I doubt that webmasters of secure sites want to give Google a monopoly on speeding up secure websites. I would hazard the guess that 80% or more of current secure websites do not use either technology.
Almost everybody on the network between the two victims has "access to both sides of the conversation", there are very few unidirectional links on the Internet, and almost all of them are vulnerable to at least the owner of that link and usually also all other users of the link intercepting or changing the contents of messages. If you own an old-fashioned land line telephone you have probably discovered that you can overhear conversations by simply picking up another extension elsewhere in the house. If you could do a convincing impersonation of one of the parties on the call you could probably make your own call and pretend to "be" them most effectively. Most of the Internet works like that.
Man in the Middle attacks work when neither side is able to tell the other from a stranger. Far from being impossible without encryption, it's much easier without encryption. In fact it's so trivial that you can download an Android phone app that pranks people using the same network as you to access HTTP web sites, flipping photographs upside down, replacing words, that sort of thing.
Further away on the network, we know that several major ISPs have been caught replacing or adding advertisements in web pages using a Man in the Middle attack on plain HTTP websites and that the entire country of China is regularly subject to the "great firehose", an attack in which unencrypted content is replaced with scripts directing all the visitors to flood some seemingly unrelated web site, usually in the West, again using Man in the Middle.
I appreciate that your narrow personal focus means you probably don't care about any of this. But globally we DO care and we're trying to fix the problem rather than just shrugging it off. Having people insist that without any expertise in the area they've decided it must be a "hoax" and there's no reason to do anything is unhelpful.
The site is comparing worst-case usage of HTTP (many small resources and no domain sharding, etc.) with an use-case where HTTP/2 performs at its best. It's definitely not a real-world benchmark, but I wouldn't call it a hoax either.
Most realistic benchmarks determine that HTTP/2 (which implies HTTPS in practice) performs (at least) on-par with sites using HTTP with typical optimizations, like domain sharding when the number of resources required to render the page is big, etc. It performs significantly better than non-optimized HTTP pages in many use-cases. To put it simply, you get the encryption and authentication for free, and you get to avoid having to do some ugly things to work around the shortcomings of HTTP.
Google provided some numbers in the past that show their CPU usage increased by about 1% and the network overhead by 2% when they enabled SSL for Gmail. Modern CPUs (especially server CPUs) are capable of handling thousands of handshakes per second, and the symmetric encryption used by the most common ciphers can largely be offloaded to things like AES-NI. If you're YouTube or Netflix, you might have to put some thought into this, but otherwise it doesn't really matter. https://istlsfastyet.com/ provides a very good summary of this topic. Things are only going to get better with TLS 1.3 which includes Zero-RTT handshakes.
This statement doesn't make any sense. A MitM attack is useful any time you can manipulate the content one of the two parties receive, while making it look like it came from the other party. This affects unencrypted communication and communication that uses broken crypto, but if you're using modern TLS, you're fine.
Insecure routers are only a small part of the problem. There are any number of network hops between you and the website you're visiting, and each is capable of reading the traffic, modifying the content, adding code (as in JavaScript), malware, ads, etc. This could be anyone from your ISP, the public WiFi you're on, any of the upstream network providers your ISP or the website's ISP use, state actors, etc. This is not just theory - many ISPs have started injecting tracking code or even ads into plaintext traffic. Modern TLS mitigates these issues.
HTTP/2 supercedes SPDY, it's essentially the internet standard version of SPDY. There aren't too many recent market share statistics of HTTP/2. Netcraft published this in October 2015:
However, 29% of SSL sites within the thousand most popular sites currently support SPDY or HTTP/2, while 8% of those within the top million sites do.
Wikipedia enabled HTTP/2 quite recently, and many CDNs have started supporting it recently as well, so I imagine there'll be quite an uptick soon.
The benefit of encrypting traffic isn’t limited to a website that implements HTTPS, it also benefits the website’s visitors.
Encryption means both sides of a conversation are relatively confident that no “man in the middle” is able to listen in to their conversation or alter it. This has obvious security benefits (preventing session hijacking, content tampering or forging; protecting the transmission of personal/confidential information) but more importantly brings forth a concept of confidentiality by limiting eavesdropping.
Without decryption… a man in the middle can only determine the HOST, not the URL, of an requested resource. This is an incredibly important feature for many parts of the world where there are no free-speech rights and people are routinely jailed for dissenting opinion or thought.
tialaramex, Thank you so much for your in-depth analysis. You are right, I could scarcely have been more incorrect. I don’t know how I didn’t realize all these points before. I especially enjoyed your examples and your analogy of the extension telephone.
pfg, Thank you so much for correcting my mistakes and providing deeper understanding. I’m glad that Google’s proprietary solution will be replaced by the HTTP/2 standard, and this does seem a powerful motivation (among the others) for using HTTPS instead of HTTP. I like the articles you linked to. I hope everyone else who doesn’t know this stuff yet will find out about it soon. Too many important topics are only learned through random chance these days.