If non-secure content hasn’t been removed before LE is installed on a website, users will remain unprotected and browsers will warn about mixed content (https://developer.mozilla.org/en-US/docs/Security/MixedContent), and in some cases block all non-encrypted content – presumably including ads. If not addressed by LE, this naturally could lead to lots of grief and most possibly a very significant loss of revenue. Can the process of changing all links to HTTPS be automated and then verified by LE as part of its installation? If not, can LE by default be set to verify all links and not install itself on sites containing non-secure items?
Well there’s the draft for Content Security Policy Upgrade Insecure Requests header http://www.w3.org/TR/upgrade-insecure-requests/ but not sure which browsers would support that. It was mentioned in the Letsencrypt video at https://www.youtube.com/watch?v=0JioB7rNpvI
Pretty sure the onus is still on web site owner to ensure their web apps handle mixed content warnings prior to implementing HTTPS site wide. Mixed content warnings and ad networks support for HTTPS is pretty much what I see as the top barriers for HTTPS and SSL certificate adoption especially if your web app/site relies on user generated, uploaded, linked or posted content…
Upgrade-insecure-requests is already supported by Chrome and Firefox. Out of the box, the official Let’s Encrypt client will send that header to browsers which indicate that they support it, and will give site admins the choice of (a) no redirect from HTTP; (b) a conditional redirect for clients that support upgrade-insecure; and © redirects for all clients. Either (a) or (b) will be the default, marked as “easy” mode; © will be “secure” mode.
Though it helps enormously, upgrade-insecure won’t eliminate all mixed content situations. To handle mixed content really thoroughly, we probably need to go further than that; in particular, we should consider (1) standing up CSP report-mode servers that detect mixed content situations from the client side and (2) offer to install rewriting proxies that apply HTTPS Everywhere rulesets on the server side.
But both (1) and (2) are complex tasks that aren’t on our definite roadmap.
thanks @pde is there any future indication for upgrade insecure requests in respective roadmaps for other web browsers ?
My GA stats shows Chrome and Firefox are my largest visitor sessions but that still leaves other web browsers heh
Have you considered proxying and caching user-uploaded content? The site I work for does that to avoid warnings. Modern forum software often supports it out of the box.
@Zenexer yeah Xenforo is what I use for my main forums and that has inbuilt image proxy to handle https mixed content. However, for other forum or web apps it’s still a concern.
It’s worth noting that those warnings are there for a reason: mixed content can render HTTPS completely pointless, especially when the insecure content is a script or frame. Images are a bit more acceptable, but information is leaked about visitors’ browsing habits.
Yeah i understand that.
For me I’ve been using forums most of the time and for many it’s user linked or embedded non-https external images url links in posts that’s troublesome on a https enabled forum.
Then again if every site online used https, then mixed content would be a thing of the past
My fear is always that sites ignoring or disabling the warnings either aren’t smart enough to avoid compromising security entirely, or will eventually slip up.
only way would be to combo education and marketing
I know Firefox and Chrome both block “active” mixed content. Active content being scripts, frames, cs, etc.
Other browsers may as well, but I dont use them.
You have to choose to allow it by clicking an icon in the address bar (on the left hand side in firefox, and on the right hand side in Chrome).
You can test that here: https://mixed.badssl.com/mixed/script/
thanks @rugk for that link to http://caniuse.com/#search=upgradeinsecurerequests