This is simply a compromise inside of the CA/Browser Forum which has not yet been revisited.
LE simply needs a leg inside the Tor network by running a Tor client.
I also don’t see any necessity for EV certs. Ownership for .onion names is even harder to spoof than for regular names.
The argument for EV is that unlike Internet domain names, the .onion Tor names are gibberish. So whereas knowing you’re really connected to eBay.com or Google.com is meaningful, we have to expect that facebookhomepage.onion seems like it might be what you wanted as much as facebookcorewwwi.onion does, and a certificate isn’t helping you know which one is the famous social network.
But knowing that DV certs simply certify a name and not organizations, I don’t see a problem using them for .onion. It’s really no different than normal TLDs. A DV cert is “DNS name <-> public key” and that doesn’t change just because we’re below .onion.
we have to expect that facebookhomepage.onion seems like it might be what you wanted
There’s basically the same problem there with normal DNS names - many (most?) people wouldn’t notice that a certificate wasn’t EV if they stumbled upon facebookhomepage.ws and it looked like the Facebook page.
The bigger, seemingly unstated problem is that many .onion operators are running sites that explicitly need privacy (political dissidents, etc.). Acquiring an EV cert is the absolute last thing that those site operators are going to do, yet we need defense in depth for the potentially vulnerable users of those sites in case the Tor protocol gets compromised, either directly or with traffic correlation attacks. It’s hard to imagine that the privacy conflict isn’t apparent to anybody familiar with both Tor and EV certs which just makes it appear to be “a nice way of saying ‘no’”.
I did read that there’s a concern by CAB that the current 80/128-bit .onion addresses could be brute-forced with tens of millions of dollars of hardware. I think it’s important to remember that ten million dollars can buy you lots of access on the social side, and technical attacks and social attacks are just as important in the security ecosystem.
I hope this gets revisited soon as Letsencrypt certs would be a useful addition to the Tor ecosystem.
Hidden Services (v2) were easily weaker than their (currently alpha) counterparts and will become available in future releases of Tor. Since they’re available now, what process needs to be started to evaluate them – identify the areas for improvement – and then bring that back to the tor project folks and then get them to address those concerns?
The question of whether or not should we use PKI backed TLS in tor has already been answered, it is a big yes – and facebook is pretty much the only interaction most folks have on tor with TLS. Because EV’s aren’t cheap (and in most cases go against the nature of tor/hiddenservices in general.)
The question is now: Is a 256 bit ed25519 keypair based cryptosystem enough to root trust in? If not, I’ve got a story to tell you about BGP and SHA1, or DNS and a phone call to a registrar… I think the question as to how can you prove ownership of a hidden service backed system is as simple as making sure the LE registration authority can speak Tor.
The last piece, which I think is relevant for lots of reasons is: will .onion be designated a reservation/exception by ICANN such that we can make sure there won’t be .onion addresses in the clearnet ever. Facebook tried to make this a thing, and one of their own published RFC7686 which expands on IANA rules regarding special use Domains to make sure it’s agreed that this is the case.
And if you ask ICANN about it they believe in the idea, but want to make sure the process of reserving names through RFCs is sound. As of August the question is outstanding:
Question 2.2.3: Do you think Special Use Domain Names should be added to the Applicant Guidebook section on reserved names at the top level to prevent applicants applying for such labels?
– A good example would be .onion. Most people would like to keep special use domain names should be reserved.
– Question: How did the IETF RFC 6761 come into being? Response: Understanding that it would have been approved by the IETF. The bottom-up consensus building process is extremely robust. The process for all of the special use names in IETF is going under review. There are two ways that things go through: 1) through the working group and if it is judged not to be an end run it goes into a last call as an Internet draft in the IETF community before becoming an RFC. What is being talked about is how to do better coordination. Also, what is being discussed is why people need top-level names for their special use names, rather than second-level. It is quite a lengthy process. We were notified when .onion was going through.
What we are left with is, do the CAs now have enough confidence in the system that this is acceptable for DVs? Do they want to wait for ICANN to officially sanction the IEEE RFC process? Is there something more that Tor developers, or the community at large can do to make this happen?
A thread on allowing DV issuance for next-generation hidden services was recently started by @schoen on the cabfpub mailing list. There was some discussion on validation methods, but no one seems to have argued against allowing DV issuance, so hopefully a future ballot will pass.
I wasn’t aware of the thread! Looks like there’s been some progress, though I’m not sure how to tell if there’s been a ballot proposed for this yet.
There hasn’t been a ballot proposed. When there is, it will be proposed by a member of the Forum and will have a ballot number.
Isn’t it kinda pointless?
Onion addresses already are key fingerprints. What more validation shall you need?
(That is, unless you want to prove your real identity)
when you want http/2 speed upgrade as it doesn’t support handshaked on http?
For sites that use HSTS, it’s important to have their .onion address on the certificate as well, so that when users visit their site via tor, their browser doesn’t show a big, scary warning (as is currently happening with my site: priveasy6qxoehbhq5nxcxv35y6el73hpzpda7wgtnfe5qaspemtl6qd.onion). It’s true that it’ll be redundant in terms of end-to-end encryption, but it’s still important in order to maintain security on the clearnet site, and “validates” that the darknet site isn’t a phishing site or some clone, and is actually associated with the trusted clearnet site.
@NTroy Aren’t HSTS-headers host-dependend? I.e., when you surf to the .onion site, which is a different hostname than the clearnet site, it should not generate a warning?
Yes! You are absolutely right. The only problem, however, is that many web application frameworks do not allow for a distinction to be made between hosts, when it comes to serving headers. cough Django cough Therefore, most projects will be stuck between a rock and a hard place with this issue, and have to choose between serving their entire site insecurely, or using an entirely new backend instance to serve requests made over Tor, which is simply not practical for small/medium sites. (They could also attempt to spoof data through uwsgi, django modifications, etc… but we’ll ignore that for now).
Since I made the original comment, there have been two new developments worth mentioning:
1.) Tor Project introduced Onion-Location headers, which helps to reduce some of the worries I’d previously commented about (although does not entirely eliminate them)
2.) Personally, I was able to customize my Django installation to spoof the request scheme when serving the onion site. For obvious reasons, although functional, this is most-certainly not preferable.
I need a (stupid) ssl certificate for an hidden service to be able to send push notifications (https://developer.mozilla.org/en-US/docs/Web/API/notification). as simple as that. It is technically easy to do, although unsupported.
why doesn’t tor browser itself insert it’s own CA (name constrained locked to .onion domains) and automatically write certificate to onion domain with matching public key?
That would involve putting the certificate authority’s private key in the browser. If I was an attacker I’d just download the browser source code, extract the private key and sign certs for whatever .onion domains I’d like.
At that point, we could ask why browsers don’t just treat onion (v3) services as secure contexts? Or even just the “Tor Browser” alone could do so, if it is the de facto browser for the product.
It’s not clear to me why, if onion services make the same confidentiality/authenticity/integrity promises as TLS, layering TLS over it is considered productive or beneficial:
All traffic between Tor users and onion services is end-to-end encrypted, so you do not need to worry about connecting over HTTPS.
Because this is a common perspective, I want to make it very clear why supporting https for
.onion domains is important: it makes sense to add an https cert for Onion Services when adding a
.onion domain as an alias to an existing, complex site that has http -> https redirects buried everywhere
Personally, I'm trying to add a
.onion as a secondary domain for all my existing wordpress sites. Years ago, I migrated from http to https when Let's Encrypt first came out. When I made that transition, I checked all the boxes to "redirect http to https" everywhere I could: in my web server's config, in the caching reverse proxies, in the CMS core config, in various plugins, themes, etc. Now that I'm trying to add a
.onion to my existing websites, I'm finding that some of my sites work OK, but others stubbornly refuse to serve traffic over http. They just 301 redirect to
https://xyz.onion (which of course doesn't work). Isolating and changing this behavior is non-trivial, especially for large sites.
There's huge privacy benefits to be gained by site admins making their existing websites accessible to tor users through onion services, but--since all our great efforts to migrate from http to https in the past years--it's not always trivial to just point a
.onion at a website and have all the infrastructure we hardened just accept serving over http again.
If ACME could support issuing certs for .onion sites, this would lower the barrier of entry for sysadmins to be able to bring their existing websites onto the tor network, which would be another huge benefit for the privacy of Internet users everywhere.
I definitely agree, currently running into this issue currently trying to make my Mastodon instance available to the Tor network...