Letsencrypt for .onion?

Hi,

Digicert issues certs for .onion. I would like to know if letsencrypt also issues for .onion.

1 Like

.ONION is not in the internet DNS zone and it’s not recognized an official TLD. As far as I know, DigiCert initially supported the .ONION, but as a result of the deprecation of internal server names from the CA/B forum, they announced they would have revoked them within November 2015.

At the time being, issuing certificates for the .ONION is implicitly discouraged by the CA/B, therefore I don’t think LE supports it (considering they don’t support internal server names and since it requires a proper proxy to connect to an .ONION site for validation).

2 Likes

Since October 2015 .onion is an official special use domain name: https://tools.ietf.org/rfc/rfc7686.txt

1 Like

Thanks @mrtux, I definitely missed it.

1 Like

Here is another update from DigiCert:
https://blog.digicert.com/onion-officially-recognized-special-use-domain/

Sounds as if this is all a little too early to be useful to the bulk of users. Also, it sounds as if for the time being, .ONION requires EV certificates, which LetsEncrypt by definition cannot issue. I am sure, over time, the rules will be worked out better and eventually certificates for .ONION will be more readily available; maybe, at that point, LetsEncrypt can also join the action.

I’m not sure understand why you would absolutely need a EV certificate for .ONION?

The complicated part would definitely be the ownership verification. Any idea how this could work?

This is simply a compromise inside of the CA/Browser Forum which has not yet been revisited.

1 Like

LE simply needs a leg inside the Tor network by running a Tor client.

I also don’t see any necessity for EV certs. Ownership for .onion names is even harder to spoof than for regular names.

1 Like

The argument for EV is that unlike Internet domain names, the .onion Tor names are gibberish. So whereas knowing you’re really connected to eBay.com or Google.com is meaningful, we have to expect that facebookhomepage.onion seems like it might be what you wanted as much as facebookcorewwwi.onion does, and a certificate isn’t helping you know which one is the famous social network.

But knowing that DV certs simply certify a name and not organizations, I don’t see a problem using them for .onion. It’s really no different than normal TLDs. A DV cert is “DNS name <-> public key” and that doesn’t change just because we’re below .onion.

we have to expect that facebookhomepage.onion seems like it might be what you wanted

There's basically the same problem there with normal DNS names - many (most?) people wouldn't notice that a certificate wasn't EV if they stumbled upon facebookhomepage.ws and it looked like the Facebook page.

The bigger, seemingly unstated problem is that many .onion operators are running sites that explicitly need privacy (political dissidents, etc.). Acquiring an EV cert is the absolute last thing that those site operators are going to do, yet we need defense in depth for the potentially vulnerable users of those sites in case the Tor protocol gets compromised, either directly or with traffic correlation attacks. It's hard to imagine that the privacy conflict isn't apparent to anybody familiar with both Tor and EV certs which just makes it appear to be "a nice way of saying 'no'".

I did read that there's a concern by CAB that the current 80/128-bit .onion addresses could be brute-forced with tens of millions of dollars of hardware. I think it's important to remember that ten million dollars can buy you lots of access on the social side, and technical attacks and social attacks are just as important in the security ecosystem.

I hope this gets revisited soon as Letsencrypt certs would be a useful addition to the Tor ecosystem.

Hidden Services (v2) were easily weaker than their (currently alpha) counterparts and will become available in future releases of Tor. Since they're available now, what process needs to be started to evaluate them -- identify the areas for improvement -- and then bring that back to the tor project folks and then get them to address those concerns?

The question of whether or not should we use PKI backed TLS in tor has already been answered, it is a big yes -- and facebook is pretty much the only interaction most folks have on tor with TLS. Because EV's aren't cheap (and in most cases go against the nature of tor/hiddenservices in general.)

The question is now: Is a 256 bit ed25519 keypair based cryptosystem enough to root trust in? If not, I've got a story to tell you about BGP and SHA1, or DNS and a phone call to a registrar... I think the question as to how can you prove ownership of a hidden service backed system is as simple as making sure the LE registration authority can speak Tor.

The last piece, which I think is relevant for lots of reasons is: will .onion be designated a reservation/exception by ICANN such that we can make sure there won't be .onion addresses in the clearnet ever. Facebook tried to make this a thing, and one of their own published RFC7686 which expands on IANA rules regarding special use Domains to make sure it's agreed that this is the case.

And if you ask ICANN about it they believe in the idea, but want to make sure the process of reserving names through RFCs is sound. As of August the question is outstanding:

Question 2.2.3: Do you think Special Use Domain Names should be added to the Applicant Guidebook section on reserved names at the top level to prevent applicants applying for such labels?
-- A good example would be .onion. Most people would like to keep special use domain names should be reserved.
-- Question: How did the IETF RFC 6761 come into being? Response: Understanding that it would have been approved by the IETF. The bottom-up consensus building process is extremely robust. The process for all of the special use names in IETF is going under review. There are two ways that things go through: 1) through the working group and if it is judged not to be an end run it goes into a last call as an Internet draft in the IETF community before becoming an RFC. What is being talked about is how to do better coordination. Also, what is being discussed is why people need top-level names for their special use names, rather than second-level. It is quite a lengthy process. We were notified when .onion was going through.

What we are left with is, do the CAs now have enough confidence in the system that this is acceptable for DVs? Do they want to wait for ICANN to officially sanction the IEEE RFC process? Is there something more that Tor developers, or the community at large can do to make this happen?

A thread on allowing DV issuance for next-generation hidden services was recently started by @schoen on the cabfpub mailing list. There was some discussion on validation methods, but no one seems to have argued against allowing DV issuance, so hopefully a future ballot will pass.

4 Likes

I wasn’t aware of the thread! Looks like there’s been some progress, though I’m not sure how to tell if there’s been a ballot proposed for this yet.

There hasn’t been a ballot proposed. When there is, it will be proposed by a member of the Forum and will have a ballot number. :slight_smile:

1 Like

Isn’t it kinda pointless?

Onion addresses already are key fingerprints. What more validation shall you need?

(That is, unless you want to prove your real identity)

when you want http/2 speed upgrade as it doesn’t support handshaked on http?

2 Likes

For sites that use HSTS, it’s important to have their .onion address on the certificate as well, so that when users visit their site via tor, their browser doesn’t show a big, scary warning (as is currently happening with my site: priveasy6qxoehbhq5nxcxv35y6el73hpzpda7wgtnfe5qaspemtl6qd.onion). It’s true that it’ll be redundant in terms of end-to-end encryption, but it’s still important in order to maintain security on the clearnet site, and “validates” that the darknet site isn’t a phishing site or some clone, and is actually associated with the trusted clearnet site.

1 Like

@NTroy Aren’t HSTS-headers host-dependend? I.e., when you surf to the .onion site, which is a different hostname than the clearnet site, it should not generate a warning?

4 Likes

Yes! You are absolutely right. The only problem, however, is that many web application frameworks do not allow for a distinction to be made between hosts, when it comes to serving headers. cough Django cough Therefore, most projects will be stuck between a rock and a hard place with this issue, and have to choose between serving their entire site insecurely, or using an entirely new backend instance to serve requests made over Tor, which is simply not practical for small/medium sites. (They could also attempt to spoof data through uwsgi, django modifications, etc… but we’ll ignore that for now).

Since I made the original comment, there have been two new developments worth mentioning:

1.) Tor Project introduced Onion-Location headers, which helps to reduce some of the worries I’d previously commented about (although does not entirely eliminate them)
2.) Personally, I was able to customize my Django installation to spoof the request scheme when serving the onion site. For obvious reasons, although functional, this is most-certainly not preferable.

1 Like