Certificates/Domain Limit Problem


#1

Hello,
I’m in the process of integrating Let’s Encrypt into our CMS. So far so good - except one major Problem reagarding Certificates/Domain Limits. In our CMS all Clients can be access with realdomainname.customers.futureweb.at. That’s for testing/staging purpose. As we got several thousand Clients within our CMS I immediately hit the Certificates/Domain Limit … :-/
As all Content is staged to those realdomainname.customers.futureweb.at Domains - they also need SSL Certificate so Customers see if everything is working as expected … (no non-https linkings, …)

Any chance to get such a scenario running? (unfortunately no Wildcard Certs by Let’s Encrypt)

Thx
Andreas Schnederle-Wagner


#2

Considering that it is likely a dynamic pool of clients, so not just a fixed set of multiple names, I’d really go for a wildcard. It will cost you something in terms of money, but will give you less headache in terms of managing it I believe.

P.S. If you are adamant regarding not using a wildcard cert, then technically you could try switching that scheme to use one hostname with multiple paths under it, which get proxy-passed to appropriate client backends. That might look a bit strange though from the client’s point of view.


#3

yes - new Clients come - old ones go … it’s dynamic. Was hoping for a LE Solution as I already fully automated LE Certificate Management within our CMS - as soon as a new Client is created the CERT is generated - Apache VHost configured, … same for Domain modifications / deletions, …
So it was nice to just add realdomainname.customers.futureweb.at to the Cert signing and all was done … until I got Errors about Rate Limit … :wink:

As every Customer also got it’s own Apache VHost it would double the amount of Apache VHosts if I work with a seperate Wildcard Cert! :-/


#4

If the alternate domain is for staging/testing, I don’t see why you can’t use a self-signed certificate and expect clients to trust it.


#5

Well - because one would need a dedicated Employee to convince thousands of (not computer savvy) Customers that it’s SAFE to accept a self-signed Cert - even if their Browser is telling them it’s not safe … :wink:
(apart the fact that this wouldn’t be a clean solution …)

Anyway - I just wanted to know if there is any way to solve this specific Problem with LE which I was overlooking … it seems there isn’t … so I guess I have to go for the Wildcard approach - even if this will double Apache VHosts … (guess Apache is able to handle it …)

Thx


Best option for issuing certificates for custom domains
#6

Your domain might belong on the Public Suffix List, which is basically a collection of domains under which other domains can be registered. Just like “official” TLDs like .com, private suffixes such as kunden.futureweb.at can also be added to this list. This list is also what’s used by Let’s Encrypt to determine the rate limit scope, so if you register kunden.futureweb.at as a public suffix, there would be no rate limit for that domain, and the rate limit would apply separately for every subdomain. More precisely: one DNS label + kunden.futureweb.at - I’m mentioning that because this could still be a problem if “realdomain” could be something like example.com, so you’d run into the rate limit for com.kunden.futureweb.at quite quickly if that’s the case.

If you want to explore this option, the PSL site has instructions on opening a pull request that would add your domain. The final decision on whether the domain belongs on the PSL will be up to the maintainers (I think it does, as your use-case is similar to things like github.io). Note that this process might take a couple of weeks or even months, and if it gets added you’ll still need to wait until Let’s Encrypt updates their copy of the PSL (which could take a month or two as well). This would definitely be a long-term approach, so I’m not sure if that’s viable for you.


#7

I have a similar problem with domain dy.fi. I have reserved a subdomain and now my certificate is expiring tomorrow. When trying to renew it, I get this error:

There were too many requests of a given type :: Error creating new cert :: Too many certificates already issued for: dy.fi. Skipping.

I just checked that the domain belongs to Public Suffix List and the FAQ there states that Let’s Encrypt is using it so is there some other limit I’m facing? I have only one certificate for my domain.


#8

If you’re renewing an existing certificate, you should not be subject to the limit for which you’ve quoted an error. A renewal (issuing new certificates for the same names, to the same subscriber) counts against a separate limit.

The most likely cause is that your “renewal” is instead a new issuance for some reason, e.g. you decided to add more names, or you created a new account rather than keeping the one which issued the original certificate, but it’s also possible there is a bug of course. Can you say the exact name you’re trying ?


#9

Hi pfg,
thank you for for pointing me into this Direction - that might be the way to go for the long term! :slight_smile:
Andreas


#10

I’m suggesting this because, for over a decade, Dreamhost has used a self-signed cert or other-domain cert to offer https webmail for their customers. A warning box and an email will accomplish a lot. These same non-savvy people are the ones who willingly install malware.


#11

Just because other companies solve it that way - doesn’t mean that’s a good solution! :wink:

Already made a request to get those Domains listed in the Public Suffix List as proposed by pfg - most likely it will take some months - but then it’s a very good solution!


#12

I tried the renewal again today. I used the same command certbot renew and this time the renewal was successful. The only “difference” was that this time the certificate had already expired.


#13

It turns out that there is in fact a bug, although a fix is due to land later this week so that your renewals shouldn’t be affected by the limit in future, as was intended.


#14

I’m with Neocities, and I wanted to chime in here quickly.

Public Suffix inclusion for rate limit scope is great for general scoping, but unfortunately mandating it as a requirement is not going to work for every scenario. We considered adding Neocities to the Public Suffix list, but were forced to walk away after it was discovered it would break our ability to use cookies on our root site, breaking the trust model we have with our users (main site neocities.org, user sites *.neocities.org). In addition to the multiple-month timeframe for getting an update in place, it would quite literally destroy our site to put it on that list!

Let’s Encrypt really does need a mechanism for outliers here. It could be a public document if transparency was needed, but I don’t think fairness will be an issue as the primary issue would be related to quotas to prevent abuse causing problems for larger organizations.


#15

It would be nice if LE got it’s own (additional) “Public Suffix” List for Rate Limits.

Would have several benefits:

  • no side effects on Cookie Privacy
  • hopefully faster adjustments than with current List

Only downside would be - someone would need to maintain the List / accept Pull Requests (if they fit the to be defined rules)


#16

This may be causing a security problem for you. Any of your subdomains can set (and sometimes read) cookies from your main domain. So someone can create foo.neocities.org and configure it to clear the neocities login cookie from visitors, or set cookie values that can trigger malicious behavior, or perform a session fixation attack. I’d recommend you do a security review of what’s possible with your current setup, and make plans to move management functions to a domain that is not shared with user-controlled sites.


#17

Thanks for bringing this up, it’s always good to mention potential security issues! Long story short, it’s not causing a security problem for us. We have code and mechanisms (prefixing, auto-detection, encrypted cookies, header overrides, etc.) for preventing session attack issues. The theoretical future risks due to accidental leakage by browsers at this point are not concerning enough (yet) to justify a mass migration of all our users to a different top level domain.

If I was to make a change here, it would probably be to move the front-store Neocities site to www.neocities.org, but that is also not an ideal solution, because it breaks the trust model that our users are familiar with. I may decide to do this in the future, but doing this to soley satisfy an ambiguity issue with the PSL would be pretty difficult for me to justify.

If the PSL is being used as a sole required mandate for quota limit removal, I’d like to discuss that a bit more. If it’s not the sole mandate, you can safely ignore the following text. :slight_smile:

I’m not opposed to the PSL existing and I think it provides a good service, but I am worried about relying on it as the sole required mandate for being able to use Let’s Encrypt in larger organizations. In a nutshell, PSL breaks many legitimate scenarios where a root domain should be allowed to function, while still providing an explicit barrier between it and the subdomains:

  • As mentioned, the update process takes a long time. Having a several month waiting period to start a new hosting service is a significant barrier to entry that will prevent many new adopters from considering both PSL and Let’s Encrypt.
  • Maintenance on the PSL seems to be based on occasional developer availability, further complicating the issue. I’m certain it’s because the maintainers are busy at their primary jobs and that there is no ill will, but it’s worth noting.
  • There is no specification for how domains on the list will behave once they are added to the list, it’s left as an exercise to the implementers (which is why I had no idea it would break our cookies with some implementations). I’m not convinced we’ve hit “peak unintended consequences” here yet, and the potential repercussions are devastating, as removing from the PSL is also very difficult and time consuming.
  • There are (I believe) legitimate pull requests in that list that have been sitting there for long periods of time (what’s the process for escalation/rejection?).
  • It’s IMHO still abusable by attackers, if that’s Lets Encrypt’s primary concern (no business verification or notary requirements).

So I do believe that a complementary PSL-like system is needed for Let’s Encrypt if it does not yet exist.

In the interim, I’m also not a fan of arbitrary decisionmaking and I understand where you’re coming from, but I’d love it if you could make an exception for Neocities for the moment. We’re ready to go with Let’s Encrypt as soon as we can safely launch with support for it.


#18

We do provide rate limit overrides for some hosting providers, but for those based on subdomains we definitely recommend PSL as the most correct choice, for the security and privacy reasons talked about previously.

If you’ll PM me your email address I’ll put you in touch with Josh regarding rate limit overrides.


#19

+1 on the recommendation. I would definitely design Neocities differently from scratch the second time around.

I’m not able to send you a PM here for some reason (new account restrictions?), but you can just use kyle |AT| kyledrake.net. Thank you for your assistance on this, I wish I wasn’t an outlier on this one.


#20

Have I understood correctly that if the parent domain in on the Public Suffix List, there should be no restrictions or rate limiting?

Now that I got my first “test” domain “piwik.kettu.dy.fi” working, I tried to switch another domain to LE, “pilvi.kettu.dy.fi”. When I tried to get a certificate, I get this one:

certbot certonly --webroot -w /path/to/app/ -d pilvi.kettu.dy.fi An unexpected error occurred: There were too many requests of a given type :: Error creating new cert :: Too many certificates already issued for: dy.fi

Does this mean that even though “dy.fi” is on PSL there are still some restrictions and I should try again tomorrow or next week?