Hitting Rate Limit for swedencentral.cloudapp.azure.com

I wanted to generate a certificate for wim.swedencentral.cloudapp.azure.com, but I've hit the Rate Limit:

There were too many requests of a given type :: Error creating new order :: too many certificates already issued for "swedencentral.cloudapp.azure.com". Retry after 2024-05-07T01:00:00Z

The Domain Name "swedencentral.cloudapp.azure.com" is part of the public Microsoft-cloud (Azure).
So, I have no control at all, to other people requesting certificates for "something.swedencentral.cloudapp.azure.com".

I've seen similar problems with "westeurope.cloudapp.azure.com" and "francecentral.cloudapp.azure.com".
Is there an option to increase the limits for cloudproviders? (Azure, AWS, ...)

(Using the staging server works fine, but I would like to get just one "real" certificate for my site.)

Hi @goedertw, and welcome to the LE community forum :slight_smile:

I think you've confused the rate limit exceeded with another one.
I see your limit as you already have 5 certs [for that specific name (FQDN)] - not there are too many certs issued for that domain.

Which ACME client are you using?

3 Likes

There is Duplicate Certificate Limit and Rate Limits.

3 Likes

I can't seem to find any certs having ever been issued containing "swedencentral.cloudapp.azure.com".
So... I'm :confused:

2 Likes

Personally I'd try to use Azure's own free certificates for this anyway, or use a custom domain.

When requesting certs from LE it's important to preserve them especially if using ephemeral servers or restoring snapshots etc, they are a rate limited resource. One option is to store in Azure Keyvault and pull your cert from that when you need it.

6 Likes

crt.sh times out for me there are so many. Entrust's search shows a lot.
https://ui.ctsearch.entrust.com/ui/ctsearchui

The rate limit shown isn't for 5 dupes/week but too many for one "registered" domain (so default of 50/week). I'm not sure why that is considered a registered domain but I saw a thread similar to that in recent days. It is not in the PSL so LE must treat it special.

I don't know Azure very well but @webprofusion advice is sound.

4 Likes

That's exactly the solution I used with a former employer. :slightly_smiling_face:

5 Likes

Yes, quite possibly—but the cloud provider would have to request it; normally Let's Encrypt expects rate limit requests to come from the responsible party for the domain itself rather than sua sponte from Let's Encrypt itself or from an end user. And both Let's Encrypt and the cloud provider might have some concerns about the concept of name ownership or persistence for such domains (whether the same customer consistently uses an individual name over time).

We've seen that kind of issue with Salesforce subdomains and some other cases like AWS subdomains that encode an IP address, and it didn't seem like the domain owners were very interested in pursuing rate limit exemptions. (Universities, on the other hand, usually are interested and usually do receive such adjustments.) I'd encourage anybody faced with this (not just you, @goedertw) to try asking the cloud provider what it thinks is a best practice in these cases. In some cases, it will definitely be to use the cloud provider's own certificate authority!

4 Likes

Thx for all the replies.

  • Using Azure's free certificates sounds good (I was not aware they existed), but after a quick investigation is looks like they only apply to their "App Services" (not usable for "Virtual Machines")
  • Azure Key Vault let you generate selfsigned certificates or store exisiting certificates. So, I still need to generate one proper certificate.

This morning, I could generate my one LE-certificate. I'll store it and take care of it as if it were my baby :slight_smile:
(and I'll use the staging-server whenever possible)

5 Likes

The issue is that Microsoft made the following change to the PSL: Removing wildcard for cloudapp.azure.com by edwa001 · Pull Request #1944 · publicsuffix/list · GitHub

They later fixed this with Removing cloudapp.azure.com from PSL by edwa001 · Pull Request #1966 · publicsuffix/list · GitHub, but this change is not propagated to let's encrypt yet.

In theory after they register the latest list the certificate generation will work again.

2 Likes

I'm a little confused, those pull requests look like they're removing the names from the PSL entirely. If *.cloudapp.azure.com isn't on the PSL, then they would need to request a rate limit from Let's Encrypt directly if they're expecting users of those names to be able to get certificates from Let's Encrypt that the default limits per "base" domain name wouldn't be enough.

5 Likes

That's not a fix: it only makes things worse.

3 Likes

Are you sure? Because before Azure updates for Microsoft Corporate Domains by edwa001 · Pull Request #1891 · publicsuffix/list · GitHub, it was the same situation, it was not on the list at all. And we were using it well before that without any issues.

I'm pretty sure all requests for foo.bar.cloudapp.azure.com are now counted against just the azure.com domain, making it even harder to get a certificate for any subdomain.

Although:

NXDOMAIN looking up A for foo.bar.cloudapp.azure.com

I'm getting the above error, not a rate limit..? Maybe they already have a rate limit exemption in place? I dunno :man_shrugging:t2:

3 Likes

Thats my guess, that they have an exemption already, because it was somehow working for years before the Feb PR, when it was not on the PSL.

And based on this comment it's supposed to fix it, at least according to Microsoft Removing wildcard for cloudapp.azure.com by edwa001 · Pull Request #1944 · publicsuffix/list · GitHub

But then we wouldn't have seen below error in first post. Unless the exemption limit itself was reached perhaps.

Good find about the PSL PR. This still does not feel well explained.

3 Likes

Agreed..

Which is weird, as that domain name would suggest the PSL removals weren't active in Boulder at that time. But Boulder currently runs commit 939ac1be (see https://acme-v02.api.letsencrypt.org/build) which I thought already contained those adaptations from at least March..

3 Likes

As I mentioned, the issue is that the second PR is still not propagated to letsencrypt as it's using the build from 04.11.: boulder/go.mod at main · letsencrypt/boulder · GitHub
which still has the wrong entry:
publicsuffix-go/publicsuffix/rules.go at 21202160c2ed9b468febed8dd5a2ca18d6a988b7 · weppos/publicsuffix-go · GitHub

Hm, that wildcard and non-wildcard PSL entry stuff is causing me a headache :stuck_out_tongue:

With the wildcard (as it was before the removal[s]), would the error message show foo.swedencentral.cloudapp.azure.com by any chance?

3 Likes

Not even sure the PSL with the wildcard state was ever registered into boulder :smile:

Curently what we have in the PSL that is used by boulder is the cloudapp.azure.com entry which causes the error form the post, until they update it with a newer build of publicsuffix-go. Then I expect the exemption for azure.com will apply and we will have no errors.

1 Like