Failed to connect to let's encrypt, confirm domain

Please fill out the fields below so we can help you better. Note: you must provide your domain name to get help. Domain names for issued certificates are all made public in Certificate Transparency logs (e.g. https://crt.sh/?q=example.com), so withholding your domain name here does not increase secrecy, but only makes it harder for us to provide help.

My domain is:cloudrancho.kittenmcnuggets.com

I ran this command: replace cert

It produced this output:failed to connect to let’s encrypt, confirm domain is valid

My web server is (include version):synology, no website setup

The operating system my web server runs on is (include version):DSM6

My hosting provider, if applicable, is:not hosted, parked at namecheap

I can login to a root shell on my machine (yes or no, or I don’t know):don’t know

I’m using a control panel to manage my site (no, or provide the name and version of the control panel):no

The version of my client is (e.g. output of certbot --version or certbot-auto --version if you’re using Certbot):?

The basics,

I have ports 80 and 443 fwd’d to 5000/5001 in my router. (they are off now, I open them when I’m trying to get this to work)

I have cname cloudrancho at my paid domain kittenmcnuggets.com pointed to my xxx.synology.me ddns.

with those items, I can log into my synology DSM just fine via any pc just fine with quickconnect turned off.

MXtools says my domain checks and email redirect policy are in place.

After some failed attempts I tried to generate the cert using my synology ddns (xxx.synology.me) and lets encrypt created the cert. my understanding is that since this is not a paid domain, the cert will not function the same with the ddns cert.

I have since tried to replace the ddns cert using the cloudrancho.kittenmcnuggets.com domain multiple times all failing. I have tried logged in locally, logged in remotely by accessing it with the cloudrancho.kittenmcnuggets.com link, I have tried installing the web package and setting up a virtual host (not sure if I completed that correctly), but all give the same error in the title.

I need another suggestion to start researching…

thanks all

Hi @mcnugget

your domain is invisible (checked with https://check-your-website.server-daten.de/?q=cloudrancho.kittenmcnuggets.com ):

Domainname Http-Status redirect Sec. G
http://cloudrancho.kittenmcnuggets.com/
47.156.80.55 -14 10.027 T
Timeout - The operation has timed out
https://cloudrancho.kittenmcnuggets.com/
47.156.80.55 -14 12.377 T
Timeout - The operation has timed out
http://cloudrancho.kittenmcnuggets.com/.well-known/acme-challenge/check-your-website-dot-server-daten-dot-de
47.156.80.55 -14 10.027 T
Timeout - The operation has timed out
Visible Content:

The details are relevant, Letsencrypt doesn't use non-standard ports.

So open your ports and recheck your domain to see the result.

I don't know, if you can use a "parked domain" regular. Is that

Host T IP-Address is auth. ∑ Queries ∑ Timeout
cloudrancho.kittenmcnuggets.com C cloudrancho.synology.me yes 1 0
A 47.156.80.55 yes
www.cloudrancho.kittenmcnuggets.com Name Error yes 1 0

the ip address of your synology?

yes, 47.156.80.55 is my current home IP. 80 or 443 should point to 5000/5001 and the internal ip of my nas.

Domainname

http://cloudrancho.kittenmcnuggets.com/
47.156.80.55

https://cloudrancho.kittenmcnuggets.com/
47.156.80.55
Certificate error: RemoteCertificateNameMismatch

http://cloudrancho.kittenmcnuggets.com/.well-known/acme-challenge/check-your-website-dot-server-daten-dot-de
47.156.80.55
403

Forbidden
Visible Content: © 2018 Synology Inc.

when I run that tool, it does get in there, and shows the cert for the synology.me ddns name…
CN=cloudrancho.synology.me

That may be the problem. There should be a http status 404 - Not Found. Not Forbidden. So your directory /.well-known/acme-challenge has the wrong rights.

PS: But then use

https://cloudrancho.synology.me/

Perhaps your synology doesn’t know the other domain name.

But the autoupdate works.

1 Like

so I have read just a couple of posts that said they had to properly setup the web package in synology, add a virtual host, and actively add those folders. I didn’t think that was necessary just to pull a cert, and it seems like it’s not included in any official instruction.

I did run the webhost package, install php 7.2, and create virtual host. I built a default root folder, but did not actively create the acme folder… is that needed just to pull a cert?

I was under the impression that an ssl generated from the synology.me domain(free ddns), did not provide all of the coverage of generating one from a paid and registered domain.

also, what do you mean autoupdate works?

thanks

My tool found that certificate ( https://check-your-website.server-daten.de/?q=cloudrancho.kittenmcnuggets.com ):

CN=cloudrancho.synology.me
	22.03.2019
	20.06.2019
expires in 90 days	cloudrancho.synology.me - 1 entry

That's a new Letsencrypt certificate.

But then I tried to check the domain cloudrancho.synology.me - again only timeouts.

oh, sorry, I probably closed the router. i’ll leave it on the rest of the day.

I just just concerned as I had read many posts that the ddns cert was basically only good inside the home network, and that a paid domain should really be used

overall the synology ping looks like the kittenmcnuggets ping…

That's not really relevant, it's a Disk, not a website.

The non-www version works, so you can ignore the www-version.

But the certificate with the non-www domain name is new, created today.

yes, I got the ddns/synology cert created this morning. I have been trying to replace it with one based on the paid kmn.com domain… I’m not sure why one went so easily and one refuses…

that’s an amazing tool by the way

This couldn’t have anything to do with it?
“Note: According to Let’s Encrypt policies, the number of email addresses for certificates registrations and the number of certificate requests for a domain are both limited.”
limits

wow, this was rookie mistake i guess.

this was a port fwd problem apparently. i had 80>5000 and 443>5001 setup only, which allowed me to log in remotely to my nas just fine, i thought that should allow lets encrypt to communicate to get the cert. i’m not sure why it had no problem connecting for the ddns cert.

when i changed rules to 80>80, 443>443, 5000>5000, and 5001>5001, it pulled the paid domain cert just fine. i do have redirect to hppts set in the nas.

after getting it setup, i went back to only having 443>5001, and it seems to be functioning fine. will leaving at this setting prevent the cert from renewing? which of the above rules is the one i have to leave on indefinitely?

also, as an FYI for others, i’m not sure why there are so many posts saying you have to have a paid registered domain that cnames to your free sysnology ddns name, it seems to me that both lests encrypt certs function the same, and the synology cert and address work ok. thanks for the exercise.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.