Ubuntu nginx certbot verification fails

My site is https://gnuorleans.org, which I'm running on a home ubuntu 18.04 machine with ports 80 and 443 forwarded on my router. I want to serve rstudio server (mainly for my own remote access) via https. By default, rstudio server listens on port 8787 without encryption.

I managed to get https set up with nginx with a self-signed ssl cert, and it appropriately redirects both http://gnuorleans.org and https://gnuorleans.org to the rstudio application, albeit with the browser warning. So basically I just want to get a signed cert from letsencrypt in place.

Supposedly 'sudo certbot --nginx' should do this, but it fails with the following:

  • The following errors were reported by the server:

    Domain: gnuorleans.org
    Type: connection
    Detail: Fetching
    http://gnuorleans.org/.well-known/acme-challenge/PR5TfKSMrIPb2JCACMWieV77VRYAExjT0pIgCd4eMnQ:
    Timeout during connect (likely firewall problem)

    To fix these errors, please make sure that your domain name was
    entered correctly and the DNS A/AAAA record(s) for that domain
    contain(s) the right IP address. Additionally, please check that
    your computer has a publicly routable IP address and that no
    firewalls are preventing the server from communicating with the
    client. If you're using the webroot plugin, you should also verify
    that you are serving files from the webroot path you provided.
    dbruce@tuxworld:/etc/nginx/sites-enabled$

my /etc/nginx/sites-enabled/test:

server {
listen 80;
listen [::]:80;
server_name gnuorleans.org;
return 301 https://gnuorleans.org$request_uri;
}

server {
listen 443;
server_name gnuorleans.org;

ssl_certificate         /etc/nginx/cert.crt;
ssl_certificate_key     /etc/nginx/cert.key;

ssl on;
ssl_session_cache   builtin:1000    shared:SSL:10m;
ssl_protocols   TLSv1 TLSv1.1 TLSv1.2;
ssl_ciphers HIGH:!aNULL:eNULL:!EXPORT:!CAMELLIA:!DES:!MD5:!PSK:!RC4;
ssl_prefer_server_ciphers on;

access_log              /var/log/nginx/rstudio.access.log;

location / {

    proxy_set_header        Host $host;
    proxy_set_header        X-Real-IP $remote_addr;
    proxy_set_header        X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header        X-Forwarded-Proto $scheme;

    proxy_pass              http://localhost:8787;
    proxy_read_timeout      90;
    proxy_redirect          http://localhost:8787   https://gnuorleans.org;
}

}

Also - the ubuntu firewall (ufw) is inactive, and I also tried putting my machine in my router's DMZ, neither of which changed behavior.

How can I get a signed cert in place?

Thanks - David

Hi,

I think the reason you experience this “not found” is she to the fact that all requests are being proxied into the backend server. Please add a exception to your Nginx virtual host file to exempt that directory.

Thank you

Hi Steven - I apologize but I’m not clear on what I have to do. From what I’ve found, I assume I have to do something to the server block listening on port 80 to get it to handle the verification instead of returning the https version of rstudio. But I’m not having any luck pinning down exactly what needs to go in the file.

(I’m a surgeon by profession, not a server admin, but I’m trying to learn)

David

Hi @davidstuartbruce,

Before trying to modify your nginx conf, you should fix the connection issue, http-01 challenge must use port 80 but your domain can’t be reached on port 80, only on port 443.

$ curl -IkL -m10 http://gnuorleans.org/.well-known/acme-challenge/test
curl: (28) Connection timed out after 10001 milliseconds

$ curl -IkL -m10 https://gnuorleans.org/.well-known/acme-challenge/test
HTTP/1.1 404 Not Found
Server: nginx/1.14.0 (Ubuntu)
Date: Tue, 11 Sep 2018 16:31:21 GMT
Content-Type: text/html
Content-Length: 57
Connection: keep-alive

Cheers,
sahsanu

1 Like

OK, this is what I now get with the curl command you provided:

Last login: Tue Sep 11 17:17:54 on ttys000
DSBMacBook:~ dbruce$ curl -IkL -m10 http://gnuorleans.org/.well-known/acme-challenge/test
HTTP/1.1 301 Moved Permanently
Server: nginx/1.14.0 (Ubuntu)
Date: Tue, 11 Sep 2018 22:27:22 GMT
Content-Type: text/html
Content-Length: 194
Connection: keep-alive
Location: https://gnuorleans.org/.well-known/acme-challenge/test

HTTP/1.1 404 Not Found
Server: nginx/1.14.0 (Ubuntu)
Date: Tue, 11 Sep 2018 22:27:22 GMT
Content-Type: text/html
Content-Length: 57
Connection: keep-alive

DSBMacBook:~ dbruce$

So I think nginx is receiving the challenge and handling it by serving up rstudio server over https, rather than acting as a static http file server as the challenge requires. Though this is what I ultimately want, I think that in order to get certbot to work I need to modify the server block for port 80. I'll search along those lines but any more specific tips are greatly appreciated.

David

Hi @davidstuartbruce,

I can’t reach your server on port 80 yet…

To avoid the challenge going through your reverse proxy, you can add two locations on your server block for port 80, one to use a specific root path for the LE challenge and the other one to redirect the rest of requests to your reverse proxy configured on port 443.

server {
    listen 80;
    listen [::]:80;
    server_name gnuorleans.org;
    
    location '/.well-known/acme-challenge' { 
        default_type "text/plain";
        root /var/www/letsencrypt;
    }
        
    location / {
         return 301 https://gnuorleans.org$request_uri;
    }	
}

Note: If you change the nginx conf, remember to restart/reload it.

As this is an “special” conf, I wouldn’t use certbot with nginx plugin, I will use it using certonly and webroot options, something like this:

certbot certonly --webroot -w /var/www/letsencrypt/ -d gnuorleans.org

Once you get your certificate you only need to edit your conf and change these two directives:

ssl_certificate         /etc/nginx/cert.crt;
ssl_certificate_key     /etc/nginx/cert.key;

to these:

ssl_certificate      /etc/letsencrypt/live/gnuorleans.org/fullchain.pem;
ssl_certificate_key  /etc/letsencrypt/live/gnuorleans.org/privkey.pem;

Anyway, as I said, I can’t connect to your server on port 80 (tested from 5 different countries)… you should double check this, maybe it is a firewall issue, a port forwarding issue or your ISP is blocking it.

Good luck,
sahsanu

Hi Sahsanu,

I incorporated your code into a file in my /etc/nginx/sites-enabled, and as far as I can tell it works as intended.

I think the problem though is that my ISP (Cox) blocks port 80, and from what I’ve googled they are very rigid about never unblocking it for home accounts, even temporarily. I tried using certonly and webroot as you suggested, and it failed as before.

Is there any way I can get a cert from Letsencrypt with port 80 blocked by my ISP?

Thanks - David

In this case, you need to use the DNS challenge method. This doesn't work with certbot --nginx but ideally requires you to have a DNS provider API that lets you create DNS records automatically from software. You can use a Certbot DNS plugin, or use the acme.sh client, which has broader DNS provider API support.

Success!

I was able to run acme.sh and validate a cert manually by creating a TXT record for my domain, which is just a DDNS from Google Domains. I realize I will need to renew it every 90 days.

This was quite the pain, but very educational. I’m a liver transplant surgeon in “real life”, not an IT professional, so I feel this was a pretty decent accomplishment as a hobbyist.

Thanks to all involved!

David

1 Like

Ideally the process can be automated, but you would need a DNS provider that has an API that lets you create DNS records from software (so that Certbot or another client can create the appropriate TXT records for you—the required TXT record content is different every time).

According to previous forum threads, Google Domains doesn't provide a suitable API, but one thing you can do if you don't want to switch provides is to create a CNAME record (which is a kind of DNS alias that tells clients that the record in question can be found at a different place), which can redirect the _acme-challenge record to a different provider that does provide an API. Then you could still use Google Domains for everything else, but it would serve a CNAME record pointing _acme-challenge to some other provider that offers an API through which this can be set up automatically.

I promise not to attempt any liver transplants on the basis of online tutorials and forum posts! :slight_smile:

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.