My site is https://gnuorleans.org, which I'm running on a home ubuntu 18.04 machine with ports 80 and 443 forwarded on my router. I want to serve rstudio server (mainly for my own remote access) via https. By default, rstudio server listens on port 8787 without encryption.
I managed to get https set up with nginx with a self-signed ssl cert, and it appropriately redirects both http://gnuorleans.org and https://gnuorleans.org to the rstudio application, albeit with the browser warning. So basically I just want to get a signed cert from letsencrypt in place.
Supposedly 'sudo certbot --nginx' should do this, but it fails with the following:
To fix these errors, please make sure that your domain name was
entered correctly and the DNS A/AAAA record(s) for that domain
contain(s) the right IP address. Additionally, please check that
your computer has a publicly routable IP address and that no
firewalls are preventing the server from communicating with the
client. If you're using the webroot plugin, you should also verify
that you are serving files from the webroot path you provided.
dbruce@tuxworld:/etc/nginx/sites-enabled$
I think the reason you experience this “not found” is she to the fact that all requests are being proxied into the backend server. Please add a exception to your Nginx virtual host file to exempt that directory.
Hi Steven - I apologize but I’m not clear on what I have to do. From what I’ve found, I assume I have to do something to the server block listening on port 80 to get it to handle the verification instead of returning the https version of rstudio. But I’m not having any luck pinning down exactly what needs to go in the file.
(I’m a surgeon by profession, not a server admin, but I’m trying to learn)
Before trying to modify your nginx conf, you should fix the connection issue, http-01 challenge must use port 80 but your domain can’t be reached on port 80, only on port 443.
$ curl -IkL -m10 http://gnuorleans.org/.well-known/acme-challenge/test
curl: (28) Connection timed out after 10001 milliseconds
$ curl -IkL -m10 https://gnuorleans.org/.well-known/acme-challenge/test
HTTP/1.1 404 Not Found
Server: nginx/1.14.0 (Ubuntu)
Date: Tue, 11 Sep 2018 16:31:21 GMT
Content-Type: text/html
Content-Length: 57
Connection: keep-alive
HTTP/1.1 404 Not Found
Server: nginx/1.14.0 (Ubuntu)
Date: Tue, 11 Sep 2018 22:27:22 GMT
Content-Type: text/html
Content-Length: 57
Connection: keep-alive
DSBMacBook:~ dbruce$
So I think nginx is receiving the challenge and handling it by serving up rstudio server over https, rather than acting as a static http file server as the challenge requires. Though this is what I ultimately want, I think that in order to get certbot to work I need to modify the server block for port 80. I'll search along those lines but any more specific tips are greatly appreciated.
To avoid the challenge going through your reverse proxy, you can add two locations on your server block for port 80, one to use a specific root path for the LE challenge and the other one to redirect the rest of requests to your reverse proxy configured on port 443.
Anyway, as I said, I can’t connect to your server on port 80 (tested from 5 different countries)… you should double check this, maybe it is a firewall issue, a port forwarding issue or your ISP is blocking it.
I incorporated your code into a file in my /etc/nginx/sites-enabled, and as far as I can tell it works as intended.
I think the problem though is that my ISP (Cox) blocks port 80, and from what I’ve googled they are very rigid about never unblocking it for home accounts, even temporarily. I tried using certonly and webroot as you suggested, and it failed as before.
Is there any way I can get a cert from Letsencrypt with port 80 blocked by my ISP?
In this case, you need to use the DNS challenge method. This doesn't work with certbot --nginx but ideally requires you to have a DNS provider API that lets you create DNS records automatically from software. You can use a Certbot DNS plugin, or use the acme.sh client, which has broader DNS provider API support.
I was able to run acme.sh and validate a cert manually by creating a TXT record for my domain, which is just a DDNS from Google Domains. I realize I will need to renew it every 90 days.
This was quite the pain, but very educational. I’m a liver transplant surgeon in “real life”, not an IT professional, so I feel this was a pretty decent accomplishment as a hobbyist.
Ideally the process can be automated, but you would need a DNS provider that has an API that lets you create DNS records from software (so that Certbot or another client can create the appropriate TXT records for you—the required TXT record content is different every time).
According to previous forum threads, Google Domains doesn't provide a suitable API, but one thing you can do if you don't want to switch provides is to create a CNAME record (which is a kind of DNS alias that tells clients that the record in question can be found at a different place), which can redirect the _acme-challenge record to a different provider that does provide an API. Then you could still use Google Domains for everything else, but it would serve a CNAME record pointing _acme-challenge to some other provider that offers an API through which this can be set up automatically.
I promise not to attempt any liver transplants on the basis of online tutorials and forum posts!