Cannot get certificate 404 error [Solved]

Please fill out the fields below so we can help you better. Note: you must provide your domain name to get help. Domain names for issued certificates are all made public in Certificate Transparency logs (e.g. https://crt.sh/?q=example.com), so withholding your domain name here does not increase secrecy, but only makes it harder for us to provide help.

My domain is: remote.gscomputing.co.uk

I ran this command: sudo certbot --apache --agree-tos --redirect --hsts --staple-ocsp --email siv@gscomputing.co.uk -d remote.gscomputing.co.uk

It produced this output: Challenge failed for domain remote.gscomputing.co.uk
IMPORTANT NOTES:

My web server is (include version): Apache2

The operating system my web server runs on is (include version): Ubuntu V19.01

My hosting provider, if applicable, is: N/A

I can login to a root shell on my machine (yes or no, or I don’t know): Yes

I’m using a control panel to manage my site (no, or provide the name and version of the control panel): No

The version of my client is (e.g. output of certbot --version or certbot-auto --version if you’re using Certbot): certbot 0.36.0

Hi,

Do you happen to have any configuration to your domain?
It looks like your domain is currently on default virtual host (hence the Apache default page).
Could you please try to configured your website first before apply with Apache? Or try to use webroot method and specify your folder.

Thank you

Hi @Siv

checking your domain there is a Letsencrypt certificate, 9 days old - https://check-your-website.server-daten.de/?q=remote.gscomputing.co.uk#ct-logs

Issuer not before not after Domain names LE-Duplicate next LE
Let’s Encrypt Authority X3 2019-11-06 2020-02-04 remote.gscomputing.co.uk - 1 entries

How did you create that certificate? What did you change so your configuration doesn’t work.

Juergen,
I am switching from a ClearOS server to my new Ubuntu server, so the other certificate is from the old one.
I am configuring the Ubuntu one to be pretty much the same.
What I am getting slightly confused with is how the config file and the physical web root interact. The procedure I am following to set up email assumes you are hosting on an ISP provided server and that your domain is a virtual one. My DNS records point the remote.gscomputing.co.uk domain to the root website which is on /var/www/html which seems to be different to older Ubuntu servers was just /var/www I think.
I created the folder /var/www/remote.gscomputing.co.uk and then when I created the site it added public_html and entered the usual default index.html page.

It seems to be working because if you browse to http://remote.gscomputing.co.uk you do see the default Apache page.

I then added /.well-known/acme-challenge/
Because it wasn’t working I tried changing the /etc/apache2/sites-available/remote.gscomputing.co.uk.conf file to this:

<VirtualHost *:80>
	ServerName remote.gscomputing.co.uk
	DocumentRoot /var/www/html/
</VirtualHost>

Origonally it was:

<VirtualHost *:80>
	ServerName remote.gscomputing.co.uk
	DocumentRoot /var/www/remote.gscomputing.co.uk
</VirtualHost>

So I am confused as to what is going on?

Following some of the other support messages on here I tested being able to access a file in /.well-known/acme-challenge/ so I created a 1234 file in there and if you browse to:

http://remote.gscomputing.co.uk/.well-known/acme-challenge/1234 it will display a blank page so I know you can see that? So I am baffled as to why Lets Encrypt is getting a 404 error?

Is it some sort of permissions issues. I have set the ownership of the whole tree as follows:
sudo chown www-data:www-data /var/www/remote.gscomputing.co.uk -R

In the procedure I am following I am told that the www-data user is the Apache user hence why it needs permissions.

Thanks for your help.

What says

apachectl -S
VirtualHost configuration:
*:80                   is a NameVirtualHost
     default server remote.gscomputing.co.uk (/etc/apache2/sites-enabled/000-default.conf:1)
     port 80 namevhost remote.gscomputing.co.uk (/etc/apache2/sites-enabled/000-default.conf:1)
     port 80 namevhost remote.gscomputing.co.uk (/etc/apache2/sites-enabled/remote.gscomputing.co.uk.conf:1)
ServerRoot: "/etc/apache2"
Main DocumentRoot: "/var/www/html"
Main ErrorLog: "/var/log/apache2/error.log"
Mutex default: dir="/var/run/apache2/" mechanism=default 
Mutex mpm-accept: using_defaults
Mutex watchdog-callback: using_defaults
PidFile: "/var/run/apache2/apache2.pid"
Define: DUMP_VHOSTS
Define: DUMP_RUN_CFG
User: name="www-data" id=33 not_used
Group: name="www-data" id=33 not_used

There

you see the problem. You have multiple port 80 vHosts with the same domain name -> that’s wrong.

Every combination of port and domain name must be unique.

Rename the default server, restart, then again apachectl -S to see, if the output is better.

OK, I removed the file I created and now we get:

apachectl -S
VirtualHost configuration:
*:80                   remote.gscomputing.co.uk (/etc/apache2/sites-enabled/000-default.conf:1)
ServerRoot: "/etc/apache2"
Main DocumentRoot: "/var/www/html"
Main ErrorLog: "/var/log/apache2/error.log"
Mutex default: dir="/var/run/apache2/" mechanism=default 
Mutex mpm-accept: using_defaults
Mutex watchdog-callback: using_defaults
PidFile: "/var/run/apache2/apache2.pid"
Define: DUMP_VHOSTS
Define: DUMP_RUN_CFG
User: name="www-data" id=33 not_used
Group: name="www-data" id=33 not_used
1 Like

OK, since making that change and re-ran the:

sudo certbot --apache --agree-tos --redirect --hsts --staple-ocsp --email siv@gscomputing.co.uk -d remote.gscomputing.co.uk

I have now got my Certificate.
Thanks very much for your help.

1 Like

Yep, now it looks good, no duplicated vHosts -> and the certificate creation works :+1:

Yes things seem to be working properly now.
Thanks for all your help, much appreciated!

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.