Hello, just like many times before, I ran this command on my ubuntu server:
sudo certbot certonly --webroot -w /var/www/rgarza/mywebsite.mx -d www.mywebsite -d mywebsite.mx
Which gave me the usual congratulations output text stating that the certs was successfully generated.
So that part is fine, but then, if i run the usual sudo nginx -t
command to check if there are any errors before i reload nginx i get this unusual error:
[emerg] BIO_new_file("/etc/letsencrypt/live/mywebsite.mx/fullchain.pem")
failed (SSL: error:02001002:system library:fopen:No such file or directory:fopen('/etc/letsencrypt/live/mywebsite.mx/fullchain.pem','r')
error:2006D080:BIO routines:BIO_new_file:no such file)
nginx: configuration file /etc/nginx/nginx.conf test failed
This sound to me like a permission problem. Now i don’t know why i get this error now cause this is the exact procedure that i do everytime i need to generate a cert for any given website on my server.
The first three lines of my nginx.conf are these:
user www-data;
worker_processes auto;
pid /run/nginx.pid;
And the relevant server block of this particular website which is stored within the sites-available folder is this:
server {
listen 80;
server_name mywebsite.mx www.mywebsite.mx;
rewrite ^/(.*) https://mywebsite.mx/$1 permanent;
}
server {
listen 443 ssl http2;
server_name www.mywebsite.mx;
ssl_certificate /etc/letsencrypt/live/mywebsite.mx/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/mywebsite.mx/privkey.pem;
ssl_prefer_server_ciphers on;
return 301 https://mywebsite.mx$request_uri;
}
server {
listen 443 ssl http2;
server_name mywebsite.mx;
ssl_certificate /etc/letsencrypt/live/mywebsite.mx/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/mywebsite.mx/privkey.pem;
ssl_prefer_server_ciphers on;
root /var/www/rgarza/mywebsite**strong text**.mx;
index index.php index.html index.htm;
location / {
try_files $uri $uri/ /index.php?q=$request_uri;
}
# deny running scripts inside writable directories
location ~* /(images|cache|media|logs|tmp)/.*\.(php|pl|py|jsp|asp|sh|cgi)$ {
return 403;
error_page 403 /403_error.html;
}
location ~ \.php$ {
try_files $uri =404;
fastcgi_split_path_info ^(.+\.php)(/.+)$;
fastcgi_pass unix:/run/php/php7.0-fpm.sock;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
fastcgi_index index.php;
include fastcgi_params;
}
location ~* \.(jpg|jpeg|svg|png|gif|ico|css|js|woff)$ {
expires 20d;
}
location ~* \.(pdf)$ {
expires 30d;
}
location ~ /\.ht {
deny all;
}
}
I have many certs generated like this and already working, this is the first time i get this problem. I guess it had somenthing to do that i did an apt update and apt upgrade last week. Maybe something changed because of that upgrade. Anyway nginx does not complain about the old certs from other sites, just this one, and it looks like a permission problem like i said before.
I have no experience chaning permission, can someone point me in the right direction here? By the way I AM NOT USING DOCKER
Thanks in advanced.