Webroot plugin : The client lacks sufficient authorization

Thanks anyway.

Anyone else?

@billp : i think that the files in acme-challenge are created and then destroyed in the process. Check the log.

And i really think this is a chmod issue. User other has to be able to read if the validation files are created as root. Orherwise Apache canā€™t serve the validation files in acme-challenge

@lsccommunication Iā€™ve been looking through the logs, but Iā€™m not at all clear what LE does to create a certificate. I mentioned that there are no files in acme-challenge to point out that it hadnā€™t got as far as creating files there. I donā€™t know if files it creates there are temporary or if they persist there. Having looked at the logs I see that LE removes files that it has created there as part of Cleaning up challenges. I donā€™t know if it cleaned them up because of the error it found, or because they are a temporary requirement for the set up process. If they are temporary then I assume that it would remove the folders too, but perhaps it needs those later.

So Iā€™m thrashing around in the dark really. There are several instances in the log like this:
{ā€œtypeā€:ā€œurn:acme:error:unauthorizedā€,ā€œdetailā€:ā€œInvalid response from http://www.domain.com/.well-known/acme-challenge/OQmJaWYAdyznW-oAIgtbZbpIgMr9795wEr4ocI60QDQ [176.227.202.115]: 403ā€}
but these only confirm the ā€˜whatā€™, rather than the ā€˜whyā€™.

If it is a chmod issue, I canā€™t yet find the answer that gets me moving.

I dont think we can solve it.
I think LE script create the file and folder with a wrong permission.

So the solution would be an update of the LE script wich correct this issue.

I use a simple bash script for a single domain that checks the domain and creates the /var/www/${domain}/web/.well-known/acme-challenge folders ( I use ISPconfig, and the le2ispc function rather than the letsencrypt-auto directly, it should be easy to modify the script though.

#!/bin/bash

if [ -z "$1" ]; then
echo "needs a domain name"
exit
fi

domain=$1

if [ ! -d /var/www/${domain}/web/ ]; then
echo "Domain name seems incorrect"
exit
fi

if [ ! -d /var/www/${domain}/web/.well-known/acme-challenge ]; then
echo "creating /var/www/${domain}/web/.well-known/acme-challenge"
mkdir -p /var/www/${domain}/web/.well-known/acme-challenge
fi

cd /home/atbsaa/letsencrypt
./le2ispc ${domain}

exit

which was my temporary work around - if that helps.

Thanks to all, but Iā€™ve found a problem that seems to be one for the LE guys. I was attempting to create a certificate for a domain with one of the new TLDs (.domains).

I tried again, exact same set up - same server, same WHM, but with a .net domain. Voila!

It seems that for some reason LE canā€™t work with any old TLD (at the moment).

1 Like

I have the same problem and cannot work around it changing the permission with chmod -R 777 .well-known. I canā€™t confirm that this is domain specific. I tried with .de .org an others. But still get the error.

Have similar problem, and my problem was the redirect of the root directory of my web-server.

Hi,

just struggled with same problems than the original writer @lsccommunication and seem to have solved it. It just might be that some of you may have the same setup as I do so here I explain root cause and resolution:

In by basic nxinx configuration, which is a global config and applies to all of my sites, I deny access to any hidden files. My configuration snippet looks like

location ~ /\. { deny all; }

That was the reason why I got the unauthorized error etc. My resolution to the problem was to add another directive to my nginx configuration like

location ~ /\.well-known { allow all; }

infront of

location ~ /\. { deny all; }

Hope this may be of interest to you.

1 Like

I have the same issue with insufficient authoriztion while trying to update certs.
But, I have proofs that letsencrypt server was able to access files:

66.133.109.36 - - [10/Mar/2016:21:45:30 +0200] ā€œGET /.well-known/acme-challenge/9zltwhE1Ivhsp9FvrYqOG1BIzBUP24a8gKnje64u1ik HTTP/1.0ā€ 200 360 ā€œ-ā€ ā€œMozilla/5.0 (compatible; Letā€™s Encrypt validation server; +https://www.letsencrypt.org)ā€

Trying to create certs for subdomain (sub.domain.tld for example) on Ubuntu 14.04 server, with Vesta control panel installed.

Update: found the reason why that happened.
I was trying to create cert for subdomain of domain which already had certificate.
Letsencrypt was trying to reach .well-known/ā€¦/ā€¦ files on both main domain and itā€™s subdomain. But they are physically in different webroots.
I had to create symlink of .well-known from webroot of domain.tld into subdomains webroot - after that I managed to create certificate with
./letsencrypt-auto certonly --webroot -w /home/admin/web/domain.tld/public_html/ -d sub.domain.tld -d www.sub.domain.tld

Same here urn:acme:error:unauthorized with 403 code.

The strange part is that under the same cPanel server I can create successfully the certs for all domains except one.

Seems like that my .htaccess security rules were blocking dotted files. Just rename your .htaccess file to htaccess.txt and try again.

1 Like

Has anyone found a resolution for letsencrypt writing the files as root:root? My server will not read these files at all.

Is there a way to have letsencrypt write the .well-known folders and files as www-data?

You can set up certbot to run as a non-root user, but it may be easier to se one of the alternate clients which will happily run as a different user by default.

Thanks so much! This finally solved the permission problem on my nginx/ghost blog.
I didnā€™t know what to do but I added this line first to my nginx conf before I issued the generate command.

location ~ ^/.well-known {
        root /var/www/ghost;
    }

But still no luck until I added your line. I did not have any rule for hidden files like you. This below worked perfect and the certificate was issued:

location ~ ^/.well-known {
        root /var/www/ghost;
        allow all;
    }

I was running into this issue as well. In my case, it was a Drupal 7 site, and the problem turned out to be Drupalā€™s .htaccess file was preventing the .well-known site from being accessed. I temporarily removed my .htaccess file, and was able to generate the certificate.

This is a known problem with Drupal 7, a fix is being worked on here.

DUH. owner/group was wrong