(Solved) Invalid response


#1

Hello! First time poster and user of this software.

I am running a droplet on DigitalOcean running Ubuntu 16.04.3 LTS. All of my domains have A records pointed to the server IP. However, when I try to make a certificate for them, hyperdefined.wtf fails. The other 2 work perfectly fine.

Domains:
hyper.rip
hyperdefined.space
hyperdefined.wtf

I ran this command:
sudo certbot --nginx -d hyper.rip -d hyperdefined.space -d hyperdefined.wtf

It produced this output:
IMPORTANT NOTES:

Like I said, that domain has the correct A records. I can visit the site and it will load just fine. I just can’t make a certificate for it strangely.


#2

Hi @hyperdefined,

What version of Certbot are you running?

It looks like Certbot failed to correctly reconfigure your nginx to pass the challenge. This might be a bug related to Certbot’s ability to parse nginx configurations, particularly if there’s something slightly unusual about your nginx configuration.

Certbot would have tried to temporarily create a configuration allowing http://hyperdefined.wtf/.well-known/acme-challenge/wHoV8ozcCbEESjS3zDpCrNPNw3xgE7hABVYW_1NdBe0 to serve a static file whose contents were specified by the CA as part of the challenge, yet apparently instead of that file’s contents being served, we just see a generic WordPress error.

I noticed that your sites are behind CloudFlare and have IPv6 addresses (two things that often cause trouble when obtaining certificates), but on further examination, neither of these appears to be related to the problem.

Do all three of your sites have essentially the same nginx configuration, created in the same way, or is there something customized about the nginx configurations?


#3

This is what I believe the version is: certbot/xenial,now 0.21.1-1+ubuntu16.04.1+certbot+0.2 all [installed,automatic] Sorry if this is wrong, first time user.

This is probably wrong, but this is my etc/sites-available/default file.

The format might be messed up, I’m away right now and don’t have SCTP access right now.

Again, the config probably is wrong since I am new to all of this.


#4

I read online that I should setup Virtual Hosts. Would that be ideal for using my 3 domains for the same server?


#5

I think the separate server blocks for your sites basically already correspond to what Apache calls virtual hosts. There is a style where you put each one in a separate file (typically in the sites-available directory), but I don’t think that’s related to your problem in this case.


#6

Yeah, I just setup the virtual hosts for my domains and the .wtf domain still does not work. You are correct.

In the sites-avaiable folder, I have the default config with the domain hyper.rip
Then I have 2 files called hyperdefined.space.conf and hyperdefined.wtf.conf that are the same. I just changed the server_name to match them.

This is just confusing why that domain will not work.


#7

Are they both linked from sites-enabled?


#8

Yep, I used the command “ln -s /etc/nginx/sites-available/hyperdefined.wtf.conf /etc/nginx/sites-enabled/hyperdefined.wtf.conf” to link it.

Also did the same with the .space one.

Here is the config I am using. https://gist.github.com/hyperdefined/c15d258299dd15997bc48296b6ff6870


#9

Solved. I messed around with Cloudflare settings and now it works.


#10

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.