Remote API calls with certificate failing

My domain is:

I ran this command:

import requests
import json

url = ""
headers = {'Content-Type':'application/json'}
payload = {

cert_file = 'chux-info-cert.pem'

r =, data=payload, headers=headers, verify=cert_file)

It produced this output:

ssl.SSLError: ("bad handshake: Error([('SSL routines', 
'tls_process_server_certificate', 'certificate verify failed')])",)`

My web server is (include version): this docker, latest version:

The operating system my web server runs on is (include version):

My hosting provider, if applicable, is: N/A

I can login to a root shell on my machine (yes or no, or I don’t know): yes

I’m using a control panel to manage my site (no, or provide the name and version of the control panel): N/A

The version of my client is (e.g. output of certbot --version or certbot-auto --version if you’re using Certbot): 1.3.0

My issue is I have a remote API I want to use, and they required me to get a static IP and provide them that IP and a single domain certificate (e.g. not wildcard). So I set up on my home server (I run unraid on bare metal and then run dockers) the letsencrypt docker linked above. The cert generation worked great, I ended up with a directory of .pem files: cert, chain, fullchain, privkey and priv-fullchain-bundle. I sent the fullchain one to the remote API administrators along with my IP, and they claim to have set it up on their end. Well I keep attempting the python script above and just keep getting cert verify failed. I have tried all the various cert files given, and tried giving it as the cert= parameter instead of the verify= parameter. Any thoughts would be greatly appreciated. I’m a long time software developer but fairly ignorant of how certs work.

1 Like

Hi @impala454

your code sends something to


  • that url doesn’t answer
  • you can’t use a Letsencrypt certificate as a client certificate validation