My domain is: chux.info
I ran this command:
import requests
import json
url = "https://services.smartmetertexas.net/15minintervalreads/"
headers = {'Content-Type':'application/json'}
payload = {
<redacted>
}
cert_file = 'chux-info-cert.pem'
r = requests.post(url, data=payload, headers=headers, verify=cert_file)
print(r)
It produced this output:
ssl.SSLError: ("bad handshake: Error([('SSL routines',
'tls_process_server_certificate', 'certificate verify failed')])",)`
My web server is (include version): this docker, latest version: https://hub.docker.com/r/linuxserver/letsencrypt/
The operating system my web server runs on is (include version):
https://hub.docker.com/r/linuxserver/letsencrypt/
My hosting provider, if applicable, is: N/A
I can login to a root shell on my machine (yes or no, or I don’t know): yes
I’m using a control panel to manage my site (no, or provide the name and version of the control panel): N/A
The version of my client is (e.g. output of certbot --version
or certbot-auto --version
if you’re using Certbot): 1.3.0
My issue is I have a remote API I want to use, and they required me to get a static IP and provide them that IP and a single domain certificate (e.g. not wildcard). So I set up on my home server (I run unraid on bare metal and then run dockers) the letsencrypt docker linked above. The cert generation worked great, I ended up with a directory of .pem files: cert, chain, fullchain, privkey and priv-fullchain-bundle. I sent the fullchain one to the remote API administrators along with my IP, and they claim to have set it up on their end. Well I keep attempting the python script above and just keep getting cert verify failed. I have tried all the various cert files given, and tried giving it as the cert=
parameter instead of the verify=
parameter. Any thoughts would be greatly appreciated. I’m a long time software developer but fairly ignorant of how certs work.