Issue certificate for locally hosted webserver

Hi everyone,

I am trying to create a globally accessible HTTPS (CA issued) webserver hosted from within my local network via a Raspberry Pi 4B running Ubuntu Server 21.04. It comprises a NodeJS/Parcel-based front-end UI which is accessible internally via port 1234, a NodeJS/express-based back-end API which the front-end accesses over port 3000, and a MongoDB database. It is accessible via my paid (cPanel managed) subdomain as follows:

  • => frontend => via its forwarded port
  • Frontend => backend API via its forwarded port
  • Backend API => database via its local connection string

I managed to get it all working using self-signed certificates – one each for the front-end and back-end. What I would really like to do is to have it all working using CA issued certificates so that neither I nor other device/platform users are required to manually approve of the self-signed certificate for both the front-end and back-end, as well as to remove that ugly looking red padlock on Chrome which warns users that the connection may not be secure.

I have seen a few videos which indicate that this is possible by using duckdns (e.g., this YouTube link), so I am sure that it must also be doable using a private domain. However, despite my attempts to learn more and implement it, it seems beyond my current level of understanding. I have also looked at tools such as certbot but wasn’t able to obtain a working solution.

Ultimately, I would like to be able to have a perpetually signed/re-issued CA certificate. I understand that this could be achievable in some way by creating a crontab-managed script.

I would be very grateful for any information or guidance on IF/HOW this is achievable. Happy to provide any further information that I may have missed.

Many thanks in advance.


Hi @sslNoob. and welcome to the LE community forum :slight_smile:

I guess the hardest part has already been done: You managed to put all the pieces together in a way that works to resolve your problem.

As for switching to a globally trusted CA, that part would require using an ACME client authenticated via either:

  • HTTP authentication via an FQDN that can be resolved and reached via the Internet.
    [that may pose a problem for the backend system - but may go unnoticed by Internet users (hard to say from the details provided thus far)]
  • DNS authentication via an FQDN from an Internet DNS zone that can be updated via API.
    [in order to meet your "perpetual" requirement, it would have to be automated]

So, which of the two systems (front/back) has an Internet resolvable FQDN that can be reached via HTTP?
[you don't need to have a service running on port 80 - it merely need to be accessible via port 80]

1 Like

Hi rg305,

Firstly, thank you for the warm welcome and the quick response!

Please forgive my lack of understanding of this, I will do my best to keep up.
The front end is resolvable to my subdomain and is accessible via port 443 using my self-signed certificate.
With port 80, do you mean to just have port 80 forwarded to the frontend IP address?

Thank you very much again!

Update: An interesting thing just happened... I copied the certificate and key from my hosted subdomain and used them in both the frontend and backend servers. The frontend now appears to be trusted but the backend still needs to have manual approval to add it to the trusted certificates list...

This is the message that Firefox gives me:

Websites prove their identity via certificates. Firefox does not trust this site because it uses a certificate that is not valid for MY.PUBLIC.IP.BACKEND:FORWARDED_PORT. The certificate is only valid for the following names:,


I would be very grateful for any help on this.


here is a link to own ca / server / client certs OpenSSL Certificate Authority — Jamie Nguyen
I use LE for public and own certificates for private with client certificates.
see doc, you must create a ca-cert to sign server / client cert.
then import ca-cert on all computers (different for each OS / browser), then all browser accept the own certs, no not bad cert message.
with a script i can change between LE and own certs.
is easy

1 Like

@sslNoob @jens_hb brings up a good point (that we haven't yet covered):
Will you be serving any unknown clients from the Internet - or is this level of security only for your controlled devices?
If you are serving the Internet, then you will need a valid cert from a trusted CA.
In order to get one you will first need an FQDN that can be resolved via the Internet (i.e. preferably form a domain you own/control).
That said, which of the two systems will be accessed directly from the Internet.
[the front-end/back-end scenario is typical with web front-ends and database back-ends - where the client only really sees the web front-end]

If you are only going to serve systems under your control, then you can use your own private CA and issue certs that last 100 years if you like.

1 Like

Thanks for your help guys, I really appreciate it!

Initially, the app will be used by a variety of known clients. While the userbase will not be massive, it will be spread across the world and across a variety of devices/browsers. For ease of setup and explanation, it would be ideal if they didn't have to muck around adding exceptions or have scary-looking security warnings in their browsers. The user age range and technical abilities are quite broad as well which is something I am trying to keep in consideration. And to top it off, it's a matter of personal pride and is a valuable skill I would like to develop. :slight_smile: Plus, who doesn't love a secure site! :grinning_face_with_smiling_eyes:

With that said, I managed to have a breakthrough overnight and got both the front end and back end to work by sharing my web host's subdomain keys between the front and back end, closing off the back end API to the public and having the front end access it internally.

Thank you for your help!


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.