Successfully deployed certificate for wgbis.ces.iisc.ernet.in to /etc/apache2/sites-available/000-default-le-ssl.conf
Redirecting vhost in /etc/apache2/sites-enabled/000-default.conf to ssl vhost in /etc/apache2/sites-available/000-default-le-ssl.conf
Congratulations! You have successfully enabled HTTPS on https://wgbis.ces.iisc.ernet.in
My web server is (include version): Apache/2.4.41 (Ubuntu)
The operating system my web server runs on is (include version): Ubuntu 20.04
My hosting provider, if applicable, is:
I can login to a root shell on my machine (yes or no, or I don't know): yes
I'm using a control panel to manage my site (no, or provide the name and version of the control panel):
The version of my client is (e.g. output of certbot --version or certbot-auto --version if you're using Certbot): certbot 1.31.0
I got some suggestions such as self signed certificates will work on subdomains but in my case since it had 2 sub domains the url was failing is that the case
Do we have limits on the number of subdomains in a url?
Thanks all for the response, the problem was after the certificate is generated successfully I had to pass it to the system admin who handles the domain address and stuff. After I passed the certificate it is now working successfully. But dono what happened behind the screen
Now there is a new problem, after the secure connection is enabled, all the login form enabled sites are showing 403 forbidden error tho
You'd need to point to an example for more help there but you'd need to investigate how the web application accepts the supplied login information and processes it, which will be the part returning the 403 error. I assume this isn't the first time this site has been https enabled?
You may also want to know that some of the other links don't work: https://ces.iisc.ernet.in/hpg/envis/ (https://ces.iisc.ernet.in/ is using the cert for *.iisc.ac.in which obviously doesn't match ces.iisc.ernet.in, so it won't work).
I recommend having a test environment for your systems so you can recreate your entire live website on a set of test servers, this allows you to test changes, improve your system administration practices and figure out configuration problems before they are a problem on your live servers.
It seems like you have a central squid server (which incidentally is very dated) handling traffic to your websites then passing it to the internal servers that actually serve each website. This means that the squid server is the one terminating SSL and is the one presenting all of the https connections/certificates. It may be time to modernise that approach.
As an aside, I'd also be very confident that some of your website content dates back about 25 years (.htm files and table based layouts are the clue), so it might be worth having a project to modernise everything (content included) if these websites are important. In doing so you could also consider moving to modern webservers, content management platforms etc. This has the advantage of up-skilling the current people who support the system, rather than having them maintain old systems they don't understand.