HTTPS on localhost not working - apache

I just followed the steps for using SSL on localhost: https://www.digitalocean.com/community/tutorials/how-to-create-a-ssl-certificate-on-apache-for-ubuntu-14-04
But when i access https://localhost, i get this message:
Page Web inaccessible
ERR_CONNECTION_REFUSED
I'm using Apache2 with Ubuntu trusty on Vagrant.
Let me know if you want more informations.
Thank's

There are three possibilities about this message:
Your self-signed certificate is invalid for some reason.
Please see your Apache error log.
Your Apache SSL/TLS protocols do not match with those of your browser.
Try something like the following from a command prompt: openssl s_client -connect localhost:443 to test the SSL/TLS connection. Please update your question with the output.
May be there is a firewall between your browser and your Apache server?

The problem could be due to Apache performing a reverse DNS lookup on the URL - the result of which does not match any of your PC's aliases.
Try https://127.0.0.1 or https://<hostname>.

Related

Enable HTTPS on thingsboard

Im trying to enable HTTPS using this guide (https://thingsboard.io/docs/user-guide/install/pe/add-haproxy-ubuntu/#step-10-refresh-haproxy-configuration) but i got stuck on step 9 i believe.
sudo certbot-certonly --domain your_domain --email your_email
I get the following error
certbot: error: unrecognized arguments: --tls-sni-01-port 8443
As far as i can tell, lets encrypt no longer supports this argument (tls-sni-01-port) or using ports other than 80 and 443. I got this from (https://serverfault.com/questions/805666/certbot-letsencrypt-on-different-port-than-443).
I am uncertain as how to solve this problem.
Here is my docker-compose.yml for Thingsboard + HTTPS through Nginx reverse proxy with automatic Let's Encrypt certificates: https://github.com/michalfapso/thingsboard_docker_https/
It uses linuxserver/swag which takes care of the certificates and is kept in sync with Let's Encrypt requirements by the linuxserver.io community.

Firefox can find certificate, but curl cannot (while tunneling https through ssh)

Background:
I have a running app at ports 8080 in the remote server and a https ingress proxy at 443 on the same server, which redirects everything to 8080 app after handling the SSL.
What I want to do:
I want to communicate with the app through SSL remotely, while not having access directly to this domain (it is on a local network, I can access the server remotely via a different domain).
What I did:
I tunneled 443 port from my remote server ssh -L 3001:0.0.0.0:443 user#example.com. I then added 127.0.0.1 example.com to my /etc/hosts to make sure that the domain on my system is resolved properly.
Now, what I can do is enter https://example.com:3001/some/thing/ in firefox and it gets me a proper response from the server, while everything is ran through ssl without any problems. I also am able to use curl without checking the certificate: curl --insecure https://example.com:3001/some/thing works fine.
At the same time secure curl call fails: curl https://example.com:3001/some/thing with the error:
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html
Just to make sure both are using the same certificates, I actually used this tool: https://curl.haxx.se/docs/mk-ca-bundle.html to create a ca-bundle.crt from the most recent firefox certificates and passed it to curl with --cacert ca-bundle.crt. No luck - the same error. (I also tried following other curl tutorial on getting the local installation of firefox's certs, also no luck).
Question
What is going on? Why is curl's output different from firefox's even if I seem to use the same certificates? How can I debug this?
Side note
The real reason I am concerned about it is that with a normal (local) access to the server, I observed the same behaviour: I could connect to the server through chrome on https, but my react native app could not. I suspect the app to use libcurl under the hood or something similar and I believe debugging this problem could help me understand what's the problem with the app.

Browsers refuse to connect to websites without url scheme provided

I recently updated my webserver to Ubuntu 16.04 and after the update, I'm getting issues with browsers refusing to connect when the url doesn't include https://
I made sure to check ufw to verify 'Apache Full' was allowed and it was, not sure what to check from here. Any help is greatly appreciated! :)
Unless someone else online here has solved the same problem with the same version of Ubuntu, you will probably have to debug this. I cannot debug it for you because I am not at your keyboard. However, I can get you started.
From a machine other than the web server, try the command
openssl s_client -connect HOSTNAME:80
Replace HOSTNAME with the web server's hostname. If it complains, "Connection refused," then your new web server is no longer serving HTTP. On the other hand, if OpenSSL connects, then your new web server is at least trying to serve HTTP. (Note that OpenSSL, if called as above, won't do anything useful when it connects. It should just drop the connection after a few seconds, but the point is that is connects.)
If, for purpose of comparison, you wish to see what a good HTTP connection looks like, then try
openssl s_client -connect stackoverflow.com:80

Does cURL to https URLs require SSL certificate installed on the server?

I am trying to test a cURL command from my server. The command is requesting a JSON response from a URL that is https.
The cURL seems to be stuck, with no response and it doesn't time out.
However, I have tested it from my local machine and the cURL command works fine.
Do I need an SSL certificate installed on the server for it be able to send cURL commands to an https URL?
No you don't need an SSL cert to send a command like curl via https (unless your server uses two way ssl).
You DO need one at the other end for the https server that receives the command from curl.
However if it's a self signed cert then it may not be recognised by curl and you may get an error instead of a successful connection.
The fact the process is hanging suggests a network/connectivity issue rather than an SSL issue. Can you telnet to the machine using port 443 or does that hang too?
telnet www.example.com 443
Should respond.

unable to ssl connect to chef-server from chef-workstation

I have 2 different ubuntu VPS instances each with different ip addresses.
One is assigned as a chef-server and the other acts as a workstation.
When I use the command
knife configure -i
I do get options to locate admin.pem and chf-validator.pem files locally.
I am also able to create knife.rb file locally.
WHile setting up knife, I get a question which asks me to enter 'chef-server url' so I enter 'https://ip_address/ of the vps instance
But in the end I get an error message
ERROR: SSL Validation failure connecting to host: "ip_address of my server host"- hostname "ip_address of my host" does not match the server certificate
ERROR: Could not establish a secure connection to the server.
Use knife ssl check to troubleshoot your SSL configuration.
If your Chef Server uses a self-signed certificate, you can use
knife ssl fetch to make knife trust the server's certificates.
I used 'knife ssl fetch' to fetch the trusted_certs from the chef-server but still it doesnt work.
CHef experts please help.
Your chef-server has a hostname, the selfsigned certificate is done with this hostname.
The error you get is due to the fact you call an IP adress where the certificate is done for a hostname.
Two way: disable ssl validation (you'll have a warning but it will works) or make a configuration (using your hostname files for exemple) to use the chef-server hostname instead of ip address.
This is a SSL configuration point you may have with other servers too.