Firefox can find certificate, but curl cannot (while tunneling https through ssh) - ssl

Background:
I have a running app at ports 8080 in the remote server and a https ingress proxy at 443 on the same server, which redirects everything to 8080 app after handling the SSL.
What I want to do:
I want to communicate with the app through SSL remotely, while not having access directly to this domain (it is on a local network, I can access the server remotely via a different domain).
What I did:
I tunneled 443 port from my remote server ssh -L 3001:0.0.0.0:443 user#example.com. I then added 127.0.0.1 example.com to my /etc/hosts to make sure that the domain on my system is resolved properly.
Now, what I can do is enter https://example.com:3001/some/thing/ in firefox and it gets me a proper response from the server, while everything is ran through ssl without any problems. I also am able to use curl without checking the certificate: curl --insecure https://example.com:3001/some/thing works fine.
At the same time secure curl call fails: curl https://example.com:3001/some/thing with the error:
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html
Just to make sure both are using the same certificates, I actually used this tool: https://curl.haxx.se/docs/mk-ca-bundle.html to create a ca-bundle.crt from the most recent firefox certificates and passed it to curl with --cacert ca-bundle.crt. No luck - the same error. (I also tried following other curl tutorial on getting the local installation of firefox's certs, also no luck).
Question
What is going on? Why is curl's output different from firefox's even if I seem to use the same certificates? How can I debug this?
Side note
The real reason I am concerned about it is that with a normal (local) access to the server, I observed the same behaviour: I could connect to the server through chrome on https, but my react native app could not. I suspect the app to use libcurl under the hood or something similar and I believe debugging this problem could help me understand what's the problem with the app.

Related

Why are we getting "tls handshake error using curl"?

I'm trying to use curl to access a URL of an app we've developed internally, and on the server I'm seeing
http: TLS handshake error from 1.2.3.4 remote error: tls: unknown certificate authority.
This only happens when we hit the endpoint using curl (inside git bash ) or wget. When we use IE on Windows it works just fine. I've even tried re-installing git bash using Native SSL library (which should be the same as IE), but still getting the same error message.
Have even tried downloading the curl-ca-bundle.crt file and saving to the same place as the curl binary, or even directly telling it to use this file with curl --cacert option, but still no joy.
I've compared the root certs that IE is reporting, and the ones in that curl-ca-bundle.crt and they look the same (they don't line up exactly the same, but they have the same text in them between the BEGIN and END markers, one is just wider on the screen and therefore uses fewer lines if that makes sense).
Hopefully someone has ideas what to try next as two of us have been tearing our hair out all afternoon with this.
wget also gives an error message :-
$ wget https://bler.com/admin/user
--2018-09-03 15:53:43-- https://bler.com/admin/user
Connecting to 132.146.1.142:8090... connected.
ERROR: cannot verify oss.dns.networks.bt.com's certificate, issued by 'CN=DigiCert SHA2 Secure Server CA,O=DigiCert Inc,C=US':
Unable to locally verify the issuer's authority.
To connect to oss.dns.networks.bt.com insecurely, use '--no-check-certificate'.
We're using a local proxy server, and HTTP_PROXY is set. It must be using the proxy, as we can see we're hitting the end point.
I also had the same error. This problem occurs when you install on Golang web server the only certificate. You must use a certificate chain instead.
For example, Letsencrypt gives "cert.pem" and "fullchain.pem". "cert.pem" works in browsers, but curl cannot work with this file (curl: (60) SSL certificate problem: unable to get local issuer certificate). "fullchain.pem" works fine in browsers and curl.

Mattermost TLS issue

I'm having issues with TLS enabling in Mattemost. In my server I configured a lot of virtualHosts plus the mattermost files. In http everything was working fine.
Today I tried to setup TLS and https. I followed the instuctions as in https://docs.mattermost.com/install/config-tls-mattermost .html. Now I get this:
Please notice the error: I'm trying to access domain1.mywebsite.com and the error is "its security certificate is signed by domain2.mywebsite.com". domain2.mywebsite.com is one of the websites configured as virtualhosts in apache.
I did not configure any virtualhost for Mattermost, since I don't thing any is needed (and it worked flawlessly without one, and without TLS). But how can I tell mattermost (or the browser?) that the server of domain2.mywebsite.com is the same of domain1.mywebsite.com?
I generated the certificates using letsencrypt with the standalone option (sudo certbot certonly --standalone -d domain1.mywebsite.com) and didn't move any file, just enabled "UseLetsEncrypt": true, in config.json file.
Do you happen to have any idea about how I could fix this?
Thank you
Marco
You'll need to configure TLS on Apache. You'll needs to use separate certificates for each virtual host.
Here is information that might help you: https://httpd.apache.org/docs/2.4/ssl/ssl_howto.html
Don't configure TLS on Mattermost if TLS is being handled by the proxy.

Does cURL to https URLs require SSL certificate installed on the server?

I am trying to test a cURL command from my server. The command is requesting a JSON response from a URL that is https.
The cURL seems to be stuck, with no response and it doesn't time out.
However, I have tested it from my local machine and the cURL command works fine.
Do I need an SSL certificate installed on the server for it be able to send cURL commands to an https URL?
No you don't need an SSL cert to send a command like curl via https (unless your server uses two way ssl).
You DO need one at the other end for the https server that receives the command from curl.
However if it's a self signed cert then it may not be recognised by curl and you may get an error instead of a successful connection.
The fact the process is hanging suggests a network/connectivity issue rather than an SSL issue. Can you telnet to the machine using port 443 or does that hang too?
telnet www.example.com 443
Should respond.

What is Apache's OpenSSL and why does one server have it but another not

I'm working with two servers; one localhost one on the web. Both are http; I don't have an SSL certificate installed on either.
When I'm trying to make a Curl request to an https url (in this case the Facebook API), one of the servers works and the other doesn't. The CURL error is "SSL certificate problem: unable to get local issuer certificate." Upon investigation, I noticed that $_SERVER["SERVER_SOFTWARE"] outputs something different on the two servers.
Server 1, which works with CURL to https
$_SERVER["SERVER_SOFTWARE"] = Apache/2.4.10 (Win32) OpenSSL/1.0.1i PHP/5.6.3
Server 2, which doesn't work with CURL to https
$_SERVER["SERVER_SOFTWARE"] = Apache
I'm guessing that the fact that the second server has no mention of OpenSSL may have something to do with the error? Is that possible? What would I need to do to get OpenSSL on that server? Why would the first server be able to "find issuer certificate" when I don't have an SSL cert installed on it?
Since you are doing a request with curl to an external server the problem is completely unrelated to the web server software you are running locally, i.e. you don't even need to run a local web server at all. It only depends on the certificate the external server sends back to curl and if the necessary root CA can be found in the trust store of curl.

Download build files from Nexus, certificate error when running wget, but succeed with browser (Firefox, Chrome)

I've already purchased the SSL Certifcate from DigiCert and install it into my Nexus server (running in tomcat, jks)
It works well in firefox and chrome(green address bar indicates that a valid certificate received) , builds could be downloaded from Nexus WebUI too.
But, wget could not get the result without --no-check-certificate
something like
ERROR: cannot verify mydomain.com's certificate, issued by `/C=US/O=DigiCert Inc/OU=www.digicert.com/CN=DigiCert High Assurance CA-3':
Unable to locally verify the issuer's authority.
To connect to mydomain.com insecurely, use `--no-check-certificate'.
Unable to establish SSL connection.
Found something,
SSL connection fails with wget, curl, but succeed with firefox and lynx
linux wget not certified?
But neither of them gives a final solution, I want to know whether there are some (special) configurations on Nexus or this's a bug of wget command?
Google return many results about "digitcert wget",but I cannot find a clue either, Thank you!
You need to add the DigiCert root certificate to a store accessible by wget:
http://wiki.openwrt.org/doc/howto/wget-ssl-certs