Error while curling a https website using self signed certificate - ssl

I am trying to curl a tool's rest api which is using self signed certificate.
curl -D- -u user:pass -X GET -H "Content-Type: application/json" https://server:8006/api2/json/nodes
It gives following error:
"curl: (60) SSL certificate problem: unable to get local issuer certificate"
When using insecure option, following output is received:
HTTP/1.1 401 No ticket
Cache-Control: max-age=0
Connection: close
Date: Mon, 22 Aug 2016 15:25:18 GMT
Pragma: no-cache
Server: pve-api-daemon/3.0
Expires: Mon, 22 Aug 2016 15:25:18 GMT
I tried generating server certificate using:
echo "" | openssl s_client -connect server:8006 -prexit 2>/dev/null | sed -n -e '/BEGIN\ CERTIFICATE/,/END\ CERTIFICATE/ p' > cacert.pem
However, when using this certificate using --cacert option it still gives "curl: (60) SSL certificate problem: unable to get local issuer certificate" error message.
Please let me know what am I missing?

A self-signed certificate is untrusted by design, you can't get a self-signed certificate to be trusted.
If you want the certificate to be trusted, you need to purchase/request the certificate from a trusted Certificate Authority.

Related

curl failing with --insecure(-k) and missing --cacert

I have a client server interface.
The case is unusual, because the server authenticates client and not vice versa.
The client sends client cert to the server. The server authenticates the client cert with its root ca cert. There is also an intermediate cert being used, but I dont need that on server to authenticate the client.
The commands go as follows.
"curl -k --cert client-cert.pem --key client-key.pem https://server.com:443/endpoint" (FAIL)
But if I pass the intermediate cacert as argument to curl, its successful.
"curl -k --cert client-cert.pem --key client-key.pem --cacert intermediates.pem https://server.com:443/endpoint" (PASS)
Even if I pass in the root ca(same between client and server) it fails.
"curl -k --cert client-cert.pem --key client-key.pem --cacert root-ca.pem https://server.com:443/endpoint" (FAIL)
PS:
(FAIL) means, Server says authentication failure!
My Question is:
Why should a server care about the --cacert option at all. curl shouldn't be sending over the ca cert regardless. Correct?
Why do things work when I only pass intermediates file.
Thanks in advance for the response.
PS:
The client-cert.pem is signed by intermediate ca and intermediate ca is signed by root ca.

MISSING_AUTHORIZATION_HEADER while using certificate to authenticate

I am trying to authenticate against an API using certs.
I have a pfx file that I need to convert to required certs.
In order to generate the Public cert, i used the command :
openssl pkcs12 -in ~/Downloads/file.pfx -nodes -clcerts -nokeys | openssl x509 -out public.crt
This public cert was uploaded on the API side.
Now from the client side I need to connect to the api using curl
So first I generated the Private Key using the command :
openssl pkcs12 -in ~/Downloads/file.pfx -nodes -nocerts | openssl rsa -out private_new.key
And now Im trying to connect to the API using the command :
curl -I -k --key ./private_new.key --cert ./public.crt https://<API-END-POINT>/foo/bar
But in the response I get :
HTTP/1.1 401 Unauthorized
Date: Thu, 31 Jan 2019 00:24:43 GMT
Server: Foo
X-IDS-ID: 4E178F65-78F2-4CB9-B31A-8D6288F854C5
WWW-Authenticate: Basic realm=CPS Rest Services
X-message-code: MISSING_AUTHORIZATION_HEADER
Content-Type: text/html
Vary: X-CSP-STRIP
X-IDS-Node: idp15
X-IDS-Pool: green
X-IDS-Project: prod
X-IDS-Landscape: eu-nl-1
X-Content-Type-Options: nosniff
Strict-Transport-Security: max-age=31536000;includeSubDomains;preload
Cache-Control: private,no-cache,no-store
Is there something I have missed ? Im new to this so Im not sure what I might have missed.
HTTP/1.1 401 Unauthorized
...
WWW-Authenticate: Basic realm=CPS Rest Services
X-message-code: MISSING_AUTHORIZATION_HEADER
The API requires a proper Authentication header within your request but you don't send one. It is unclear what the contents should be but usually there is some username and password encoded together.
There is nothing known about the API you are trying to access so no help is possible on how to use it properly:
Specifically it is not known if the site has authentication using a client certificate at all. If it doesn't then you can't enforce it.
If it supports client certificates then it might find your self-generated certificate not sufficient since there is no way to verify who is behind this certificate. In this case it might try with Basic Authentication as a fallback to the failed client certificate.
It might also be that the site needs Basic Authentication in addition to a client certificate.

Display received cert with curl?

With a slightly older version of curl, I had a handy batch file:
curl --verbose -k https://%1 2>&1 |grep -E "Connected to|subject|expire"
This would show me the IP connected to, with the subject and expiration date of the actual certificate negotiated, even if that was not the correct certificate for that domain name -- which is sometimes a problem for our hosting (we host literally thousands of domains on our multitenant application, about half with their own certs).
In specific, I would see something like this in the stderr output before grep filtered it:
* Server certificate:
* subject: CN=academy-fran.chi.v6.pressero.com
* start date: Feb 22 04:55:00 2017 GMT
* expire date: May 23 04:55:00 2017 GMT
* issuer: C=US; O=Let's Encrypt; CN=Let's Encrypt Authority X3
* SSL certificate verify ok.
Today I had to reinstall the OS on my machine, and reinstalled curl. Now at version 7.52.1 (x86_64-w64-mingw32); previous one seems to have been 7.49.1 (i686-pc-cygwin). Curl no longer displays ANY certificate information, regardless of whether -k is used or not, if the TLS connection succeeds or not.
Is there an option that will give it back to me?
For anyone else on OSX or Linux, you can add this to your ~/.zshrc file:
function seecert () {
nslookup $1
(openssl s_client -showcerts -servername $1 -connect $1:443 <<< "Q" | openssl x509 -text | grep -iA2 "Validity")
}
Example usage, after you have run a source ~/.zshrc after the above additions:
% seecert www.google.com
Server: 1.1.1.1
Address: 1.1.1.1#53
Non-authoritative answer:
Name: www.google.com
Address: 172.217.10.100
depth=2 OU = GlobalSign Root CA - R2, O = GlobalSign, CN = GlobalSign
verify return:1
depth=1 C = US, O = Google Trust Services, CN = GTS CA 1O1
verify return:1
depth=0 C = US, ST = California, L = Mountain View, O = Google LLC, CN = www.google.com
verify return:1
DONE
Validity
Not Before: Nov 3 07:39:18 2020 GMT
Not After : Jan 26 07:39:18 2021 GMT
Thanks go to #ross-presser and his answer for the inspiration for this function.
Here is my replacement batch file, using openssl instead of curl:
#echo off
nslookup %1
(openssl s_client -showcerts -servername %1 -connect %1:443 <nul |openssl x509 -text |findstr /I "DNS After") 2>nul
This gives me this output:
C:\>seecert www.google.com
Server: 192.168.1.1
Address: 192.168.1.1#53
Non-authoritative answer:
Name: www.google.com
Address: 172.217.10.228
Name: www.google.com
Address: 2607:f8b0:4006:813::2004
Not After : Aug 16 09:49:00 2018 GMT
DNS:www.google.com

Browser, s_client without SNI and expired certificate

When I access one of my subdomains: say https://foo.example.com in a browser and inspect the certificates, the certificate looks great. When I use openssl from a remote computer it shows an expired certificate. How can this be?
I tried to reproduce what was found in this question, but my scenario is different. When I run
echo | openssl s_client -showcerts -connect foo.example.com:443 2>&1 | grep Verify
I see:
Verify return code: 10 (certificate has expired)
When I run:
echo | openssl s_client -showcerts -connect foo.example.com:443 2>&1 | openssl x509 -noout -dates
I get:
notBefore=Sep 27 15:10:20 2014 GMT
notAfter=Sep 27 15:10:20 2015 GMT
It looks expired but the browser doesn't show it as expired. Here it is in the browser:
See the 1st comment by #jww. He pointed out that I needed to add -tls1 -servername foo.example.com to my openssl command. His comment:
Try adding -tls1 -servername foo.example.com. I'm guessing you have a front-end server that's providing a default domain for requests without SNI, and the default domain is routed to an internal server with the old certificate. When the browsers connect, they use SNI and get the server for which you have updated the certificate. Or, there could be an intermediate with an expired certificate in the chain that's being served. If you provide real information, its easier for us to help you with problems like this.

cURL for Windows can't make a secure connection to the Stack API

I am using cURL for Windows.
I'm following separate advice and have just about no idea what I'm doing, but I know what I've done:
C:\...> curl http://api.stackexchange.com/2.1/sites
<garbage, probably encrypted>
C:\...> curl https://api.stackexchange.com/2.1/sites
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: http://curl.haxx.se/docs/sslcerts.html
curl performs SSL certificate verification by default, using a "bundle"
of Certificate Authority (CA) public keys (CA certs). If the default
bundle file isn't adequate, you can specify an alternate file
using the --cacert option.
If this HTTPS server uses a certificate signed by a CA represented in
the bundle, the certificate verification probably failed due to a
problem with the certificate (it might be expired, or the name might
not match the domain name in the URL).
If you'd like to turn off curl's verification of the certificate, use
the -k (or --insecure) option.
I haven't added any frills, but I do not have what you'd call a 'normal' installation; I've temporarily added the unzipped folder to my PATH, but it's on my desktop. I don't think this makes a difference but, again, I have no idea what I'm doing.
(If you're interested, I'm working on a Stack Exchange mode for Emacs.)
As you can see it is gzip
$ curl -I api.stackexchange.com/2.1/sites
HTTP/1.1 400 Bad Request
Cache-Control: private
Content-Length: 95
Content-Type: application/json; charset=utf-8
Content-Encoding: gzip
So
$ curl -s --compressed api.stackexchange.com/2.1/sites
{"items":[{"site_type":"main_site","name":"Stack Overflow","logo_url":"http://cd
n.sstatic.net/stackoverflow/img/logo.png","api_site_parameter":"stackoverflow","
site_url":"http://stackoverflow.com","audience":"professional and enthusiast pro
[...]
Tricky Tricky