What Encryption Method Does Curl Use By Default? - ssl

Full disclosure, I have very little idea what I'm doing.
I'm doing some troubleshooting with Curl and encryption, and I don't understand why this works for a certain website I'm testing against:
curl -v https://website
but none of these options work:
curl -v -1 https://website
curl -v -2 https://website
curl -v -3 https://website
The error I get back with all three options is:
error:14094410:SSL routines:SSL3_READ_BYTES:sslv3 alert handshake failure
I've Googl'd the heck out of this error, and it seems like there are a millions reasons for Curl to return this.
I know that the -2 option uses super old and busted SSL, -3 uses less old (but still busted) SSL and that -1 uses TLS. The version of Curl I'm using doesn't seem to work if I try to get granular with --tlsv1.0, etc. I don't have permission to install a newer version of Curl on the machines I'm testing on.
So, my question is this: How do I know what method Curl is using to connect to https:// sites if I don't explicitly tell it what to use?

It depends entirely on what is negotiated with the peer. You would need to examine the handshake trace in each specific case.

Related

curl command works but C program fails NSS: client certificate not found (nickname not specified)

there are a number of similar posts but I am trying to understand a little more than what those offer.
My curl commandline works fine and am able to talk to the server and get the data I want. The command looks like
curl -v --tlsv1.2 --cert ./service_cert.pem --key ./service_private.key "https://myserver"
But when I try to run my C program and examine the http client object I see this
errorBuffer = "NSS: client certificate not found (nickname not specified)
reading further I realized I have libcurl built with NSS which doesn't support reading cert from a flat file ( .pem)
How is then command line curl utility able to read the pem file ?
You need to import your client certificate into a NSS database, using certutil, and have your code use this database.
Reference:
https://developer.mozilla.org/en-US/docs/Mozilla/Projects/NSS/tools/NSS_Tools_certutil

Jmeter Distributed Testing Not working with two way SSL Handshake

I have tried to do distributed testing with two servers for a request which requires two way SSL handshaking. This is working fine when we are not using remote hosts for testing
sh jmeter.sh -n -t sample_Load_Test/sample_test.jmx -l sample_report/Log/results.jtl -e -o sample_report/Dashboard/
Jmeter Success:
But on trying to use the remote hosts for the same jmx file, the SSL handshake is failing. I have put the same same jmeter.p12 and truststore.jks in all the servers which are used for distribute testing.
Command used:
sh jmeter.sh -n -t sample_test/sample_load_test.jmx -l sample_report/Log/results.jtl -e -o sample_report/Dashboard/ -r -Jserver.rmi.ssl.disable=true
Please see the error that I am getting
Jmeter Failure:
<httpSample t="20" it="0" lt="0" ct="20" ts="1545068074631" s="false"
lb="HTTP Request" rc="Non HTTP response code:
javax.net.ssl.SSLHandshakeException" rm="Non HTTP response message:
Received fatal alert: handshake_failure"
Does anyone knows what I am doing wrong here
I can think of 2 possible causes:
You use different JRE versions on master and slaves and they have different SSL configuration in terms of storing certificates. Make sure you use exactly the same Java runtime everywhere and configuration is the same.
Your test relies on client certificates and on one of the slaves you don't have them defined in system.properties file or in SSL Manager make sure to use the same JMeter version and the same set of config files and external data files on each slave.
Get used to look into jmeter.log and/or jmeter-server.log files - in the majority of cases you should get the reason of the failure or unexpected behavior from the log.

Disable cert revocation check in unix/linux using curl command

I am using curl command to invoke a rest service. It is as follows:
{curl -X POST --ssl-no-revoke --cacert xyz.pem -K urls.txt -H "Authorization:Basic XYZ" -H "Content-Type:application/json" -d #data.json}
The above command is used to hit the service using one way SSL and basic authorization. The data to be passed is enclosed in data.json file and the urls to be hit are enclosed in urls.txt file.
The above command works perfectly in Windows but when executed from linux, it says:
{curl: option --ssl-no-revoke: is unknown
curl: try 'curl --help' or 'curl --manual' for more information}
I want to disable certificate revocation checks altogether. It looks like {--ssl-no-revoke} works on Windows but not Unix/Linux.
Would like to know if any alternative.
ssl-no-revoke is Windows Only. The only alternative I'm aware of is to have a valid certificate or not use SSL.
https://curl.haxx.se/docs/manpage.html
Using a valid certificate is not always a solution as revocation checks will fail with a valid certificate too when there is no Internet connection (for example, in the presence of a captive portal).
One way is to disable certificate checking altogether, i.e.:
curl --insecure https://www.example.com
Note that this will greatly reduce the security as self signed certificates will also be accepted as well as revoked ones!

Curl request with ntlm authentication fails if password is set

I try to fetch some data from a Microsoft Dynamics Nav WebService.
This service uses the NTML authentication.
If I open the webservice url in a browser and use the given credentials everything works fine.
For setting up the environment for the WebService Client, I used the command line to check whether everything is working fine, I was, at a specific point, unable to authenticate.
Thats the command I am using:
curl --ntlm -u "DOMAIN\USERNAME" -k -v "http://hostname:port/instance/Odata/Company('CompanyName')/Customer"
The command will prompt for the password.
I copy in the password and everything is doing fine.
But when I use this command, with the password already included, it stops working and the authentication fails:
curl --ntlm -u "DOMAIN\USERNAME:PASSWORD" -k -v "http://hostname:port/instance/Odata/Company('CompanyName')/Customer"
The password contains some special chars, so I tried to use the percent encoding, which had no effect at all.
It is very difficult to research this kind of issue. Searching for curl + ntlm authentication issues provides a lot of results, but nothing is related to this specific kind of issue.
Does anyone of you guys already had experience with this kind of issue?
I had a problem with authentication because of cookies. I solved this containing cookies in txt file and using exactly this file through all requests. For example, after login request I saved this cookies:
curl -X POST -u username:password https://mysite/login -c cookies.txt
And with next request I used this file like this:
curl -X POST -u username:password https://mysite/link -b cookies.txt
This solution worked for me, I don't know if your problem is similar, but, I think, you may try this.
I was struggling with similar issue for a long time and finally I found this curl bug report #1253 NTLM authentication fails when password contains special characters (british pound symbol £) .
NTLM authentication in cURL supports only ASCII characters in passwords! This is still the case in version 7.50.1 on Ubuntu but I tested this on many different distributions and it is always the same. This bug also will break curl_init() in PHP (tested on PHP7). The only way to solve that is to avoid non ASCII characters in NTLM authentication passwords.
If you are using Python then you are lucky. Apparently Python developers rewrote cURL implementation and it works with non ASCII characters if you use HttpNtlmAuth package.
Try with nltm flag.
Something like this:
curl -v --proxy-nltm -u 'username:password' youproxy.com:8080 someURL
from > curl --help
-x, --proxy [PROTOCOL://]HOST[:PORT] Use proxy on given port
--proxy-anyauth Pick "any" proxy authentication method (H)
--proxy-basic Use Basic authentication on the proxy (H)
--proxy-digest Use Digest authentication on the proxy (H)
--proxy-negotiate Use Negotiate authentication on the proxy (H)
--proxy-ntlm Use NTLM authentication on the proxy (H)

wget ssl alert handshake failure

I am trying to download files from an https site and keep getting the following error:
OpenSSL: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
Unable to establish SSL connection.
From reading blogs online I gather I have to provide the server cert and the client cert. I have found steps on how to download the server cert but not the client cert. Does anyone have a complete set of steps to use wget with SSL? I also tried the --no-check-certificate option but that did not work.
wget version: wget-1.13.4
openssl version: OpenSSL 1.0.1f 6 Jan 2014
trying to download all lecture resources from a course's webpage on coursera.org. So, the URL would look something like this: https://class.coursera.org/matrix-002/lecture
Accessing this webpage online requires form authentication, not sure if that is causing the failure.
It works from here with same OpenSSL version, but a newer version of wget (1.15). Looking at the Changelog there is the following significant change regarding your problem:
1.14: Add support for TLS Server Name Indication.
Note that this site does not require SNI. But www.coursera.org requires it.
And if you would call wget with -v --debug (as I've explicitly recommended in my comment!) you will see:
$ wget https://class.coursera.org
...
HTTP request sent, awaiting response...
HTTP/1.1 302 Found
...
Location: https://www.coursera.org/ [following]
...
Connecting to www.coursera.org (www.coursera.org)|54.230.46.78|:443... connected.
OpenSSL: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
Unable to establish SSL connection.
So the error actually happens with www.coursera.org and the reason is missing support for SNI. You need to upgrade your version of wget.
You probably have an old version of wget. I suggest installing wget using Chocolatey, the package manager for Windows. This should give you a more recent version (if not the latest).
Run this command after having installed Chocolatey (as Administrator):
choco install wget
I was in SLES12 and for me it worked after upgrading to wget 1.14, using --secure-protocol=TLSv1.2 and using --auth-no-challenge.
wget --no-check-certificate --secure-protocol=TLSv1.2 --user=satul --password=xxx --auth-no-challenge -v --debug https://jenkins-server/artifact/build.x86_64.tgz
One alternative is to replace the "https" with "http" in the url that you're trying to download from to just circumvent the SSL connection. Not the most secure solution, but this worked in my case.
I was having this problem on Ubuntu 12.04.3 LTS (well beyond EOL, I know...) and got around it with:
sudo apt-get update && sudo apt-get install ca-certificates
Basically your OpenSSL uses SSLv3 and the site you are accessing does not support that protocol.
Just update your wget:
sudo apt-get install wget
Or if it is already supporting another secure protocol, just add it as argument:
wget https://example.com --secure-protocol=PROTOCOL_v1
Below command for download files from TLSv1.2 website.
curl -v --tlsv1.2 https://example.com/filename.zip
It`s worked!
Otherwise might be just simpler to use curl instead.
There is no peculiar need to specify any option and can be simply:
curl https://example.com/filename.zip
with curl there is no need to add the -v option when facing the wget SSL error.