Wget and Curl stopped working with HTTPS. Wrongly complain about an expired certificate - ssl

I have a script that runs every day on an Ubuntu 14.04 server. The script is a simple wget command that downloads a file from a remote server and saves it to the local file system:
wget https://example.com/resources/scripts/myfile.php -O myfile.php
It has worked fine for months until this morning when suddenly when I run it I get:
--2020-05-30 11:57:16-- https://example.com/resources/scripts/myfile.php
Resolving example.com (example.com)... xx.xx.xx.xx
Connecting to example.com (example.com)|xx.xx.xx.xx|:443... connected.
ERROR: cannot verify example.com's certificate, issued by ‘/C=GB/ST=Greater Manchester/L=Salford/O=Sectigo Limited/CN=Sectigo RSA Domain Validation Secure Server CA’:
Issued certificate has expired.
To connect to example.com insecurely, use `--no-check-certificate'.
The SSL for the domain is valid and expires in Jan. 2022. Nothing has changed on that front. And yet somehow wget no longer sees that.
Here is another interesting fact. If I run this same exact command on an Ubuntu 18 box, it works like a charm without any complaints. This tells me something is wrong with my Ubuntu 14.04 machine.
Curl produces the same error:
curl https://example.com
curl: (60) SSL certificate problem: certificate has expired
This post suggest that the certificate bundle is out of date. I have downloaded the suggested PEM file and tried running wget with by specifying the --ca-certificate=cacert.pem option, but to no avail.
I have also tried running: apt install ca-certificates and update-ca-certificates, but that did not work either.
Again, everything works great on an Ubuntu 18 box, but not Ubuntu 14 or 16. Also why did it work fine until this morning when I know nobody has touched the box? Clearly something is out of date, but I can't seem to figure out how to fix it.
Does anybody have any suggestions?

I had the same error two days ago with Comodo Certificate and ubuntu 16.04.
The problem was like say mrmuggles this https://support.sectigo.com/Com_KnowledgeDetailPage?Id=kA03l00000117LT.
I fixed with this steps:
vi /etc/ca-certificates.conf
Remove the line (or comment)
specifying AddTrust_External_Root.crt
apt update && apt install ca-certificates
update-ca-certificates -f -v

https://askubuntu.com/questions/440580/how-does-one-remove-a-certificate-authoritys-certificate-from-a-system
Like the original poster the method of editing ca-certificates.conf did not work for me on Ubuntu 14.04.
What did work:
Run sudo dpkg-reconfigure ca-certificates
Deselect the problem CA: AddTrust_External_Root
Press OK
My understanding is that deletes the expired CA of AddTrust_External_Root and the newer CA USERTrust_RSA_Certification_Authority is used instead.

For wget add --no-check-certificate
Example: wget https://example.com/resources/scripts/myfile.php --no-check-certificate -O myfile.php

Related

Where do I find the ca certificates for mosquitto_sub and pub?

In this article mosquitto_sub with TLS enabled I understand that you need to provide a capath or cafile option to mosquitto_sub (and pub) but I am having trouble figuring out where those files/paths come from.
Back in October I was able to run mosquitto_sub -h mymosquitto.com -p 8883 -v -t 'jim/#' -u <u> -P <pw> --capath ssl/certs from my desktop computer (running Mint 19). That no longer works. I did an apt install ca-certificates and found the .crt files in /usr/share/ca-certificates/mozilla/ but when I used that path, it still gave me: Error: A TLS error occurred.
This is a Ubuntu 18.04 server running Let'sencrypt. I tried to point the --cafile to the chain.pem file which came from:
allow_anonymous false
password_file /etc/mosquitto/pwfile
listener 1883
listener 8883
certfile /etc/letsencrypt/live/mymosquitto.com/cert.pem
cafile /etc/letsencrypt/live/mymosquitto.com/chain.pem
keyfile /etc/letsencrypt/live/mymosquitto.com/privkey.pem
But that didn't work either. Can someone please help me understand what I should be doing?
From the mosquitto_sub man page:
--capath
Define the path to a directory containing PEM encoded CA certificates that are trusted. Used to enable SSL communication.
For --capath to work correctly, the certificate files must have ".crt" as the file ending and you must run "openssl rehash [path to
capath]" each time you add/remove a certificate.
If you want to use a directory of certs you will have to make sure the openssl rehash command mentioned has been run on that directory.
If you want use a file from the letsencrypt --cafile with the fullchain.pem file
I have rethought my situation. Since my certs get regenerated every 3 months or so I'm going to have to redo my apps using the new files so I decided to just go back to rolling my own. I did that using this site: http://www.steves-internet-guide.com/mosquitto-tls/ and I'm back to where I was in October.Thanks to hardillb for the advise.
Jim.

Why isn't certbot writing the verification file?

I am trying to install a certificate using certbot on Ubuntu Xenial by using the below command:
sudo certbot run -a webroot -i apache -w /var/www/mydomain/public/.well-known/acme-challenge/ -d "example.com"
I get a challenge failed error with the following notes:
Domain: mydomain.com
Type: unauthorized
Detail: Invalid response from
http://example.com/.well-known/acme-challenge/lvJ9RbuDyoPn4NXnxPpjOYpsGHZb6ZYdDoBWW-6JC1k
I created the /.well-known/acme-challenge myself thinking this might help, but it didn't. I tried putting a file into the acme-challenge directory and browsed to it through Chrome and this worked without an issue. Therefore, I know the Apache host is setup correctly.
I'm now at a loss of what to try.

Problem in getting SSL Certificate for my domain at digitalocean droplet through Let's Encrypt

I was trying to get SSL certificate for my domain on PhpMyAdmin Droplet by following the steps mentioned at "https://www.digitalocean.com/community/tutorials/how-to-secure-apache-with-let-s-encrypt-on-ubuntu-18-04". My server is Ok. I have DNS A entry for my domain.com and CNAME entry for my www.domain.com
As I went to execute "sudo certbot --apache -d your_domain -d www.your_domain"
It asked me to enter email address and after that it gave me the following error.
"An unexpected error occurred:
The client lacks sufficient authorization :: Account creation on ACMEv1 is disabled. Please upgrade your ACME client to a version that supports ACMEv2 / RFC 8555. See https://community.letsencrypt.org/t/end-of-life-plan-for-acmev1/88430 for details.
"
I tried with root and non root admin user with sudo but still the same. Any help is appreciated
Best
I got it resolved. So first run
sudo apt update
sudo apt install --only-upgrade certbot
and then
sudo certbot --apache -d your_domain -d www.your_domain
worked for me

Cannot connect to "0.0.0.2 - Published app name" in Citrix Receiver (ICA Client 13.10.x)

If you want to establish a connection with the citrix receiver on different citrix-based vpn-services, then an error message appears "Connection with 0.0.0.2 ... cannot connect, no such file or directory"
Current configuration:
Fedora 28 with all updates
Browser Firefox 63.0
ICAClient-13.10.0.20-0.x86_64
cd /opt/Citrix/ICAClient/keystore/
sudo rm -rf cacerts
sudo ln -s /etc/ssl/certs cacerts
Answers can be found in the ubuntu forum, but they didn't help me: Citrix receiver error 1000119
My problem are also the certificates ... the following root certificates: Digicert and Comodo
The technical solution to my problem is as follows - do it in the terminal as root.
su
cd /opt/Citrix/ICAClient/keystore/cacerts/
wget https://dl.cacerts.digicert.com/DigiCertHighAssuranceEVRootCA.crt
curl https://support.comodo.com/index.php?/Knowledgebase/Article/GetAttachment/969/821026 > comodorsacertificationauthority.crt
exit
And connect the vpn-services again.
worked for me on ubuntu, by copying the certs (entire cacerts directory) to /opt/Citrix/ICAClient/keystore/cacerts from /etc/ssl/certs
Just installed newest Receiver (now called Workspace) but it had same ssl certificate issue.
I recalled a 2013 fix I had used several years ago from AskUbuntu-> Make Firefox's certificates accessible to Citrix.
I suppose it has to do with certificates that my institution uses and aren't included in the usual Citrix downloads since most people are not screaming for a fix.
All you have to do is set a symbolic link:
sudo ln -s /usr/share/ca-certificates/mozilla/* /opt/Citrix/ICAClient/keystore/cacerts
or wherever your flavor's Citrix and mozilla certificates are stored...YMMV
I have faced the same issue in Ubuntu, in my case I just changed my connection type from ipv4/ipv6 to ipv4 and its worked.

wget ssl alert handshake failure

I am trying to download files from an https site and keep getting the following error:
OpenSSL: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
Unable to establish SSL connection.
From reading blogs online I gather I have to provide the server cert and the client cert. I have found steps on how to download the server cert but not the client cert. Does anyone have a complete set of steps to use wget with SSL? I also tried the --no-check-certificate option but that did not work.
wget version: wget-1.13.4
openssl version: OpenSSL 1.0.1f 6 Jan 2014
trying to download all lecture resources from a course's webpage on coursera.org. So, the URL would look something like this: https://class.coursera.org/matrix-002/lecture
Accessing this webpage online requires form authentication, not sure if that is causing the failure.
It works from here with same OpenSSL version, but a newer version of wget (1.15). Looking at the Changelog there is the following significant change regarding your problem:
1.14: Add support for TLS Server Name Indication.
Note that this site does not require SNI. But www.coursera.org requires it.
And if you would call wget with -v --debug (as I've explicitly recommended in my comment!) you will see:
$ wget https://class.coursera.org
...
HTTP request sent, awaiting response...
HTTP/1.1 302 Found
...
Location: https://www.coursera.org/ [following]
...
Connecting to www.coursera.org (www.coursera.org)|54.230.46.78|:443... connected.
OpenSSL: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
Unable to establish SSL connection.
So the error actually happens with www.coursera.org and the reason is missing support for SNI. You need to upgrade your version of wget.
You probably have an old version of wget. I suggest installing wget using Chocolatey, the package manager for Windows. This should give you a more recent version (if not the latest).
Run this command after having installed Chocolatey (as Administrator):
choco install wget
I was in SLES12 and for me it worked after upgrading to wget 1.14, using --secure-protocol=TLSv1.2 and using --auth-no-challenge.
wget --no-check-certificate --secure-protocol=TLSv1.2 --user=satul --password=xxx --auth-no-challenge -v --debug https://jenkins-server/artifact/build.x86_64.tgz
One alternative is to replace the "https" with "http" in the url that you're trying to download from to just circumvent the SSL connection. Not the most secure solution, but this worked in my case.
I was having this problem on Ubuntu 12.04.3 LTS (well beyond EOL, I know...) and got around it with:
sudo apt-get update && sudo apt-get install ca-certificates
Basically your OpenSSL uses SSLv3 and the site you are accessing does not support that protocol.
Just update your wget:
sudo apt-get install wget
Or if it is already supporting another secure protocol, just add it as argument:
wget https://example.com --secure-protocol=PROTOCOL_v1
Below command for download files from TLSv1.2 website.
curl -v --tlsv1.2 https://example.com/filename.zip
It`s worked!
Otherwise might be just simpler to use curl instead.
There is no peculiar need to specify any option and can be simply:
curl https://example.com/filename.zip
with curl there is no need to add the -v option when facing the wget SSL error.