I am running the docker cli for Wappalyzer from a RHEL box. We have a number of sites we scan by IP address where the certificate verification prevents Wappalyzer from returning results. Is there a way to disable the certificate check for Wappalyzer or RHEL like you can for wget with --no-check-certificate?
The Github repo now includes a commit, strictSSL: false, to resolve the issue.
https://github.com/AliasIO/Wappalyzer/commit/09ebc4aa486b5151ab05344e7611006a877408c2
Related
I am using Google Cloud deployment manager - Wordpress click to deploy solution.
I installed a certificate through the Virtual machine SSH on the compute engine page using Certbot. Immediately after I installed the certificate the page started showing
"ssl_error_bad_cert_domain " and didn´t open.
I went back to the SSH and deleted the certificate by using the certbot command $ sudo certbot delete . Since that didn´t solved the error I tried turning off and on and also restarting the VM which didn't resolve the issue either
I could see in the logs explorer there was an error coming from the VM saying: Invalid ssh key entry - expired key:[expired_key] so I requested a new one through the VM ssh using:
ssh-keygen -t rsa -f ~/.ssh/gcloud_instance1 -C username printed the content of cd ~/.ssh && cat gcloud_instance1.pub and then added that to the VM ssh-keys text-area. That did stop the errors on the logs explorer but didn't solve the issue since the Wordpress implementation still doesn't open.
Another thing to add is that when the VM was turned off and on the IP address changed, which I am not sure it is also causing the crash.
This is how the page currently looks like: webpage failing
This is the logs explorer : https://docs.google.com/spreadsheets/d/1cKXWkfaFbmUFakomwM_-TtoSelxWKBDxNBK7fnMy_PA/edit?usp=sharing
Any ideas what could be happening ?
Thanks
The "ssl_error_bad_cert_domain" error can happen when the SSL certificate is not properly installed or is not issued for the correct domain.
My guess is that you had a typo in the domain name or you need a static ip.
You can use this "-sudo certbot certificates" command to verify after the installation.
It is recommended to have a static IP for better stability, especially if you are using SSL certificates.
As part of an automated build, we run download some code from github. Minimal example:
wget github.com
Recently, the command started failing with a certificate error:
URL transformed to HTTPS due to an HSTS policy
--2017-10-05 11:43:45-- https://github.com/
Resolving github.com (github.com)... 192.30.253.112, 192.30.253.113
Connecting to github.com (github.com)|192.30.253.112|:443... connected.
ERROR: cannot verify github.com's certificate, issued by 'CN=DigiCert SHA2 Extended Validation Server CA,OU=www.digicert.com,O=DigiCert Inc,C=US':
Unable to locally verify the issuer's authority.
I tried updating the certificate store, and wget itself:
update-ca-certificates
apt-get install wget
The error is still the same.
My wget version is GNU Wget 1.17.1, and the OS is Ubuntu 16.04.3.
You can avoid checking the validity of the certificate adding the --no-check-certificate option on the wget command-line.
The answer turned out to lie somewhere in packet configuration. Unfortunately, I am unable to tell exactly why. The suspicion is some mono version installed from a ppa was messing with our cert store.
I've been trying off and on to get a LAMP development server operational behind my corporate firewall (McAfee Web Gateway). I have a Ubuntu/Trusty64 image on a virtualbox VM provisioned through Vagrant. I cannot get "some" {most} repositories to load for a proper sudo apt-get update. I'm getting a 401 authentication required error on all 'security.ubuntu.com trusty-security/*' sources and 'archive.ubuntu.com trusty/*' sources and all fail to fetch. Therefore most all sudo apt-get install {whatever} fails and I cannot add the necessary PPA repository to install the LAMP environment I want.
I can turn off SSL verification for some things and can get many things installed - but I need SSL working correctly within this environment.
Digging deeper, I find that if I curl -v https://url.com:443, I get the
curl(60): ssl certificate error: unable to get local issuer certificate.
I have the generic bundle 'ca-bundle.crt' installed locally in /usr/local/share/ca-certificates/ and ran sudo update-ca-certificates which seemed to update ca-certificates.crt in etc/ssl/certs/.
I ran a strace -o stracker.out curl -v https://url.com:443 and searched for the failing stat() as suggested in here by No-Bugs_Hare and found that curl was looking for 'c099e901.0' in /etc/ssl/certs/ and it isn't there. Googling that particular HEXID is no joy and am stuck at this step.
Next I tried strace -o traceOppenSSL.out openssl s_client -connect url.com:443 to see if I can get more detail but can't see what causes the
verify error:num=20:unable to get local issuer certificate
followed by two other errors (I'm sure all relating to the first one), then displays the "Server Certificate" within a BEGIN / END block, followed by a bunch of other metadata. The entire session ends with
Verify return code: 21 (unable to verify the first certificate).
So, this is not my forte and I'm doing what I can to try and get this VM operational. Like I said earlier, I've been trying many things and understand most of the issue is the fact that I'm behind a McAfee firewall within my corporate structure. I don't know how to troubleshoot more than what I've explained above but I'm willing to dig deeper.
I have a few questions. Why is curl looking for that particular hex ID and where would I find or generate the beast? Are there other troubleshooting steps I should try? The VM is a server-class Ubuntu install, so I only have a SSH CLI terminal and no WindowManager GUI to work with this.
I've already purchased the SSL Certifcate from DigiCert and install it into my Nexus server (running in tomcat, jks)
It works well in firefox and chrome(green address bar indicates that a valid certificate received) , builds could be downloaded from Nexus WebUI too.
But, wget could not get the result without --no-check-certificate
something like
ERROR: cannot verify mydomain.com's certificate, issued by `/C=US/O=DigiCert Inc/OU=www.digicert.com/CN=DigiCert High Assurance CA-3':
Unable to locally verify the issuer's authority.
To connect to mydomain.com insecurely, use `--no-check-certificate'.
Unable to establish SSL connection.
Found something,
SSL connection fails with wget, curl, but succeed with firefox and lynx
linux wget not certified?
But neither of them gives a final solution, I want to know whether there are some (special) configurations on Nexus or this's a bug of wget command?
Google return many results about "digitcert wget",but I cannot find a clue either, Thank you!
You need to add the DigiCert root certificate to a store accessible by wget:
http://wiki.openwrt.org/doc/howto/wget-ssl-certs
I am getting the below error while making ssl connection with self signed certificate.
"Peer certificate cannot be authenticated with known CA certificates"
It is working fine with CA signed certificate.
I am setting the below using curl_easy_setopt().
curl_easy_setopt(MyContext, CURLOPT_CAPATH, CA_CERTIFICATE_PATH)
curl_easy_setopt(MyContext, CURLOPT_SSL_VERIFYPEER,TRUE);
The curl version:
libcurl-7.19.7-26
Openssl version is:
0_9_8u
Please let me know how to solve this issue.
By default CURL will generally verify the SSL certificate to see if its valid and issued by an accepted CA. To do this, curl uses a bundled set of CA certificates.
If you'd like to turn off curl's verification of the certificate, use the -k (or --insecure) option. Here's an example:
curl --noproxy -k \* -D - https://127.0.0.1:443/some-secure-endpoint
Security issue: This answer disables a security feature. Do not use this in production!
For php it is possible to switch off curl's verification of the certificate (see warning below) e.g. for curl_exec
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
http://php.net/manual/en/function.curl-setopt.php
(evaluate the security risk yourself, in my case it was on a partner company's server and the file required contained no secure information - just happened to be on a secure server)
We fixed a similar issue on CentOS 6 by updating curl to the latest version available in the standard repositories and installing the newest ca-certificates bundle:
yum update curl
yum install ca-certificates
libcurl performs peer SSL certificate verification by default. This is done
by using CA cert bundle that the SSL library can use to make sure the peer's
server certificate is valid.
If you communicate with HTTPS or FTPS servers using certificates that are
signed by CAs present in the bundle, you can be sure that the remote server
really is the one it claims to be.
Until 7.18.0, curl bundled a severely outdated ca bundle file that was
installed by default. These days, the curl archives include no ca certs at
all. You need to get them elsewhere. See below for example.
For more to know about Peer SSL Certificate Verification visit http://curl.haxx.se/docs/sslcerts.html
Though this error happened in the case of using git clone rather than with using curl, I've recently stumbled across an identical error message:
Peer certificate cannot be authenticated with known CA certificates
Similar to Arth's findings, something that worked for CentOS 6 (in order to successfully use HTTPS URLs with git clone for related GitLab repositories) involved updating the trusted certificates on the server (i.e., the server that is using HTTPS), using the following steps:
sudo yum install ca-certificates
sudo update-ca-trust enable
sudo cp /path/to/your_new_cert.crt /etc/pki/ca-trust/source/anchors/
sudo update-ca-trust extract
Perhaps the same certificate steps can be applied for the case of curl (or other similar scenarios) for users on CentOS in the future.
Security issue: This answer disables a security feature. Do not use this in production!
In 'C'
curl_easy_setopt(curl_handle, CURLOPT_SSL_VERIFYPEER, 0);
worked for me
As we checked and observed/ Found in Centos 8 .
Due to Proxy issue your packages not allowing you to get accessible to update or download any packages.
try to add sslverify=0 in file /etc/dnf/dnf.conf
Its worked for me.
Also make sure you must have proper internet acess on your server.