disable validate_certs to false in openstack ansible deployment - ssl

i am currently deploying openstack using OSAD and an error is occurring
Failed to validate the SSL certificate for raw.githubusercontent.com:443. Make sure your managed systems have a valid CA certificate installed. You can use validate_certs=False if you do not need to confirm the servers identity but this is unsafe and not recommended
I have no idea which file i should update with validate_certs=False

This only applies if using a https url as the source of the keys. If set to no, the SSL certificates will not be validated.
This should only set to no used on personally controlled sites using self-signed certificates as it avoids verifying the source site.
Prior to 2.1 the code worked as if this was set to yes.

I guess it depends on the role for which error appeared, for example for haproxy_server SSL error, you should edit a file:
/etc/ansible/roles/haproxy_server/defaults/main.yml
and set:
haproxy_hatop_download_validate_certs: no

Related

How to disable 'Your connection is not private' screen in Chrome?

I'm working on automating a web application (F# and Canopy). Getting 'Your connection is not private' screen upon launching the website/ after providing login credentials. Tried a few workaround to have the same disabled, but none did the job. Please help.
The best approach here is not try to hide or cover up the problem, but to fix it properly so you don't have to. Solutions that involve hiding the issue are necessarily going to adversely affect your security.
Note the wording of the error code: ERR_CERT_AUTHORITY_INVALID. That tells us that the certificate for the site is signed by a non-standard or unknown certificate authority.
You mentioned localhost in your comment; you're not going to be able to get a certificate for that, but you could create a self-signed one, however, if you've enabled the localhost exemption and you're still getting the error, it suggests that you may not be using localhost after all.
So, if you have a certificate signed by a real CA and you're seeing this error, it's likely that your local OS or browser has an outdated CA root certificate bundle. you can usually get the latest one by making sure your OS packages are up to date.
If your certificate is self-signed, then the 'advanced' button will allow you to add an exemption. I you have set up your own CA and signed the certificate with that, you need to add that CA's public key that signed it to your OS.
If you've got a "regular" commercial certificate from verisign, letsencrypt, comodo or whoever, then a run through a testing tool like testssl.sh or Qualys SSL labs will tell you more about what's going wrong. Without knowing the actual domain we can't test anything for you.
Added the following argument and it did the job:
options.AddArguments("--ignore-certificate-errors")

Self-signed *.dev cert untrusted using Firefox 59 on Ubuntu

I am using Firefox 59.0.1 on Ubuntu and I am seeing the following error when accessing my development environment which is behind a self-signed SSL cert.
Your connection is not secure
The owner of crmpicco.dev has configured their website improperly. To
protect your information from being stolen, Firefox has not connected
to this website.
This site uses HTTP Strict Transport Security (HSTS) to specify that
Firefox may only connect to it securely. As a result, it is not
possible to add an exception for this certificate.
Learn moreā€¦
Report errors like this to help Mozilla identify and block malicious
sites
crmpicco.dev uses an invalid security certificate.
The certificate is not trusted because it is self-signed.
Error code: SEC_ERROR_UNKNOWN_ISSUER
I have added "crmpicco.dev" to security.tls.insecure_fallback_hosts and set security.enterprise_roots.enabled to true, restarted Firefox but this has had no effect.
I know Chrome has their "badidea"/"thisisnotsafe" workaround, which I know isn't ideal but it at least works - whereas I am yet to find a Firefox equivalent.
What is the solution for this? Do I need to generate new self-signed certs even although the cert I have is from Feb 2018.
I have tried the numerous questions on here and Mozilla support to no effect.
The top level domain *.dev is owned by Google. For some time already there has been a pre-configured HSTS policy in Chrome which made it impossible to use self-signed certificates for this domain. Firefox recently added such policy too so you get the same behavior now.
There are several ways to deal with this. The best way is to not use any currently public or future public top level domains for your private purpose. By using such domains you risk to getting in conflict with usage policies enforced by the domain owner, like enforcing HSTS in case of *.dev. Also, it might even cause security problems. Instead use either domains you actually own or use top level domains which are reserved for internal and test use, like *.test, *.invalid or *.example.
If you really want to use *.dev internally (again, bad idea) you can do it by following the policy of this domain: don't use a self-signed certificate but use a certificate issued by a CA trusted by your browser. This means creating your own CA, adding it as trusted to the browser and then issue the certificates you want by this CA. But again, using public domains you don't own (no matter if top-level or not) is a receipt for trouble.

OpenSSL error when authenticating user for DocusignAPI

We are trying to use composite templates (fillable PDFs) and embedded signing using the REST API. We are using the docusign_rest gem in conjuction with our custom code to create composite templates and embedded signing. The docusign_rest gem is used for authentication and is giving the following error:
OpenSSL::SSL::SSLError (SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed)
On the local dev machine, we simply provided path to a certificate file at the time of starting the dev server, but on a remote machine this is not feasible.
Is it possible to skip the SSL check for a demo purpose? This SO link seems to suggest that it is possible. If yes, then how can we achieve that?
If not, then is there a quick way to fix it or do we have to install SSL certificates and configure the server to read those?
We are using ruby 1.9.3 , rails 3.2.11 and Apache2 (so that would mean enabling the SSL module).
I believe for demo (demo.docusign.net) you can use https OR http. What happens if you simply use http? Does that resolve your SSL error?
In either case, you'll eventually need to resolve this though because for production (www.docusign.net) you need to use https. The problem is most likely in your Ruby code or with your certificate. For testing purposes I'd try making a cURL request through the command line to see if that works.
See here for some examples of making DocuSign REST API calls using cURL

TrustStore and reocurring "unable to find valid certification path to requested target"

I am trying to use Spring Security to authenticate users against Active Directory. So far I was using LDAP protocol, but now I would like to use LDAPS.
I followed this article http://blogs.oracle.com/gc/entry/unable_to_find_valid_certification and it works. I was able to bind user against AD successfully using LDAPS.
But after a while (15 - 30min), when I try to log in, I get this exception again:
sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:174)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:238)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:318)
and then I am no longer able to use LDAPS.
I tried to:
restart tomcat
add certificate directly to cacerts
starting tomcat with path to trustStore by using env property -Djavax.net.ssl.trustStore
Only thing that works is to recreate jssecacerts completely. It is not enough just copy existing jssecacerts to jre/lib/security, it MUST be new file. I just do not understand...
My enviroment is: java 1.6.0_26, tomcat 7.0.20, spring 3.0.5, spring security 3.1RC2
Am I doing something wrong?
Thanks
Ok, so I probably found solution. I did not know that behind one Active Directory URL are many physical machine :) When I used InstallCert it rewrote and generated new keystore with only one current certificate. That was reason why it sometimes worked and sometimes did not. I also found that all certificates are signed by one CA. After adding CA's certificate to trustStore it started finally work.

Only sometimes, since two days: CurlException: 60: SSL certificate problem

Sometimes this appears, sometime not. Its since two days in former good running apps.
CurlException: 60: SSL certificate problem, verify that the CA cert is OK. Details: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed thrown in
With a former Version of the php SDK, I disabled CURLOPT_SSL_VERIFYPEER generally besause that never works. But the last two versions, now its the newest, worked until yesterday.Shout I disable something again? Is it the same method in the actual SDK? Writing from home, can't look inside.
Is it a message from the cert coming with the sdk or are that problems with the cert of https on my server?
You shouldn't disable CURLOPT_SSL_VERIFYPEER because of the security implications. The PHP SDK usually contains the needed certificate, but in your case it seems to have problems.
The best way to solve it is:
Download the Facebook SSL certificate here
Put it somewhere accessible by PHP
Tell the Facebook PHP SDK to use it:
Facebook::$CURL_OPTS[CURLOPT_CAINFO] = '/path/to/fb_ca_chain_bundle.crt';
I just ran into this same error (coworkers didn't have it) and the solution was to download a new copy of the Facebook API SDK from https://github.com/facebook/facebook-php-sdk. Apparently my version (and the certificate) was outdated.