Terraform Init/apply/destroy - SSL Connection Problems - ssl

our company proxy brokes the SSL Connections and the proxy use our own CA.
So i have always tell the applications i use (RubyGems, Python Pip, Azure CLI ...) to use our company CA Certificate.
Does anyone know, how i can use our CA Certificate with a local Terraform installation?

Is the CA deployed to your OS's certificate store or can you import it? If so, Terraform (and probably other tools) should just be able to work with a proxy like this with no other configuration. If you need some further direction, tell us what operating system and how you typically access you have to the CA.
Edit:
#Kreikeneka have you have the certain the location CentOS expects to import into the store. There is a command you need to run that actually imports it update-ca-trust. Have you run this? If the cert is being used for SSL and you just need to trust it when going through your proxy, that is all you should need to do. You shouldn't need to tell your tools (Terraform, PIP, etc) to trust it for SSL with the proxy. If the cert is imported into your certificate store, it should be passively usable from any connection on from the machine from any process.
If you are using the cert for client authentication to the proxy then just trusting the cert by placing it in the certificate store probably won't work.
I'm not clear from your comments if you need the cert for SSL or for client authentication to the proxy. Check with your IT what it is really used for if you aren't sure and get back to us.
As of CentOS 6+, there is a tool for this. Per this guide,
certificates can be installed first by enabling the system shared CA
store:
update-ca-trust enable
Then placing the certificates to trust as CA's
in /etc/pki/ca-trust/source/anchors/ for high priority
(non-overridable), or /usr/share/pki/ca-trust-source/ (lower priority,
overridable), and finally updating the system store with:
update-ca-trust extract
Et voila, system tools will now trust those
certificates when making secure connections!
Source:
https://serverfault.com/questions/511812/how-does-one-install-a-custom-ca-certificate-on-centos

Related

SSL certificate in local IIS

Hello Everyone
Is it possible to upload SSL certificate in (local) IIS ?
if so , please provide me for the steps
if it is not possible, how can I make it runs in a global
sorry for the stupids questions but I am really newb in IIS
I would like to know more details about “upload SSL certificate in IIS”. In my opinion, IIS can create a self-signed certificate by using the Server certificate tool.
Also, it supports that importing a PFX file to the certificate store.
We can also import it to the local machine certificates store by clicking the certificate file to install it so that every user account in the machine can refer to the certificate when binding a certificate to a port.
Finally, a certificate is commonly used to bind to a port so that secure the communication. We could refer to this link.
https://learn.microsoft.com/en-us/dotnet/framework/wcf/feature-details/how-to-configure-an-iis-hosted-wcf-service-with-ssl
Feel free to let me know if there is anything I can help with.

Valid certificate signed by intermediate CA(go daddy) doesnt work with few clients(docker alpine)

I have bought a domain and certificate from Azure. The certificate is given by Go Daddy as azure partner and it is signed with the intermediate certificate from Go daddy hence it always needs chaining of certificates till Root CA.
Our website is used as a REST interface for our customers and hence clients use Java SDK or plain scripts. In our case precisely with zulu jdk image(11u5-zulu-alpine) from microsoft and even I tried with ubuntu 16.04 LTS and get the same error.
if we try to even curl to our site with proper certificates we get error as below and its because of the intermediate CA is missing!
curl: (60) server certificate verification failed. CAfile: /etc/ssl/certs/ca-certificates.crt CRLfile: none
With java SDK we get as javax.net.ssl.SSLHandshakeException
I have 3 questions.
Do we need to bundle the intermediate and root certificate with our domain certificate and deploy it.( the certificate is in pfx format)
Is it a good practice to tell our clients to install the bundle certificates ( root and intermediate) in order to get this working.
Does GoDaddy needs to update the bundle certificate in the packing repositories of Ubuntu ,alpine Or is my understanding wrong
And lastly I have a fair idea of how certificates work w.r.t client and server but with the intermediate CA in picture I am unable to understand exactly where the intermediate CA should be put in. I read few posts on the SO but its still unclear. Please bear with me and if some one can explain me the approach how it works in general and what could be the good practice.
Do we need to bundle the intermediate and root certificate with our domain certificate and deploy it.( the certificate is in pfx format)
You should definitely configure the server to send all required intermediate certs; this is required by the TLS standards. (Although if you don't, clients have the option to try to obtain them by other means, like a cache or repository or AIA, and sometimes they do.) Whether the server sends the root is optional; the standards actually state this in reverse, by saying the server MAY omit the root, where the all-caps 'MAY' invokes the meaning defined in RFC 2119. E.g. for TLS1.2 in RFC5246 7.4.2:
This is a sequence (chain) of certificates. The sender's
certificate MUST come first in the list. Each following
certificate MUST directly certify the one preceding it. Because
certificate validation requires that root keys be distributed
independently, the self-signed certificate that specifies the root
certificate authority MAY be omitted from the chain, under the
assumption that the remote end must already possess it in order to
validate it in any case.
How you do this depends on what web-server software you are using, which you didn't identify. Although from the fact you specify a Java version, I can speculate it might be Tomcat, or something based on Tomcat like Jboss/Wildfly. Even then, Tomcat's SSL/TLS configuration varies substantially depending on the version and which type of connector 'stack' you use (the pure-Java JSSE, or Tomcat Native, aka APR Apache Portable Runtime, which is actually OpenSSL). However, a 'pfx' (PKCS12) file can definitely contain both a privatekey and the matching (EE) certificate PLUS the chain cert(s) it needs, and is a convenient way to deal with the whole kaboodle at once.
For a cert obtained directly from GoDaddy, they provide instructions linked from https://www.godaddy.com/help/install-ssl-certificates-16623 for many common servers. I don't know if for Azure they use any different chaining that would alter these instructions.
If your server is publicly accessible, at port 443, https://www.ssllabs.com/ssltest will check if it is correctly handling the chain certs, as well as many other things. There are other tools as well but I am not familiar with them; for non-public servers I usually just look manually.
Is it a good practice to tell our clients to install the bundle certificates ( root and intermediate) in order to get this working.
Clients should not install intermediate cert(s) because as above the server should send them. The GoDaddy roots have been accepted in most official truststores for several years now, so most clients using default settings should not need to add them. However, some might; in particular Ubuntu 16.04 might be old enough that it doesn't have GoDaddy preinstalled. And any client(s) that wishes to use a customized truststore, and/or a pin, must ensure that it is set to include/allow your cert's trust chain.
Does GoDaddy needs to update the bundle certificate in the packing repositories of Ubuntu ,alpine Or is my understanding wrong
GoDaddy has supplied its roots to (AFAIK all) the major truststore programs, as above. IINM Ubuntu uses the Mozilla/NSS list, which definitely includes GoDaddy today, but as above I can't be sure about 16.04. I don't know for alpine. CAs do not request truststore programs to include intermediates (although a program or user may be able to add selected intermediate(s) as trusted, depending on the software used).

RabbitMQ SSL Guide...What changes needed for production?

I followed the RabbitMQ SSL guide meticulously.
https://www.rabbitmq.com/ssl.html
And it works, of course. But I've had to install the Certificate Authority on the computer running RabbitMQ server and on the PC connecting to the server.
What was not clear to me on the guide is what happens next. My self-created certificates are just for development. What do I do next to make this a production system?
Normally in production systems you have a internal CA maintained by
your organization. Get your certificates signed by the CA.
Also make sure to use TLS v1.2.

TLS/SSL certificate verification

I am new to TLS/SSL so this might be a very basic question, but I've been searching a lot an could not find an answer.
I am trying to implement a TLS/SSL client. This client will run on an embedded unit with Windows OS on it (XPe or WES7). My implementation uses GnuTLS.
How do I get the list of trusted authorities to my unit so my client can verify the server's certificate? Is it supposed to be a file stored on the client side, that the client is responsible for keeping up to date? Or can my client somehow get this list from the internet each time it is needed and not maintain it locally?
The Certificating Authority (CA) master certificates are stored client side and the client is responsible for keeping them up to date. Keeping them up to date isn't as hard as it sounds, as CA certificates aren't changed that often - most are valid for 5-10 years at least.
Client side storage is necessary because any given internet site your application uses might be compromised.
To get a list, you might start by looking at the CA certificates distributed with a browser or at the cacerts file distributed with Java. Before releasing your code, you'll probably want to check that the certs you use are genuine by checking them against information provided by the CA.
Go to any Windows machine and run "certman.msc" from command-line. Export each CA certificate (Intermediate Certificates, Trusted Authorities) to file (BER, PEM), then import these certificates into your embedded software. Now you can validate email certificates, the same way Windows does (i.e. using various x.509 related RFCs and CRLs)

distributed PKI Certificate handling- making it user friendly

I have a Server and N number of clients installed on different hosts. Each host has its self-signed certificate generated during install. The client authentication is turned ON at this point. Which means they can't communicate to each other until these certs are properly imported as described below.
Now, the server needs to import all the clients' certificates. So do all the clients from this single server. This part is really not user friendly to do it during install as either client or the server can be installed independent of each other any time.
What is the better way to import certs between clients and server without the user having to perform some kind of out-of-band manual steps?
PS: The PKI tool I am using can only import/export certificates on a local machine only. Assume I can't change this tool at this time.
In general, this is one of the problems with PKI. It is a pain to securely distribute certificates in an automated fashion.
In an Active Directory domain environment you already have a Kerberos trust in place. Therefore you can use group policy to securely distribute certficates automatically. Don't know if that would apply to you because you haven't given information about your environment/OS etc.