Using private_pub with SSL - ssl

I have setup private pub with SSL according to https://github.com/ryanb/private_pub#serving-faye-over-https-with-thin, also adding in daemonize: true (tested with and without).
I can browse to https://mydomain.com:4443/faye.js and that loads.
There are no errors on the page.
However, nothing is actually working i.e. no real time events trigger. When trying to PrivatePub.publish_to in the console I get:
OpenSSL::SSL::SSLError: SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed
When I run the thin server un-daemonized I can see it returns <SSL_incomp> when trying to publish_to.
The SSL on the server is working correctly, how do I go about fixing this?

I managed to solve this by appending the contents of the ca-bundle to the crt file specified in the slim config

Please find the proper approach to resolve this issue.
When you use only yourdomain.crt file, private_pub wont work while its doing handshake with rails server.
So your SSL Certificate provider will provide you either the intermediate.crt or CAbundle files.
Just do
If you have ca-bundle file provided by CA
*cat yourdomain.crt whatever.ca-bundle > yourdomainfinal.crt*
If you have intermediate certificate
*cat yourdomain.crt intermediate.crt > yourdomainfinal.crt*
Then use the yourdomainfinal.crt and your private key yourdomain.key for pointing to the ssl verify while running the server.
Please find the block for thin server
---
chdir: "/home/your/project/path"
environment: "your environment"
timeout: 30
log: "/home/your/project/path/log/thin.log"
pid: /home/your/project/path/tmp/pids/thin.pid
max_conns: 1024
require: []
max_persistent_conns: 1000
wait: 30
threadpool_size: 20
servers: 1
threaded: true
socket: /tmp/thin.sock
ssl: true
ssl_key_file: /home/your/project/path/ssl/yourdomain.key
ssl_cert_file: /home/your/project/path/ssl/yourdomainfinal.crt
For Private pub
To use private pub over the ssl, please use the below configuration in the private_pub_thin.yml
---
port: 4443
ssl: true
ssl_key_file: /path/to/yourdomain.key
ssl_cert_file: /path/to/yourdomainfinal.crt
environment: "your environment"
rackup: private_pub.ru
And then run the server with the following command
*thin -C config/private_pub_thin.yml start*
If you are using bundler please don't forget to use
*RAILS_ENV="your environment" bundle exec thin -C config/private_pub_thin.yml start*
The above command is important when you are using bundler, if you don't do it then your private pub will start and no issues while running server, but it wont publish messages. That's what I observed.
And note, please check weather you have port 4443 allowed in firewall settings in your server using **sudo ufw status**
Thats it!!! if you followed all the above specified steps you should have private_pub working on production or uat over SSL.

Related

Can't connect Filebeat to Logstash

I am new to elasticsearch and I am following the tutorial here:
I have hit a stumbling block as I can connect the servers with the ELK-stack configured with the server that is logging activity to FileBeat.
I have narrowed it down to an issue with the SSL certificates copied from the ELK server as when i check /var/log/messages I get the following error:
usr/bin/filebeat[13730]: transport.go:125: SSL client failed to
connect with: x509: certificate signed by unknown authority (possibly
because of "crypto/rsa: verification error" while trying to verify
candidate authority certificate "serial:16193853809450343771")
How ever, the keys have been copied over and these files are the same on both servers :
cat /etc/pki/tls/certs/logstash-forwarder.crt
When I try to read the syslogs, I get the following message :
sudo tail /var/log/syslog | grep filebeat:
tail: cannot open ‘/var/log/syslog’ for reading: No such file or directory.
I will appreciate any pointers on this
I found a similar issue in the elastic forum in the following link.
In summery, You should add to your FileBeatconfig:
insecure: true
And than see if you manage to connect. If you do, you can use this guidelines for how to configure your ssl connection

How do I have Apache2 httpd use the ubuntu's CA cert for outbound SSL connections from Apache?

Note this is not a question about having apache accept inbound SSL connections.
I have an apache module that needs to make outbound SSL connections. When it attempts to, it gets this error:
Failed to send events: The OpenSSL library reported an error: error:14090086:SSL routines:ssl3_get_server_certificate:certificate verify failed:s3_clnt.c:1269:
This is indicating the SSL library that apache is using doesn't know about the (valid) certificate of the server that my module is trying to connect to.
The CA cert on my ubuntu system where this is running is fine, knows about this downstream cert, openssl s_client tells me everything is ok.
How do I tell Apache2 to use ubuntu's system CA cert to make outbound connections work?
update - I did an strace -e open httpd -X to see where it was trying to load certificates from. I see apache opening libssl.so, but then I don't see it even trying to open up the usual ssl.cnf or any certificates file.
snipped useless strace output
update2: As to how I'm creating the https request - I'm making the request from inside my custom apache module. My module .so is written in Rust, so the connection code looks basically like:
in mod_mine.so:
use hyper::Client;
use hyper_tls::HttpsConnector;
use tokio_core::reactor::Core;
let mut core = Core::new()?;
let handle = core.handle();
let client = Client::configure()
.connector(HttpsConnector::new(4, &handle)?)
.build(&handle);
//actually a POST, but this gets the same error
let request = client.get("https://saas.mycompany.io".parse()?);
let result = core.run(request)?;
... //process result
I found a solution that works, though I'm not sure it is optimal.
openSSL takes the environment variable SSL_CERT_FILE. I can set this in my apache module source code.
use std::env;
let cert_file = figure_out_cert_path(); //on ubuntu: /etc/ssl/certs/ca-certificates.crt
env::set_var("SSL_CERT_FILE", cert_file);

Chef ssl validation failure

I have one chef-server version 12.0.1 and can connect linux (rhel/centos) systems to the chef-server with knife bootstrap but cannot with windows and locally on my rhel client knife ssl check fails.
I have two problems but I think they are both related.
Problem 1 - knife ssl check fails:
Connecting to host chef-server:443
ERROR: The SSL certificate of chef-server could not be verified
Problem 2 - bootstrap windows server fails:
ERROR: SSL Validation failure connecting to host: chef-server - SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed
Chef encountered an error attempting to create the client "desktop"
I have tried a number of things:
1) knife ssl fetch - no changes
2) I have a signed digicert crt on the server which is accepted by the management-console and chrome web browser
3) I have changed set this in the chef-server.rb
nginx['ssl_certificate'] = "/var/opt/opscode/nginx/ca/hostname.crt"
nginx['ssl_certificate_key'] = "/var/opt/opscode/nginx/ca/hostname.key"
which go to the signed certs.
Anything else I should be trying or am I being a plank?
Try running these commands on your Chef server:
mkdir /root/.chef/trusted_certs
cp /var/opt/chef-server/nginx/ca/YOUR_SERVER'S_HOSTNAME.crt /root/.chef/trusted_certs/
I was having the same problem and it was fixed after I looked through this article, and tried out the steps it gave: http://jtimberman.housepub.org/blog/2014/12/11/chef-12-fix-untrusted-self-sign-certs/
I was having the same issue using a valid wildcard certificate, although it was linux rather than windows. Looks like the issue is that the chef client uses openssl and didn't have the CA and root certificates. I was getting errors when I ran the following from the chef client server:
openssl s_client -connect chef_server_url*:443 -showcerts
I solved my issue by browsing to the chef server, inspecting the certs and exporting each cert in the chain to a single file, ordered with the issued certificate at the top, and the root at the bottom. I then used this bundled-cert as the certificate file in the chef server config file and reconfigured chef.

Chef SSL verification failed while setting workstation

I am setting up Chef workstation by configuring knife.rb using "knife configure -i" configure command. After PROPERLY answering all question, I get the following error :
ERROR: SSL Validation failure connecting to host: 172.xx.x.xx - SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed
ERROR: SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed
My goal is to disable this SSL certificate verification forever and use knife utility to bootstrap my all nodes.
I had the same issue running chef-client after upgrading to the version 12.xx. Steps to solve:
Pull crt from server. Run on node:
knife ssl fetch -s https://yourchefserver01.com:443
Note: If fetch doesnt work copy from yourchefserver01.com:/var/opt/chef-server/nginx/ca/yourchefserver01.com.crt to client:/root/.chef/trusted_certs/yourchefserver01.com.crt
Verify it pulled:
knife ssl check -s https://yourchefserver01.com:443
export SSL_CERT_FILE="/root/.chef/trusted_certs/yourchefserver01.com.crt"
Run chef-client
Your problem is the validation of the chef server certificate.
Install a proper certificate on the chef server
or add your chef server certificate (located in /etc/chef-server/hostname.crt) to your workstation cacert.pem (located by default in <install path>/opscode/chef/embedded/ssl/certs).
With chef 12 you'll have to ditribute it too on your nodes to validate the chef API server or you'll have a warning at the start of each chef-client run about it.
Issue seems to be concerned with the .pem validator. your validation are misconfigured. Try create new validation key from chef server and place it under the node.
If you are running Chef Server on-premise, it will easier in the long run to install a third-party SSL cert, e.g. Verisign, on the Chef Server (or load balancer). chef-client and knife come with OpenSSL which will trust a valid third-party cert automatically with no configuation required on each node.
Please don't turn off SSL cert validation. SSL validation is additional protection that the server you are trusting with root access to your Chef nodes is the real Chef server, not a man-in-the-middle attack.

Why does Chef throw SSL error when using knife Command on Chef-Workstation?

SSL error occurs when we use the knife command to verify successful setup of the Chef-Workstation or when we try to upload a Chef-Cookbook. Using the following commands :
knife client list
knife node list
knife cookbook upload cookbookname
we get the following error on the Chef-Workstation:
OpenSSL::SSL::SSLError: SSL_connect returned=1 errno=0 state=SSLv2/v3 read server hello A: unknown protocol
To resolve this error we tried using rackfile software to create following 3 files:
hostname.key
hostname.pem
hostname.crt
on the Chef-Server.
We placed hostname.pem inside the chef folder on the server itself and inside certs folder on the workstation. Finally we tried to run the commands once again but did not succeed. Any help to resolve the SSL error will be sincerely appreciated.
The Chef Server certificate has not yet been pulled into the workstation's trusted_certs directory.
Run the command
knife ssl fetch
from your Chef Workstation.
This will pull the certificate from the Chef Server and place it in the Workstation's trusted_certs directory. The default location of the trusted_certs is in your .chef/trusted_certs directory within your chef-repo directory.
Then run
knife ssl check
to verify the certificate.
Certificates that are in the trusted_certs directory will be trusted by any execution of the knife command.
https://docs.chef.io/workstation/getting_started/#get-ssl-certificates
You need to register that certificate on each workstation. Also, make sure the certificate matches the correct URL (i.e. the API endpoint, not the web interface)