Docker private registry | TLS certificate issue - ssl

I've tried to follow the following tutorial to setup our own private registry (v2) on an AWS Centos machine.
I've self signed a TLS certificate and placed it in /etc/docker/certs.d/MACHINE_STATIS_IP:5000/
When trying to login the registry (docker login MACHINE_IP:5000) or push a tagged repository (MACHINE_IP:5000/ubuntu:latest) i get the following error :
Error response from daemon: Get https://MACHINE_IP:5000/v1/users/: x509: cannot validate certificate for MACHINE_IP because it doesn't contain any IP SANs
Tried to search for an answer for 2 days, however I couldn't find any.
I've set the certificate CN (common name) to MACHINE_STATIC_IP:5000

When using a self signed TLS certificate docker daemon require you to add the certificate to it's known certificates.
Use the keytool command to grab the certificate :
keytool -printcert -sslserver ${NEXUS_DOMAIN}:${SSL_PORT} -rfc > ${NEXUS_DOMAIN}.crt
And copy it your client's machine SSL certificates directory (in my case - ubuntu):
sudo cp ${NEXUS_DOMAIN}.crt /usr/local/share/ca-certificates/${NEXUS_DOMAIN}.crt && sudo update-ca-certificates
Now reload docker daemon and you're good to go :
sudo systemctl restart docker

You can also use the following command to temporarily trust the certificate without adding it your system certificates.
docker --tlscert <the downloaded tls cert> pull <whatever you want to pull>

Related

Troubleshooting - Setting up private GitLab server and connecting Gitlab Runners

I have a Gitlab instance running in docker on a dedicated private server (accessible only from within our vpc). We want to start doing CI using Gitlab runners so I spun up another server to host our runners.
Now that Gitlab-Runner has been configured, I try and register a runner with the private IP of the Gitlab server and the registration token
Enter the GitLab instance URL (for example, https://gitlab.com/):
$GITLAB_PRIVATE_IP
Enter the registration token:
$TOKEN
Enter a description for the runner:
[BEG-GITLAB-RUNNER]: default
Enter tags for the runner (comma-separated):
default
ERROR: Registering runner... failed runner=m616FJy- status=couldn't execute POST against https://$GITLAB_PRIVATE_IP/api/v4/runners: Post "https://$GITLAB_PRIVATE_IP/api/v4/runners": x509: certificate has expired or is not yet valid: current time 2022-02-06T20:00:35Z is after 2021-12-24T04:54:28Z
It looks like our certs have expired and to verify:
echo | openssl s_client -showcerts -connect $GITLAB_PRIVATE_IP:443 2>&1 | openssl x509 -noout -dates
notBefore=Nov 24 04:54:28 2021 GMT
notAfter=Dec 24 04:54:28 2021 GMT
Gitlab comes with let's encrypt so I decided to enable let's encrypt and cert autorenewal in gitlab rails, however when I try and reconfigure I get the error message:
There was an error running gitlab-ctl reconfigure:
letsencrypt_certificate[$GITLAB_PRIVATE_IP] (letsencrypt::http_authorization line 6) had an error: Acme::Client::Error::RejectedIdentifier: acme_certificate[staging] (/opt/gitlab/embedded/cookbooks/cache/cookbooks/letsencrypt/resources/certificate.rb line 41) had an error: Acme::Client::Error::RejectedIdentifier: Error creating new order :: Cannot issue for "$GITLAB_PRIVATE_IP": The ACME server can not issue a certificate for an IP address
So it looks like I can't use the let's encrypt option that packaged with gitlab to enable the renewal of certs.
How can I create/renew ssl certs on a private linux server without a domain?
If you've set up Gitlab + Runners on private servers, what does your rails configuration look like?
Is there a way to enable DNS on a private server for the sole purpose of a certificate authority granting certs?
I would suggest to use Self-signed certificate I have tested this before and its working fine but require some work. I will try to summarize some of the steps needed:
1- generate Self-signed certificate with the domain you choose and make sure to keep it in /etc/gitlab-runner/certs/
2- you need to add the domain and certs path in /etc/gitlab/gitlab.rb
3- reconfigure giltab
4- when connecting the runner make sure to manually copy and activate certs to the runner server .

Use certificates from host inside ddev environment to connect a remote system

I try to connect a remote elastic cluster that is available from the host (Windows 10 Enterprise) system.
I tested the host's connection via curl https://url.to.target:443. Got that 'For sure, its search'-Response.
When i try the same from inside the webserver-container (Debian GNU/Linux 10 (buster)) it failes by:
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html
curl failed to verify the legitimacy of the server and therefore could not
establish a secure connection to it.
Is there a simple way use the hosts certificates store?
Copy yourcert.crt to .ddev/web-build folder.
Create a custom .ddev/web-build/Dockerfile, for example:
ARG BASE_IMAGE
FROM $BASE_IMAGE
COPY ./yourcert.crt /usr/local/share/ca-certificates/
RUN update-ca-certificates --fresh
When referencing the cert in your code use:
$myCert='/usr/local/share/ca-certificates/yourcert.crt';
Have you tried it by adding the insecure option to the .curlc file in your Home dir?
echo insecure >> $HOME/.curlrc
Shouldn't be used in production!

Using SSL with docker containers

I am having a trouble related with SSL certificates.
I have a server running service in a docker container, I installed Caddy and get the SSL certificate for the site. Now, from other server I want to consume the service with HTTPS, but I get:
x509: certificate signed by unknown authority exit status 1
And, it seems to be a common issue when using docker + SSL. What should I do? thanks
Install the ca-certificates package.

Chef ssl validation failure

I have one chef-server version 12.0.1 and can connect linux (rhel/centos) systems to the chef-server with knife bootstrap but cannot with windows and locally on my rhel client knife ssl check fails.
I have two problems but I think they are both related.
Problem 1 - knife ssl check fails:
Connecting to host chef-server:443
ERROR: The SSL certificate of chef-server could not be verified
Problem 2 - bootstrap windows server fails:
ERROR: SSL Validation failure connecting to host: chef-server - SSL_connect returned=1 errno=0 state=SSLv3 read server certificate B: certificate verify failed
Chef encountered an error attempting to create the client "desktop"
I have tried a number of things:
1) knife ssl fetch - no changes
2) I have a signed digicert crt on the server which is accepted by the management-console and chrome web browser
3) I have changed set this in the chef-server.rb
nginx['ssl_certificate'] = "/var/opt/opscode/nginx/ca/hostname.crt"
nginx['ssl_certificate_key'] = "/var/opt/opscode/nginx/ca/hostname.key"
which go to the signed certs.
Anything else I should be trying or am I being a plank?
Try running these commands on your Chef server:
mkdir /root/.chef/trusted_certs
cp /var/opt/chef-server/nginx/ca/YOUR_SERVER'S_HOSTNAME.crt /root/.chef/trusted_certs/
I was having the same problem and it was fixed after I looked through this article, and tried out the steps it gave: http://jtimberman.housepub.org/blog/2014/12/11/chef-12-fix-untrusted-self-sign-certs/
I was having the same issue using a valid wildcard certificate, although it was linux rather than windows. Looks like the issue is that the chef client uses openssl and didn't have the CA and root certificates. I was getting errors when I ran the following from the chef client server:
openssl s_client -connect chef_server_url*:443 -showcerts
I solved my issue by browsing to the chef server, inspecting the certs and exporting each cert in the chain to a single file, ordered with the issued certificate at the top, and the root at the bottom. I then used this bundled-cert as the certificate file in the chef server config file and reconfigured chef.

How to use wget with ssl certificate

I am using wget in my program to get some file using HTTP protocol. Here i need to set security so we moved HTTP protocol to HTTPS.
After changing to HTTPS how to perform wget. I mean how to make trusted connection between two machines then perform wget.
I want to make sure that wget can be performed from certain system only.
Step 1: SSL Certificates
First things first, if this machine is on the internet and the SSL certificate is signed by a trusted source, there is no need to specify a certificate.
However, if there is a self signed certificate involved things get a little more interesting.
For example:
if this machine uses a self signed certificate, or
if you are on a network with a proxy that re-encrypts all https connections
Then you need to trust the public key of the self signed certificate. You will need to export the public key as a .CER file. How you got the SSL certificate will determine how you get the public key as a .CER
Once you have the .CER then...
Step 2: Trust the Certificate
I suggest two options:
option one
wget --ca-certificate={the_cert_file_path} https://www.google.com
option two
set the option on ~/.wgetrc
ca_certificate={the_cert_file_path}
Additional resources
Blog post about this wget and ssl certificates
wget manual
macOS users can use the cert.pem file:
wget --ca-certificate=/etc/ssl/cert.pem
or set in your ~/.wgetrc:
ca_certificate = /etc/ssl/cert.pem
On Linux (at least on my Debian and Ubuntu distributions), you can do the following to install your cert to be trusted system-wide.
Assuming your certificate is ~/tmp/foo.pem, do the following:
Install the ca-certificates package, if it is not already present, then do the following to install foo.pem:
$ cd ~/tmp
$ chmod 444 foo.pem
$ sudo cp foo.pem /usr/local/share/ca-certificates/foo.crt
$ sudo update-ca-certificates
Once this is done, most apps (including wget, Python and others) should automatically use it when it is required by the remote site.
The only exception to this I've found has been the Firefox web browser. It has its own private store of certificates, so you need to manually install the cert via its Settings interface if you require it there.
At least this has always worked for me (to install a corporate certificate needed for Internet access into the Linux VMs I create).