SSL Certificate apache2 - certificate trust issues when website loads - apache

So I started working on generating the certificate for the website that I' working on. I installed apache2 and further looked into sites - drissamri.be, linode.com and akadia.com and some SO questions. From these sites, I was able to generated the certificate, the crt, csr and key files. After completing all the steps mentioned in these sites, the website is opening in the browser but the 'https://' gets a red slash on it.
Where am I missing here?
Also, when I open the website in one machine it opens with the following view, and when I open in an other machine the website doesn't open with an error - Your connection is not private.
For reference purpose, here's the screenshot :-

Related

SSL error on Magento 2 Sign In for marketplace

I am posting this question on SO instead of ServerFault, because all my previous efforts to get Magento 2 issues sorted out, ended up being hacking some or other code in the Magento or template source.
I have configured a basic install of Magento 2 with a theme for a client.
Magento is running on IIS and Windows. (Not WAMP), shared IIS hosting on windows (My own server).
I configured the shop to use SSL, and the complete shop runs over SSL without any issues.
However, when trying to use the market place, I get a weird SSL issue:
"SSL certificate problem: unable to get local issuer certificate"
This error is shown on the Magneto shop (which is currently running over ssl), when trying to sign in to the market place.
I have found lots of hits on this issue, but all answers seem to lead to a self-signed certificate that isn't trusted or adding intermediary and/or root certificates. This is all based on XAMP, WAMP or native 'nix installations.
I do not understand what the exact issue is. I also do not know how to troubleshoot this further as the error description is very vague.
I would appreciate some feedback.
Thanks
This error happens because cURL cannot find a cacert.pem file from which take the trusted signatures.
There are some ways to set this file in cURL:
• Pass the cacert.pem file path directly to cURL when making the call;
• Set the path to the cacert.pem file in the php.ini.
You could follow below post:
• https://serverfault.com/questions/633644/adding-a-self-signed-cert-to-the-trusted-certs-within-curl-in-windows
• https://magento.stackexchange.com/questions/97036/magento-component-manager-ssl-certificate-problem-unable-to-get-local-issuer-c
• https://mage2.pro/t/topic/988
Regards,
Jalpa.

Recently can't connect to my NAS via HTTPS

I have a Synology NAS DiskStation DS2415+. When I bought it several years ago, I followed the setup instructions and created a self-stamped certificate which worked and even allowed me to remotely connect to my NAS via HTTPS.
Recently I changed some settings following the Synology's "Security Advisor" which is an automatic tool which scans all settings and recommend changes to secure it.
Following the recommendations of the said tool, I made some the reuqired changes, mostly in the Network Settings and Security Settings, but now I now can't use Quick Connect without getting a warning. In case any of you is familiar with this issue, I do hope there is a way to use HTTPS and not HTTP, either with a self stamp SSL or a purchased one. When I inquired about purchasing an SSL, I am told that it would be impossible to use an SSL without a dedicated domain for that SSL, but that's a side issue because originally my NAS worked and was remotely accessed via a self stamped certificate.
I managed to fix it by the following steps:
Creating a Self Stamped SSL (done in 2 steps)
1. Go to Control Panel -> Security -> Certificate -> CSR
Generate the CSR and download it. Use your user name as the Common name.
Go again to Certificate -> CSR and this time select Sign Certificate Signing Request. You will then be asked to select the .csr file from before, and as a result, the certificate will be downloaded.
Go to your browser and import this certificate. (Thanks #Matt Clark !)
In my case, only after going to Chrome -> Settings -> Advance and selecting Reset Settings to Default, it worked.
I can now connect to my NAS using QuickConnect and using HTTPS.

Create a https server app using its own certificate

We are developing a local server app (written in nodejs for now), used by our web site to manipulate local files and folders (browse, upload, download...).
Basically, the customer installs the nodejs app, which starts a local server listening on 127.0.0.1.
Then, when (for instance) a list of local folders is needed on the web site, a JS script queries the local server, which returns the local folders, and they are displayed on the web site.
The problem is when the web site is configured in HTTPS, the web site's JS refuses to communicate with the HTTP-non-S nodejs app.
We are exploring various options :
using self-signed certificates deployed with the app, and trusting them on the machine during install, but I feel there will be a LOT of times when it won't work
using "proper" certificates for local.example.com, with a DNS entry where local.example.com points to 127.0.0.1, but it seems that distributing private keys to the general public is prohibited by the CGU of most (if not all) certificate authorities.
Now I thought of maybe another mean. Can a "packaged" HTTPS server (written in any language, I don't care), "living" inside an exe file, which is signed with a proper SSL certificate, use the certificate of the app?
I'm not sure if I'm making any sense, I don't know certificates very well...
Thanks!
We ended up adding a self-signed root CA using certutil :
certutil.exe -user -addstore Root "mycert\rootca.cer"
Since we're adding a root CA, it generates a warning popup that the user has to accept, but it has been deemed acceptable by the powers that be.
There is a "check config" screen that can try to add the certificate again if it hasn't been properly added the first time.
There is a case when the group policies (GPO) prevent trusting self-signed certificates. In this case, certutil has a return code of 0 (the certificate is added) but the root CA is not trusted, so the local server does not work. So, after install, we have to check that the certificate is trusted using:
certutil.exe -user -verifystore Root xxx
(xxx being the certificate serial number). This command does exit with error if the certificate is untrusted either, so we parse the output for CERT_TRUST_IS_UNTRUSTED_ROOT or 0x800b0109.

New SHA-2 Certificate Key on Domino 9.0.1 not loading

My old Live system (Domino 8.5.3 / Windows 2003) is out on the DMZ and needs to be upgraded to a SHA-2 certificate. So, we have built a new Test server also out in the DMZ (Domino 9.0.1 FP6 / Windows 2008) box to move the site to.
I copied the entire Data directory from the Live over the top of the Test 9.0.1 folder to bring across all the databases and jQuery files etc...
I then followed this procedure to create the new certificate:
https://www-10.lotus.com/ldd/dominowiki.nsf/dx/3rd_Party_SHA-2_with_OpenSSL_and_kyrtool?open
I used the procedure to generate a new CSR which we sent to GoDaddy to have them reKey the SHA-2 for the new Test system.
They returned to CRT files.
1) gd_bundle-g2-g1.crt - This I believe holds the Root and Intermediate certificates. But, I only found two certificates in this.
2) 8e0702e83bd035e9.crt - This has the Site certificate
I extracted the two GoDaddy certificates:
godaddy_root_Base64_x509.cer
GoDaddy_Secure_CA-G2_Base64_X509.cer
Then used the following command to join them all together:
type server.key 8e0702e83bd035e9.crt GoDaddy_Secure_CA-G2_Base64_X509.cer godaddy_root_Base64_x509.cer > hbcln04_server.txt
I followed all the steps in the procedure above. The only difference is that the proceedure shows 2 intermediate certificates but GoDaddy only sent me one.
But, I was able to verify both the Keys and the Certificates as the procedure said.
There were no errors in the process.
I put the new kyr file down in the Data directory with the others and then went to the Website document and changed the reference there to the new kyr filename.
Note, this is a Website document not the Server document.
I even went to the Server document and followed a procedure to Disable and Enable the Website documents just in case the path to the Keyring.kyr file was corrupted.
However, because the new Test box is in the DMZ it is very difficult to test.
So, I have modified the servers Host file to map the certificates domain back to the same box. (Otherwise the DNS would keep taking it back to the Live system.)
There is a question as to whether mapping the domain to the IP of the Test box will work with HTTPS. But, I don't see why not.
But no matter what I do, I can't get the certificate to take hold.
I put in the URL for the site and if it is HTTP it works, But soon as I change it the HTTPS I get this:
This page can’t be displayed
List item Make sure the web address https:_Link_to_site is correct.
List item Look for the page with your search engine.
List item Refresh the page in a few minutes.
I then refresh the page and I get this:
This page can’t be displayed
Turn on TLS 1.0, TLS 1.1, and TLS 1.2 in Advanced settings and try connecting to https:_Link_to_site again. If this error persists, it is possible that this site uses an unsupported protocol or cipher suite such as RC4 (link for the details), which is not considered secure. Please contact your site administrator.
Well unfortunately, I'm the site administrator!
The only things I have seen differ to the procedure is:
1) that I only had 1 intermediate cert and not 2 as in the example.
2) I'm using a Host file to map the domain to the server so it doesn't follow it's usual DNS.
Also note that there are no errors in the log. We did have a few around the Access to the Key files. The kyr file was fine, but the sth file had restricted access. This has been corrected now.
At the moment, I don't know where to even look for an error or what to turn on to see the error.
It seems the certificate just doesn't load.
Please help.

Why does Internet Explorer cache expired SSL certificates (and can I do anything about it)?

I am using a Debian/Apache webserver with up-to-date software and a SSL certificate to encrypt the communication via HTTPS. In February the old certificate expired and I got me a new one (CA Geotrust via CA RapidSSL). Like the one before.
In Firefox (Chrome, ...) everything works fine. But after the old certificate finally expired after 2 weeks, Internet Explorer says the certificate has expired - leave the page? Appearently the old certificate is stuck in the browser cache and has not been updated since.
And the thing ain't done with clearing the browser cache. I actually had to reset the IE settings to make it reload the new certificate. As it works by now, I guess that the server delivers the correct certificate. But there are still other users who report the same problem - so it wasn't my browser alone.
My best guess is that something in the old cert or my cache suggestions told the IE to store the certificate for a long while. But I have no clue how to solve this - or even what to change so I don't have the same problem next year, again.
Thanks for any ideas!
BurninLeo
I had a similar problem. In fact it is IE under XP who don't support several HTTPS subdomaine on a single IP address.
http://nginx.org/en/docs/http/configuring_https_servers.html#sni
So if you have also several domains or subdomains in same IP you can't solve this on XP/IE you can just choose which certificat is used by XP/IE but it will be the same for all subdomaine.
PiR