What is Apache's OpenSSL and why does one server have it but another not - apache

I'm working with two servers; one localhost one on the web. Both are http; I don't have an SSL certificate installed on either.
When I'm trying to make a Curl request to an https url (in this case the Facebook API), one of the servers works and the other doesn't. The CURL error is "SSL certificate problem: unable to get local issuer certificate." Upon investigation, I noticed that $_SERVER["SERVER_SOFTWARE"] outputs something different on the two servers.
Server 1, which works with CURL to https
$_SERVER["SERVER_SOFTWARE"] = Apache/2.4.10 (Win32) OpenSSL/1.0.1i PHP/5.6.3
Server 2, which doesn't work with CURL to https
$_SERVER["SERVER_SOFTWARE"] = Apache
I'm guessing that the fact that the second server has no mention of OpenSSL may have something to do with the error? Is that possible? What would I need to do to get OpenSSL on that server? Why would the first server be able to "find issuer certificate" when I don't have an SSL cert installed on it?

Since you are doing a request with curl to an external server the problem is completely unrelated to the web server software you are running locally, i.e. you don't even need to run a local web server at all. It only depends on the certificate the external server sends back to curl and if the necessary root CA can be found in the trust store of curl.

Related

Using and then removing self-signed certificate localhost

Problem Background:
As part of the Computer Networking course assignment, I have been given task of implementing a Proxy Server ( using python socket and ssl module ) that handles https communications between the browser and the origin server (The real server that my browser wants to talk to).
What I have done so far:
I have implemented the above requirement using ssl sockets and also generated self-signed 'cert.pem' 'key.pem' files.
What I need to do:
Now I just need to tell my browser (chrome 89 on kubuntu 20.04) to accept this self-signed certificate and then test the working of my proxy server.
Reading from this stackoverflow question, I can see that I have to:
(1) become my own CA (2) then sign my SSL certificate as a CA. (3) Then import the CA certificate (not the SSL certificate, which goes onto my server) into Chrome.
My confusion/question:
So if I do this, when eventually I am done with this assignment, how do I reverse all these steps to get my browser in the previous state before I had made all these changes. Also, how to reverse the "become your own CA" and also delete the SSL certificates signed by my CA.
Basically, I want my system to return to the previous state it was before I would have made all these changes.
UPDATE:
I have done the previously outlined steps but now I get an error.
Here is a snippet of my code:
serv_socket = socket(AF_INET, SOCK_STREAM)
serv_socket.bind(('', serv_port))
serv_socket.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1)
context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
context = context.load_cert_chain('cert.pem', 'key.pem')
context.set_ciphers('EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH')
serv_socket.listen(10)
socket_to_browser, addr = serv_socket.accept()
conn_socket_to_browser = context.wrap_socket(socket_to_browser, server_side=True)
At the last line conn_socket_to_browser = context.wrap_socket(socket_to_browser, server_side=True) an exception is thrown: [SSL: HTTPS_PROXY_REQUEST] https proxy request (_ssl.c:1123)
What am I doing wrong ?
As glamorous as "becoming your own CA" sounds, with openssl it basically comes down to creating a self-signed certificate, and then creating a directory where some CA-specific configuration will be stored (I don't fully remember the specifics, but I think it was just some files related to CNs and serial numbers) so basically reversing the "become your own CA" step is something as mundane as deleting this directory along with the private key and self-signed certificate you were using for the CA. That's it, the CA is no more.
And for chrome returning to the previous state, you would just go the the CA list where you added the CA certificate, select it and delete it. Chrome will stop accepting certificates signed by your CA.
Regarding your new problem... In my opinion, you have developed some kind of reverse proxy (meaning that you expect normal HTTPS requests that you then redirect to the real server) but you have configured Chrome to use it as a forward proxy. In this case, Chrome does not send it a normal HTTPS request, it sends a special non-encrypted CONNECT command and only after receiving the non-encrypted response, it negotiates the TLS connection. That's why openssl says "https proxy request" because it has detected a "https proxy request" (a CONNECT command) instead of the normal TLS negotiation.
You can take a look at How can a Python proxy server (using SSL socket) pretend to be an HTTPS server and specify my own keys to get decrypted data?
It's python, but I think that you'll get the idea

Firefox can find certificate, but curl cannot (while tunneling https through ssh)

Background:
I have a running app at ports 8080 in the remote server and a https ingress proxy at 443 on the same server, which redirects everything to 8080 app after handling the SSL.
What I want to do:
I want to communicate with the app through SSL remotely, while not having access directly to this domain (it is on a local network, I can access the server remotely via a different domain).
What I did:
I tunneled 443 port from my remote server ssh -L 3001:0.0.0.0:443 user#example.com. I then added 127.0.0.1 example.com to my /etc/hosts to make sure that the domain on my system is resolved properly.
Now, what I can do is enter https://example.com:3001/some/thing/ in firefox and it gets me a proper response from the server, while everything is ran through ssl without any problems. I also am able to use curl without checking the certificate: curl --insecure https://example.com:3001/some/thing works fine.
At the same time secure curl call fails: curl https://example.com:3001/some/thing with the error:
curl: (60) SSL certificate problem: unable to get local issuer certificate
More details here: https://curl.haxx.se/docs/sslcerts.html
Just to make sure both are using the same certificates, I actually used this tool: https://curl.haxx.se/docs/mk-ca-bundle.html to create a ca-bundle.crt from the most recent firefox certificates and passed it to curl with --cacert ca-bundle.crt. No luck - the same error. (I also tried following other curl tutorial on getting the local installation of firefox's certs, also no luck).
Question
What is going on? Why is curl's output different from firefox's even if I seem to use the same certificates? How can I debug this?
Side note
The real reason I am concerned about it is that with a normal (local) access to the server, I observed the same behaviour: I could connect to the server through chrome on https, but my react native app could not. I suspect the app to use libcurl under the hood or something similar and I believe debugging this problem could help me understand what's the problem with the app.

Why are we getting "tls handshake error using curl"?

I'm trying to use curl to access a URL of an app we've developed internally, and on the server I'm seeing
http: TLS handshake error from 1.2.3.4 remote error: tls: unknown certificate authority.
This only happens when we hit the endpoint using curl (inside git bash ) or wget. When we use IE on Windows it works just fine. I've even tried re-installing git bash using Native SSL library (which should be the same as IE), but still getting the same error message.
Have even tried downloading the curl-ca-bundle.crt file and saving to the same place as the curl binary, or even directly telling it to use this file with curl --cacert option, but still no joy.
I've compared the root certs that IE is reporting, and the ones in that curl-ca-bundle.crt and they look the same (they don't line up exactly the same, but they have the same text in them between the BEGIN and END markers, one is just wider on the screen and therefore uses fewer lines if that makes sense).
Hopefully someone has ideas what to try next as two of us have been tearing our hair out all afternoon with this.
wget also gives an error message :-
$ wget https://bler.com/admin/user
--2018-09-03 15:53:43-- https://bler.com/admin/user
Connecting to 132.146.1.142:8090... connected.
ERROR: cannot verify oss.dns.networks.bt.com's certificate, issued by 'CN=DigiCert SHA2 Secure Server CA,O=DigiCert Inc,C=US':
Unable to locally verify the issuer's authority.
To connect to oss.dns.networks.bt.com insecurely, use '--no-check-certificate'.
We're using a local proxy server, and HTTP_PROXY is set. It must be using the proxy, as we can see we're hitting the end point.
I also had the same error. This problem occurs when you install on Golang web server the only certificate. You must use a certificate chain instead.
For example, Letsencrypt gives "cert.pem" and "fullchain.pem". "cert.pem" works in browsers, but curl cannot work with this file (curl: (60) SSL certificate problem: unable to get local issuer certificate). "fullchain.pem" works fine in browsers and curl.

Lets encrypt SSL in Ubuntu 16 apache server returns 404 error

I have a digitalocean droplet where I have hosted a Laravel application App Url . I have added a SSL using Tutorial Link. But when I run the application in https it returns 404 page not found error. can anyone check the issue. Config file ( assamgas.tk.conf ) is below.
I'm seeing two things wrong here.
1) The web server is not redirecting to port 443 (SSL/HTTPS)
2) The certificate is not present.
I could not find any certs through HTTPS on your server.
I suggest, run through the tutorial again, or try this DigitalOcean tutorial
Don't generate too many production certificates while you test, rather use the Let's Encrypt staging server for your testing, when you get the self-signed certificate, then you switch over to the production server for Let's Encrypt, otherwise you will get Rate Limited for a week.

Does cURL to https URLs require SSL certificate installed on the server?

I am trying to test a cURL command from my server. The command is requesting a JSON response from a URL that is https.
The cURL seems to be stuck, with no response and it doesn't time out.
However, I have tested it from my local machine and the cURL command works fine.
Do I need an SSL certificate installed on the server for it be able to send cURL commands to an https URL?
No you don't need an SSL cert to send a command like curl via https (unless your server uses two way ssl).
You DO need one at the other end for the https server that receives the command from curl.
However if it's a self signed cert then it may not be recognised by curl and you may get an error instead of a successful connection.
The fact the process is hanging suggests a network/connectivity issue rather than an SSL issue. Can you telnet to the machine using port 443 or does that hang too?
telnet www.example.com 443
Should respond.