I recently had a need to upgrade an old server. The server fulfills a very specific purpose and as such has not been kept up to date. With the recent push for SSL to utilize SHA256 I needed to upgrade a few packages.
Short Background
The server is RHEL3 (yes, that is correct).
I downloaded and built OpenSSL 0.9.8q and ensured it was the only instance of OpenSSL on the server (moving the old instance to a backup directory). I then downloaded and built cURL 7.15.5 with the ./configure --with-ssl=/usr/local/ssl - pointing the with-ssl to my new OpenSSL directory.
Once cURL was built I tested my connection to the resource that is requiring sha256 using cURL. My connection test was successful using cURL.
On to my problem and question
I downloaded httpd 2.0.59 and built it with --enable-ssl and --enable-so, but my tests did not work.
I also tried to d/l & build httpd 2.0.63 but I was having trouble getting 2.0.63 working at all. I then took the mod_ssl built from 2.0.63 and put it into the 2.0.59 directory...no luck either.
I feel I am missing some element that connects httpd to my newly installed OpenSSL. What do I need to do to ensure mod_ssl is using my new version of OpenSSL on the server?
I understand I am quite a few releases behind with my httpd instances, but again, this is an old server with a specific purpose. My only goal is to get it working with sha256, not buy a new server with the latest RHEL, etc.
Thank for any input/assistance.
Running
./configure --help |grep ssl
gives
--with-ssl=DIR SSL/TLS toolkit (OpenSSL)
So just like the curl build you could try adding that.
Assuming you are not going to do the sensible thing and upgrade the OS.
Related
I'm using Netbeans with automatic upload to server each time I save a file locally. I suddenly started running into this error:
Cannot connect to server xxx.xxx.xxx
(Cause: java.security.cert.CertificateExpiredException: NotAfter: Sat May 30 12:34:56 CEST 2020)
I've checked my server (running Apache with cPanel/WHM on AWS EC2), and all SSL certificates seem to be updated and valid. I can connect to the same server using FileZilla. I'm using FTP with explicit TLS in both FileZilla and NetBeans.
I first got this error on my legacy Netbeans 8.2 installation, so I tried updating to 11.2, but I get the same error. Possibly because it duplicated my settings from 8.2?
(If I connect without encryption, it works.)
Hopefully my own experience with that issue helps you along, though I have not fixed it for myself, yet.
It seems, that not the server's certificate is invalid, but the root-certificate against which it is checked by the java-JRE has expired. see https://www.ssl.com/blogs/addtrust-external-ca-root-expired-may-30-2020/ - These root-certificates are normally stored locally with the OS. But some applications come with their own keystore.
And since JRE apparently does not use the OS's certificate store, this might explain FileZilla behaving differently.
I tried updating my local Java-installation to no avail. I also tried to find the out-of-date root-certificate in the Java-config. And indeed, it is listed with the "appropriate" valid-until date. But temporarily removing it, did not help. No luck there, either.
I installed youtube-dl on my local machine using curl as mentioned in the official README here.
sudo curl -L https://yt-dl.org/downloads/latest/youtube-dl -o /usr/local/bin/youtube-dl
sudo chmod a+rx /usr/local/bin/youtube-dl
Now when I run below command on my local machine
youtube-dl --cookies cookie.txt https://www.youtube.com/watch?v=x-5V_RS3Q48 -u my_account#gmail.com -p my_pass_word
I am able to download the video without any hassle as shown below.
But when I try to download the same video on one of my ec2 instances, I get the following exception.
The installation procedure on both machine is exactly same, youtube-dl version is exactly same (2017.08.18), python version is same (2.7.6)
The only difference I could figure out is the kernel versions on both machines:
On My Local: Linux-3.19.0-25-generic-x86_64-with-Ubuntu-14.04-trusty
On EC2 Instance: Linux-3.13.0-74-generic-x86_64-with-Ubuntu-14.04-trusty
Also the video I am trying to download is private and uploaded by same user I am providing credentials of.
One important point to note is that EC2 machine is able to download the video without any trouble if I am not using the username & password (which is only possible for videos that are not private)
Thanks
Posting the answer in case someone else is stuck with the similar issue.
The Issue with the cookies not being generated on server edition Linux OS on ec2 instance provided by AWS.
According to what I learnt recently, we don't have support for firefox browser on these machines (at least by default) and that's why it was failing to create the cookie file.
Solution
I created the file locally and set expiry time to 20 years in future and moved that cookie on server ec2 instance and used that cookie to sign in rather than creating one.
Thanks
So with the help of Graham I realize I need to rebuild the mod_ssl.so to point to the new OpenSSL version.
I found the following post with similar problem but not much suggested: https://stackoverflow.com/questions/36756641/rebuild-mod-ssl-so-on-apache2-on-macosx
Is it possible to only rebuild the mod_ssl.so only or do I need to rebuild Apache?
Any specific flags to use?
Is homebrew the way o go and how do I avoid having two installations of Apache?
I am on 10.11.6 and using MacOS Server 5.2 (If that has an impact)
I have integrated the following framework under a flask app and made it work. https://github.com/playingmedia/swish-python
So basically it makes a Request with pyopenssl with included certificates.
This is working fine in my flask app, but when I move it to my Apache Server (configured to be accessed through TLS - not sure if that is relevant) it gives me the following error: SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:590)
I am wondering if there is mod_wsgi setting I need to manipulate of there could be any permission issues...
I included another framework using Suds with TLS without on the Apache server any problem so wondering if there is any known issues with Request library and pyopenssl under mod_wsgi?
Have tried to google quite a lot but perhaps I am not typing in the right key words
Thx
Recently my coworkers installed new OS on sftp server and I have bunch of scripts that use curl to access this sftp.
The problem is that I'm getting CURLE_PEER_FAILED_VERIFICATION (51).
I believe it is because my local fingerprint/certificate does not match the one on sftp anymore.
For sftp client I fixed this simply by removing line in known_hosts file, but it didn't work for curl.
PS. Searching google didn't give me any meaningful information, so maybe I'm wrong and curl does not store any certificates?
So I have managed it. I can clone mercurial-repositories remotely using HTTP to my Windows Server 2003 machine and the ipaddress from that machine. Although I did deactivate IIS6 and am using Apache 2.2.x now. But not all works right now...darn! Here's the thing:
Cloning goes smooth! But when I want to push my changes to the original repository I get the message "cannot lock static http-repository". On the internet I get to read several explanations that Mercurial wasn't designed to push over HTTP connections. Still, on the Mercurial website there's something about configuring an hgrc file.
There's also the possibilty to configure Apache to host via HTTPS (or SSL). For this you have to load the module enabling OpenSSL and generating keys.
Configuring the hgrc file
Just add "push_ssl = false" under the [web] line. But where to put this file when pushing your changes back?! Because I placed it in the root of the server, in the ".hg" directory, nothing works.
Using SSL/HTTPS with Apache
When I try to access 'https://myipaddress' it fails, displaying a dutch message which would mean something like "server taking too long to respond". Trying to push also gives me a dutch error message which means about the same. It can not connect to my server via https although I followed the steps exactly at this blog.
I don't care which of the above solutions will work for me. Turns out none of them work so far. So please, can anyone help me with one of the solutions above? Pick the easiest! Help will be greatly appreciated, not only from me.
Summary
-Windows Server 2003
-Apache 2.2 with OpenSSL
-Mercurial 1.8.2
-I can clone, but not push!
Thank you!
Maarten Baar(s)
It seems like you might have apache configured incorrectly for getting it to do what you want. Based on your question it sounds like you have a path (maybe the root of the server) pointing to the repository you want to serve.
Mercurial comes with a script for this exact purpose, in the latest version it is hgweb.cgi. There are reasonably good instructions for setting it up on the mercurial site. It should allow both cloning and pushing. You will need the push_ssl=false if you will not be configuring https and also an allow_push line which will let certain users, or all (*) push to the repository. But all that should be part of the setup docs.