perl cgi shows plain text in https only? - apache

I have an odd issue with my webserver (redhat/apache). There are two sites on the server, each with their own virtualhost section in httpd-vhosts.conf and ssl.conf.
One site is primarily perl/cgi and works fine. I am able to properly execute perl/cgi scripts in the root folder as well as cgi-bin and subfolders of both areas. I can access the cgi files from http and https URLs.
If I try to put a cgi file in my other site it will execute if I access it via http but not if I access it via https. (it only displays the code as plain text if I use https)
As far as I can tell both sites are configured identically in both .conf files.
Any idea why it might be doing this?

I tracked down the issue. There was an entry in ssl.conf for '' and it had the same url as the site that wasnt working. That one was apparently overriding the one farther down which had the correct cgi config.

Related

Correct Apache Configuration And Htaccess

I've just reset my Ubuntu 14.04 LAMP server hosted with digital ocean. Could someone tell me the 'proper' way to do server configuration. My goal is to do everything as clean as possible (and hopefully well structured).
I intend on using the server mainly for programming and data analytics, however I do plan on hosting my website in /var/www/html. I also plan on using letsencrypt/certbot to get an easy SSL. With this in mind, these are the main goals I would like to accomplish:
1) Redirect the website to ALWAYS be served through https AND www.
2) Enable HSTS for the entire website.
3) Enable clean url's (remove .php extensions and what not).
Since I would like all of these properties to be used across the entire website, should the configuration be done inside of the /etc/apache2/ folder? Or should it be done inside of .htaccess?
And if it should be done inside of apache2 configuration, which file should I add it to? And finally, how exactly should it be added? (for example vhost 80/443, inside of a mod_something section, etc).
Thank you in advance, I would appreciate and consider any advice about Apache and htaccess!

Apache routing without htaccess

I am working with a custom website built in PHP running on Apache server. The client wants to move it to a new server. I moved everything including the .htaccess file, the homepage loads fine but all the other urls like site.com/register isn't working. I'm sure this is not handled by code in the old server because I renamed everything (including .htaccess) and it still works. If I create a file like test.php in the old server, I can access it like site.com/test. It doesn't even hit the index.php file. Also, not all the urls work like this, some are loading through files in other folders.
So my question is - what are the possible ways that Apache can let user access site.com/test without the .php extension. It must not be using .htaccess. Also, we should be able to add exceptions to this so that some urls can be loaded differently.
you can achieve same thing in hosts file if you are using Linux server. you need to define same rules in hosts configuration file.

htaccess - simulating local page as web page

Is there a way to rewrite a URL in a local project to look like web page?
For example I have project with url
localhost/site
I'm tryin' to rewirite this to:
www.site.com
or
site.com
That project has subpages, and it would be good if it worked like
site.com/subpage.php
I'm trying for an hour but I'm really htaccess noob.
I work with VertrigoServ and mod_rewrite works fine with some examples which I tried in other projects.
The issue is not with rewrite. To make a local page appear to have a more conventional domain you need to tell your browser that the conventional domain is found on your local webserver.
The simplest way to do this is by editing the hosts file on your OS. On *nix based devices it's usually /etc/hosts, On windows it's usually C:\Windows\System32\drivers\etc\hosts. You will need elevated privileges to edit the file.
Add a line to the end of the file that maps the domain name to your local ip address like this:
127.0.0.1 www.site.com
Close an reopen your browser and visit www.site.com, you should see it is loading the page from your local webserver.
Chances are, you are still seeing the same page as if you view localhost site. To make it load your code you need to change the DocumentRoot in httpd.conf for your apache install to match the directory where your code is.
A preferable solution may be to use Name Based Virtual Hosting, this allows multiple web sites to share the same IP address. Searching 'apache VirtualHost examples' should give you plenty of resources for this. Make sure that NameVirtualHost *:80 is also enabled in order for this to work.

.htaccess works on Apache server, but not on FTP directory

I recently found the use of a .htaccess file to edit the URL of my webpages. This is done with mod_rewrite (Apache). I use XAMPP and the working files are inside of the appropriate htdocs folder. While in the local directory, the .htaccess file does the job and it edits the URL. I have a domain name that I've been working on and periodically update the working files to that. When I upload these files to the domain through FTP, the .htaccess file doesn't work correctly, as you can imagine since Apache modules have no way of working on a web directory. So my question is, how do I make a .htaccess file work in a web directory without Apache's mod_rewrite module?
Your question is not sufficiently clear. URL rewriting won't work if you're just accessing the static files (i.e. file:///home/user/www/index.html) rather than going through the Apache server (http://localhost/~user/index.html) since Apache will never process the request.
Perhaps your .htaccess file is not being uploaded properly? Some programs will complain a bit when you try to upload strangely named files, such as those beginning with a period.

Apache redirect for single XML file

I have a number of subdomains, which are using crossdomain.xml file and I'm looking to a simple way of managing them all - which get semi-regularly updated. One way I've thought is a PHP script, which pushes and overwrites the xml file. The other, which I much prefer is a an apache redirect on a single file.
So, question is how would I, across multiple domains, redirect an xml on dom1.domain.com and dom2.wirewax.com to the same crossdomain.xml file without Flash getting upset about. i.e. not a 302 HTTP redirect, but internal file fetching.
You can write a PHP script that fetches the content from a single location (database or text file) and sends it as-is to Flash. Yes, the script itself needs to be copied on all hosts.
If you have all websites hosted on same webserver, perhaps mod_alias could help:
Alias /crossdomain.xml /path/to/shared/crossdomain.xml
I have not personally tested this. The reference page includes instructions to setup the shared directory so that it can be read by multiple hosts.