Apache serving files that should not be served - apache

Today I discovered that my fresh installation of Apache HTTP Server is able to serve files from my C:\uploads\ directory.
I have two folders in C:\uploads:
C:\uploads\templates
C:\uploads\sites
Both folders contain testimage.jpg.
I found that Apache will serve the files from the templates folder if I request:
http://localhost/templates/testimage.jpg
However, http://localhost/sites/testimage.jpg 404's!
OMG - firstly, why does Apache serve the templates folder in the first place? Is it special?
Secondly, by what arbitrary set of rules does apache disallow access to other folders such the sites?
I'm so confused. Perhaps I've taken a wrong turn somewhere during the installation.

Did you look through your httpd.conf file to see what rules are in place for what is being served? Alternatively, are there .htaccess files that may be changing what is being served? You might have templates exposed in one or the other, but not sites... that's the first thing that comes to mind.
I would suggest going through these configuration files with a fine toothed comb to see what may cause the behavior you see.

Related

Is there any way in Apache to specify a single .htaccess file?

My hosting allows use of .htaccess files as the configuration files are not available.
I'm aware of the performance hit that override files incur though - so I was thinking - If Apache provided a mode for having a single .htaccess file - wouldn't that be faster than having to check for multiple .htaccess files whilst still maintaining the convenience?
If Apache provided a mode for having a single .htaccess file
Well, not a "mode" as such, but you could achieve this by allowing .htaccess for the parent directory (root directory or even one above the document root) and disable the use of .htaccess files in all subdirectories. The .htaccess file in the parent directory will still apply.
Realistically (if indeed this is at all "realistic"), you would probably need to enable .htaccess for the directory "above the document root" and disable .htaccess in the document root and below, rather than enabling .htaccess for the document root. Otherwise, if you enable .htaccess for the document root, you will have to disable .htaccess for each subdirectory individually. And if you add more subdirectories, the server config will need to be updated accordingly. (Since, the AllowOverride directive is only allowed in <Directory> containers without regex, not <DirectoryMatch> containers.) However, this might not be possible on some shared hosting environments (there might not be an "above the document root") and it could impact the installation of some CMSs.
Note that you obviously need access to the server config (or VirtualHost) in order to implement this, so it is hypothetical in this instance.
wouldn't that be faster than having to check for multiple .htaccess files
Possibly. But you are only talking about a micro-optimsation at best in real terms. On most sites, even enabling .htaccess files at all will hardly be noticeable - if at all. The "performance hit" you speak of is not as big as you might think. To put it another way, if you are finding that .htaccess is proving to be a bottle neck then you've either done something wrong, or you have far more serious problems to address.
Note, however, that you are generally only using .htaccess files on smaller sites anyway. On larger / high traffic sites you will have your own VPS / Server and access to the server config, so there wouldn't be any need to use .htaccess (or, importantly, have it enabled).
whilst still maintaining the convenience?
Not exactly. Part of the "convenience" is being able to put the .htaccess file in any directory you like, overriding parent directives and have it apply to just that directory tree. (It is the userland equivalent of the <Directory> container in the server config.)

.htaccess works on Apache server, but not on FTP directory

I recently found the use of a .htaccess file to edit the URL of my webpages. This is done with mod_rewrite (Apache). I use XAMPP and the working files are inside of the appropriate htdocs folder. While in the local directory, the .htaccess file does the job and it edits the URL. I have a domain name that I've been working on and periodically update the working files to that. When I upload these files to the domain through FTP, the .htaccess file doesn't work correctly, as you can imagine since Apache modules have no way of working on a web directory. So my question is, how do I make a .htaccess file work in a web directory without Apache's mod_rewrite module?
Your question is not sufficiently clear. URL rewriting won't work if you're just accessing the static files (i.e. file:///home/user/www/index.html) rather than going through the Apache server (http://localhost/~user/index.html) since Apache will never process the request.
Perhaps your .htaccess file is not being uploaded properly? Some programs will complain a bit when you try to upload strangely named files, such as those beginning with a period.

Magento: .htaccess files

I am running Magento Community Edition version 1.7.0.2.
I would like to know, how come are there two .htaccess
files in my installation, one in the magento root directory,
and another one in the magento app directory just beneath
the magento root directory?
On my system the first one is 209 lines long whereas the
second one only contains two directives.
Can anyone please explain how come there are two files
instead of one. Are both parsed or just one of them?
Normaly each .htaccess-File paresed, cause they could be used additional.
The last .htaccess-File may overwrite or enhanced previuos ones.
The .htaccess file in app/ is used to "deny" all access to any file under app. Without this someone could access http://yourdomain.com/app/etc/local.xml and see your database credentials, among other bad things. A similar file should be present in var/ as well (to prevent viewing logs, etc)
Delete the existing file and try adding default new .htaccess file
Magento default htaccess file

Popular techniques to debug .htaccess

I'm a self-taught coder and I like to debug by echoing suspicious variables and commenting out code.
Lately, I've had to learn more about the .htaccess file. I need it to do things like interpret php scripts as php5, url rewriting, limit file upload size etc.... I have a lot of trouble debugging a .htaccess file. I often have to migrate PHP applications from one shared hosting environment to another. Sometimes this breaks the .htaccess file (or instead, something in the .htaccess file breaks the site). I check to make sure domain names are updated.
Are there popular techniques for debugging a .htaccess file? Is it just look in the apache logs? Anything else?
Looking in the apache logs is the easiest way to debug .htaccess imho (adding rewriteLog Directive if necessary)
About migrating: if you are not using any physical file paths inside .htaccess (i.e. /var/www/site/script.php) they should be working without problems. If this is not the case, first try to remove all options and leave only redirect directives, in this mode you can see if it's problem with server configuration which denies rewriting of default settings.
Some reference

conf or .htaccess-- when to use which?

Apache URL rewrite logic can be written either in conf or .htaccess file. Which one is more suitable, for which occasion? And let's say I have .htaccess in my web root directory, and I have a conf file defined on Apache/conf directory also, which file will kick in first?
To address just one part of your question: if you don't need the increased flexibility of a .htaccess file in different directories, don't use them; they slow down processing of requests.
.htaccess file resides in the root folder, and can easily be copied to another destination together with web the pages. Conf file may not always be accessible in a hosted environment.
The .conf file mod_rewrite directives will occur first -- before the URL-to-filename translation phase. But for most cases, that difference will be minimal. There are performance considerations, but they only really affect very heavily trafficked sites or underpowered hardware.