My hosting allows use of .htaccess files as the configuration files are not available.
I'm aware of the performance hit that override files incur though - so I was thinking - If Apache provided a mode for having a single .htaccess file - wouldn't that be faster than having to check for multiple .htaccess files whilst still maintaining the convenience?
If Apache provided a mode for having a single .htaccess file
Well, not a "mode" as such, but you could achieve this by allowing .htaccess for the parent directory (root directory or even one above the document root) and disable the use of .htaccess files in all subdirectories. The .htaccess file in the parent directory will still apply.
Realistically (if indeed this is at all "realistic"), you would probably need to enable .htaccess for the directory "above the document root" and disable .htaccess in the document root and below, rather than enabling .htaccess for the document root. Otherwise, if you enable .htaccess for the document root, you will have to disable .htaccess for each subdirectory individually. And if you add more subdirectories, the server config will need to be updated accordingly. (Since, the AllowOverride directive is only allowed in <Directory> containers without regex, not <DirectoryMatch> containers.) However, this might not be possible on some shared hosting environments (there might not be an "above the document root") and it could impact the installation of some CMSs.
Note that you obviously need access to the server config (or VirtualHost) in order to implement this, so it is hypothetical in this instance.
wouldn't that be faster than having to check for multiple .htaccess files
Possibly. But you are only talking about a micro-optimsation at best in real terms. On most sites, even enabling .htaccess files at all will hardly be noticeable - if at all. The "performance hit" you speak of is not as big as you might think. To put it another way, if you are finding that .htaccess is proving to be a bottle neck then you've either done something wrong, or you have far more serious problems to address.
Note, however, that you are generally only using .htaccess files on smaller sites anyway. On larger / high traffic sites you will have your own VPS / Server and access to the server config, so there wouldn't be any need to use .htaccess (or, importantly, have it enabled).
whilst still maintaining the convenience?
Not exactly. Part of the "convenience" is being able to put the .htaccess file in any directory you like, overriding parent directives and have it apply to just that directory tree. (It is the userland equivalent of the <Directory> container in the server config.)
Related
I am trying to decide whether to use .htaccess files in each sub-directory to deny all requests for specific files (while also denying directory indexes), or whether it is more security conscious to move all files except for essential files (index.php, .htaccess, robots.txt) outside the root directory and call them from the index file.
Are there any critical differences in security between these two methods for securing files in my web application?
Here is a view of what the .htaccess looks like in the root directory.
# pass the default character set
AddDefaultCharset utf-8
# disable the server signature
ServerSignature Off
<FilesMatch "\.(htaccess|htpasswd|ini|phps|fla|psd|log|sh|lock|DS_Store|json|)$">
order allow,deny
allow from all
</FilesMatch>
# disable directory browsing
Options All -Indexes
# prevent display of select file types
IndexIgnore *.wmv *.mp4 *.avi *.etc
However, this would not stop someone from accessing a file if they knew the directory structure such as https://www.example.com/security/please_dont_look.cfg
Although that file does not print anything, I don't want anyone to know it exists, and don't want a site-specific solution like using modredirect to redirect calls to specific files.
I could use a .htaccess file in each directory such as this:
order deny,allow
deny from all
From this question and reply (Prevent access to files in a certain folder)
Is one solution more bullet-proof than the other?
As always in such complex systems, security here is about having several lines of defense, keeping things simple and attempting to prevent as many attack vectors as possible.
Theoretically both solutions should provide you with the exact same level of security - the files would not be accessible in either case.
I'd recommend moving files that should not be accessed directly into a directory outside of the web root directory. It is quite easy to screw up htaccess files and thats just not possible when you move the files outside of your webroot. This will also prevent timing attacks against the directory structure of your server: reading htaccess files comes with a time penalty and that might be measureable, especially if your htaccess files get big and you have plenty of them for each sub directory. Actually I'd recommend skipping htaccess entirely, just disable indexes directly in your vhost configuation, such that Apache does not have to look for htaccess files at all, speeding up your website.
Additionally, in case you run php via fcgi, you should disallow file access on a file system level for apache and just allow access from php. With this setup it should be outright impossible to access your files by attacking the webserver (excluding php) unless you have some privilege escalation vulnerability (in which case you are screwed anyways).
The only way to access your confidential files in this setup would be to convince PHP to read the file or to mess with the file system, i.e. by creating a hard link from your web root into your "confidential files outside web root"-directory. Preventing against that boils down to ensuring your PHP configuration is as restrictive as possible, file creation inside the webroot is disallowed and, most importantly, ensuring that the PHP application itself is not vulnerable.
Are there any options in Apache 2.4 that enable the .htaccess to be cached, and not having the web server looking for it at each request?
I know that using the AllowOverride None, will do this, but then, for most popular scripts this will need the required settings to be made in the vhosts file for each folder.
If somehow we could just tell Apache to remember the htaccess files that it has already accessed since previous restart, this would be a great tool for speed improvement.
A script that reads the base content recursively for all htaccess, and builds a vhosts file, would not be a bad idea as well, anyone uses this approach?
I'm looking through my server because I want to restrict access to some specific folders and i've noticed I have several .htaccess files. One is in the root, the directory before public_html, is that the root, or is public_html the root? And that file enables php5 as default. I then have a htaccess doing some url re-writing in the public_html folder, then I have another one in the wordpress directory.
Is there a need for them to be spread out?
Do I have one htaccess for each folder I want affected or does the htaccess affect a folder plus all of the sub directories?
Thanks
Edit: Also have another htaccess in my wordpress theme folder?
Apache has two ways of storing configuration options:
The central configuration (the .conf files) - when you change these, you need to restart the server.
.htaccess files whose settings apply only to its directory, and all child directories. Changing these does not require a server restart
If you're on a dedicated server, you could theoretically migrate all .htaccess files into the central configuration, but WordPress will write into a .htaccess file when updating its permalink structure, so you'll always have at least that.
In my experience, keeping individual .htaccess files is relatively practical in everyday maintenance work as long as they're not too many. I would leave things as they are.
I'm a self-taught coder and I like to debug by echoing suspicious variables and commenting out code.
Lately, I've had to learn more about the .htaccess file. I need it to do things like interpret php scripts as php5, url rewriting, limit file upload size etc.... I have a lot of trouble debugging a .htaccess file. I often have to migrate PHP applications from one shared hosting environment to another. Sometimes this breaks the .htaccess file (or instead, something in the .htaccess file breaks the site). I check to make sure domain names are updated.
Are there popular techniques for debugging a .htaccess file? Is it just look in the apache logs? Anything else?
Looking in the apache logs is the easiest way to debug .htaccess imho (adding rewriteLog Directive if necessary)
About migrating: if you are not using any physical file paths inside .htaccess (i.e. /var/www/site/script.php) they should be working without problems. If this is not the case, first try to remove all options and leave only redirect directives, in this mode you can see if it's problem with server configuration which denies rewriting of default settings.
Some reference
Apache URL rewrite logic can be written either in conf or .htaccess file. Which one is more suitable, for which occasion? And let's say I have .htaccess in my web root directory, and I have a conf file defined on Apache/conf directory also, which file will kick in first?
To address just one part of your question: if you don't need the increased flexibility of a .htaccess file in different directories, don't use them; they slow down processing of requests.
.htaccess file resides in the root folder, and can easily be copied to another destination together with web the pages. Conf file may not always be accessible in a hosted environment.
The .conf file mod_rewrite directives will occur first -- before the URL-to-filename translation phase. But for most cases, that difference will be minimal. There are performance considerations, but they only really affect very heavily trafficked sites or underpowered hardware.