can a .htaccess file malfunction - apache

In some CMS backup system i see they use the following .htaccess code in the backup folder:
order deny,allow
deny from all
allow from none
I was wondering if this is fail safe or if it is possible to annoy the server so badly that the .htaccess file is being ignored.
The reason I ask this is because if that is possible I would rather put the files outside the httpdocs folder so they are not accessible by a browser. That though requires me to use quite some extra's to show or use those files if i want to.
Anyone have an idea or tips?

Related

Is there any security differences between using .htaccess and moving all files outside web root directory?

I am trying to decide whether to use .htaccess files in each sub-directory to deny all requests for specific files (while also denying directory indexes), or whether it is more security conscious to move all files except for essential files (index.php, .htaccess, robots.txt) outside the root directory and call them from the index file.
Are there any critical differences in security between these two methods for securing files in my web application?
Here is a view of what the .htaccess looks like in the root directory.
# pass the default character set
AddDefaultCharset utf-8
# disable the server signature
ServerSignature Off
<FilesMatch "\.(htaccess|htpasswd|ini|phps|fla|psd|log|sh|lock|DS_Store|json|)$">
order allow,deny
allow from all
</FilesMatch>
# disable directory browsing
Options All -Indexes
# prevent display of select file types
IndexIgnore *.wmv *.mp4 *.avi *.etc
However, this would not stop someone from accessing a file if they knew the directory structure such as https://www.example.com/security/please_dont_look.cfg
Although that file does not print anything, I don't want anyone to know it exists, and don't want a site-specific solution like using modredirect to redirect calls to specific files.
I could use a .htaccess file in each directory such as this:
order deny,allow
deny from all
From this question and reply (Prevent access to files in a certain folder)
Is one solution more bullet-proof than the other?
As always in such complex systems, security here is about having several lines of defense, keeping things simple and attempting to prevent as many attack vectors as possible.
Theoretically both solutions should provide you with the exact same level of security - the files would not be accessible in either case.
I'd recommend moving files that should not be accessed directly into a directory outside of the web root directory. It is quite easy to screw up htaccess files and thats just not possible when you move the files outside of your webroot. This will also prevent timing attacks against the directory structure of your server: reading htaccess files comes with a time penalty and that might be measureable, especially if your htaccess files get big and you have plenty of them for each sub directory. Actually I'd recommend skipping htaccess entirely, just disable indexes directly in your vhost configuation, such that Apache does not have to look for htaccess files at all, speeding up your website.
Additionally, in case you run php via fcgi, you should disallow file access on a file system level for apache and just allow access from php. With this setup it should be outright impossible to access your files by attacking the webserver (excluding php) unless you have some privilege escalation vulnerability (in which case you are screwed anyways).
The only way to access your confidential files in this setup would be to convince PHP to read the file or to mess with the file system, i.e. by creating a hard link from your web root into your "confidential files outside web root"-directory. Preventing against that boils down to ensuring your PHP configuration is as restrictive as possible, file creation inside the webroot is disallowed and, most importantly, ensuring that the PHP application itself is not vulnerable.

Is there any way in Apache to specify a single .htaccess file?

My hosting allows use of .htaccess files as the configuration files are not available.
I'm aware of the performance hit that override files incur though - so I was thinking - If Apache provided a mode for having a single .htaccess file - wouldn't that be faster than having to check for multiple .htaccess files whilst still maintaining the convenience?
If Apache provided a mode for having a single .htaccess file
Well, not a "mode" as such, but you could achieve this by allowing .htaccess for the parent directory (root directory or even one above the document root) and disable the use of .htaccess files in all subdirectories. The .htaccess file in the parent directory will still apply.
Realistically (if indeed this is at all "realistic"), you would probably need to enable .htaccess for the directory "above the document root" and disable .htaccess in the document root and below, rather than enabling .htaccess for the document root. Otherwise, if you enable .htaccess for the document root, you will have to disable .htaccess for each subdirectory individually. And if you add more subdirectories, the server config will need to be updated accordingly. (Since, the AllowOverride directive is only allowed in <Directory> containers without regex, not <DirectoryMatch> containers.) However, this might not be possible on some shared hosting environments (there might not be an "above the document root") and it could impact the installation of some CMSs.
Note that you obviously need access to the server config (or VirtualHost) in order to implement this, so it is hypothetical in this instance.
wouldn't that be faster than having to check for multiple .htaccess files
Possibly. But you are only talking about a micro-optimsation at best in real terms. On most sites, even enabling .htaccess files at all will hardly be noticeable - if at all. The "performance hit" you speak of is not as big as you might think. To put it another way, if you are finding that .htaccess is proving to be a bottle neck then you've either done something wrong, or you have far more serious problems to address.
Note, however, that you are generally only using .htaccess files on smaller sites anyway. On larger / high traffic sites you will have your own VPS / Server and access to the server config, so there wouldn't be any need to use .htaccess (or, importantly, have it enabled).
whilst still maintaining the convenience?
Not exactly. Part of the "convenience" is being able to put the .htaccess file in any directory you like, overriding parent directives and have it apply to just that directory tree. (It is the userland equivalent of the <Directory> container in the server config.)

How to avoid .htaccess constant seek?

Are there any options in Apache 2.4 that enable the .htaccess to be cached, and not having the web server looking for it at each request?
I know that using the AllowOverride None, will do this, but then, for most popular scripts this will need the required settings to be made in the vhosts file for each folder.
If somehow we could just tell Apache to remember the htaccess files that it has already accessed since previous restart, this would be a great tool for speed improvement.
A script that reads the base content recursively for all htaccess, and builds a vhosts file, would not be a bad idea as well, anyone uses this approach?

Why is .htaccess insecure by default to prevent unauthorized access?

I was browsing the web and came across the following:
Source code, including config files, are stored in publicly accessible directories along with files that are meant to be downloaded (such as static assets). [...] You can use .htaccess to limit access. This is not ideal, because it is insecure by default, but there is no other alternative.
Source: owasp.org
Sometimes I use the following code to prevent access from a specific directory:
// contents of .htaccess
order deny,allow
deny from all
allow from none
On servers where there is access outside of the webroot there is obviously less need to prevent access to folders/files with .htaccess.
Can someone explain why they write ".htaccess is insecure by default" and what are alternative ways to prevent access to certain files on a regular LAMP-stack?
.htaccess is not a complete security solution. It doesn't protect you from ddos, sniffing, or man in the middle (when using auth) without SSL.
As far as denying access to specific files, it's generally fine. The scenarios under which it would fail to do so are scenarios where there has already been a successful exploit somewhere else. Since any files in the directory have to be readable by the process owner, the files are only superficially secured by .htaccess.

How to prevent server files from being viewable online?

I have a website running on an Amazon EC2 Linux server, and everything works fine, but when I point the address bar to something like mydomain.com/css or mydomain.com/images, it prints out a list of all the files in that directory to the brower and they're all readable and viewable. I tried chmod'ing some of the folders to have fewer permissions, and that prevented viewing of these files, but it also made them not appear on the site at all. Is there a way that I can protect my documents and server files while also keeping full functionality?
You can prevent the directory listing by disabling it in the Apache config. Just remove "Indexes" from whatever lines it appears on. For example, change from:
Options Indexes FollowSymLinks
To:
Options FollowSymLinks
Edit: Note, you can also add (or edit) the .htaccess in those directories, explicitly disabling indexing for that directory:
Options -Indexes
That's the nature of the web, these files are downloaded to the user's computer so the browser can display them. You cannot protect them from being called from your own site / URL but you can put rules into place that prevent "hotlinking," that is, it will prevent people from linking to the image in their website from your URL. But even then, they could download the file(s) then upload to their own space and carry on.
Sorry I don't have better news, hope this helps!