I was browsing the web and came across the following:
Source code, including config files, are stored in publicly accessible directories along with files that are meant to be downloaded (such as static assets). [...] You can use .htaccess to limit access. This is not ideal, because it is insecure by default, but there is no other alternative.
Source: owasp.org
Sometimes I use the following code to prevent access from a specific directory:
// contents of .htaccess
order deny,allow
deny from all
allow from none
On servers where there is access outside of the webroot there is obviously less need to prevent access to folders/files with .htaccess.
Can someone explain why they write ".htaccess is insecure by default" and what are alternative ways to prevent access to certain files on a regular LAMP-stack?
.htaccess is not a complete security solution. It doesn't protect you from ddos, sniffing, or man in the middle (when using auth) without SSL.
As far as denying access to specific files, it's generally fine. The scenarios under which it would fail to do so are scenarios where there has already been a successful exploit somewhere else. Since any files in the directory have to be readable by the process owner, the files are only superficially secured by .htaccess.
Related
I am trying to decide whether to use .htaccess files in each sub-directory to deny all requests for specific files (while also denying directory indexes), or whether it is more security conscious to move all files except for essential files (index.php, .htaccess, robots.txt) outside the root directory and call them from the index file.
Are there any critical differences in security between these two methods for securing files in my web application?
Here is a view of what the .htaccess looks like in the root directory.
# pass the default character set
AddDefaultCharset utf-8
# disable the server signature
ServerSignature Off
<FilesMatch "\.(htaccess|htpasswd|ini|phps|fla|psd|log|sh|lock|DS_Store|json|)$">
order allow,deny
allow from all
</FilesMatch>
# disable directory browsing
Options All -Indexes
# prevent display of select file types
IndexIgnore *.wmv *.mp4 *.avi *.etc
However, this would not stop someone from accessing a file if they knew the directory structure such as https://www.example.com/security/please_dont_look.cfg
Although that file does not print anything, I don't want anyone to know it exists, and don't want a site-specific solution like using modredirect to redirect calls to specific files.
I could use a .htaccess file in each directory such as this:
order deny,allow
deny from all
From this question and reply (Prevent access to files in a certain folder)
Is one solution more bullet-proof than the other?
As always in such complex systems, security here is about having several lines of defense, keeping things simple and attempting to prevent as many attack vectors as possible.
Theoretically both solutions should provide you with the exact same level of security - the files would not be accessible in either case.
I'd recommend moving files that should not be accessed directly into a directory outside of the web root directory. It is quite easy to screw up htaccess files and thats just not possible when you move the files outside of your webroot. This will also prevent timing attacks against the directory structure of your server: reading htaccess files comes with a time penalty and that might be measureable, especially if your htaccess files get big and you have plenty of them for each sub directory. Actually I'd recommend skipping htaccess entirely, just disable indexes directly in your vhost configuation, such that Apache does not have to look for htaccess files at all, speeding up your website.
Additionally, in case you run php via fcgi, you should disallow file access on a file system level for apache and just allow access from php. With this setup it should be outright impossible to access your files by attacking the webserver (excluding php) unless you have some privilege escalation vulnerability (in which case you are screwed anyways).
The only way to access your confidential files in this setup would be to convince PHP to read the file or to mess with the file system, i.e. by creating a hard link from your web root into your "confidential files outside web root"-directory. Preventing against that boils down to ensuring your PHP configuration is as restrictive as possible, file creation inside the webroot is disallowed and, most importantly, ensuring that the PHP application itself is not vulnerable.
My hosting allows use of .htaccess files as the configuration files are not available.
I'm aware of the performance hit that override files incur though - so I was thinking - If Apache provided a mode for having a single .htaccess file - wouldn't that be faster than having to check for multiple .htaccess files whilst still maintaining the convenience?
If Apache provided a mode for having a single .htaccess file
Well, not a "mode" as such, but you could achieve this by allowing .htaccess for the parent directory (root directory or even one above the document root) and disable the use of .htaccess files in all subdirectories. The .htaccess file in the parent directory will still apply.
Realistically (if indeed this is at all "realistic"), you would probably need to enable .htaccess for the directory "above the document root" and disable .htaccess in the document root and below, rather than enabling .htaccess for the document root. Otherwise, if you enable .htaccess for the document root, you will have to disable .htaccess for each subdirectory individually. And if you add more subdirectories, the server config will need to be updated accordingly. (Since, the AllowOverride directive is only allowed in <Directory> containers without regex, not <DirectoryMatch> containers.) However, this might not be possible on some shared hosting environments (there might not be an "above the document root") and it could impact the installation of some CMSs.
Note that you obviously need access to the server config (or VirtualHost) in order to implement this, so it is hypothetical in this instance.
wouldn't that be faster than having to check for multiple .htaccess files
Possibly. But you are only talking about a micro-optimsation at best in real terms. On most sites, even enabling .htaccess files at all will hardly be noticeable - if at all. The "performance hit" you speak of is not as big as you might think. To put it another way, if you are finding that .htaccess is proving to be a bottle neck then you've either done something wrong, or you have far more serious problems to address.
Note, however, that you are generally only using .htaccess files on smaller sites anyway. On larger / high traffic sites you will have your own VPS / Server and access to the server config, so there wouldn't be any need to use .htaccess (or, importantly, have it enabled).
whilst still maintaining the convenience?
Not exactly. Part of the "convenience" is being able to put the .htaccess file in any directory you like, overriding parent directives and have it apply to just that directory tree. (It is the userland equivalent of the <Directory> container in the server config.)
I'm building a Mp3 store with Drupal and Ubercart. I would like to implement the best security measures to proctect the content from hackers etc. I have a file directory with .htaccess file
Contents of the .htaccess file
SetHandler Drupal_Security_Do_Not_Remove_See_SA_2006_006
Deny from all
Options None
Options +FollowSymLinks
Is this enough or should the mp3 files be stored outside of the webroot?
Does VPS Hosting provide better security than shared hosting?
It appears that you have set file system to Private and files will be transferred via Drupal.
From my experience, it works and it's almost secure, unless:
A third party can access your server via FTP or a higher protocol.
A user can gain access to execute PHP.
Make sure that, if you have IMCE or other file browser module enabled, these secured folders are not allowed to access.
Whatever plan you have, hosting company has access to your files. But usually, a correctly configured can be more secure than a shared host because you can use private temporary folders, and you can have more control over who can access your server and banning bad guys.
I'm using this for one of my applications:
Options +FollowSymLinks -SymLinksIfOwnerMatch
And I worry about the security problems this may bring. Any idea what measures I can take to make this approach as secure as possible?
There's nothing specific you can do to make using those options as secure as possible. The risk in using them is that a user, or a process running under a user, can disclose information or even hijack content by creating symlinks. For example, if an unpriviliged user (who may have been compromised) wants to read a file that they normally can't, they can sort of escalate it by creating a symlink from their public_html directory to it, and if apache can read it, they can then just access their webpage and read the file. There's nothing specific you can do to prevent something like that from happening except to make sure you're system is properly patched and configured.
Note that this threat isn't just from users on your system. If you are running a webapp in, say php, and it got compromised somehow, an attacker can upload a php file browser and create symlinks to content outside of your document root (like to /etc/passwd or some other file you don't want exposed to the web).
If you're worried about stuff like that, it's better not to use these options.
I have a directory on my website specifically for javascript files, I want these javascript files to be hidden, so if I type the url to it it says Forbidden or disallows access, but my front-end website files can still access them to execute them when needed. Is there a way to do this through a FTP client?
Cheers,
Dan
You can't do this trough a ftp client. It is the task of your webserver to forbid access to certain files.
If you change permission, the webserver won't have access to them anymore, so this is not the way to go.
You must configure your webserver to restrict the access. If you're using Apache, you can use an .htaccess file. There's different ways of doing this, many depends on the way the webserver is configured.
The easiest is to put an .htaccess file in your Scripts folder which contain only this none line :
deny from all
However, like peeter said, there's a good chance this will break your site, since the browser must access theses files, so you can't restrict access.
Put a htaccess file in your scripts folder containing deny from all, but this will stop your pages from accessing the scripts also (though not if you pass them through the PHP engine first)
You're trying to hide JavaScript files that are executed on the clients side. If a client(browser) cannot access the files means non of your javascript code is executed.
If I understood your question correctly then you cannot achieve what you're trying to achieve.