How to prevent server files from being viewable online? - apache

I have a website running on an Amazon EC2 Linux server, and everything works fine, but when I point the address bar to something like mydomain.com/css or mydomain.com/images, it prints out a list of all the files in that directory to the brower and they're all readable and viewable. I tried chmod'ing some of the folders to have fewer permissions, and that prevented viewing of these files, but it also made them not appear on the site at all. Is there a way that I can protect my documents and server files while also keeping full functionality?

You can prevent the directory listing by disabling it in the Apache config. Just remove "Indexes" from whatever lines it appears on. For example, change from:
Options Indexes FollowSymLinks
To:
Options FollowSymLinks
Edit: Note, you can also add (or edit) the .htaccess in those directories, explicitly disabling indexing for that directory:
Options -Indexes

That's the nature of the web, these files are downloaded to the user's computer so the browser can display them. You cannot protect them from being called from your own site / URL but you can put rules into place that prevent "hotlinking," that is, it will prevent people from linking to the image in their website from your URL. But even then, they could download the file(s) then upload to their own space and carry on.
Sorry I don't have better news, hope this helps!

Related

How to disable the download of files in an Apache2 webserver?

I took over a website which I'm supposed to admin and somebody brought to my attention that certain Indexes and Files are available, which shouldn't be. I will be using dummy names.
You were able to access example.com/intern before, but I changed a line in /etc/apache2/apache2.conf according to this https://stackoverflow.com/a/31445273 . This worked partly, as I get a 403-Forbidden when I now navigate to example.com/intern and that's basically what I want.
However the directory intern governs a file called file.php.bak aswell as file.php. When I navigate to example.com/intern/file.php I get a white website. I am however not sure, if you are able to access file.php in another way, because the site does load and I don't get a 403 like before. What is way worse and the reason I am struggling with this is: If I go to example.com/intern/file.php.bak then my Browser (Firefox) offers me to download file.php.bak, which I can read in plaintext. I want all files in intern to not be accessible via the website, but I have no idea how to do this. Can anybody help?
Things I've tried:
Removing the Indexes from the apache2.conf file like mentioned above. It only puts the 403 on the directory itself and not recursively for all the files in it.
Writing a .htaccess file as described here: https://fedingo.com/how-to-prevent-direct-file-download-in-apache-server/ and putting it in intern with the same result as in 1)
Putting an empty index.html file in the intern directory. This leads to no more 403 in example.com/intern, but the download on example.com/intern/file.php.bak is still possible. I've also tried index.php with the same result.
File System:
The application runs from /var/www/application which is also the folder for the /var/www/application/index.php I want to use. The /var/www/application/intern directory is also there. While it isn't browsable anymore, the files in it still are accessible. /var/www/application/intern/file.php can be navigated to via example.com/intern/file.php, but it seems like it can't be downloaded or read as it results in a white page. /var/www/application/intern/file.php.bak can however be downloaded via example.com/intern/file.php.bak.
Let's say Apache document root is set to DocumentRoot "/folder_one/folder_two"
Placing files in a folder_one will prevent people browsing your apache server and requesting the files directly.
Place index file in folder_two and include some code such as PHP to tell apache to include whatever files you want from folder_one.
In this manor Apache will still be able to serve whatever files you want from folder_one and people will not be able to request the files directly as the are located in a directory above the Apache document root.

Is there any security differences between using .htaccess and moving all files outside web root directory?

I am trying to decide whether to use .htaccess files in each sub-directory to deny all requests for specific files (while also denying directory indexes), or whether it is more security conscious to move all files except for essential files (index.php, .htaccess, robots.txt) outside the root directory and call them from the index file.
Are there any critical differences in security between these two methods for securing files in my web application?
Here is a view of what the .htaccess looks like in the root directory.
# pass the default character set
AddDefaultCharset utf-8
# disable the server signature
ServerSignature Off
<FilesMatch "\.(htaccess|htpasswd|ini|phps|fla|psd|log|sh|lock|DS_Store|json|)$">
order allow,deny
allow from all
</FilesMatch>
# disable directory browsing
Options All -Indexes
# prevent display of select file types
IndexIgnore *.wmv *.mp4 *.avi *.etc
However, this would not stop someone from accessing a file if they knew the directory structure such as https://www.example.com/security/please_dont_look.cfg
Although that file does not print anything, I don't want anyone to know it exists, and don't want a site-specific solution like using modredirect to redirect calls to specific files.
I could use a .htaccess file in each directory such as this:
order deny,allow
deny from all
From this question and reply (Prevent access to files in a certain folder)
Is one solution more bullet-proof than the other?
As always in such complex systems, security here is about having several lines of defense, keeping things simple and attempting to prevent as many attack vectors as possible.
Theoretically both solutions should provide you with the exact same level of security - the files would not be accessible in either case.
I'd recommend moving files that should not be accessed directly into a directory outside of the web root directory. It is quite easy to screw up htaccess files and thats just not possible when you move the files outside of your webroot. This will also prevent timing attacks against the directory structure of your server: reading htaccess files comes with a time penalty and that might be measureable, especially if your htaccess files get big and you have plenty of them for each sub directory. Actually I'd recommend skipping htaccess entirely, just disable indexes directly in your vhost configuation, such that Apache does not have to look for htaccess files at all, speeding up your website.
Additionally, in case you run php via fcgi, you should disallow file access on a file system level for apache and just allow access from php. With this setup it should be outright impossible to access your files by attacking the webserver (excluding php) unless you have some privilege escalation vulnerability (in which case you are screwed anyways).
The only way to access your confidential files in this setup would be to convince PHP to read the file or to mess with the file system, i.e. by creating a hard link from your web root into your "confidential files outside web root"-directory. Preventing against that boils down to ensuring your PHP configuration is as restrictive as possible, file creation inside the webroot is disallowed and, most importantly, ensuring that the PHP application itself is not vulnerable.

can a .htaccess file malfunction

In some CMS backup system i see they use the following .htaccess code in the backup folder:
order deny,allow
deny from all
allow from none
I was wondering if this is fail safe or if it is possible to annoy the server so badly that the .htaccess file is being ignored.
The reason I ask this is because if that is possible I would rather put the files outside the httpdocs folder so they are not accessible by a browser. That though requires me to use quite some extra's to show or use those files if i want to.
Anyone have an idea or tips?

Can someone look into a web servers folders?

More a web server security question.
Is it possible for someone to "probe" into a folder on a web server, even if an index file is in place?
I assume they can't, but if I wanted to store .pdf applications as random names (93fe3509edif094.pdf) I want to make sure there's no way to list all the pdfs in the folder.
Thank you.
Just disable the directory listing in your web server
Generally, no. Instead of creating an "index" file, you may also unset the apache "Options Indexes"
Generally speaking, no. Especially if you explicitly turn off the directory listing for that specific directory.
<Directory /path/to/directory>
Options -Indexes
</Directory>
Source: http://httpd.apache.org/docs/1.3/misc/FAQ.html
However, you should be securing files through some sort of authentication process rather than just file names. What you propose can be found by simply brute forcing the file name. Also, people can share URLs, folks can sniff and find the URL, etc. Use a better method.
Web servers have a setting that controls whether or not the directory listing can be browsed. Apache's is called Options Indexes:
Indexes
If a URL which maps to a directory is requested, and the there is no DirectoryIndex (e.g., index.html) in that directory, then the server will return a formatted listing of the directory.
However, if anyone knows the URL in advance, or can easily guess the filename, they can still load the pdf.
Depends on the server. The server always decides what the client may and may not see. In your case, Apache, see Mitro's answer.

How Come Everybody Can See My Private Files?

Sorry for the newbie question...
When I go to http://www.plans4boats.com/scripts/youtubeplayer/ in Google Chrome, I can see a full listing of the files there. What should I do if I don't want any old hacker to just come in and view/copy my source codes? Does it have something to do with htaccess?
I discovered that putting a blank index.html file in the folder helps for THAT folder, but it still leaves all subfolders vulnerable.
What should I google for more information on how to set up my server to prevent this?
Just set Options -Indexes for that particular directories either in an .htaccess file or a <Directory> or <Location> container.
What you need to do is turn of Directory Listing for your specific server. I don't know what server you're using so I can't walk you through it, but just google your server name and how to disable directory listing.
I created a file called .htaccess and put the following contents:
IndexIgnore /