How to restrict access file from Apache Server? - apache

If I allow user to upload file, after user uploaded the file, the file will go to
wwww.someplace.com/public_file/... ....
So, everybody can get access the file. But I would like to make some restrictions on that. For example, I want to reduce the downloading speed for non-login users. How can I do that? Also, if I want to limit the user to get the files if he/she don't have a user right... ... For example, if some user upload to
wwww.someplace.com/secret_place/... ...
Only the user have some rights, can get access to this place... ...How can I do that? Should I do this in the web application and the Apache Server config too? Thank you.

For users you can split this between Apache and your application as I know most servers support using a database for authentication; Apache certainly supports many methods of authentication, you should find some useful info here:
http://httpd.apache.org/docs/current/howto/auth.html
One thing to note is that if you were to do this exclusively in the application then it would be easily by-passable. You can restrict the download speeds for non-logged in users using traffic shaping.

let’s consider that we want to deny access to all files with the extension .inc. In order to achieve this we will add the following configuration lines in the appropriate context (either global config, or vhost/directory, or from .htaccess):
Order allow,deny
Deny from all
Similar to this we can deny access to whatever files we might need…

Related

Make server directory inaccessible to all external connections

I'm creating a very simple login system where my php code stores and reads file names(account name) and content(password) but anyone could simply just navigate to that folder and read whatever. so what i want is that a folder gets "locked" from any connections, and still read / write to it with php. i use xampp to host, if it has any relevance
is this possible? if not, are there other simple ways of storing / reading account password and name?
found my solution, by putting
order deny,allow
deny from all
inside .htaccess in the folder i want private

How to limit access to an FTP folder?

I'm using a shared webserver on which I can manage FTP accounts. I'm wondering how to set access restrictions a folder,
Say I have a file in:
www.somepage.com/ftp/import/
which is named someData.txt
Why is it that I can access this file by hitting:
www.somepage.com/ftp/import/someData.txt
If the file is just lying there, why make an FTP user with login/password to access it. If I check the file restrictions, public has read/write/execute/sticky, so I don't understand why I can just pull the fill by hitting it's URL.
Article on permissions
Try using chmod to change the permissions on your files/folders on the server through command line/terminal. It seems like your permissions are currently set to 777 which allows anyone to do anything.
7 allow user to read/write/execute
7 allow group to read/write/execute
7 allow world to read/write/execute
You would probably want to use something like 770 which will prevent anyone except the user and groups doing anything to the files and folders. These permissions are pretty important for security as you could be the victim of an attack if they aren't set properly.

.htaccess to restrict access from all except one specific url

I'm using a membership script to allow access to private files by a group of people. The problem is anyone who knows the link to the files can easily bypass the membership script and have direct access to the files. My goal is to prevent direct URL access through http://domain.com/folder1/folder2/index.php and to only accept access from http://domain.com/folder1/file_with_link_to_index.php. The membership script is useless unless I can prevent people typing in the url to the file locations.
I'm having a hard time even beginning to understand Apache but what I was attempting to do was to use "Deny All" and then "Allow from http://folder1/file_with_link_to_index.php." so that access to the files could only be access through the "file_with_link~" only. It didn't work. I was still able to paste the url to the files into the browser and access the directory.
I found similar questions on StackO but not enough experience to understand and actually take the little pieces that were similar to my problem and actually use them. It's probably something simple but I'm so frustrated with trying to figure this out that I can't see it.
Here's a quick example of what I was trying to use:
deny all except those indicated here
<Limit GET POST PUT>
order deny,allow
deny from all
allow from 12.345.67.890
allow from http://domain.com/fold1/LINKING_FILE/program_index.php
</Limit>
I'm pretty sure I'm failing on the way the domain is supposed to be written and maybe that's why it's not working?
I would use another approach to this than by htaccess. Put the files in a directory unreachable for the public web, then use your membership php script to retrieve them.
jTheMan was right. There was a better way of tackling this using PHP. My fix for my problem is below just in case it's useful to anyone else. The only awesome addition would be to make the code below allow direct URL access from just one IP, like my own :0)
I added the following file to the "head" or "index" or to the very top of the page that was called first in each application:
//Begin refuse direct URL Connections
$refering=parse_url($_SERVER['HTTP_REFERER']);
if($refering['host']==$_SERVER['HTTP_HOST']){
echo "";
} else {
header("Location: http://www.yourdomain.com");
exit();
}
//End refuse direct URL Connections
Thank you!

Security problems regarding +FollowSymLinks and -SymLinksIfOwnerMatch?

I'm using this for one of my applications:
Options +FollowSymLinks -SymLinksIfOwnerMatch
And I worry about the security problems this may bring. Any idea what measures I can take to make this approach as secure as possible?
There's nothing specific you can do to make using those options as secure as possible. The risk in using them is that a user, or a process running under a user, can disclose information or even hijack content by creating symlinks. For example, if an unpriviliged user (who may have been compromised) wants to read a file that they normally can't, they can sort of escalate it by creating a symlink from their public_html directory to it, and if apache can read it, they can then just access their webpage and read the file. There's nothing specific you can do to prevent something like that from happening except to make sure you're system is properly patched and configured.
Note that this threat isn't just from users on your system. If you are running a webapp in, say php, and it got compromised somehow, an attacker can upload a php file browser and create symlinks to content outside of your document root (like to /etc/passwd or some other file you don't want exposed to the web).
If you're worried about stuff like that, it's better not to use these options.

How do I hide my Scripts folder?

I have a directory on my website specifically for javascript files, I want these javascript files to be hidden, so if I type the url to it it says Forbidden or disallows access, but my front-end website files can still access them to execute them when needed. Is there a way to do this through a FTP client?
Cheers,
Dan
You can't do this trough a ftp client. It is the task of your webserver to forbid access to certain files.
If you change permission, the webserver won't have access to them anymore, so this is not the way to go.
You must configure your webserver to restrict the access. If you're using Apache, you can use an .htaccess file. There's different ways of doing this, many depends on the way the webserver is configured.
The easiest is to put an .htaccess file in your Scripts folder which contain only this none line :
deny from all
However, like peeter said, there's a good chance this will break your site, since the browser must access theses files, so you can't restrict access.
Put a htaccess file in your scripts folder containing deny from all, but this will stop your pages from accessing the scripts also (though not if you pass them through the PHP engine first)
You're trying to hide JavaScript files that are executed on the clients side. If a client(browser) cannot access the files means non of your javascript code is executed.
If I understood your question correctly then you cannot achieve what you're trying to achieve.