Should I protect server scripts/files from users, Is there a security risk if users can see server files, how to do it? - apache

I know .htaccess is not the best way to do this, but I don't have access to other server settings.
- index.php (root)
-- scripts (folder)
--- someScript.php (one of the scripts)
So let's say I have an index.php file that lives at the root of the server, that file makes ajax requests to a script in a folder scripts, If a user types in the search bar domain.com/scripts/ he now has access to that folder (I don't know if they can be downloaded from there or not).
I know I can use options -Indexes but this still allows users to go directly to a script if they now the name of it, which is not hard to find or even guess?
The second option I know about is
<Files ~ "\.txt$">
Order allow,deny
Deny from all
</Files>
But this stops everything from accessing the file, even the ajax requests.
So, my question is, should I protect these files somehow ? Can the user see their content or download them, are there security risks ?

should I protect these files somehow?
Well, you can't really, not if they are to be requested by the client (browser AJAX request).
It's usual to send a custom HTTP request header when calling a script via AJAX (client-side), so the script knows how to respond to such requests and return the appropriate response. Whilst this provides no "security", it does prevent casual requests to that script from doing anything.
Can the user see their content or download them, are there security risks ?
The example you gave is of a PHP script. Any direct request will only see its output, not necessarily its contents.
The only security risks are what you make. If an arbitrary request to that script returns a list of all active users and personal information then yes, that's obviously a security risk. But if the response is empty and no harmful event happens as a result of calling that script then it's a non-issue.

Related

Only allow access to files in directory from the website they are a part of

I know there are a lot of similar questions out there, and I've trawled them all, but I can't seem to get any of the solutions to work.
I have a folder on the root of my website containing uploaded files that can be viewed and downloaded from the site when a user is logged in. They are here: https://example.com/uploads (for example). I need the site to continue to be able to access them to display them (some are images) and provide links for download (pdfs etc) so the user can download them, but I want to avoid anyone who get's hold of the url of a particular file being able to download them directly, like this: https://example.com/uploads/2020/02/myfile.pdf. OR these urls getting into search engines (or if they do, the server prevents them from being accessed directly.
I've tried adding an .htaccess file in the uploads directory with the following content:
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
And I've tried
Order Allow,Deny
Deny from all
Allow from 127.0.0.1
...as I read that might allow HTTPS calls from the site itself as well as local urls.
But it forbids the site and a direct url request, which is no good.
Is there a way to do this?
The user interface that provides the ‘official’ access to the files has user authentication, yes, but the files still exist in a directory than won’t stop anyone getting to them if they know the url.
You need to protect the files using the same authentication system that you are using to protect access to the user interface. The only way you could protect these resources by IP address (the client IP address) - as you are currently attempting in .htaccess - is if the client's IP is fixed and known in advance (but if this was the case then you wouldn't need another form of authentication to begin with).
So, this will primarily be an exercise in whatever scripting language/CMS is being used to authenticate the "user interface".
What you can use .htaccess for is to rewrite requests for these files to your server-side script that handles the authentication and then serves the file to the client once authenticated.
For example:
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^uploads/. /serve-protected-file.php [L]
Any request for /uploads/<something> (eg. /uploads/2020/02/myfile.pdf), that maps to a valid file is routed to your script: /serve-protected-file.php.
/serve-protected-file.php would then need to do something like the following:
// 1. Parse the file being requested from REQUEST_URI
// 2. Is the requested file "protected"?
// (Serving both protected and public files from the same directory?)
// 3. If not protected then serve/stream the resource to the client. END
// 4. If protected then authenticate the user...
// 5. If user authenticated then serve/stream the resource to the client. END
// 6. Resource is protected and user not authenticated...
// Serve a 403 Forbidden. END
(Ideally, the location of these "protected" resources would be entirely outside of the document root - so they are "private" by default - and the URL the user uses to access these resources is entirely virtual - then you probably wouldn't need any additional coding in .htaccess and everything would be implemented by your front-controller - but that all depends on how your site is implemented and the way in which URLs are routed.)

Prevent direct file access

I have several audio files that I don't want to allow anyone else to gain access to them. Each file is in a separate folder inside a main folder, that I'll call "download" for now. So "download" has several other directories, and inside each directory are audio files. Those audio files are played with in a web app on the system.
The problem is that right now anyone can type in the full address of the file localhost/download/dir/sound.wav and play the audio file. This is what I want to prevent from happening, I want those files to only stream when they are access or streamed from our application.
I tried the following on the .htaccess file
deny from all
This just returned an 403 forbidden page, but i was unable to stream the file from within the application
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://(www\.)localhost.com/.*$ [NC]
RewriteRule \.(mp3|wav)$ - [F]
This just disabled the stream all together did not return a 403 or anything it just did not stream from neither the application or direct access
Finally I'm using AJAX to call the script that holds the files to be streamed; are there any options I can use?
It is impossible to prevent the user from accessing those files
In order to hear them they have to be downloaded to the user's computer and that means that they have to be accessible!
The best you can do is encrypt the files and decrypt them in the player. But even then the player could be reverse-engineered and someone could discover the encryption key and algorithm. In the end you gonna find out that you just wasted a whole lot of processing time and in fact slowed down your application!
There is just one problem: how is server supposed to detect who has requested your media - application or some other system, just using similar protocol?
But if you just want to prevent simplest http request to you media, you could involve some token exchange system, e.g. your application sends request for media in certain format, server sends token for accessing certain file, and then your application may access special (say php) script supplying it with token, script returns your sound stream. This way, media can be forbidden to be accessed from outside world and only will be accessed by you own server-side php script.
Then in order to gain access to media file user would need to know your existing token or your exchange protocol which eliminates random users accessing your media at will. However, as you have been told before there is probably no way to protect against 'educated users'.
One possibility would be to:
Add an apache rewrite directive on that download folder to route all requests to a php script instead that takes the file requested as a parameter.
Create this script (say sound.php) in your application which takes that file path as a get parameter. This script can output the correct http headers to indicate that the type of data is wav or whatever you want. Then check some cookies or a token or similar, and output the content of the restricted file directly (see readfile) only if the user is valid.

how to solve anti-leech in a better way?

As I came across with the hot-leeching problem, I searched the website and found two ways to solve it.
The first is an easier and simpler way with the code showing below:
RewriteEngine On
RewriteCond %{HTTP_REFERER}!^$ Options +FollowSymlinks
RewriteCond %{HTTP_REFERER}!^http://(www\.)?mydomain.com(/)?.*$ [NC]
RewriteRule .*\.(gif¦jpg¦jpeg¦png¦swf)$ [mydomain.com...] [R,NC]"
This can only prevent some simple leeching ,but can do nothing with a determined person.
The other way is a better way with a script-and-cookies-based approach. They said "You set a cookie on an 'authorizated' page of your site, and then use a script to serve images only if the correct cookie is present in the image request. Images are kept in a directory accessible only to the script, and not via the Web. So, the script acts as an 'image server' on your site." I understand this principle but don't have any idea about how to realize it . Could anyone know how to realize this?
Any help appreciated.
I can't really give any implementation, but only some idea of how it can be achieved:
You will need a "portal" page, where you set the cookie for the user. Any request for resources without having a cookie of your site should be redirected here. There may not may not be a login mechanism here, depending on the purpose of your site, but usually you will set the cookie, after the user is logged in.
All resource links will link to to the same "script" page. The difference is that different resource will have different identifier (can be some sort of id - if you maintain a database of id to file path mapping). The identifier must be included in the query of the URL. The "script" will find the resource on the server based on the identifier (in case of id to file mapping, you will obtain the file path and go retrieve the file).
There will be a "script" page, which can be php code, for example. It will check for the cookie, then check for the identifier, then load the resource accordingly. You may also want to check for Referer to restrict the access a bit more (without checking, hot linking will work for any logged in user).
In this implementation, sharing a hot link to a resource will not work for any user that haven't visited the "portal" page (or haven't logged in, depending on your web site). It will also not work even for logged in user if they click the link from somewhere else.
However, scraping your website for resources is simple in both implementations mentioned in your question, since scraper can freely adjust the HTTP header.

non-browsable URLs?

How can one make URL's on their site non-browsable?
Example:
http://mydomain.com/files/file1.txt
If a user hits it directly, don't allow it.
If I call it inside an href on MY site then it would work.
Would one url-rewrite t accomplish this?
or how?
Apache, CentOS 5.5
You can check the Referer header.
Note that not all browsers send Referer headers, so you'll be completely locking out some users.
Also note that the Referer header is trivially spoofable.
Alternatively, and more securely, you can protect the files with a server-side script.
Change your links to point to a server-side script and include a randomly-generated one-time passcode in the querystring.
The server-side script should verify the one-time passcode (use a database), then send the file to the client.
Depending on your application, you can also use an ordinary password-based authentication system. (if you have user accounts)

Custom .htaccess password protection handler

I need to protect a site that has a ton of static .html files. The standard .htaccess scheme doesn't meet the requirements.
Is there a way to specify an .htaccess style of password protection with a custom handler? That is I need to write the code to determine if the user is allowed or not, but I don't want to modify a million .html files all over the place.
Thanks!
Maybe. It depends on what modules are loaded on your web server. Your options will range from keeping a simple list of users in a flat file, to keeping them in a database and customizing the queries.
http://httpd.apache.org/docs/2.2/howto/auth.html
Another option - just brainstorming here - is to use something like mod_rewrite to redirect the calls to the physical file to something like a PHP script that can manage the user/password authentication for your, and if authenticated, go out and load the html file that was requested. So calls to www.some.com/10203.html actually get directed to www.some.com/auth.php?10203.html, which would control access to that underlying html file. That would of course require mod_rewrite to be installed, which is pretty common even for shared hosting environments.