non-browsable URLs? - apache

How can one make URL's on their site non-browsable?
Example:
http://mydomain.com/files/file1.txt
If a user hits it directly, don't allow it.
If I call it inside an href on MY site then it would work.
Would one url-rewrite t accomplish this?
or how?
Apache, CentOS 5.5

You can check the Referer header.
Note that not all browsers send Referer headers, so you'll be completely locking out some users.
Also note that the Referer header is trivially spoofable.
Alternatively, and more securely, you can protect the files with a server-side script.
Change your links to point to a server-side script and include a randomly-generated one-time passcode in the querystring.
The server-side script should verify the one-time passcode (use a database), then send the file to the client.
Depending on your application, you can also use an ordinary password-based authentication system. (if you have user accounts)

Related

Should I protect server scripts/files from users, Is there a security risk if users can see server files, how to do it?

I know .htaccess is not the best way to do this, but I don't have access to other server settings.
- index.php (root)
-- scripts (folder)
--- someScript.php (one of the scripts)
So let's say I have an index.php file that lives at the root of the server, that file makes ajax requests to a script in a folder scripts, If a user types in the search bar domain.com/scripts/ he now has access to that folder (I don't know if they can be downloaded from there or not).
I know I can use options -Indexes but this still allows users to go directly to a script if they now the name of it, which is not hard to find or even guess?
The second option I know about is
<Files ~ "\.txt$">
Order allow,deny
Deny from all
</Files>
But this stops everything from accessing the file, even the ajax requests.
So, my question is, should I protect these files somehow ? Can the user see their content or download them, are there security risks ?
should I protect these files somehow?
Well, you can't really, not if they are to be requested by the client (browser AJAX request).
It's usual to send a custom HTTP request header when calling a script via AJAX (client-side), so the script knows how to respond to such requests and return the appropriate response. Whilst this provides no "security", it does prevent casual requests to that script from doing anything.
Can the user see their content or download them, are there security risks ?
The example you gave is of a PHP script. Any direct request will only see its output, not necessarily its contents.
The only security risks are what you make. If an arbitrary request to that script returns a list of all active users and personal information then yes, that's obviously a security risk. But if the response is empty and no harmful event happens as a result of calling that script then it's a non-issue.

How can I find out who is hotlinking my content?

I have lots of videos on my website, I am curious to know what websites are hotlinking to it.
I am using cpanel with awstats, I have google analytics too.
The server is running Apache.
Actually you can check Referer header.
If you want block all requests outside of your domain. Here is example for Apache server.
But this technique has 2 disadvantages:
Very-very easy to send faked Referer header
Some browsers in very rear case may not send Referer header at all
Most common way to prevent content from cross-linking is generate dynamic temporary links with limited session time.

How does apache match authentication/authorization information with subsequent http requests from same user?

When you protect an area of your document root using either the server configuration or .htaccess, the server prompts for a username and password when someone requests those files from a browser. If the password matches the one from the authentication provider for that user, the documentation at http://httpd.apache.org/docs/2.2/howto/auth.html says that apache will set environment variables for that user. In my case I'm building a php app, and using phpinfo() I gather that the environment variables set are are REDIRECT_AUTHENTICATE_SAMACCOUNTNAME, AUTHENTICATE_SAMACCOUNTNAME (Using active directory as authentication provider), and REMOTE_USER. I believe this is what prevents the user from being prompted again and again on each subsequent request.
What I don't understand is how apache matches requests from a user with the environment variables set for that user, and also when and how it knows how to clear those variables. I doesn't appear to use cookies, because I cleared all the cookies for the domain in question, and still it doesn't ask me to reauthenticate unless I actually close the browser.
Ultimately I'm going to be working with php to get the userid and to maintain state, but since php is getting the information from the apache information, I'd like to know about that context, and I don't seem to be able to find these details. Thanks in advance.
Look at the http headers your browser is sending. After you have supplied a username and password, your browser will continue sending those details to that site until your browser session ends, or longer if you tell your browser to remember the credentials.

Changing request and response with an Apache Proxy Server

I want to use an Apache proxy server (mod_proxy) to intercept all requests and responses to a web server. However I want to change requests and responses before redirecting them. Simply rewriting URLs is easy and documented, but the changes I want to make are more sophisticated, namely they need to inspect the request for user credentials as well as conditionally make redirects.
Is this possible in Apache's mod_rewrite, possibly in combination with other modules?
While the main goal is to implement this in Apache, I would also be happy with an alternative solution which doesn't necessarily use Apache.
Here is a more precise explanation of what I want to achieve, to give a little more context:
Check each incoming request for user credentials. If credentials are present, they are replaced by the user information which the web server can use to identify the user (Ideally in the Authorization header)
For example, let's assume a request contains a cookie which authenticates the request as beeing sent from the user "John", this cookie is removed, and the Authorization header is changed to Authorization Authenticated_by_proxy {"id":12345,"name":"John"}
Check each answer to see if it's an Error 403. If this is the case and the user is not logged in, redirect the user to a login page instead of forwarding the error

Xenu Link checker

I want to use an application that checks for broken links. I got to know that, Xenu is one such software. I do not have access to internal aspx/http files on a drive. The Problem I am facing is the Website requires the user to be authenticated. After login I need to crawl the site to determine which links are broken.
As an example, I kick off with mail.google.com. We end up typing the Username and password after which we are served different URLs. If I give the Xenu (or similar programs) the link such as mail.google.com it will not be able to fecth URLs inside the mail.google.com which will be of type - /mail/u/0/?shva=1#inbox/ etc. There lies the problem.
With minimal or least scripting language how can I provide Xenu (or other similar app) capability to Login by providing external URL (mail.google.com) in this example in order to do whatever xenu has to do.
Thanks
Balaji S
Xenu can be used with an authenticated user as long as the cookies are persistent. You will need to enable cookies in Xenu and login once yourself using IE.
From their FAQ:
By default, cookies are disabled, and Xenu rejects all cookies. If you
need cookies because
you have used Internet Explorer to authenticate yourself before
starting a run
to prevent the server from delivering URLs with a
session ID
then you can enable the cookies in the advanced options
dialog. (This has been available since Version 1.2g)
Warning: You
should not use this option if you have links that delete data, e.g. a
database or a shop - you are risking data loss!!!
You can enable cookies in the Options menu. Click Preferences and switch to the Advanced tab.
For single page applications (like gmail) you will also need to configure Xenu to parse Javascript
This is done by modifying the ini file (traditionally at C:\Program Files (x86)\Xenu135\Xenu.ini) and adding a line of code under [Options]
Javascript=[Jj]ava[Ss]cript: *[_a-zA-Z0-9]+ *\( *['"]((/|ftp://|https?://)[^'"]+)['"]
There are several variations provided in their FAQ, but I didn't get them to work perfectly.