Protect folder on server only allow requests from allowed sites - apache

Is there a way to not allow anyone to have access to the contents of a folder unless the were referred by a certain site? So if someone tried to load music.mp3 can I redirect them, but if example.com referred them allow them to see it. Would this be done through .htaccess?
Thanks!

something like
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http://www\.example\.com [NC]
RewriteRule \.jpg - [F]
should work
if its not http://example.com then give 403 status code

Related

Forbidden access to files in a specific folder

I have a directory files that contains files that I don't want to allow to be accessed directly
If you go to www.mywebiste.com/files/myfile.pdf then you would be redirected.
However I want the files to be accessible from the rest of the site.
ie. I may have a page www.mywebiste.com/dave/page/ that needs to be able to display files from within the files dir.
I have tried the following htacccess in the files dir, but it's not working:
RewriteEngine ON
RewriteCond %{HTTP_REFERER} ^http://www.mywebiste.com/files/ [NC]
RewriteRule ^.*$ - [R=404,L]
If I could get this working, I assume that this would also prevent the files from being indexed by robots, as they too would be redirected?
Options +FollowSymLinks -MultiViews
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_REFERER} ^$
RewriteRule ^files/.*$ - [F]
You may also use:
RewriteCond %{HTTP_REFERER} !^http://(www.)?mywebiste.com/.*$ [NC]
RewriteRule ^files/.*$ - [F]
Since the refer comes from the page it was called the page could be anything within your domain so I simple use ^http://(www.)?mywebiste.com/.*$ you may as well use ^http://(www.)?mywebiste.com if you feel more comfortable with.
If refer is empty and folder is files deny access to it.
PS: http://(www.)?mywebiste.com means either site address have www. or doesn't.
As for robots you could use user agent to allow access even when the refer is empty.
robots.txt example:
User-agent: *
Disallow: /files/
More information about robots.txt see here.

Sending access is denied error using htaccess on apache for all files but a certain one

I have a folder containing various .php files, and I want to prevent direct access to them, BUT to index.php.
This is what I got so far, and it appears working:
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond $1 !^(index\.php)
RewriteRule ^(.*)$ /403.php/$1 [R=403]
</IfModule>
Is this the correct way to do it? Also note that 403.php doesn't actually exist among the files I have in the folder.
EDIT: to better clarify what I'm trying to do -- I have a folder (we can assume named "includes") containing an index.php file, and various other files which are included by index.php.
I don't want users / malicious bots / whoever to be able to directly access anything in "includes" other than index.php.
In case they reach anything else (regardless whether the file exists or not), I want to send to the browser a 403 - Access Denied HTTP response code.
The correct way is to use the F flag, which simply returns a 403 forbidden and you can use - as the target which just means "do nothing and let the URI pass through unchanged":
RewriteEngine on
RewriteCond $1 !^(index\.php)
RewriteRule ^(.*)$ - [L,F]
Or you can try combining the condition with the rule:
RewriteEngine on
RewriteRule !index\.php$ - [L,F]
You can either create an error page. Actually, some control panels have an application that will allow their user to create an SSI-enabled 403 (Forbidden) page with .shtml file extension. In cPanel that app. is entitled with "Error Pages" which were found in the "Advanced" section, and the 403 page will be going to saved in 403.shtml basename. If you didn't found such kind of app., you can manually create an SSI-enabled HTML file, only if your server is configured to allow this. If it's not possible, you can still use another extension.
So, the more correct way is to remap the existed error page, such like this:
RewriteCond %{REQUEST_URI} !^/index\.php
RewriteRule ^(.*) /403.shtml
But anyway, what are you trying to do?

Limiting access to one directory when on subdomain

Howdy, I need to do the following:
allow access to sub.domain.com/directory from www.domain.com (linking to an asset)
prevent access to sub.domain.com/* (Return a 404 page when user hits subdomain directly
Is this possible using htaccess and, if so, any pointers on how to accomplish it?
Thanks
Using .htaccess you might want to look up htaccess and regex tutorials as they are of great help.
Stop sites from hotlinking is what I assume you are referring to therefore: you need to enable mod_rewrite and add this to the .htaccess file in the domain's root directory (with a little of what you learn):
RewriteEngine On
RewriteCond %{HTTP_REFERER} !^http://(.+\.)?domain\.com/ [NC]
RewriteCond %{HTTP_REFERER} !^$
RewriteRule .*\.(jpe?g|gif|bmp|png)$ http://www.domain.com/noHotlink.gif
This should point you in the right direction towards how to do it.

Using .htaccess to reroute all requests through index.php EXCEPT a certain set of requests

So I just inherited a site. The first thing I want to do is build a nice little standard, easy-peezy, CMS that allows for creating a page with any URL (for example: whatever.html).
Therefore, if user hits example.com/whatever.html, it should get any db info for whatever.html and display it. This is run of the mill stuff.
My problem is that there are quite a few pages on the site (all listed in the .htaccess) that need to continue to be accessible. For instance, /Promotions is linked to promotions.php via .htaccess, and I need it to stay that way.
Anyone know how I can construct the .htaccess file to allow specific rewrites to still work but to reroute all other requests through index.php?
Currently, I just have .htaccess show a custom 404 page which in turn checks the db for the url and displays it if it exists. This is an easy solution, but I know that some people have weird browser toolbars (dumb or not, they exist :) ) that autoredirect 404s, and I'd hate to annoy my users with these toolbars by not allowing access to certain pages.
Thanks so much for your help!
The RewriteRule for promotions should still work as it's not 404ing.
If the 404 handler is showing the page because it exists in the database then it should really be returning a 200 OK status (overriding the 404 one), so you should not get any issues with browser toolbars.
As for doing the rerouting you can do something like this:
RewriteEngine On
RewriteCond %{REQUEST_URI} !^.*/(promotions|anotherone|somethingelse)($|/.*$) [NC]
RewriteRule ^(.*)$ /index.php?p=$1
Here is another variant:
RewriteEngine on
RewriteRule ^/i/(.*)$ - [L]
RewriteRule ^/css/(.*)$ - [L]
RewriteRule ^index.php$ - [L]
RewriteRule ^(.*)$ index.php?p=$1 [L,QSA]

.htaccess redirect doesn't hide url

my .htaccess in the root folder includes the following lines :
Options +FollowSymlinks
RewriteEngine on
RewriteRule ^(.*)\.htm$ http://example.com/?city=$1 [NC]
when I open the address http://example.com/bla.htm, my browser doesn't hide the GET values specified in the .htaccess, it redirects me to ?city=bla. eventhough I'm not using the [R] switch. This has always worked for me before (as I remember, haven't dealt with htaccess in a while). What's wrong here ?
When you redirect to an entire URL, it doesn't do URL rewriting (you can't exactly rewrite URLs on someone else's website).
Assuming both URLs are on the same server, you need to do something like
RewriteRule ^(.*)\.htm$ index.php?city=$1 [NC]
Also, I'd recommend getting into the habit of using the [L] switch whenever you can - it helps avoid bugs when you have a lot of URLs to rewrite.