IP restriction with htaccess - apache

I want to restrict an entire site in such a way that just two IPs could enter the site. I have the following in my .htaccess (at the root of site):
ErrorDocument 403 http://www.example.com/views/error403.html
Order Deny,Allow
Deny from all
Allow from 311.311.311 322.322.322.322
ErrorDocument 404 /views/error404.html
ErrorDocument 500 views/error500.html
(Obviously, these are fake IPs, in my .htaccess they are the right IPs)
As you can see, I allow just 322.322.322.322 and all IPs from 311.311.311.0/24, and deny for the rest of people. What I want is that when anybody enter the site from another IP, he'll view the error403.html page.
The filter is working fine, but not the redirection. When I try to enter the site from a deny IP, I see an Apache message:
Found
The document has moved here
Where "here" is a link to error403.html.
I think I'm restricting even the error403.html page.
How can I do this restriction, but allowing the view of the error page? Should I move error403.html page to another directory (i.e., /views/error/ ) and put other .htaccess in it, allowing in that file all the IPs?
Thank you in advance!

Yes, you have answered your own question. :) Move all non-protected pages into another directory with its own .htaccess containing the proper Allow and Deny.

Related

How can I restrict access to a folder or a file to other users in htaccess?

I want to restrict access to some folders and files but only when a user tries to access to it through the url, not when the website access to these files. For example restrict the folders images, javascript,...
I tried in different ways but I always got error 500.
Basically, I don't want external users to list my website directory and open their files, if it is possible to accomplish.
Thanks in advance
This is pure mod_rewrite based solution:
RewriteRule ^(includes/|submit\.php) - [F,L,NC]
This will show forbidden error to use if URI contains certain paths.
You are getting a 500 error because the container cannot be used in an htaccess file (which is essentially all inside a directory container for the directory that it's in). What you need to do is remove the container from your htaccess file, and leave the Deny from all bit:
htaccess file in your document root:
Refuse direct access to all files
Order deny,allow
Deny from all
Allow from 127.0.0.1
Then create an htaccess file in the uploads/files/, uploads/images/pages/ and uploads/images/store/ (and whatever other directories that you want to allow access to):
Allow from all
put .htaccess and edit with "Deny from all".
That's it.

403 forbidden response still sends body

I set up my .htaccess file so that only certain IP ranges can access the /admin portion of my site, as asked in this question: Deny access to URI
That works... in testing. When I tried this on my live, https enabled, site something strange happened:
When I GET the /admin page, I receive a 403 Forbidden status code but I also get the body as if nothing happened.
How is that possible, and how do I fix it?
Here's the eventual .htaccess:
SetEnvIf Request_URI ^(?!/admin) not_admin_uri
Order deny,allow
Deny from all
Allow from 127.0.0.1
allow from 366.241.93.
allow from env=not_admin_uri
Also: if I remove the last allow rule it actually does block the request (though it then, of course, blocks all reguest)
The document for the 403 status code (which was 403.shtml) did not exist, in which case Apache apparently just executes the request.

Block downloading of files, but show on my own site pages

I want to block downloading of images from a directory, but allow them to be displayed on my own blog's pages (on same domain).
Created following .htaccess file
order deny, allow
deny from all
allow from mydomain.ru
It blocks downloading AND blocks showing images on my blog's pages.
What am I missing?
Shared hosting, ubuntu linux, apache. I don't have access to httpd.conf
allow from mydomain.ru will block all the requests that do not come from the IP address that mydomain.ru resolves to. So assuming you are not coming from that IP, that is why the images are blocked.
I don't know how your images are being served, but you may be able to block if the Referer does not match the domain name. This could easily be forged so it's by no means foolproof.
If your html has a number of links to the images, the following would work:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !(.*\.)?mydomain.ru$
RewriteRule /path/to/directory - [F]

Apache cross domain 404

I want to make apache to always open up a signle page for 404 errors from all subdomains.
The problem is that my subdomains are located in subfolders in public_html, and thus have a different root path.
For example the main domain this works quite well:
ErrorDocument 404 /Error/404.html
The Error folder and the main domain are located in public_html respectively.
However for the forum subdomain, located in public_html/forum/ the above root path does not, and it actually looks for public_html/forum/Error/404.html which doesn't exist.
I tried to rewrite rule for the forum folder, but it didn't work out either:
ErrorDocument 404 /../Error/404.html
Seems, it cannot go below the root folder for some reason.
Any ideas how can I refer to the same page from the main and the subdomain alike, without triggering redirects? (eg: http://mysite/Error/404.html would accomplish this, but would also change the url address of the page which I don't want)
Seems, it cannot go below the root folder for some reason.
Because being able to traverse above the document root is a very, very serious security risk. If your webserver gets compromised, people would be able to serve all kinds of files anywhere on your entire server.
If you have access to server config you can setup aliases for the /Error folder. For example, in your forum subdomain's vhost config, you can add:
Alias /Error /path/to/public_html/Error/
This way, when you go to http://forum.yourdomain.com/Error/404.html you'd actually be looking at http://main.yourdomain.com/Error/404.html. Then you can just use:
ErrorDocument 404 /Error/404.html
like normal in your forum subdomain.
But if you don't have access to your server/vhost config, you'll need to use mod_proxy and mod_rewrite. So in the htaccess file in public_html/forum/, add these to the top:
RewriteEngine On
RewriteRule ^Error/(.*)$ http://main.yourdomain.com/Error/$1 [L,P]

How can I block mp3 crawlers from my website under Apache?

Is there some way to block access from a referrer using a .htaccess file or similar? My bandwidth is being eaten up by people referred from http://www.dizzler.com which is a flash based site that allows you to browse a library of crawled publicly available mp3s.
Edit: Dizzler was still getting in (probably wasn't indicating referrer in all cases) so instead I moved all my mp3s to a new folder, disabled directory browsing, and created a robots.txt file to (hopefully) keep it from being indexed again. Accepted answer changed to reflect futility of my previous attempt :P
That's like saying you want to stop spam-bots from harvesting emails on your publicly visible page - it's very tough to tell the difference between users and bots without forcing your viewers to log in to confirm their identity.
You could use robots.txt to disallow the spiders that actually follow those rules, but that's on their side, not your server's. There's a page that explains how to catch the ones that break the rules and explicitly ban them : Using Apache to stop bad robots [evolt.org]
If you want an easy way to stop dizzler in particular using the .htaccess, you should be able to pop it open and add:
<Directory /directoryName/subDirectory>
Order Allow,Deny
Allow from all
Deny from 66.232.150.219
</Directory>
From this site: (put this in your .htaccess file)
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://((www\.)?dizzler\.com [NC]
RewriteRule .* - [F]
You could use something like
SetEnvIfNoCase Referer dizzler.com spammer=yes
Order allow,deny
allow from all
deny from env=spammer
Source: http://codex.wordpress.org/Combating_Comment_Spam/Denying_Access
It's not a very elegant solution, but you could block the site's crawler bot, then rename your mp3 files to break the links already on the site.