I have a server using Joomla on an AWS EC2, 64bit Amazon Linux v2.1.3, PHP 5.6, and we would like to prevent access to the /administrator folder php files, with the exception of our office ip, and the ip of the server, since the folder contains libraries used by scripts not located in that folder.
I put together the following using LocationMatch, but it is not working. Access to the server is not restricted.
I am not very familiar with Apache, and especially with SetEnvIf. Is the below setting the env=allow no matter what? Is there a way to test that? Is there anything else that is wrong?
<LocationMatch "/(administrator|tmpl)">
SetEnvIf Request_URI "\.(css|js|html|htm|gif|jpg|png|jpeg)$" allow
Deny from all
##except if either of these are satisfied
Satisfy any
##1. a valid authenticated user
Allow from ip1 ip2
## or 2. allow is set
Allow from env=allow
</LocationMatch>
Satisfy directive is only useful if access to a particular area is being restricted by both username/password and client host address: with the Any option the client will be granted access if they either pass the host restriction or enter a valid username and password. Since you don't have any user restrictions, with Satisfy any you are effectively granting access to everyone.
Since you are running 2.4, this should work:
<LocationMatch "/(administrator|tmpl)">
SetEnvIf Request_URI "\.(css|js|html|htm|gif|jpg|png|jpeg)$" allow
<RequireAny>
Require env allow
Require ip 10.0.2.2 10.0.2.3
</RequireAny>
</LocationMatch>
If your server is behind ELB, the connection to Apache will come from load balancer and not directly from the client, so IP address can not be used in Require ip. But ELB adds several request headers in order to pass this information to the origin server, one of them being the X-Forwarded-For which will contain the IP address of the client. If the original request already contained this header (which is not unusual at all), ELB will append the client IP address to existing value(s) so you will get comma+space separated list of IP addresses. The last (rightmost) IP address is always the IP address that connects to the last proxy (your ELB), which means that is the one you want to test against, so try:
<LocationMatch "/(administrator|tmpl)">
SetEnvIf Request_URI "\.(css|js|html|htm|gif|jpg|png|jpeg)$" allow
SetEnvIf X-Forwarded-For x.x.x.x$ office
SetEnvIf X-Forwarded-For y.y.y.y$ bar
<RequireAny>
Require env allow
Require env office
Require env bar
</RequireAny>
</LocationMatch>
Related
OK - so I have a developer that does not want our REST endpoints to be accessible externally with the only access allowed is localhost and the internal network scheme. Our internal network scheme is 10.10.x.x.
The way we did this is with the < LocationMatch > switch in the .conf file as follows:
<LocationMatch "/foo/bar/*">
Order deny,allow
Deny from all
Allow from 10.10
Allow from 127
</LocationMatch>
Now, the challenge we are having is that the AWS Load Balancer has an X-Forward-Host rule on it so all original source IPs and if I do Allow from 10 - obviously, will allow access to all endpoints externally because of this.
As stated before, our internal IP is 10.10 so I can do allow from 10.10 and that would resolve it but if I make more regions then the network scheme could be 10.20.x. 10.30.x.x 10.40.x.x and then it becomes a bit of an administrative nightmare.
So, what makes sense is someone mentioned to do something on the http.conf level:
<Directory />
#Example..
SetEnvIF X-Forwarded-For "(,| |^)192\.168\.1\.1(,| |$)" DenyIP
SetEnvIF X-Forwarded-For "(,| |^)10\.1\.1\.1(,| |$)" DenyIP
Order allow,deny
Deny from env=DenyIP
Allow from all
</Directory>
found from this blog
So, I am unsure how to follow this format and ensure that it denies all external IPs to these directories.
Would the http.conf file have something like:
<VirtualHost>
#Example..
SetEnvIF X-Forwarded-For "(,| |^)*\.*\.*\.*(,| |$)" DenyIP
</VirtualHost>
and my other conf file with the < LocationMatch > rules have:
<LocationMatch "/foo/bar/*">
Order deny,allow
Deny from env=DenyIP
Allow from 10.
Allow from 127
</LocationMatch>
Thanks for your help!
Rather than modifying apache, use Security Groups!
Create a security group for your Elastic Load Balancer. Allow in-bound access from 0.0.0.0/0 for ports 80 & 443.
Create a security group for your apache server(s). Allow in-bound access from the ELB Security Group (a security group can reference another security group). Also add access so you can SSH into the server(s).
That's it! The security groups will block traffic that attempts to access your apache server(s) without passing through the Load Balancer.
See:
Amazon EC2 Security Groups for Linux Instances
Configure Security Groups for Your Load Balancer
I have a very simple .htaccess file:
<RequireAll>
Require all granted
# require localhost
Require ip 127.0.0.1
</RequireAll>
and it works... sometimes!
Sometimes, it will throw me a 403, and the error.log explains:
[client ::1:65443] AH01630: client denied by server configuration
Why won't it match that local client to the Require ip 127.0.0.1 rule?
As it turns out, Apache 2.4's Require matches the IP exactly. If you have multiple IP addresses aliasing localhost, you need to list all of them (or use a special alias, if one exists, as explained below).
In this particular case, the error.log entry reveals it all: The client connected through the IPv6 interface (ip == ::1). That needs to be white-listed as well:
<RequireAll>
Require all granted
# require localhost
<RequireAny>
Require ip 127.0.0.1
Require ip ::1
</RequireAny>
</RequireAll>
Any suggestions as to whether there is a simpler/safer method to get this done, are very welcome!
Update
As Helge Klein suggests, Require local is a more concise alternative:
<RequireAll>
Require all granted
# require localhost
Require local
</RequireAll>
Require ip 127.0.0.1
Require ip ::1
The Require all granted is the equivalent to:
Order allow,deny
Allow from all
from earlier Apache versions, which open the site to everyone. If your intention is to block the site to everyone, except certain IPs, you should start with a:
Require all denied
You can find more info here: Upgrading to 2.4 from 2.2
I don't use .htaccess since I have Apache installed on my workstation, and have full access to the http.conf file. But for a site like phpmyadmin where I want to limit where people log from, I have this:
Require all denied
Require ip 127.0.0.1
First line denies access to everyone, including my own workstation.
Second line adds my workstation localhost ip to the list of only allowed connections.
No RequireAll or RequireAny tags. Again in .htaccess those tags may be needed.
I'm testing out Amazon Cloudfront in our dev environment, which is protected by .htaccess/.htpasswd. The password protection on the dev server is causing all of the cloudfront.net assets to be password protected as well. And no username/password combination works.
What I need to do is allow cloudfront to access the dev server by poking some holes in the .htaccess protection.
I can get a list of IP addresses here but since they are subject to change, I was wondering if anyone knew of a better way.
I cannot remove the password protection unfortunately.
Here's what I've been using:
SetEnvIf User-Agent ^Amazon Cloudfront$ cdn
SetEnvIf Host ^static.dev.website.com$ cdn # This could also be a cloudfront.net host
AuthType Basic
AuthName "Dev Site"
AuthUserFile /directory/location/.htpasswd
Require valid-user
Order Deny,Allow
Deny from all
Allow from env=cdn
Satisfy Any
Essentially, you need you exclude CloudFront-related requests, which the first two lines handle, in conjunction with the Allow from env=cdn line.
Via google analytics I noticed that there is website which is scrapping my content automatically.. His content 100% matches mine. is there any way I could block that website host from accesing my server at all? Any solutions what I could do about this?
Im running LAMP web host on CentOS.
If the IP address of the scraping host is static, you can use .htaccess to block this IP, like:
order allow,deny
deny from 111.111.111.111
allow from all
If the IP address is variable, but the user agent is constant, you can use agent blocking:
BrowserMatchNoCase SpammerRobot bad_bot
BrowserMatchNoCase SecurityHoleRobot bad_bot
Order Deny,Allow
Deny from env=bad_bot
I am trying to open my host address using my host name, but I am getting following error:
You can see the host URL above.
Can someone help me to resolve this issue?
As the error states you cannot access it outside you local network. In another words - your apache xampp is configured to accept calls only from 127.0.0.1 or localhost. For xampp this is defined in location match directive for apache look at the following topic, it covers the most common cases. Note however that this might be security issue.
http://www.apachefriends.org/f/viewtopic.php?p=185823
This will provide you more info on how allow and deny can be configured
http://httpd.apache.org/docs/2.2/mod/mod_authz_host.html#allow
Something like this should be in your httpd-xampp.conf
<LocationMatch "^/(?i:(?:xampp|licenses|phpmyadmin|webalizer|server-status|server-info))">
Order deny,allow
Deny from all
Allow from ::1 127.0.0.0/8 **your.local.ip.address**
ErrorDocument 403 /error/HTTP_XAMPP_FORBIDDEN.html.var
</LocationMatch>