.htaccess rule FilesMatch to all subfolders (subfolders was randomly created) - apache

I tried to find solution here, but I don't. I working with htaccess every day but I have one problem which I can't solve.
I have directory structure:
/cmd/user_files/**[random_folder]**/avatars/pic.jpg
In /cmd/user_files/ I have more folders which randomly created over some app. Under every that randomly created folders I have some files and pictures which I don't want to access from public, and I have one subfolder "avatars" for which I want to show only .jpg
I created .htaccess in /cmd/user_files/ and I added this:
order allow,deny
deny from all
allow from 127.0.0.1
With this I deny access to every file and subfolder under /cmd/user_files/.
Now, what I need to add to this .htaccess that I can show .jpg from
/cmd/user_files/[random_folder]/avatars/*.jpg

Instead of using mod_auth... it may be easier to use mod_rewrite instead. For example, in the /cmd/user_files/.htaccess file:
RewriteEngine On
RewriteRule !^[^/]+/avatars/.*\.jpg$ - [F]
This blocks (403 Forbidden) access to all files/folders except the URL-path that matches the regex ^[^/]+/avatars/.*\.jpg$ (relative to the /cmd/user_files/ subdirectory).
UPDATE: Note the ! prefix on the RewriteRule pattern - this negates the regex. This is an Apache operator, it's not part of the regex itself.
Side note... order allow,deny are Apache 2.2 directives. If you are on Apache 2.4+ then there is a different syntax.

Related

Apache .htaccess <FilesMatch> // Setting as forbidden subfolder files

I'm going mad over Apache .htaccess
I'm trying to setting as protected my subfolders using relative address, but it seems impossible.
The path of Apache folder is structured like this:
/var/www/apachedir
now I want to protect
/var/www/apachedir/subfolder/*
What I tryied is putting in /var/www/apachedir/ an .htaccess file like this
<FilesMatch "subfolder\/.*">
Order Allow,Deny
Deny from all
</FilesMatch>
but it seems not woking good.
I don't want to use ModRewrite and I want to make this .htaccess reusable.
So, listen, if I put the site over an other server that has a direcory structure like /var/www/zzz it has to protect files in /var/www/zzz/subfolder/*.
Also the file .htaccess has to stay in the root folder /var/www/apachedir.
There's a way to do it?
Edit:
I don't want to use ModRewrite but also I don't want to use Redirectmatch.
I want to know if there's a way to set it up with FilesMatch without ModRewrite or Redirectmatch.
I don't want to use ModRewrite.
You can use RedirectMatch to block access to a known path:
Redirectmatch 403 ^/subfolder/
I want to know if there's a way to set it up with FilesMatch
No, because the FilesMatch (and the non-regex Files) directive(s) literally match against files only, not directories. eg. <Files "*.jpg"> matches all .jpg files in any subdirectory.
There are various methods to block access to that subdirectory...
Use a <Directory> section in the server config
If you have access to the server (virtual host) config then you can use the <Directory> (and <DirectoryMatch>) directive(s) to target specific directories. But this is not permitted in .htaccess. For example:
<Directory "/var/www/apachedir/subfolder">
Require all denied
</Directory>
Create an additional .htaccess file in that subdirectory
The equivalent userland .htaccess way of doing this is to create an additional .htaccess file in that subdirectory (ie. at /subfolder/.htaccess) with a single Require all denied directive. The .htaccess file itself is equivalent to the <Directory> directive in the server config.
Aside: Order, Deny and Allow are Apache 2.2 directives and formerly deprecated on Apache 2.4 (which you are far more likely to be using). You should be using the equivalent Require (mod_authz_core) directives instead, as used above.
Use Redirect 403 (mod_alias) - not a "redirect"
I don't want to use ModRewrite but also I don't want to use Redirectmatch
RedirectMatch (and Redirect) are part of mod_alias - this is a base module and compiled into Apache by default (unlike mod_rewrite), so using the prefix-matching Redirect directive (no need for the regex variant RedirectMatch) is a reasonable solution as #anubhava suggests in his answer, depending on the scenario and existing directives. For example:
Redirect 403 /subfolder/
Despite the use of the Redirect directive, this is not an external (HTTP) redirect. The 403 response is served via an internal subrequest.
Set an environment variable and check with mod_authz_....
Alternatively, you can set an environment variable when the /subfolder is requested (using SetEnvIf) and check for this using the Require directive. This allows you to keep the condition separate from the directives that actually permit access. For example (using Apache 2.4 mod_authz_core):
SetEnvIf Request_URI "^/subfolder/" BLOCK_ACCESS
<RequireAll>
Require all granted
Require not env BLOCK_ACCESS
</RequireAll>
NB: If you are doing any URL-rewriting with mod_rewrite then you might need to check for REDIRECT_BLOCK_ACCESS instead in the above Require directive.
<If> expression (Apache 2.4)
On Apache 2.4 you can also use an <If> expression to target that specific subfolder with a containing mod_authz_core directive. For example:
<If "%{REQUEST_URI} =~ m#^/subfolder/#">
Require all denied
</If>
Although, strictly speaking, these methods target the URL-path, not the file-path.

Generic .htaccess for multiple websites stored in subdirectories

My development environment is set up for using a single host (localhost). I am developing multiple websites on my machine, each stored under its own directory like this:
/var/www/site1
/var/www/site2
...
The document root is set to /var/www on my machine.
I am using URL rewriting for most of these websites and most of the .htaccess files will rewrite a sub-directory to GET parameters in different ways like this:
http://localhost/site1/home/red -> http://localhost/site1/index.php?page=home&p1=red
http://localhost/site2/index/param1/param2/param3 -> http://localhost/site2/index.php?page=index&p1=param1&p2=param2&p3=param3
I also tend to copy some of these websites under different directories and, when I do that, I have to make a lot of changes in the .htaccess files for the website that I'm copying.
I would like to know if there is a way to define a constant that contains the website's root directory (not the host's document root) and how can that be used with the rewrite rule so that I would need to change only one line of code (setting this constant to a different value) when copying a website.
Putting this in a different form, is there a way to perform rewrites that relate to a website root instead of a host / %{HTTP_HOST} (i.e. the "host" for the website being localhost/site1 instead of localhost) and how can this be done?
I have tried removing the host from each request at the beginning of the script and prepending it back at the end of the script, but this does not work with rewrite rules that use the [L] option.
Thank you!
Regards,
Lucian
You could make an htaccess file with rules like this:
RewriteEngine On
RewriteBase /site1/
RewriteRule ^([^/]+)/([^/]+)/([^/]+)/([^/]+) index.php?page=$1&p1=$2&p2=$3&p4=$4 [L,QSA]
And put this in the directory /var/www/site1, and if you want for it to apply to site2, change the RewriteBase and put the rules in /var/www/site2.

mod_rewrite rule broken if PHP file is named as the rewrite pattern

I want to redirect all requests like:
/xml/doSomething?arg=value
To:
/xml.php?action=doSomething&arg=value
I tried this simple rule:
RewriteEngine
RewriteBase /
RewriteRule ^xml/([^/]+)[/]? xml.php?action=$1 [R=302,L,QSA]
But it not works. The xml.php is executed with arg=value param only. The problem is that there is that file named xml.php, as the first part of my rule. In fact, if I change the rule to:
RewriteRule ^asd/([^/]+)[/]? xml.php?action=$1 [R=302,L,QSA]
And I point to:
/asd/doSomething?arg=value
I'm correctly redirected to:
/xml.php?action=doSomething&arg=value
How the presence of a PHP file named as the first part of my rewrite pattern can break it all?
I solved. The problem is that I had Options MultiViews in my virtual host configuration. I did not know about this option:
A MultiViews search is enabled by the MultiViews Options. If the server receives a request for /some/dir/foo and /some/dir/foo does not exist, then the server reads the directory looking for all files named foo.*
That simply explain why my rule does not work.

Restrict access to a directory by file type

My Google-fu is failing me on this one...
I'm trying to create an Apache config that will only allow access to image, js, and css files in a specific directory.
For example, the following URL should work:
mysite.com/dir/image.gif
but this should be blocked:
mysite.com/dir/page.php
The part I'm struggling with is getting it working only for /dir/. The rest of the directories outside of /dir/ shouldn't be impacted by this directive.
This is what I have so far, which isn't doing what I need (it seems to apply to all directories).
<FilesMatch "\.(gif|jpe?g|jpg|png|js|css)$">
Order deny,allow
Allow from all
</FilesMatch>
How do I only allow access to certain file types within /dir/ but not affect the rest of my directories?
I recently used this:
Options -ExecCGI -Indexes
<FilesMatch "\.*$">
deny from all
</FilesMatch>
<FilesMatch "\.(png|jpg|gif|css)$">
allow from all
</FilesMatch>
I could not find explicit documentation on this but for FilesMatch it appears Apache does not short-circuit at the first match. It processes the entire .htaccess rules.
So the first rule blocks access to all file types and the second then allows the selected types.
Probably needs more testing but had to do something for a client that was easy for them to implement to deal with a web exploit their developers are struggling to fix.
For simplicity, when I do this I usually put all the media files in their own directory. However if this isn't an option you might try the FilesMatch directive:
http://httpd.apache.org/docs/2.2/mod/core.html#filesmatch
You can put a FilesMatch inside a Directory.
I'd generally use mod_rewrite for that
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} ^/my_dir/[^/]+\.php$
RewriteRule .* - [F]

Excluding one directory in .htaccess (not just rewrite rules)

Excluding one or more directories from rewrite rules in .htaccess files seems to be a common question. However, my .htaccess does more than just set rewrite rules. I've also set some server changes (we don't have suPHP on this server) as well as set some prepending of some php files. For example these are a few examples:
# Make files ending in .php, .html and .xml files etc. parsed by php.
AddType application/x-httpd-php .php .html .xml .css .js .le .txt
<FilesMatch "\.html$">
php_value auto_prepend_file "/home/2427/spwebsites/www.spwebsites.co.uk/incs/phps/config.php"
</FilesMatch>
# Internal Server Error
ErrorDocument 500 /admin/errors.html?code=500
RewriteEngine On
RewriteRule ^([a-zA-Z0-9-]+)/$ $1.html [L]
I don't want any of these set for one directory (where my word press installation is), is there a way I can do this? Can I set a conditional statement for the whole .htaccess file?
Adding a blank .htaccess file in the word press directory won't work because this won't undo the settings in the parent directory.
I was just looking into your dilemma and it is a tricky one. It would be nice to be able to have the DirectoryMatch directive available in .htaccess ...
What you can try is to reset your values in the specific directory via another .htaccess file.
So in the case of the AddType perhaps, resetting it back to just ".php" might work (assuming it doesn't inherit the other values). Definitely not an ideal solution with out access to the main config file/ virtual host.
Here is a weird idea that you can try/test ... place the "wordpress" dir outside of the main root (or whereever you have the offending .htaccess file). Now route all requests to the wordpress (inner dir) to the outer dir. I wonder if Apache would not use the offending .htaccess considering the requests are being routed?