secure underlaying directory with htaccess - apache

I have created an axtra ftp account for someone else, so he can upload files.(tournament results, about 20/30 htm files and images)
I am also very paranoid, so in case he upload "possible dangerous" files, i do not want those files to be accessible via an http request. With the help of PHP I want to grab the content of those files. (I do not expect troubles with that yet)
Problem:
My hoster does not allow extra ftp accounts have access outside the public_html.
So i thought htacces should solve my problem. Just by deny from all rule.
But with ftp acces this htaccess file can be deleted or changed.
So i tried to add the following code in my main htacces file in the root of my site:
<Directory "/home/xxxx.nl/public_html/xxxxxxxx.nl/onzetoernooien/swissmaster_ftp">
deny from all
</Directory>
My site hung with an internal server error.
I have no access to the httpd file.
My idea was to use an htacces file above this directory.
If the absolute path was incorrect, i could use some kind of wildcard, like *swissmaster?
I have searched on the Apache website, but i get lost in the overwhelming amount of information.
Thanks in advance for any help!

Unfortunately you can't use a <Directory> section in .htaccess, only in the server configuration file. That causes the server error (check your error logs and you'll see the error message). We can't secure a subdirectory with a <Filesmatch "subdir/.*$"> either, as FilesMatch examines only the filename part of the requested URI.
You can, however, use mod_rewrite, along these lines:
RewriteEngine on
RewriteRule ^subdir.*$ - [NC,F]
If the requested URI matches the regex pattern subdir.* (so "subdir" followed by anything else; you may need to tweak the pattern, as it happily catches subdir_new/something.txt too -- I'm sure you get the idea), then mod_rewrite's F flag will return a 403 Forbidden status (the NC stands for No-Case, making the pattern case-insensitive).

Related

Can a .htaccess file be hacked?

On a subdomain I want to use only a .htaccess file for redirects. No PHP, no database or something else will be used. Can a .htaccess file still be hacked? What should I do to protect it?
The apache2.conf file has following lines by default which prevent viewing of htaccess files:
#
# The following lines prevent .htaccess and .htpasswd files from being
# viewed by Web clients.
#
<FilesMatch "^\.ht">
Require all denied
</FilesMatch>
It will not be visible under standard Apache setup which blocks all files starting with.ht from being served. So nobody will be able to view the contents or get at it through the Apache front-end. Take the usual precaution of having it be 644 permissions and not owned by the user that Apache runs as. No extra security needed outside of protecting your server generally.
Check that the standard protection is in place, so it can't be viewed. Easiest way is just to try visiting it in a web browser. You should get a 403 forbidden.
If you're worried you could put the rules in the main server config instead. I wouldn't worry as long as the above is in place.

Shutting down a website using HTaccess does not work

OK, it's very simple but it does not work. I have a wiki site where the root contains an index.php file and the subdirectories contains the content of the wiki (I use PMwiki, so no database is required)
I want to temporarity shutdown the website and make it unaccessible by using an nice HTML page to display the shutdown message. I could rename the index.php file, but the rest of the files in the subfolder will remain accessible.
The first thing that worked but which is not elegant is restricting the whole site with a password in the htaccess using "Require valid-user" and all it's other command. The problem is that I cannot display a nice shutdown message as an HTML file.
Else I tried renaming the index.php file to something else like site.php. Creating a index.html file as a message and using a script like this:
Order Deny, allow
Deny from all
<File "index.html">
Allow from all
</File>
In that case, the index.html file is accessible, but it must be manually typed in the URL, it will not use this file by default. I tried adding DirectoryIndex directive like this
DirectoryIndex index.html
But it still does not work.
So first is there a way to make the user only see 1 page in particular and block everything else.
Second, doing so makes the site unaccessible to me. So is there a way to passords restrict the whole directory structure except for a specific index.html file. So that I could type url/site.php and be able to enter my website using an htaccess password.
Thanks for any help
Just this rule in root .htaccess should be able to handle this:
RewriteEngine On
RewriteBase /
RewriteRule !^shutdown\.html$ shutdown.html [L,NC]
Now you can keep custom HTML content in /shutdown.html. Keep in mind you need to use inline css/js since it will also rewrite css/js requests to /shutdown.html file.

What is .htaccess file?

I am a beginner to Zend framework and I want to know more about the .htaccess file and its uses. Can somebody help me?
I found an example like this:
.htacess file
AuthName "Member's Area Name"
AuthUserFile /path/to/password/file/.htpasswd
AuthType Basic
require valid-user
ErrorDocument 401 /error_pages/401.html
AddHandler server-parsed .html
It's not part of PHP; it's part of Apache.
http://httpd.apache.org/docs/2.2/howto/htaccess.html
.htaccess files provide a way to make configuration changes on a per-directory basis.
Essentially, it allows you to take directives that would normally be put in Apache's main configuration files, and put them in a directory-specific configuration file instead. They're mostly used in cases where you don't have access to the main configuration files (e.g. a shared host).
.htaccess is a configuration file for use on web servers running the
Apache Web Server software.
When a .htaccess file is placed in a directory which is in turn 'loaded via the Apache Web Server', then the .htaccess file is detected and executed by the Apache Web Server software.
These .htaccess files can be used to alter the configuration of the Apache Web Server software to enable/disable additional functionality and features that the Apache Web Server software has to offer.
These facilities include basic redirect functionality, for instance if a 404 file not found error occurs, or for more advanced functions such as content password protection or image hot link prevention.
Whenever any request is sent to the server it always passes through .htaccess file. There are some rules are defined to instruct the working.
Below are some usage of htaccess files in server:
1) AUTHORIZATION, AUTHENTICATION: .htaccess files are often used to specify the security restrictions for the particular directory, hence the filename "access". The .htaccess file is often accompanied by an .htpasswd file which stores valid usernames and their passwords.
2) CUSTOMIZED ERROR RESPONSES: Changing the page that is shown when a server-side error occurs, for example HTTP 404 Not Found.
Example : ErrorDocument 404 /notfound.html
3) REWRITING URLS: Servers often use .htaccess to rewrite "ugly" URLs to shorter and prettier ones.
4) CACHE CONTROL: .htaccess files allow a server to control User agent caching used by web browsers to reduce bandwidth usage, server load, and perceived lag.
More info : http://en.wikipedia.org/wiki/Htaccess
You are allow to use php_value to change php setting in .htaccess file. Same like how php.ini did.
Example:
php_value date.timezone Asia/Kuala_Lumpur
For other php setting, please read http://www.php.net/manual/en/ini.list.php
Htaccess is a configuration file of apache which is used to make changes in the configuration on a directory basis.
Htaccess file is used to do changes in functions and features of the apache server.
Htaccess is used to rewrite the URL.
It is used to make site address protected.
Also to restrict IP addresses so on particular IP address site will not be opened
You can think it like php.ini files sub files.. php.ini file stores most of the configuration about php like curl enable disable. Where .htaccess makes this setting only for perticular directory and php.ini file store settings for its server' all directory...
It is not so easy to give out specific addresses to people say for a conference or a specific project or product.
It could be more secure to prevent hacking such as SQL injection attacks etc.
.htaccess file create in directory /var/www/html/.htaccess
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ index.php [QSA,L]
</IfModule>
What
A settings file for the server
Cannot be accessed by end-user
There is no need to reboot the server, changes work immediately
It might serve as a bridge between your code and server
We can do
URL rewriting
Custom error pages
Caching
Redirections
Blocking ip's

mod_rewrite to absolute path in .htaccess - turning up 404

I want to map a number of directories in a URL:
www.example.com/manual
www.example.com/login
to directories outside the web root.
My web root is
/www/htdocs/customername/site
the manual I want to redirect to is in
/www/customer/some_other_dir/manual
In mod_alias, this would be equal to
Alias /manual /www/customer/some_other_dir/manual
but as I have access only to .htaccess, I can't use Alias, so I have to use mod_rewrite.
What I have got right now after this question is the following:
RewriteRule ^manual(/(.*))?$ /www/htdocs/customername/manual/$2 [L]
this works in the sense that requests are recognized and redirected properly, but I get a 404 that looks like this (note the absolute path):
The requested URL /www/htdocs/customername/manual/resourcename.htm
was not found on this server.
However, I have checked with PHP: echo file_exists(...) and that file definitely exists.
why would this be? According to the mod_rewrite docs, this is possible, even in a .htaccess file. I understand that when doing mod_rewrite in .htaccess, there will be an automated prefix, but not to absolute paths, will it?
It shouldn't be a rights problem either: It's not in the web root, but within the FTP tree to which only one user, the main FTP account, has access.
I can change the web root in the control panel anytime, but I want this to work the way I described.
This is shared hosting, so I have no access to the error logs.
I just checked, this is not a wrongful 301 redirection, just an internal rewrite.
In .htaccess, you cannot rewrite to files outside the wwwroot.
You need to have a symbolic link within the webroot that points to the location of the manual.
Then in your .htaccess you need the line:
Options +SymLinksIfOwnerMatch
or maybe a little more blindly
Options +FollowSymlinks
Then you can
RewriteRule ^manual(/(.*))?$ /www/htdocs/customername/site/manual/$2 [L]
where manual under site is a link to /www/customer/some_other_dir/manual
You create the symlink on the command line with:
ln -s /www/htdocs/customername/site/manual /www/customer/some_other_dir/manual
But I imagine you're on shared hosting without shell access, so look into creating symbolic links within CPanel,Webmin, or whatever your admin interface is. There are php/cgi scripts that do it as well. Of course, you're still limited to the permissions that the host has given you. If they don't allow you to follow symlinks as a policy, you cannot override that within your .htaccess.
AFAIK mod_rewrite works at the 'protocol' level (meaning on the wire HTTP). So I suspect you are getting HTTP 302 with your directory path in the location.
So I'm afraid you might be stuck unless.. your hosting lets you follow symbolic links; so you can link to that location (assuming you have shell access or this is possible using FTP or your control panel) under your current document root.
Edit: It actually mentions URL-file phase hook in the docs so now I suspect the directory directives aren't allowing enough permissions.
This tells you what you need to know.
The requested URL /www/htdocs/customername/manual/resourcename.htm
was not found on this server.
It interprets RewriteRule ^manual(/(.*))?$ /www/htdocs/customername/manual/$2 [L] to mean rewrite example.com/manual/ as if it were example.com/www/htdocs/customername/manual/.
Try
RewriteRule ^manual(/(.*))?$ /customername/manual/$2 [L]
instead.

How can I redirect requests to specific files above the site root?

I'm starting up a new web-site, and I'm having difficulties enforcing my desired file/folder organization:
For argument's sake, let's say that my website will be hosted at:
http://mywebsite.com/
I'd like (have set up) Apache's Virtual Host to map http://mywebsite.com/ to the /fileserver/mywebsite_com/www folder.
The problem arises when I've decided that I'd like to put a few files (favicon.ico and robots.txt) into a folder that is ABOVE the /www that Apache is mounting the http://mywebsite.com/ into
robots.txt+favicon.ico go into => /fileserver/files/mywebsite_com/stuff
So, when people go to http://mywebsite.com/robots.txt, Apache would be serving them the file from /fileserver/mywebsite_com/stuff/robots.txt
I've tried to setup a redirection via mod_rewrite, but alas:
RewriteRule ^(robots\.txt|favicon\.ico)$ ../stuff/$1 [L]
did me no good, because basically I was telling apache to serve something that is above it's mounted root.
Is it somehow possible to achieve the desired functionality by setting up Apache's (2.2.9) Virtual Hosts differently, or defining a RewriteMap of some kind that would rewrite the URLs in question not into other URLs, but into system file paths instead?
If not, what would be the preffered course of action for the desired organization (if any)?
I know that I can access the before mentioned files via PHP and then stream them - say with readfile(..), but I'd like to have Apache do as much work as necessary - it's bound to be faster than doing I/O through PHP.
Thanks a lot, this has deprived me of hours of constructive work already. Not to mention poor Apache getting restarted every few minutes. Think of the poor Apache :)
It seems you are set to using a RewriteRule. However, I suggest you use an Alias:
Alias /robots.txt /fileserver/files/mywebsite_com/stuff/robots.txt
Additionally, you will have to tell Apache about the restrictions on that file. If you have more than one file treated this way, do it for the complete directory:
<Directory /fileserver/files/mywebsite_com/stuff>
Order allow,deny
Allow from all
</Directory>
Can you use symlinks?
ln -s /fileserver/files/mywebsite_com/stuff/robots.txt /fileserver/files/mywebsite_com/stuff/favicon.ico /fileserver/mywebsite_com/www/
(ln is like cp, but creates symlinks instead of copies with -s.)