Block downloading of files, but show on my own site pages - apache

I want to block downloading of images from a directory, but allow them to be displayed on my own blog's pages (on same domain).
Created following .htaccess file
order deny, allow
deny from all
allow from mydomain.ru
It blocks downloading AND blocks showing images on my blog's pages.
What am I missing?
Shared hosting, ubuntu linux, apache. I don't have access to httpd.conf

allow from mydomain.ru will block all the requests that do not come from the IP address that mydomain.ru resolves to. So assuming you are not coming from that IP, that is why the images are blocked.
I don't know how your images are being served, but you may be able to block if the Referer does not match the domain name. This could easily be forged so it's by no means foolproof.
If your html has a number of links to the images, the following would work:
RewriteEngine On
RewriteCond %{HTTP_REFERER} !(.*\.)?mydomain.ru$
RewriteRule /path/to/directory - [F]

Related

What are ways to block direct access to files on apache but allow them through scripts?

If I type in the direct path to any of the content in my server directory I can see and download the file without being logged in. (example I have a directory foo with a file bar.jpg in it. If I type into the search bar "ip:port/foo/bar.jpg I can see the picture without needing to go through the pages I created). I know it would be difficult to access but if it can happen it eventually will. Is there a way that allows my php script to access files and display them in a webpage but not allow any direct access to the content on my server when typed in? I have tried .htaccess files and directly altering the server config in apache, my access looks like this:
<Directory "C:\xampp\htdocs\RootFolder\Login System">
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?Family [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]
</Directory>
The problem here is that it randomly blocks some images in my page while allowing others through:
GET http://localhost/Family/IMG_2436.jpg 403 (Forbidden)
I have been bashing my head in trying to get this to work so that if someone types in a direct link they get an access denied while accessing through my php page would just show the picture or file. Is there any way to do this? ps I'm using a windows base

apache blocking content from viewing users by hitting url directly

I have disabled the browsing of content by using
Options -Indexes
Now I want to disable directly access of assets(images and videos) by hitting url directly for any project that is present in htdocs folder.
My webserver is Apache Tomcat.
I am accessing the content from a project in apache using a url.
The URL should serve the content in JSP/HTML files deployed in Apache Tomcat. But should be blocked if hacker finds the URL from page source and put the URL in browser.
Thanks in advance.
you have two solutions
1) create a .htacess file and set rules
2) store your protected file somewhere outside the web root folder and acess them using php or any other language.
2nd one is recommended. you can only access your files through server side codes(php,...). Crete a file handling script and check permissions when accessing.
To go with .htaccess file you need to have something like..
Try following:
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^http://(www\.)?localhost [NC]
RewriteCond %{HTTP_REFERER} !^http://(www\.)?localhost.*$ [NC]
RewriteRule \.(gif|jpg)$ - [F]
Returns 403 if you access image directly, but allows them to be displayed on site.
Change localhost to the server/domain name of yours.
NOTE: it is possible that when you open some page with image and than copy that image's path into the address bar you can see that image, it is only because of the browser's cache, in fact that image has not been loaded from the server

How to allow server to access files but not user?

I have a directory with a bunch of files in it & I don't want anybody to be able to access these files by either getting a directory listing or by guessing the file location & typing it in.... it should NOT allow them to download it.
I accomplished this by putting the below in my .htaccess file:
Options -Indexes
Order Allow,Deny
Deny from all
However, I want the user to be able to download the file ONLY IF they access it via a script (which is in a different directory) which will give them the download. At the moment with the above settings it doesn't work.
I thought of putting something like..
Allow from domain.com
But I'm not 100% sure what that means? Does that check where the REQUEST is coming from & hence it would work if the server requests access to that dir? ...or would it still not work as the user is still using the domain via the other script to access the dir?
If you dump the files with an "script" you can store your files outside the documentroot. So you need no htacces file.
Perhaps this is a better workaround.
One way is to redirect the user say to your home page when they try to access your downloadable files inside the folder sec_files in this example.
I researched on this when one of my clients who purchased secure download links a codecanyon product asked for a solution to protect a folder that contained images or downloadable.
the .htaccess code is below. this .htaccess file is placed inside the sec_files i.e downloadable files folder.
RewriteEngine on
RewriteCond %{REQUEST_URI} ^/~sec_files/ [OR]
RewriteCond %{HTTP_HOST} ^www.satyamtechnologies.net$
RewriteRule ^(.*)$ http://www.satyamtechnologies.net [R,L]
See how it works when you access here, it will redirect you to home page but when you access it through a php script here it will let you download the same files.

mod_rewrite to absolute path in .htaccess - turning up 404

I want to map a number of directories in a URL:
www.example.com/manual
www.example.com/login
to directories outside the web root.
My web root is
/www/htdocs/customername/site
the manual I want to redirect to is in
/www/customer/some_other_dir/manual
In mod_alias, this would be equal to
Alias /manual /www/customer/some_other_dir/manual
but as I have access only to .htaccess, I can't use Alias, so I have to use mod_rewrite.
What I have got right now after this question is the following:
RewriteRule ^manual(/(.*))?$ /www/htdocs/customername/manual/$2 [L]
this works in the sense that requests are recognized and redirected properly, but I get a 404 that looks like this (note the absolute path):
The requested URL /www/htdocs/customername/manual/resourcename.htm
was not found on this server.
However, I have checked with PHP: echo file_exists(...) and that file definitely exists.
why would this be? According to the mod_rewrite docs, this is possible, even in a .htaccess file. I understand that when doing mod_rewrite in .htaccess, there will be an automated prefix, but not to absolute paths, will it?
It shouldn't be a rights problem either: It's not in the web root, but within the FTP tree to which only one user, the main FTP account, has access.
I can change the web root in the control panel anytime, but I want this to work the way I described.
This is shared hosting, so I have no access to the error logs.
I just checked, this is not a wrongful 301 redirection, just an internal rewrite.
In .htaccess, you cannot rewrite to files outside the wwwroot.
You need to have a symbolic link within the webroot that points to the location of the manual.
Then in your .htaccess you need the line:
Options +SymLinksIfOwnerMatch
or maybe a little more blindly
Options +FollowSymlinks
Then you can
RewriteRule ^manual(/(.*))?$ /www/htdocs/customername/site/manual/$2 [L]
where manual under site is a link to /www/customer/some_other_dir/manual
You create the symlink on the command line with:
ln -s /www/htdocs/customername/site/manual /www/customer/some_other_dir/manual
But I imagine you're on shared hosting without shell access, so look into creating symbolic links within CPanel,Webmin, or whatever your admin interface is. There are php/cgi scripts that do it as well. Of course, you're still limited to the permissions that the host has given you. If they don't allow you to follow symlinks as a policy, you cannot override that within your .htaccess.
AFAIK mod_rewrite works at the 'protocol' level (meaning on the wire HTTP). So I suspect you are getting HTTP 302 with your directory path in the location.
So I'm afraid you might be stuck unless.. your hosting lets you follow symbolic links; so you can link to that location (assuming you have shell access or this is possible using FTP or your control panel) under your current document root.
Edit: It actually mentions URL-file phase hook in the docs so now I suspect the directory directives aren't allowing enough permissions.
This tells you what you need to know.
The requested URL /www/htdocs/customername/manual/resourcename.htm
was not found on this server.
It interprets RewriteRule ^manual(/(.*))?$ /www/htdocs/customername/manual/$2 [L] to mean rewrite example.com/manual/ as if it were example.com/www/htdocs/customername/manual/.
Try
RewriteRule ^manual(/(.*))?$ /customername/manual/$2 [L]
instead.

How can I block mp3 crawlers from my website under Apache?

Is there some way to block access from a referrer using a .htaccess file or similar? My bandwidth is being eaten up by people referred from http://www.dizzler.com which is a flash based site that allows you to browse a library of crawled publicly available mp3s.
Edit: Dizzler was still getting in (probably wasn't indicating referrer in all cases) so instead I moved all my mp3s to a new folder, disabled directory browsing, and created a robots.txt file to (hopefully) keep it from being indexed again. Accepted answer changed to reflect futility of my previous attempt :P
That's like saying you want to stop spam-bots from harvesting emails on your publicly visible page - it's very tough to tell the difference between users and bots without forcing your viewers to log in to confirm their identity.
You could use robots.txt to disallow the spiders that actually follow those rules, but that's on their side, not your server's. There's a page that explains how to catch the ones that break the rules and explicitly ban them : Using Apache to stop bad robots [evolt.org]
If you want an easy way to stop dizzler in particular using the .htaccess, you should be able to pop it open and add:
<Directory /directoryName/subDirectory>
Order Allow,Deny
Allow from all
Deny from 66.232.150.219
</Directory>
From this site: (put this in your .htaccess file)
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://((www\.)?dizzler\.com [NC]
RewriteRule .* - [F]
You could use something like
SetEnvIfNoCase Referer dizzler.com spammer=yes
Order allow,deny
allow from all
deny from env=spammer
Source: http://codex.wordpress.org/Combating_Comment_Spam/Denying_Access
It's not a very elegant solution, but you could block the site's crawler bot, then rename your mp3 files to break the links already on the site.