I have an apache webserver running on centos environment. There is a folder and in that there is a file which has an extension .exe lets name the file x.exe
when I try download this file using http://mysite.com/folder/x.exe I get a 403 error.
but if I add a gif to that folder it works http://mysite.com/folder/pic.gif
I dont have SSH access to this server but need to know some clue for why this is happenning, the file permissions are correct too.
any help is appreciated
Within Apache's httpd.conf, it is possible to specify default handling actions for certain file types or paths. It may be that your server is configured to block executable files all together. Similar blocking can also occur in an .htaccess file. There are a few ways to do it... here's one:
<Files ~ "\.exe$">
Order allow,deny
Deny from all
</Files>
That little snippet could be in the main .conf file, and included .conf file, OR an .htaccess file (or all three!), and again, that is just one possibility. Your best bet is to check out the server logs. They will indicate why a given request was denied in a form similar to this:
[Wed Oct 11 14:32:52 2000] [error] [client 127.0.0.1] client denied by
server configuration: /www/root
Take a look at this document for information about server logs (including default paths to the logs themselves).
As I mentioned, there are a few other ways to block access to certain file types, certain files, certain folders, etc. Without looking at the error logs, it is very difficult to determine the cause. Further, without full access to the server, it may not be possible to alter this behavior. This blockage could be in place as a matter of policy for your web host.
I'd like to add I spent like 2 hours trying this crap over and over again only to discover that selinux was denying specific file types for httpd.
try:
setenforce Permissive
and see if that corrects the error
tag
Fedora 16
well the answer was I had this in a folder where it forbids the exe
Deny from all
<FilesMatch "\.(html|HTML|htm|HTM|xhtml|XHTML|js|JS|css|CSS|bmp|BMP|png|PNG|gif|GIF|jpg|JPG|jpeg|JPEG|ico|ICO|pcx|PCX|tif|TIF|tiff|TIFF|au|AU|mid|MID|midi|MIDI|mpa|MPA|mp3|MP3|ogg|OGG|m4a|M4A|ra|RA|wma|WMA|wav|WAV|cda|CDA|avi|AVI|mpg|MPG|mpeg|MPEG|asf|ASF|wmv|WMV|m4v|M4V|mov|MOV|mkv|MKV|mp4|MP4|swf|SWF|flv|FLV|ram|RAM|rm|RM|doc|DOC|docx|DOCX|txt|TXT|rtf|RTF|xls|XLS|xlsx|XLSX|pages|PAGES|ppt|PPT|pptx|PPTX|pps|PPS|csv|CSV|cab|CAB|arj|ARJ|tar|TAR|zip|ZIP|zipx|ZIPX|sit|SIT|sitx|SITX|gz|GZ|tgz|TGZ|bz2|BZ2|ace|ACE|arc|ARC|pkg|PKG|dmg|DMG|hqx|HQX|jar|JAR|xml|XML|pdf|PDF)$">
Allow from all
</FilesMatch>
added exe there and worked fine,
also a note, this was in a SilverStripe CMS powered site, and in the assets folder of SilverStripe
Related
On a subdomain I want to use only a .htaccess file for redirects. No PHP, no database or something else will be used. Can a .htaccess file still be hacked? What should I do to protect it?
The apache2.conf file has following lines by default which prevent viewing of htaccess files:
#
# The following lines prevent .htaccess and .htpasswd files from being
# viewed by Web clients.
#
<FilesMatch "^\.ht">
Require all denied
</FilesMatch>
It will not be visible under standard Apache setup which blocks all files starting with.ht from being served. So nobody will be able to view the contents or get at it through the Apache front-end. Take the usual precaution of having it be 644 permissions and not owned by the user that Apache runs as. No extra security needed outside of protecting your server generally.
Check that the standard protection is in place, so it can't be viewed. Easiest way is just to try visiting it in a web browser. You should get a 403 forbidden.
If you're worried you could put the rules in the main server config instead. I wouldn't worry as long as the above is in place.
I'm getting a weird problem here.
I have at least 30 localhost WordPress installs made before and every one of them went fine.
Now (after re-installing Win7 and XAMPP) I can't access ANY install.php file (whether it's WPs own, a dummy empty one, etc). Also, it doesn't matter where it's located (wp-admin folder, a random place outside WP, htdocs root folder, etc).
This is not an antivirus or Windows firewall problem.
It seems to be coming from apache itself.
The apache error log says this
[Fri Nov 30 16:46:40.223524 2012] [access_compat:error] [pid 5876:tid 1604] [client ::1:59365] AH01797: client denied by server configuration: D:/xampp/htdocs/vmf05/wp-admin/install.php
Does anyone have a clue on this?
I've went through all the normal steps and haven't found a solution yet.
Also, if I disable the access_compat module, apache won't start.
Thanks in advance for your help.
I can't believe what the problem was!
I moved a .htaccess file that had the WP better security directives to the root of xampp's site as a backup, and forgot to move it back after.
No install.php file ANYWHERE could be accessed since it had the following:
<files install.php>
Order allow,deny
Deny from all
</files>
And, since the file was on the root, it was affecting the whole server.
I installed Apache2, php, and mysql onto my Linux Mint machine with the hopes of continuing a website I had built. After copy and pasting all of the code I had I noticed a problem with one of my include statments:
<?php include("./dir/file1.html");
That wasn't working. Originally I thought the issue was with php but after a lot of trial and error I've concluded it's apache not allowing access to subdirectories in the /var/www/ directory.
Since I'm new to editing apache configuration files, I'm not really sure what to change to allow access to all subdirectories within /var/www/ on localhost. I've tried adding:
<Directory /var/www/*>
order allow,deny
allow from all
</Directory>
to my httpd.conf file (which was blank, which I learned had something to do with Linux Mint being Debian based) and confirmed that default in /sites-available had similar code. I'll post that if it's requested.
I'm unsure on what else I can do to get apache to allow access to subdirectories in my /var/www/ directory for localhost and none of my previous methods have worked.
UPDATE:
I believe it's an Apache issue because when trying to go to a subdirectory through the browser (like localhost/dir/), I get a 403 error. I don't have to be going to an actual webpage for that problem. Also, include statments including files in the current directory has no problem, only with subdirectories.
The Include statement above gives no errors or any other useful messages. Whatever the include statement is including is just not there. I've tried require but that gives me a 500 server error: the server may be down for maintenance (paraphrased).
How can I get Apache to display the contents of my folder and provide links to them? Similar to http://www.kernel.org/pub/linux/?
I don't have access to the Apache configuration, so I'm looking for something in the way of .htaccess or something I can do in just my home folder.
You have to use the option Indexes in the Apache configuration.
For instance, in the .htaccess file for your directory (if you can use those, and have sufficient privileges), you could put :
Options +Indexes
This functionality is provided by mod_autoindex, which has lots of options to allow fine-tuning of the generated output, btw
To get this working, that module must be loaded, and to be able to activate the option in an .htaccess file, you will need the admin of the server to give you some privileges (with the AllowOverride directive in the Apache's main config file, if I remember correctly -- well, your admin should know that better than me anyway ^^ )
I've written a script which takes the summary of an order and stores in into an XML file, except the problem is that I don't want people to be able to open the XML file in their browser, obviously.
I'm hosted on a very dodgy shared server with limited abilities: no SSH, for starters.
Is there a place I can put this file so that PHP will still be able to read/write to it, but web browsers won't be able to get to it?
Ordinarily, I'd create a folder outside the document root and put it there, but I get a "Permission denied" message when I try that.
The folders which are there are:
anon_ftp
bin
cert
cgi-bin
conf
error_docs
etc
httpdocs
httpsdocs
pd
private
statistics
subdomains
web_users
PHP can't access the file when it's in the private folder. Would this be possible using .htaccess?
You could create a directory containing a .htaccess file that looks something like the following:
Deny from all
This will instruct Apache not to serve files from that directory; any attempts to access the directory or its contents will be met with a "403 Forbidden" response from the server.
Note: This depends upon the host not having removed Limit from the list of options in their AllowOverride directive; most shared hosts shouldn't have a reason to do this.
I worked around it by putting the XML file in my httpdocs folder, but added a .htaccess file with this in it:
<Files ~ "myfile.xml">
Order allow,deny
Deny from all
</Files>
Couldn't you ask the shared-hosting provider to create an outside-web-root folder for you? I've certainly done this in the past.