I am working on a website on a low memory server. I edit/update the website in a CMS and then do a nighly wget to an other folder/vhost that serves fast static pages to the users.
It works great and fast, but now I found out that I have some pages that don't work.
Wget downloads index.php?id=10 and creates the file index.html?id=10.html
One specific folder of my website has the files index.html, index.html?id=1, index.html?id=2, index.html?id=3 etc
When i open www.mysite.com/myfolder/index.html?id=1.html in my browser i get index.html, even if the file index.html?id=1.html exists.
Does anybody have any solution for how to fix this? maybe an htaccess solution or something else?
I am looking forward to any ideas,
Related
I took over a website which I'm supposed to admin and somebody brought to my attention that certain Indexes and Files are available, which shouldn't be. I will be using dummy names.
You were able to access example.com/intern before, but I changed a line in /etc/apache2/apache2.conf according to this https://stackoverflow.com/a/31445273 . This worked partly, as I get a 403-Forbidden when I now navigate to example.com/intern and that's basically what I want.
However the directory intern governs a file called file.php.bak aswell as file.php. When I navigate to example.com/intern/file.php I get a white website. I am however not sure, if you are able to access file.php in another way, because the site does load and I don't get a 403 like before. What is way worse and the reason I am struggling with this is: If I go to example.com/intern/file.php.bak then my Browser (Firefox) offers me to download file.php.bak, which I can read in plaintext. I want all files in intern to not be accessible via the website, but I have no idea how to do this. Can anybody help?
Things I've tried:
Removing the Indexes from the apache2.conf file like mentioned above. It only puts the 403 on the directory itself and not recursively for all the files in it.
Writing a .htaccess file as described here: https://fedingo.com/how-to-prevent-direct-file-download-in-apache-server/ and putting it in intern with the same result as in 1)
Putting an empty index.html file in the intern directory. This leads to no more 403 in example.com/intern, but the download on example.com/intern/file.php.bak is still possible. I've also tried index.php with the same result.
File System:
The application runs from /var/www/application which is also the folder for the /var/www/application/index.php I want to use. The /var/www/application/intern directory is also there. While it isn't browsable anymore, the files in it still are accessible. /var/www/application/intern/file.php can be navigated to via example.com/intern/file.php, but it seems like it can't be downloaded or read as it results in a white page. /var/www/application/intern/file.php.bak can however be downloaded via example.com/intern/file.php.bak.
Let's say Apache document root is set to DocumentRoot "/folder_one/folder_two"
Placing files in a folder_one will prevent people browsing your apache server and requesting the files directly.
Place index file in folder_two and include some code such as PHP to tell apache to include whatever files you want from folder_one.
In this manor Apache will still be able to serve whatever files you want from folder_one and people will not be able to request the files directly as the are located in a directory above the Apache document root.
I keep getting a 403 error on my homepage, despite having all my permissions set to allow public to read. I'm not using any plugins, I'm not using Wordpress, and though my site is routed through Cloudflare it goes through to my hosting provider's 403 page (I haven't created my own). I've tried 755 and 644 and I keep getting the same thing. How can I fix this?
(The website is alexbelman.com)
There are several possible reasons depending on your hosting environment and a bit more detailed info from you about your hosting setup would help. For example - is your site on a server running Linux? cPanel?
But maybe this info can help you in the meantime:
If you're sure you have the correct permissions on folders (in most cases your public_html folder should be 750, all folders within the public_html folder in most cases should be 755, and files should be 644) then probably the first two things I would check would be:
Make sure you have an index file in your public_html folder - some examples would be index.html , index.htm , index.php , or on some hosts a default.html is used. If there is no index file in the folder that serves your site, server security configs will often present a 403 while protecting the sensitive contents of your hosting account from being viewed.
If you're sure you have an index file in your public facing folder, then check the contents of the .htaccess file in your public_html folder, since an errant rule or line of code in your htaccess is a common cause for a 403
If you can post the contents of your .htaccess file here someone here or possibly myself can spot anything that shouldn't be there or is incorrect.
Also, since you're using CloudFlare you should take a look at their Quick Fix suggestions here - https://community.cloudflare.com/t/community-tip-fixing-error-403-forbidden/53308
I have a Sites folder inside my user directory, where I put all of my apache project files, the weird thing is that I can access all of my folders, except for the main route "localhost/" which, of course, has no index.html document on it, its just folders, but I know I should be able to see something like this:
Index of Sites.
.project1
.project2
.project3
Instead, I get this
403 Forbidden
On my new job I was assigned to this Mac PC that belonged to someone else, and this person of course needed the same tools that I have been asked to download, they told me to uninstall all of that software and install it all over again (which I did), mainly the software that I'm using is an apache server with homebrew.
I have always had this problem, but I ignored it because I thought, well, do I really need to see an "index of Sites" page when I can manually change to whatever folder I want? my answer was, not really.
But yesterday they asked me to download webpack and nodeJs, and I did, so I made a dummy project with webpack that contained all of the js and config files, but it didn't have an index.html file. And surprise surprise, I got a 403 forbidden error when entering the dummy folder in localhost.
So I'm guessing that my apache for some reason is giving me 403 errors when I do not have an index.html file or an index.php file to show.
Have you ever experienced something like this?
I recently found the use of a .htaccess file to edit the URL of my webpages. This is done with mod_rewrite (Apache). I use XAMPP and the working files are inside of the appropriate htdocs folder. While in the local directory, the .htaccess file does the job and it edits the URL. I have a domain name that I've been working on and periodically update the working files to that. When I upload these files to the domain through FTP, the .htaccess file doesn't work correctly, as you can imagine since Apache modules have no way of working on a web directory. So my question is, how do I make a .htaccess file work in a web directory without Apache's mod_rewrite module?
Your question is not sufficiently clear. URL rewriting won't work if you're just accessing the static files (i.e. file:///home/user/www/index.html) rather than going through the Apache server (http://localhost/~user/index.html) since Apache will never process the request.
Perhaps your .htaccess file is not being uploaded properly? Some programs will complain a bit when you try to upload strangely named files, such as those beginning with a period.
I am trying to upload two files to a webserver so my teacher can see it. I am using winsp since my filezila doesnt work. But for some reason it is telling me that i don't have access to that page. Can anyone tell me why is it doing that.Here is a picture of my screen.
I am just not understanding why it is telling me that i don't have to access it.
If I had to take a guess, that public_html folder is your public directory where you should put things that anybody can get to (like through a browser). You have your files outside of that directory, so your page can't access them.
edit:
It's an educated guess, as I have seen a fair amount of server configurations that name the public web folder as such (other common names are "www" and "httpdocs")
Problem definitly isn't in code. There is error while uploading files. Can you connect to FTP regulary? If you can. Look for Active or Passive file transfer to FTP. Also if you can upload files, files must be in public_html folder to be visible from browser.
Active or passive
First read Neal comment.
second, you should probably copy the files into the /public_html folder, instead of the / (root) folder.