.htaccess works on Apache server, but not on FTP directory - apache

I recently found the use of a .htaccess file to edit the URL of my webpages. This is done with mod_rewrite (Apache). I use XAMPP and the working files are inside of the appropriate htdocs folder. While in the local directory, the .htaccess file does the job and it edits the URL. I have a domain name that I've been working on and periodically update the working files to that. When I upload these files to the domain through FTP, the .htaccess file doesn't work correctly, as you can imagine since Apache modules have no way of working on a web directory. So my question is, how do I make a .htaccess file work in a web directory without Apache's mod_rewrite module?

Your question is not sufficiently clear. URL rewriting won't work if you're just accessing the static files (i.e. file:///home/user/www/index.html) rather than going through the Apache server (http://localhost/~user/index.html) since Apache will never process the request.
Perhaps your .htaccess file is not being uploaded properly? Some programs will complain a bit when you try to upload strangely named files, such as those beginning with a period.

Related

How to disable the download of files in an Apache2 webserver?

I took over a website which I'm supposed to admin and somebody brought to my attention that certain Indexes and Files are available, which shouldn't be. I will be using dummy names.
You were able to access example.com/intern before, but I changed a line in /etc/apache2/apache2.conf according to this https://stackoverflow.com/a/31445273 . This worked partly, as I get a 403-Forbidden when I now navigate to example.com/intern and that's basically what I want.
However the directory intern governs a file called file.php.bak aswell as file.php. When I navigate to example.com/intern/file.php I get a white website. I am however not sure, if you are able to access file.php in another way, because the site does load and I don't get a 403 like before. What is way worse and the reason I am struggling with this is: If I go to example.com/intern/file.php.bak then my Browser (Firefox) offers me to download file.php.bak, which I can read in plaintext. I want all files in intern to not be accessible via the website, but I have no idea how to do this. Can anybody help?
Things I've tried:
Removing the Indexes from the apache2.conf file like mentioned above. It only puts the 403 on the directory itself and not recursively for all the files in it.
Writing a .htaccess file as described here: https://fedingo.com/how-to-prevent-direct-file-download-in-apache-server/ and putting it in intern with the same result as in 1)
Putting an empty index.html file in the intern directory. This leads to no more 403 in example.com/intern, but the download on example.com/intern/file.php.bak is still possible. I've also tried index.php with the same result.
File System:
The application runs from /var/www/application which is also the folder for the /var/www/application/index.php I want to use. The /var/www/application/intern directory is also there. While it isn't browsable anymore, the files in it still are accessible. /var/www/application/intern/file.php can be navigated to via example.com/intern/file.php, but it seems like it can't be downloaded or read as it results in a white page. /var/www/application/intern/file.php.bak can however be downloaded via example.com/intern/file.php.bak.
Let's say Apache document root is set to DocumentRoot "/folder_one/folder_two"
Placing files in a folder_one will prevent people browsing your apache server and requesting the files directly.
Place index file in folder_two and include some code such as PHP to tell apache to include whatever files you want from folder_one.
In this manor Apache will still be able to serve whatever files you want from folder_one and people will not be able to request the files directly as the are located in a directory above the Apache document root.

How to avoid displaying directory path in url?

I set up a little Apache2 server on a Raspberry PI4. Now I’m looking for a way to hide the real directory path displayed in the URL. I read around that you should deal with a file called .htaccess but, I don’t even know what to actually look for on the internet. How can I display an arbitrary url in the address bar of the browser, Hiding file extension like .php and file path?
You make rewrite rules in an Apache config file, a .htaccess file for example. One way you could achieve this is to create re-write rules in a .htaccess file. Use to below link to test your rewrite rules, then once you have that part working implement on your live apache installation.
https://htaccess.madewithlove.be/

Apache routing without htaccess

I am working with a custom website built in PHP running on Apache server. The client wants to move it to a new server. I moved everything including the .htaccess file, the homepage loads fine but all the other urls like site.com/register isn't working. I'm sure this is not handled by code in the old server because I renamed everything (including .htaccess) and it still works. If I create a file like test.php in the old server, I can access it like site.com/test. It doesn't even hit the index.php file. Also, not all the urls work like this, some are loading through files in other folders.
So my question is - what are the possible ways that Apache can let user access site.com/test without the .php extension. It must not be using .htaccess. Also, we should be able to add exceptions to this so that some urls can be loaded differently.
you can achieve same thing in hosts file if you are using Linux server. you need to define same rules in hosts configuration file.

.htaccess file with Ember

So I have this Ember.js project, using node for a back-end. I am trying to add a .htaccess file so i can set exparation headers for my css,js etc. I tried putting it in the root folder of the Ember project but it does not get detected. I have a robots.txt file in the same place and it is detected fine.
I am also using Apache to redirect the domain to work with my node backend. I dont think this should be a problem as the robots.txt file is still detected this way.
Where is the best place to put the .htaccess file or is there an another soloutin to do the same thing that works with Ember?

Why do I have .htaccess files in multiple directories?

I'm looking through my server because I want to restrict access to some specific folders and i've noticed I have several .htaccess files. One is in the root, the directory before public_html, is that the root, or is public_html the root? And that file enables php5 as default. I then have a htaccess doing some url re-writing in the public_html folder, then I have another one in the wordpress directory.
Is there a need for them to be spread out?
Do I have one htaccess for each folder I want affected or does the htaccess affect a folder plus all of the sub directories?
Thanks
Edit: Also have another htaccess in my wordpress theme folder?
Apache has two ways of storing configuration options:
The central configuration (the .conf files) - when you change these, you need to restart the server.
.htaccess files whose settings apply only to its directory, and all child directories. Changing these does not require a server restart
If you're on a dedicated server, you could theoretically migrate all .htaccess files into the central configuration, but WordPress will write into a .htaccess file when updating its permalink structure, so you'll always have at least that.
In my experience, keeping individual .htaccess files is relatively practical in everyday maintenance work as long as they're not too many. I would leave things as they are.