I am new to webhosting and building a very small PHP website as a part of my project. It will not be used for practical purposes for now, but still I want to make sure that it is not TOO insecure.
I have a few files which I don't want users to access by URL(some text and CSV files) but my PHP code should be able to use them. How can I achieve something like this?
If you don't want them accessed by the web server but just by PHP, the best thing is to just keep them outside the webroot.
You can block access using .htaccess, but that will prevent you from using pretty much any other web server than Apache, and it adds un-necessary overhead (and a possible vulnerability if the .htaccess is accidentally removed or configured wrong)
Related
I am currently hosting the contents of a site with ProviderA. I have a domain registered with ProviderB. I want users to access the contents (www.providerA.com/sub/content) by visiting www.providerB.com. A domain forward is easy enough and works as intended, however, unless I embed the site in a frame (which is a big no-no), the actual URL reads www.providerA.com/sub/content despite the user inputting www.providerB.com.
I really need a solution for this. A domain masking without the use of a frame. I'm sure this has been done before. An .htaccess domain rewrite?
Your help would be hugely appreciated! I'm going nuts trying to find a solution.
For Apache
Usual way: setup mod_proxy. The apache on providerB becomes a client to providerA's apache. It gets the content and sends it back to the client.
But looks like you only have .htaccess. So no proxy, you need full configuration access for that.
So you cannot, see: How to set up proxy in .htaccess
If you have PHP on providerB
Setup a proxy written in PHP. All requests to providerB are intercepted by that PHP proxy. It gets the content from providerA and sends it back. So it does the same thing as the Apache module. However, depending on the quality of the implementation, it might fail on some requests, types, sizes, timeouts, ...
Search for "php proxy" on the web, you will see a couple available on GitHub and others. YMMV as to how difficult it is to setup, and the reliability.
No PHP but some other server side language
Obviously that could be done in another language, I checked PHP because that is what I use the most.
The best solution would be to transfer the content to providerB :-)
I am currently migrating my website from Apache to nginx, but my .htaccess file is not working. My website is inside the /usr/share/nginx/html/mywebsite folder. How can I use .htaccess in my nginx server?
This is my .htaccess file:
RewriteEngine on
RewriteRule video/watch/([a-zA-Z0-9_#$*-]+)/?$ "videos-single.php?id=$1" [NC]
Nginx doesn't support .htaccess (see here: "You can’t do this. You shouldn’t. If you need .htaccess, you’re probably doing it wrong.").
You've two choices (as I know):
import your .htaccess to nginx.conf (maybe the htaccess to nginx converter helps you)
use authd-htpasswd (I didn't try it)
Disclosure: I am the author of htaccess for nginx, which is now open source software.
Over the past years, I created a plugin which implements htaccess behaviour into nginx, especially things like RewriteRule, Allow and Deny, which can be crucial for web security. The plugin is used in my own productive environments without a problem.
I totally share the point of efficiency and speed in nginx, and why they didn't implement htaccess.
However, think about it. You cannot make it worse if you're using nginx plus htaccess. You still keep the great performance of nginx, plus you can drive your legacy appliances effortlessly on one webserver.
This is not supported officially in nginx. If you need this kind of functionality you will need to use Apache or some other http server which supports it.
That said, the official nginx reasoning is flawed because it conflates what users want to do with the way it is done. For example, nginx could easily check the directories only every 10 seconds / minute or so, or it could use inotify and similar mechanisms. This would avoid the need to check it on every request... But knowing that doesn't help you. :)
You could get around this limitation by writing a script that would wait for nginx config files to appear and then copy them to /etc/nginx/conf.d/. However there might be some security implications - as there is no native support for .htaccess in nginx, there is also no support for limiting allowed configuration directives in config files. YMMV.
Using the config file is one option, but the cool thing about the .htaccess file is that it provided a way for a web developer to have some control over server settings without having root access to the server. There doesn't seem to be anything like this on nginx which is a real bummer.
I understand how the way it's setup on apache slows down response times, but hoped there could be an nginx way to do the same thing without the performance hit... At least a way to do rewrites with regex on urls if nothing else.
"Is there no nginx way to do bulk redirects using regular expressions that doesn't slow down response times."
Just edit your database with myphpmyadmin.
Open myphpmyadmin select your database then find your "yourprefix_Posts" table.
Open it then click the "Search" tab, then "Find and Replace".
Select "post_content" in the dropdown
In the "Find" field, type URL you want to change: "website.com/oldURL".
In the "Replace" field, type the new URL: "website.com/newURL".
(To use regular expression, tick the "Regular Expression" box.)
NOTE: You can test this out by simply leaving the "Replace" field blank.
ALWAYS BACKUP database before making changes. This might sound scary but its really not. Its super simple and can be used to quickly replace just about anbything.
Sorry about my english level.
I researched so much, and i found that can i use ".htaccess" to get redirection to subdomain folder and this is OK.
In Drupal i need to create a folder for each subdomain in "/sites/sub.example.com/" and copy "default.settings.php" from default folder "/sites/default/default.setting.php" and rename it to "settings.php", after that, enable "$databases" variable in the same file, when it's done, i need to add a wildcard and modify "hosts" file.
Well, i should "automate" all this, but i don't know if it's is more hard because it's important hold the server safety with writing permissions or try another way, someone could advise me.
Im working on OSX and Drupal 7.x (recent release)
Thank you very much.
For each site that you want to use separate database, create own sites/ directory with settings.php. For example, if you want to have one database for example.com, another one for sub1.example.com and third one for sub2.example.com, all using same code base, setup your files like this:
sites/example.com/settings.php
sites/sub1.example.com/settings.php
sites/sub2.example.com/settings.php
each settings.php using different database credentials.
Read more here - https://drupal.org/documentation/install/multi-site
Also, if you want to automate this and if there is supposed to be bigger number of sites to be managed, consider deploying aegir - http://www.aegirproject.org.
I hope I understood your question correctly.
(LAMP server configuration)
As a workaround for another problem, I need PHP to be able to access local files, but prevent these files from being served over http by Apache.
Normally, I would just use .htaccess to accomplish this, however due to institutional restrictions, I cannot. I also can't touch php.ini, although I can use php_ini_set within php.
As a creative solution, I thought that if php executes as its own linux user (not as apache) I could use normal chown's and chmod's to accomplish this.
Again, the goal is simply to have a directory of files that apache will not display, but php can access.
I'm open to any suggestions.
Put the files outside of your web accessible root (DocumentRoot), but keep them accessible via PHP.
Suggestion:
/sites
/sites/my.site.com
/sites/my.site.com/data // <-- data goes here
/sites/my.site.com/web // <-- web root is here
Here's a thought. Set the permissions on the files to be inaccessible to even the owner, then when PHP needs them, chmod() then, read them, then chmod() them back to inaccessible.
I have a directory on my website specifically for javascript files, I want these javascript files to be hidden, so if I type the url to it it says Forbidden or disallows access, but my front-end website files can still access them to execute them when needed. Is there a way to do this through a FTP client?
Cheers,
Dan
You can't do this trough a ftp client. It is the task of your webserver to forbid access to certain files.
If you change permission, the webserver won't have access to them anymore, so this is not the way to go.
You must configure your webserver to restrict the access. If you're using Apache, you can use an .htaccess file. There's different ways of doing this, many depends on the way the webserver is configured.
The easiest is to put an .htaccess file in your Scripts folder which contain only this none line :
deny from all
However, like peeter said, there's a good chance this will break your site, since the browser must access theses files, so you can't restrict access.
Put a htaccess file in your scripts folder containing deny from all, but this will stop your pages from accessing the scripts also (though not if you pass them through the PHP engine first)
You're trying to hide JavaScript files that are executed on the clients side. If a client(browser) cannot access the files means non of your javascript code is executed.
If I understood your question correctly then you cannot achieve what you're trying to achieve.