I hope I didn't miss any question related to this while scanning the website. So I am completing a small project written in PHP. My core files are in a single directory (includes) where I want to prevent directory listing and user access. Deny from all rule in HTACCESS file (placed in root of concerned directory) works for disabling the direct access but my forms also don't work when I add an HTACCESS file with this code.
Please note that I have some files. Some of these files process the data received via $_POST and $_GET so I want to allow internal server requests on files in protected directories. How can I get these results? Please help.
This link might help you:
http://httpd.apache.org/docs/2.2/mod/mod_authz_host.html
To quote from the top of the page:
In general, access restriction directives apply to all access methods (GET, PUT, POST, etc). This is the desired behavior in most cases. However, it is possible to restrict some methods, while leaving other methods unrestricted, by enclosing the directives in a section.
So basically the final answer that will help you is found here:
http://httpd.apache.org/docs/2.2/mod/core.html#limit
Hope this helps.
Related
I know .htaccess is not the best way to do this, but I don't have access to other server settings.
- index.php (root)
-- scripts (folder)
--- someScript.php (one of the scripts)
So let's say I have an index.php file that lives at the root of the server, that file makes ajax requests to a script in a folder scripts, If a user types in the search bar domain.com/scripts/ he now has access to that folder (I don't know if they can be downloaded from there or not).
I know I can use options -Indexes but this still allows users to go directly to a script if they now the name of it, which is not hard to find or even guess?
The second option I know about is
<Files ~ "\.txt$">
Order allow,deny
Deny from all
</Files>
But this stops everything from accessing the file, even the ajax requests.
So, my question is, should I protect these files somehow ? Can the user see their content or download them, are there security risks ?
should I protect these files somehow?
Well, you can't really, not if they are to be requested by the client (browser AJAX request).
It's usual to send a custom HTTP request header when calling a script via AJAX (client-side), so the script knows how to respond to such requests and return the appropriate response. Whilst this provides no "security", it does prevent casual requests to that script from doing anything.
Can the user see their content or download them, are there security risks ?
The example you gave is of a PHP script. Any direct request will only see its output, not necessarily its contents.
The only security risks are what you make. If an arbitrary request to that script returns a list of all active users and personal information then yes, that's obviously a security risk. But if the response is empty and no harmful event happens as a result of calling that script then it's a non-issue.
I know there are a lot of similar questions out there, and I've trawled them all, but I can't seem to get any of the solutions to work.
I have a folder on the root of my website containing uploaded files that can be viewed and downloaded from the site when a user is logged in. They are here: https://example.com/uploads (for example). I need the site to continue to be able to access them to display them (some are images) and provide links for download (pdfs etc) so the user can download them, but I want to avoid anyone who get's hold of the url of a particular file being able to download them directly, like this: https://example.com/uploads/2020/02/myfile.pdf. OR these urls getting into search engines (or if they do, the server prevents them from being accessed directly.
I've tried adding an .htaccess file in the uploads directory with the following content:
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
And I've tried
Order Allow,Deny
Deny from all
Allow from 127.0.0.1
...as I read that might allow HTTPS calls from the site itself as well as local urls.
But it forbids the site and a direct url request, which is no good.
Is there a way to do this?
The user interface that provides the ‘official’ access to the files has user authentication, yes, but the files still exist in a directory than won’t stop anyone getting to them if they know the url.
You need to protect the files using the same authentication system that you are using to protect access to the user interface. The only way you could protect these resources by IP address (the client IP address) - as you are currently attempting in .htaccess - is if the client's IP is fixed and known in advance (but if this was the case then you wouldn't need another form of authentication to begin with).
So, this will primarily be an exercise in whatever scripting language/CMS is being used to authenticate the "user interface".
What you can use .htaccess for is to rewrite requests for these files to your server-side script that handles the authentication and then serves the file to the client once authenticated.
For example:
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^uploads/. /serve-protected-file.php [L]
Any request for /uploads/<something> (eg. /uploads/2020/02/myfile.pdf), that maps to a valid file is routed to your script: /serve-protected-file.php.
/serve-protected-file.php would then need to do something like the following:
// 1. Parse the file being requested from REQUEST_URI
// 2. Is the requested file "protected"?
// (Serving both protected and public files from the same directory?)
// 3. If not protected then serve/stream the resource to the client. END
// 4. If protected then authenticate the user...
// 5. If user authenticated then serve/stream the resource to the client. END
// 6. Resource is protected and user not authenticated...
// Serve a 403 Forbidden. END
(Ideally, the location of these "protected" resources would be entirely outside of the document root - so they are "private" by default - and the URL the user uses to access these resources is entirely virtual - then you probably wouldn't need any additional coding in .htaccess and everything would be implemented by your front-controller - but that all depends on how your site is implemented and the way in which URLs are routed.)
I am able to reach my website at a certain ip address and I am going to implement a REST service. I have some PHP files that perform actions on a database and I am calling them from the client. I am using linux ubuntu as server and so far I can do this:
http://xxx.xxx.xxx.xxx/api/create/?id=someId&val=someValue
http://xxx.xxx.xxx.xxx/api/delete/?id=someId
I can do the above because inside /var/www/html I have a folder called api that contains another folder called create. The create cointains the file index.php so that I can omit it and execute the URL you can see above.
This works fine but I don't think this is the proper way to do it. I am new with this so I don't know what to do. After some researches I have found that my goal probably be achieved using an .htaccess file use url rewriting but I am not sure.
How can I do this? Do I have to place all the php files in a single folder and then use an htaccess file? (^)
(^) To be more precise: instead of having this
http://xxx.xxx.xxx.xxx/api/create/index.php?id=someId&val=someValue
http://xxx.xxx.xxx.xxx/api/delete/index.php?id=someId
//and so on with other actions...
Do I have to create a folder like
http://xxx.xxx.xxx.xxx/files/
containing all my php files (create.php, delete.php, view.php...) and the use an htaccess to redirect?
I see that websites offer their api using www.domain.com/api/something/?data=Value or www.domain.com/api/something/dataAbout/. Are they doing what I have said about the .htaccess? I hope I have well explained my problem.
htaccess:
RewriteEngine On
RewriteRule ^api/([\w-]+)/?$ files/$1.php [L,NC]
This is inside /var/www/html and I have api inside /home/username/api .
Thanks Emma
Do it like this:
Create php files in a folder files/ subdirectory as create.php, delete.php, view.php etc (by renaming each individual index.php file, you mentioned).
Move away api directory somewhere outside site root.
Once that is done use following .htaccess file in /var/www/html/:
RewriteEngine On
RewriteRule ^api/([\w-]+)/?$ files/$1.php [L,NC]
Then use new URLs as:
http://xxx.xxx.xxx.xxx/api/create?id=someId&val=someValue
http://xxx.xxx.xxx.xxx/api/delete?id=someId
In first, this is not the right way to create a RESTFUL API. My suggestion is to you read a best practices article.
You shouldn't create a CREATE and DELETE folder. You should use HTTP actions.
To create a new record you should use POST. In example, POST /user and in the body you pass the user's information.
In another example, you could use the same route by using different HTTP methods: DELETE /user/1 to delete a user and PATCH /user/1 to edit some already existent user's information.
Hope it's help you.
As I came across with the hot-leeching problem, I searched the website and found two ways to solve it.
The first is an easier and simpler way with the code showing below:
RewriteEngine On
RewriteCond %{HTTP_REFERER}!^$ Options +FollowSymlinks
RewriteCond %{HTTP_REFERER}!^http://(www\.)?mydomain.com(/)?.*$ [NC]
RewriteRule .*\.(gif¦jpg¦jpeg¦png¦swf)$ [mydomain.com...] [R,NC]"
This can only prevent some simple leeching ,but can do nothing with a determined person.
The other way is a better way with a script-and-cookies-based approach. They said "You set a cookie on an 'authorizated' page of your site, and then use a script to serve images only if the correct cookie is present in the image request. Images are kept in a directory accessible only to the script, and not via the Web. So, the script acts as an 'image server' on your site." I understand this principle but don't have any idea about how to realize it . Could anyone know how to realize this?
Any help appreciated.
I can't really give any implementation, but only some idea of how it can be achieved:
You will need a "portal" page, where you set the cookie for the user. Any request for resources without having a cookie of your site should be redirected here. There may not may not be a login mechanism here, depending on the purpose of your site, but usually you will set the cookie, after the user is logged in.
All resource links will link to to the same "script" page. The difference is that different resource will have different identifier (can be some sort of id - if you maintain a database of id to file path mapping). The identifier must be included in the query of the URL. The "script" will find the resource on the server based on the identifier (in case of id to file mapping, you will obtain the file path and go retrieve the file).
There will be a "script" page, which can be php code, for example. It will check for the cookie, then check for the identifier, then load the resource accordingly. You may also want to check for Referer to restrict the access a bit more (without checking, hot linking will work for any logged in user).
In this implementation, sharing a hot link to a resource will not work for any user that haven't visited the "portal" page (or haven't logged in, depending on your web site). It will also not work even for logged in user if they click the link from somewhere else.
However, scraping your website for resources is simple in both implementations mentioned in your question, since scraper can freely adjust the HTTP header.
I need to protect a site that has a ton of static .html files. The standard .htaccess scheme doesn't meet the requirements.
Is there a way to specify an .htaccess style of password protection with a custom handler? That is I need to write the code to determine if the user is allowed or not, but I don't want to modify a million .html files all over the place.
Thanks!
Maybe. It depends on what modules are loaded on your web server. Your options will range from keeping a simple list of users in a flat file, to keeping them in a database and customizing the queries.
http://httpd.apache.org/docs/2.2/howto/auth.html
Another option - just brainstorming here - is to use something like mod_rewrite to redirect the calls to the physical file to something like a PHP script that can manage the user/password authentication for your, and if authenticated, go out and load the html file that was requested. So calls to www.some.com/10203.html actually get directed to www.some.com/auth.php?10203.html, which would control access to that underlying html file. That would of course require mod_rewrite to be installed, which is pretty common even for shared hosting environments.