OpenBSD's httpd daemon {block} directives not working - apache

I'am trying to restrict access to some subfolders of simple website hosted on OpenBSD's httpd native server. Config is rather simple, it is for testing purposes:
server "10.0.1.222" {
listen on 10.0.1.222 port 80
log style combined
location "/*php*"{
root "/FOLDER"
fastcgi socket "/run/php-fpm.sock"
}
directory {
index "index.php"
}
location "/*" {
root "/FOLDER"
}
location "/SUBFOLDER/*" {block}
}
Inside the SUBFOLDER I placed some htmls not intended to direct viewing.
With last location directive I expect requests like http://10.0.1.222/SUBFOLDER/01.html to be blocked with 403 code but I can't achieve it.
While http://10.0.1.222/SUBFOLDER/ returns access denied, requesting any proper html document name within SUBFOLDER serves that request without any complaints.
If string: /SUBFOLDER/* is (as I suppose) proper shell glob that should match string /SUBFOLDER/ itself + any string given after, then requests like http://10.0.1.222/SUBFOLDER/01.html should be returned with code 403. But it isn't working.
I tried many combinations: "/SUBFOLDER/*", "/SUBFOLDER/*.html" and so on with or without leading /. No effect.
There is probably something I do not understand, but I can't debug my mistake.
What am I missing?

Quick answer for my own question, obtained from misc#openbsd.org: according to the manual man httpd.conf in case of the location statement first match wins. To avoid some more specific rules being ignored it is necessary to put them before more global ones.
In my case putting blocking directive just after log style combined solved the problem.

Related

Need to configure .htaccess, so multiple folders will act as if they are their own separate root folders - for the code running on them

For example:
mydomain.com/site1
mydomain.com/site2
I need to install an application on /site1 that will think that it is on the root folder. (In this case PHP, js, CodeIgniter, but could be anything)
So for example, links/references for files such as "/file.jpg" (in code that is in the site1 folder, such as at mydomain.com/site1/code.js) will really load from mydomain.com/site1/file.jpg
And also the code would not be able to see any folder below site1, so that is basically the root folder. And similar thing would be at site2, so the 2 are separate root folders.
I thought this would be some kind of simple .htaccess file installed at mydomain.com/site1 with a redirect, or some kind of a reverse proxy, but so far everything I tried did not work.
I can't seem to find even any such example even on stack overflow..
Any ideas?
The easiest way to do this would be to create an additional VirtualHost, for internal use, called internal1, whose RootDirectory is, you guessed it, /var/www/mydomain.com/htdocs/site1 where the main site is in /var/www/mydomain.com/htdocs.
Then in mydomain.com you reverse proxy /site1 to internal1 (you'll have to put it into /etc/hosts and alias for localhost). The second request will have its DOCUMENT_ROOT point to site1, as requested (and its ServerName changed to internal1):
ProxyPass /site1/ http://internal1/
ProxyPassReverse /site1/ http://internal1/
(Not sure about the trailing slashes)
Now, accessing yourdomain.com/site1/joe.html will trigger a second internal connection to internal1/joe.html, which will contain, say, 'src="/joe.jpg"'; and here's where ProxyPassReverse will come into play, rewriting this in 'src="yourdomain.com/site1/joe.jpg"' so that everything will work.
errata corrige
The above is not correct, thanks #MrWhite for pointing this out. ProxyPassReverse is not enough as it only rewrites headers. From the Apache documentation (emphasis mine):
Only the HTTP response headers specifically mentioned above will be
rewritten. Apache httpd will not rewrite other response headers, nor
will it by default rewrite URL references inside HTML pages. This
means that if the proxied content contains absolute URL references,
they will bypass the proxy. To rewrite HTML content to match the
proxy, you must load and enable mod_proxy_html.
(The method is dirty as all Hell: every HTTP call incurs one extra connection and two rewrites, one going in, a larger one going out).
Of course, if the link is built using e.g. Javascript, it might well be that the proxy code will not recognize it as a link, will leave it unchanged, maybe with the "internal1" name inside somewhere, and the app will break.
However, #arkascha has the right of it - you should cure the cause, not the symptom. You can maybe rewrite the environment of the apps so that they run without troubles even if they are in a subdirectory. Or you could try injecting <base href="https://example.com/site1"> in the output HTML.

use Apache Alias instead of RewriteRule to serve HTML page

A simple Alias in Apache configuration not working -
Alias /url/path/some-deleted-page.html /url/path-modified/new-avatar-of-some-deleted-page.html
It gives "page not found".
However RewriteRule works as expected but it sends redirect status to browser. I want browser/user not to be aware of the redirect. Hence, I want to use Alias instead of RewriteRule. I want to confirm if mod_alias can be used to map individual URL.
I use ProxyPassMatch also which executes all html pages as PHP script. Also adding ProxyPass makes no diffrence.
ProxyPass /url/path/some-deleted-page.html !
Please help so that I can map individual URL (a bunch of them) with Alias instead of RewriteRule.
The purpose of mod_alias is to map requested URLs with a directory on the system running your httpd instance. It does not return anything to the browser (i.e. no redirection code, nothing). It is all done internally. Hence your client does not even know it is there.
Request: http://www.example.com/someurl/index.html
Configuration
[...]
DocumentRoot "/opt/apache/htdocs"
Alias "/someurl/" "/opt/other_path/someurl_files/"
[...]
In this scenario, users asking for any URL besides /someurl/ would receive files from /opt/apache/htdocs.
If a user asks for /someurl/, files from /opt/other_path/someurl_files/ will be used.
Still missing in this example is a <Directory> definition for securing the Alias directory.
You should read: https://httpd.apache.org/docs/2.4/mod/mod_alias.html
Alias will cover the case where you need to point a certain URL to a particular directory on the file system.
If you need to modify the filename (i.e. the client asks for file A, and you send back page B), you should use RewriteRule. And to hide the fact you changed the filename, use the [P] flag.
This directive allows you to use regex, yet still use a proxy mechanism. So your client does know what went on, as the address in his address bar does not change.

secure underlaying directory with htaccess

I have created an axtra ftp account for someone else, so he can upload files.(tournament results, about 20/30 htm files and images)
I am also very paranoid, so in case he upload "possible dangerous" files, i do not want those files to be accessible via an http request. With the help of PHP I want to grab the content of those files. (I do not expect troubles with that yet)
Problem:
My hoster does not allow extra ftp accounts have access outside the public_html.
So i thought htacces should solve my problem. Just by deny from all rule.
But with ftp acces this htaccess file can be deleted or changed.
So i tried to add the following code in my main htacces file in the root of my site:
<Directory "/home/xxxx.nl/public_html/xxxxxxxx.nl/onzetoernooien/swissmaster_ftp">
deny from all
</Directory>
My site hung with an internal server error.
I have no access to the httpd file.
My idea was to use an htacces file above this directory.
If the absolute path was incorrect, i could use some kind of wildcard, like *swissmaster?
I have searched on the Apache website, but i get lost in the overwhelming amount of information.
Thanks in advance for any help!
Unfortunately you can't use a <Directory> section in .htaccess, only in the server configuration file. That causes the server error (check your error logs and you'll see the error message). We can't secure a subdirectory with a <Filesmatch "subdir/.*$"> either, as FilesMatch examines only the filename part of the requested URI.
You can, however, use mod_rewrite, along these lines:
RewriteEngine on
RewriteRule ^subdir.*$ - [NC,F]
If the requested URI matches the regex pattern subdir.* (so "subdir" followed by anything else; you may need to tweak the pattern, as it happily catches subdir_new/something.txt too -- I'm sure you get the idea), then mod_rewrite's F flag will return a 403 Forbidden status (the NC stands for No-Case, making the pattern case-insensitive).

RewriteLock hangs Apache on re-start when added to an otherwise working Rewrite / Rewritemap

I am on a Network Solutions VPS, four domain names share the IP. I have a Rewrite / RewriteMap set up that works. The Rewrite is in the file for the example.com web address at var/www/vhosts/example.com/conf/vhost.conf, the Rewrite being the only thing in the vhost.conf file. It would not work in the main httpd.conf file for the server.
The RewriteMap uses a couple things in the URL typed in by the user (http://example.com/bb/cc) to get a third piece of info (aa) from the matching database record, uses that third piece of info as the query string to load a file, and leaves the originally typed in URL in the address bar while showing the file based on the query string aa.
Here is the Rewrite:
Options +FollowSymlinks
RewriteEngine on
RewriteMap newurl "prg://var/www/cgi-bin/examplemap.php"
RewriteRule ^/(Example/.*) ${newurl:$1} [L]
When I add the following either above or below the RewriteMap line:
RewriteLock /var/lock/mapexamplelock
and try to re-start Apache, it hangs and Apache will not re-start. I have tried different file paths (thinking it might be a permissions issue and just hoping it worked of course), taking away the initial /, putting it in quotes, different file types (ie. .txt at the end), different file names, just about anything, and every time it hangs Apache on re-start. The Rewrite / RewriteMap works without it, but I have read a lot on the importance of the RewriteLock, and php is issuing warnings in the log ending in DANGEROUS not to use RewriteLock.
Here is the map (located where the Rewrite says):
#!/usr/bin/php
<?php
include '/pathtodatabase';
set_time_limit(0);
$keyboard = fopen("php://stdin","r");
while (1) {
$line = fgets($keyboard);
if (preg_match('/(.*)\/(.*)/', $line, $igot)) {
$getalias = mysql_query("select aa FROM `table`.`dbase` WHERE bb = '$igot[1]' && cc = '$igot[2]'");
while($row=mysql_fetch_array($getalias)) {
$arid = $row['aa'];
}
print "/file-to-take-load.php?aa=$arid\n";
}
else {
print "$line\n";
}
}
?>
I looked in the main httpd.conf file and there is nothing I can find about RewriteLock that might be interfering. It's just the standard one that came in the set-up of the VPS.
If anyone has an idea about why this would work only without RewriteLock and the possible fix, it would be greatly appreciated.
Thanks Greg
Apache hangs if you define more than one RewriteLock directives or if you use it in a VHOST config.
The RewriteLock should be specified at server config level and ONLY ONCE. This lock file will be used by all prg type maps. So if you want to use multiple prg maps, I suggest using an internal locking mechanism, for example in PHP there is the flock function, and simply ignore the warning apache writes in the error log.
See here for more info:
http://books.google.com/books?id=HUpTYMf8-aEC&lpg=PP1&pg=PA298#v=onepage&q&f=false

mod_wsgi and static pages (no django)

On page: http://code.google.com/p/modwsgi/wiki/FileWrapperExtension , Graham Dumpleton recommends the following:
"Do note however that for the best performance, static files should
always be served by a web server. In the case of mod_wsgi this means
by Apache itself rather than mod_wsgi or the WSGI application."
I'd like to pre-build a large number of static pages, then have a python program (running under apache/mod_wsgi 3.3/python3.1, daemon mode, no django involved) decide which of them to serve to each user. I'd like the python program to decide, for example, that this guy needs "12345.html" and have it tell Apache, "please serve static file '12345.html' to this guy", rather than having to use python to open the file, read the contents, turn it into a python string, and return it to mod_wsgi as "[output]".
Is this possible? If so, how?
If not, what's the best way to do this?
There are numerous ways one could do it.
X-Sendfile implemented by mod_xsendfile and Apache.
Location/mod_rewrite tricks using mod_wsgi daemon mode.
X-Accel-Redirect if also using nginx as front end to Apache.
Read up on (1) and (3) as more widely used options.
Update with instructions for (2).
Have the WSGI application return a 200 response with empty body and 'Location' response header with URL path to local resource hosted on same Apache server and mod_wsgi when daemon mode is being used will trigger an internal redirect to that URL.
Thus if your Apache has:
Alias /generated-files/ /some/path/
<Directory /some/path>
Order allow, deny
Allow from all
</Directory>
then generate your file as /some/path/foo.txt in file system and then have the 'Location' response header have value '/generated-files/foo.txt' and it will be served up.
Note that anything under '/generated-files' is publicly accessible. If you didn't want this and wanted it to be private and so only returnable via the specific request which generated the 'Location' response header, you need to add mod_rewrite magic that blocks access to that URL except for an internally generated sub request. That from memory needs to be something like:
RewriteCond %{IS_SUBREQ} false
RewriteRule ^/generated-files/ - [F]