Apache 2.4 - transparently redirect to subfolder - apache

I am transparently redirecting every request(html, css, js,...) to a subfolder _site (where my static Jekyll site sits...)
RewriteCond %{REQUEST_URI} !^/_site.*
RewriteRule ^(.*)$ _site/$1 [L]
That basically works. Although I find it rather strange, that I have to use the Rewrite condition, to block iterations:
AH00124: Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if ...
After all, I am using [L] for last, so how come, I end up in recursions?!?
If I try to prevent direct access to the site-folder (like example.com/_site/favicon.ico), be prepending this first line:
RewriteRule ^_site.* - [R=403,NC,L]
RewriteCond %{REQUEST_URI} !^/_site.*
RewriteRule ^(.*)$ _site/$1 [L]
Then, also on legitimate access (example.com/favicon.ico) I get a 403!
Apparently, every request takes another round. The legitimate ones, then in “round 2” match the forbidden rule. So, what's wrong here?
Why doesn't „last“ mean „last“? (did something change, i.e. between Apache 2.2 and 2.4? as quite a few thing changed on .htaccess there...)

Cough, „Last“ doesn't quite mean „last“. It only means
„quit rewriting within this file now, ignore the rest of the lines“.
But it also means
„re-run the whole htaccess-thing with that new (rewritten) url.
(Something you might not notice for YEARS, if your rewrite rules are too specific to trigger recursions... even, if in all honesty, its in the docs)
So, happy recursion, everyone. To truly put an end to things, guess what:
RewriteRule ^(.*)$ _site/$1 [END]
... respectively in my extended version:
RewriteRule ^_site.* - [R=403,NC,END]
RewriteCond %{REQUEST_URI} !^/_site.*
RewriteRule ^(.*)$ _site/$1 [END]
( .. ,L,END doesn't hurt, but END implies L)

Related

mod_rewrite 301 redirect from old urls to new

Website has changed its url names due to SEO reasons, e.g. it was:
/category/filter1/f00/filter2/123/filter3/100-500/filter4/36.html
now:
/category/color/red/size/big/price/100-500/style/classic.html
I know the old and new names, they're fixed. Please help me to build a rewrite rule which will result in 301 redirect from old urls to new. I did research and I see that I cannot make it using RewriteMap for example, so I ended up making something like RewriteRule (.*)filter1(.*) $1color$2 [L] etc. Not only I don't like the way it looks, but also it doesn't give me a 301 redirect.
UPDATE: Note that at the moment I have several rules, one per filter name/value, e.g.:
RewriteEngine on
# make sure it's a catalog URL, not anything else
RewriteCond %{REQUEST_URI} !^/(category1|category2|category3|category4)
RewriteRule .* - [L]
# rewrite filter names
RewriteRule (.*)filter1(.*) $1color$2 [L]
RewriteRule (.*)filter2(.*) $1price$2 [L]
...etc...
It works as expected - changing all the names in URL, but setting R flag causes the stop on first rule and redirect to URL like:
/var/www/vhosts/site/htdocs/category/color/red/filter2/123/ etc...
I separated rules because any of filters may or may not exist in the URL. I will greatly appreciate the better solution.
Here is my own answer: it is possible to do with environment variables. We need to replace old filter names and values with new ones, and then make only one 301 redirect to new URL. Here what I've done using mod_rewrite and environment variables:
RewriteEngine on
RewriteRule /filter1/ - [E=filters:/color/]
RewriteRule /f00[.\/] - [E=filters:%{ENV:filters}red]
RewriteRule /0f0[.\/] - [E=filters:%{ENV:filters}green]
RewriteRule /00f[.\/] - [E=filters:%{ENV:filters}blue]
RewriteRule /filter2/ - [E=filters:%{ENV:filters}/size/]
RewriteRule /123[.\/] - [E=filters:%{ENV:filters}big]
RewriteRule /32[.\/] - [E=filters:%{ENV:filters}small]
RewriteRule /filter3/([^/^\.]+) - [E=filters:/price/$1]
RewriteRule /filter4/ - [E=filters:%{ENV:filters}/style/]
RewriteRule /36[.\/] - [E=filters:%{ENV:filters}classic]
RewriteRule /37[.\/] - [E=filters:%{ENV:filters}urban]
RewriteCond %{REQUEST_URI} ^/(category1|category2|category3|category4)/
RewriteCond %{ENV:filters} !^$
RewriteRule ^([^/]+)/ /$1%{ENV:filters}.html [L,R=301]
Basically, I've reformatted whole the URL in environment variable filters then checked if it's a category and not some else part of the website, and finally made redirect to this category+filters variable, appended .html at the end.
Even though the new URL looks prettier to a human, I'm not sure if there's a need to change the existing URL for SEO reasons.
To get a redirect instead of a rewrite, you must use the R|redirect flag. So your rule would look like
RewriteRule (.*)filter1(.*) $1color$2 [R,L]
But if you have multiple redirects, this might impact your SEO results negatively, see Chained 301 redirects should be avoided for SEO , but Google will follow 2 or 3 stacked redirects
Remember that ideally you shouldn’t have any stacked redirects or even a single redirect if you can help it, but if required Google will follow chained redirects
But every additional redirect will make it more likely that Google won’t follow the redirects and pass PageRank
For Google keep it to two and at a maximum three redirects if you have to
Bing may not support chained redirects at all
This means try to replace multiple filters at once
RewriteRule ^(.*)/filter1/(.*)/filter2/(.*)$ $1/color/$2/size/$3 [R,L]
and so on.
When the filters may come in an arbitrary order, you may use several rules and do a redirect at the end
RewriteRule ^(.*)filter1(.*)$ $1color$2 [L]
RewriteRule ^(.*)filter2(.*)$ $1price$2 [L]
RewriteRule ^(.*)filter3(.*)$ $1size$2 [L]
RewriteCond %{ENV:REDIRECT_STATUS} 200
RewriteRule ^ %{REQUEST_URI} [R,L]
RewriteCond with REDIRECT_STATUS is there to prevent an endless loop.
When it works as it should, you may replace R with R=301. Never test with R=301.
A final note, be very careful with these experiments. I managed to kill my machine twice (it became unresponsive and I had to switch off) during tests.

How to add "everything else" rule to mod_rewrite

How can I make mod_rewrite redirect to a certain page or probably just throw 404 if no other rules have been satisfied? Here's what I have in my .htaccess file:
RewriteEngine on
RewriteRule ^\. / [F,QSA,L]
RewriteRule ^3rdparty(/.*)$ / [F,QSA,L]
RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^((images|upload)/.+|style.css)$ $1 [L]
RewriteRule ^$ special [QSA]
RewriteRule ^(special|ready|building|feedback)/?$ $1.php [QSA,L]
RewriteRule ^(ready|building)/(\d+)/?$ show_property.php?type=$1&property_id=$2 [QSA,L]
RewriteRule . error.php?code=404 [QSA,L]
This is supposed, among other things, to send user to error.php if he tries to access anything that was not explicitly specified here (by the way, what is the proper way to throw 404?). However, instead it sends user from every page to error.php. If I remove the last rule, everything else works.
What am I doing wrong?
What is happening is that when you are doing a rewrite, you then send the user to the new URL, where these rewrite rules are then evaluated again. Eventually no other redirectoin rules will be triggered and it will get to the final rule and always redirect to the error.php page.
So you need to put some rewrite conditions in place to make this not happen.
The rewrite engine loops, so you need to pasthrough successful rewrites before finally rewriting to error.php. Maybe something like:
RewriteCond %{REQUEST_URI} !^/$
RewriteCond %{REQUEST_URI} !^/(special|ready|building|feedback|show_property)\.php
RewriteCond %{REQUEST_URI} !^/((images|upload)/.+|style.css)$
RewriteRule ^ error.php?code=404 [QSA,L,R=404]
Each condition makes sure the URI isn't one of the ones your other rules have rewritten to.
The R=404 will redirect to the error.php page as a "404 Not Found".
Unfortunatelly, it didn't work - it allows access to all files on the server (presumably because all conditions need to be satisfied). I tried an alternate solution:
Something else must be slipping through, eventhough when I tested your rules plus these at the end in a blank htaccess file, it seems to work. Something else you can try which is a little less nice but since you don't actually redirect the browser anywhere, it would be hidden from clients.
You have a QSA flag at the end of all your rules, you could add a unique param to the query string after you've applied a rule, then just check against that. Example:
RewriteCond %{REQUEST_FILENAME} -f
RewriteRule ^((images|upload)/.+|style.css)$ $1?_ok [L,QSA]
then at the end:
RewriteCond %{QUERY_STRING} !_ok
RewriteRule ^ error.php?code=404&_ok [QSA,L,R=404]
In theory if none of the rules are matched (and the requested URL does not exist), it's already a 404. So I think the simplest solution is to use an ErrorDocument, then rewrite it:
RewriteEngine On
ErrorDocument 404 /404.php
RewriteRule ^404.php$ error.php?code=404 [L]
# All your other rules here...
You can do the same for any other HTTP error code.
The problem here is that after the mod_rewrite finishes rewriting the URL, it is resubmitted to the mod_rewrite for another pass. So, the [L] flag only makes the rule last for the current pass. As much better explained in this question, mod_rewrite starting from Apache version 2.3.9, now supports another flag - [END], that makes the current mod_rewrite pass the last one. For Apache 2.2 a number of solutions are offered, but since one of them was a bit clumsy and another didn't work, my current solution is to add another two rules that allow a specific set of files to be accessed while sending 404 for everything else:
RewriteRule ^((images|upload)/.+|style.css|(special|ready|building|feedback|property).php)$ - [QSA,L]
RewriteRule .* - [QSA,L,R=404]
I think your last rule should be
RewriteRule ^(.*)$ error.php?code=404&query=$1 [QSA,L]
You could leave out the parenthesis and the $1 parameter, but maybe it's useful to know, what the user tried to achieve.
Hope, this does the trick!

Apache URL Rewriting,

I am trying to get URL rewriting to work on my website. Here is the contents of my .htaccess:
RewriteEngine On
RewriteRule ^blog/?$ index.php?page=blog [L]
RewriteRule ^about/?$ index.php?page=about [L]
RewriteRule ^portfolio/?$ index.php?page=portfolio [L]
#RewriteRule ^.*$ index.php?page=blog [L]
Now the 3 uncommented rewrite rules work perfectly, if I try http://www.mysite.com/blog/, I get redirected to http://www.mysite.com/index.php?page=blog, the same for "about" and "portfolio". However, if I mistype blog, say I try http://www.mysite.com/bloh/, then obviously I get a 404 error. The last rule, the commented one, was to help prevent that. Any URL should get redirected to the blog, but of course this rule is still parsed even if we have successfully used a previous one, so I used the "last" flag ([L]). If I uncomment my last rule, anything, including blog, about, and portfolio, redirect to blog. Shouldn't the "last" flag stop the execution as soon as it finds a matching rule?
Thanks.
Yes, the Last flag means it won't apply any of the rules following this rule in this request.
After rewriting the URL, it makes an internal request using the new rewritten URL which would match your last RewriteRule and thus your redirects go into an infinite loop.
Use the RewriteCond directive to limit rewriting to URLs that don't start with index.php, and you should be fine.
You could add a condition like:
RewriteCond %{REQUEST_URI} !^index\.php
I'll also mention that using RewriteRule ^.*$ is a good way to break all of your media requests (css, js, images) as well. You might want to add some conditions like:
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
To make sure you're not trying to rewrite actual files or directories that exist on your server. Otherwise they'll be unreachable unless index.php serves those too!
From apache's mod_rewrite docs
'last|L' (last rule)
Stop the rewriting process here and don't apply any more rewrite
rules. This corresponds to the Perl
last command or the break command in
C. Use this flag to prevent the
currently rewritten URL from being
rewritten further by following rules.
Remember, however, that if the
RewriteRule generates an internal
redirect (which frequently occurs when
rewriting in a per-directory context),
this will reinject the request and
will cause processing to be repeated
starting from the first RewriteRule.
You could use
ErrorDocument 404 /index.php?page=blog
but you should be aware of the fact that it doesn't return 404 error code, but a redirect one and I don't know if that is such a good practice.
After you [L]eave processing for the request, the whole processing runs again for the new (rewritten) URL. You could get out of that loop by using this before your other rules:
RewriteRule ^index.php - [L]
which means "for index.php, don't rewrite and leave processing."

Why would mod_rewrite rewrite twice?

I only recently found out about URL rewriting, so I've still got a lot to learn.
While following the Easy Mod Rewrite tutorial, the results of one of their examples is really confusing me.
RewriteBase /
RewriteRule (.*) index.php?page=$1 [QSA,L]
Rewrites /home as /index.php?page=index.php&page=home.
I thought the duplicates might have had been caused by something in my host's configs, but a clean install of XAMPP does the same.
So, does anyone know why this seems to parse twice?
And, to me this seems like, if it's going to do this, it would be an infinite loop -- why does it stop at 2 cycles?
From Example 1 on this page, which is part of the tutorial linked in your question:
Assume you are using a CMS system that rewrites requests for everything to a single index.php script.
RewriteRule ^(.*)$ index.php?PAGE=$1 [L,QSA]
Yet every time you run that, regardless of which file you request, the PAGE variable always contains "index.php".
Why? You will end up doing two rewrites. Firstly, you request test.php. This gets rewritten to index.php?PAGE=test.php. A second request is now made for index.php?PAGE=test.php. This still matches your rewrite pattern, and in turn gets rewritten to index.php?PAGE=index.php.
One solution would be to add a RewriteCond that checks if the file is already "index.php". A better solution that also allows you to keep images and CSS files in the same directory is to use a RewriteCond that checks if the file exists, using -f.
1the link is to the Internet Archive, since the tutorial website appears to be offline
From the Apache Module mod_rewrite documentation:
'last|L' (last rule)
[…] if the RewriteRule generates an internal redirect […] this will reinject the request and will cause processing to be repeated starting from the first RewriteRule.
To prevent this you could either use an additional RewriteCond directive:
RewriteCond %{REQUEST_URI} !^/index\.php$
RewriteRule (.*) index.php?page=$1 [QSA,L]
Or you alter the pattern to not match index.php and use the REQUEST_URI variable, either in the redirect or later in PHP ($_SERVER['REQUEST_URI']).
RewriteRule !^index\.php$ index.php?page=%{REQUEST_URI} [QSA,L]

Hidden features of mod_rewrite

There seem to be a decent number of mod_rewrite threads floating around lately with a bit of confusion over how certain aspects of it work. As a result I've compiled a few notes on common functionality, and perhaps a few annoying nuances.
What other features / common issues have you run across using mod_rewrite?
Where to place mod_rewrite rules
mod_rewrite rules may be placed within the httpd.conf file, or within the .htaccess file. if you have access to httpd.conf, placing rules here will offer a performance benefit (as the rules are processed once, as opposed to each time the .htaccess file is called).
Logging mod_rewrite requests
Logging may be enabled from within the httpd.conf file (including <Virtual Host>):
# logs can't be enabled from .htaccess
# loglevel > 2 is really spammy!
RewriteLog /path/to/rewrite.log
RewriteLogLevel 2
Common use cases
To funnel all requests to a single point:
RewriteEngine on
# ignore existing files
RewriteCond %{REQUEST_FILENAME} !-f
# ignore existing directories
RewriteCond %{REQUEST_FILENAME} !-d
# map requests to index.php and append as a query string
RewriteRule ^(.*)$ index.php?query=$1
Since Apache 2.2.16 you can also use FallbackResource.
Handling 301/302 redirects:
RewriteEngine on
# 302 Temporary Redirect (302 is the default, but can be specified for clarity)
RewriteRule ^oldpage\.html$ /newpage.html [R=302]
# 301 Permanent Redirect
RewriteRule ^oldpage2\.html$ /newpage.html [R=301]
Note: external redirects are implicitly 302 redirects:
# this rule:
RewriteRule ^somepage\.html$ http://google.com
# is equivalent to:
RewriteRule ^somepage\.html$ http://google.com [R]
# and:
RewriteRule ^somepage\.html$ http://google.com [R=302]
Forcing SSL
RewriteEngine on
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://example.com/$1 [R,L]
Common flags:
[R] or [redirect] - force a redirect (defaults to a 302 temporary redirect)
[R=301] or [redirect=301] - force a 301 permanent redirect
[L] or [last] - stop rewriting process (see note below in common pitfalls)
[NC] or [nocase] - specify that matching should be case insensitive
Using the long-form of flags is often more readable and will help others who come to read your code later.
You can separate multiple flags with a comma:
RewriteRule ^olddir(.*)$ /newdir$1 [L,NC]
Common pitfalls
Mixing mod_alias style redirects with mod_rewrite
# Bad
Redirect 302 /somepage.html http://example.com/otherpage.html
RewriteEngine on
RewriteRule ^(.*)$ index.php?query=$1
# Good (use mod_rewrite for both)
RewriteEngine on
# 302 redirect and stop processing
RewriteRule ^somepage.html$ /otherpage.html [R=302,L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
# handle other redirects
RewriteRule ^(.*)$ index.php?query=$1
Note: you can mix mod_alias with mod_rewrite, but it involves more work than just handling basic redirects as above.
Context affects syntax
Within .htaccess files, a leading slash is not used in the RewriteRule pattern:
# given: GET /directory/file.html
# .htaccess
# result: /newdirectory/file.html
RewriteRule ^directory(.*)$ /newdirectory$1
# .htaccess
# result: no match!
RewriteRule ^/directory(.*)$ /newdirectory$1
# httpd.conf
# result: /newdirectory/file.html
RewriteRule ^/directory(.*)$ /newdirectory$1
# Putting a "?" after the slash will allow it to work in both contexts:
RewriteRule ^/?directory(.*)$ /newdirectory$1
[L] is not last! (sometimes)
The [L] flag stops processing any further rewrite rules for that pass through the rule set. However, if the URL was modified in that pass and you're in the .htaccess context or the <Directory> section, then your modified request is going to be passed back through the URL parsing engine again. And on the next pass, it may match a different rule this time. If you don't understand this, it often looks like your [L] flag had no effect.
# processing does not stop here
RewriteRule ^dirA$ /dirB [L]
# /dirC will be the final result
RewriteRule ^dirB$ /dirC
Our rewrite log shows that the rules are run twice and the URL is updated twice:
rewrite 'dirA' -> '/dirB'
internal redirect with /dirB [INTERNAL REDIRECT]
rewrite 'dirB' -> '/dirC'
The best way around this is to use the [END] flag (see Apache docs) instead of the [L] flag, if you truly want to stop all further processing of rules (and subsequent passes). However, the [END] flag is only available for Apache v2.3.9+, so if you have v2.2 or lower, you're stuck with just the [L] flag.
For earlier versions, you must rely on RewriteCond statements to prevent matching of rules on subsequent passes of the URL parsing engine.
# Only process the following RewriteRule if on the first pass
RewriteCond %{ENV:REDIRECT_STATUS} ^$
RewriteRule ...
Or you must ensure that your RewriteRule's are in a context (i.e. httpd.conf) that will not cause your request to be re-parsed.
if you need to 'block' internal redirects / rewrites from happening in the .htaccess, take a look at the
RewriteCond %{ENV:REDIRECT_STATUS} ^$
condition, as discussed here.
The deal with RewriteBase:
You almost always need to set RewriteBase. If you don't, apache guesses that your base is the physical disk path to your directory. So start with this:
RewriteBase /
Other Pitfalls:
1- Sometimes it's a good idea to disable MultiViews
Options -MultiViews
I'm not well verse on all of MultiViews capabilities, but I know that it messes up my mod_rewrite rules when active, because one of its properties is to try and 'guess' an extension to a file that it thinks I'm looking for.
I'll explain:
Suppose you have 2 php files in your web dir, file1.php and file2.php and you add these conditions and rule to your .htaccess :
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ file1.php/$1
You assume that all urls that do not match a file or a directory will be grabbed by file1.php. Surprise! This rule is not being honored for the url http://myhost/file2/somepath. Instead you're taken inside file2.php.
What's going on is that MultiViews automagically guessed that the url that you actually wanted was http://myhost/file2.php/somepath and gladly took you there.
Now, you have no clue what just happened and you're at that point questioning everything that you thought you knew about mod_rewrite. You then start playing around with rules to try to make sense of the logic behind this new situation, but the more you're testing the less sense it makes.
Ok, In short if you want mod_rewrite to work in a way that approximates logic, turning off MultiViews is a step in the right direction.
2- enable FollowSymlinks
Options +FollowSymLinks
That one, I don't really know the details of, but I've seen it mentioned many times, so just do it.
Equation can be done with following example:
RewriteCond %{REQUEST_URI} ^/(server0|server1).*$ [NC]
# %1 is the string that was found above
# %1<>%{HTTP_COOKIE} concatenates first macht with mod_rewrite variable -> "test0<>foo=bar;"
#RewriteCond search for a (.*) in the second part -> \1 is a reference to (.*)
# <> is used as an string separator/indicator, can be replaced by any other character
RewriteCond %1<>%{HTTP_COOKIE} !^(.*)<>.*stickysession=\1.*$ [NC]
RewriteRule ^(.*)$ https://notmatch.domain.com/ [R=301,L]
Dynamic Load Balancing:
If you use the mod_proxy to balance your system, it's possible to add a dynamic range of worker server.
RewriteCond %{HTTP_COOKIE} ^.*stickysession=route\.server([0-9]{1,2}).*$ [NC]
RewriteRule (.*) https://worker%1.internal.com/$1 [P,L]
A better understanding of the [L] flag is in order. The [L] flag is last, you just have to understand what will cause your request to be routed through the URL parsing engine again. From the docs (http://httpd.apache.org/docs/2.2/rewrite/flags.html#flag_l) (emphasis mine):
The [L] flag causes mod_rewrite to stop processing the rule set. In
most contexts, this means that if the rule matches, no further rules
will be processed. This corresponds to the last command in Perl, or
the break command in C. Use this flag to indicate that the current
rule should be applied immediately without considering further rules.
If you are using RewriteRule in either .htaccess files or in <Directory> sections, it is important to have some understanding of
how the rules are processed. The simplified form of this is that once
the rules have been processed, the rewritten request is handed back to
the URL parsing engine to do what it may with it. It is possible that
as the rewritten request is handled, the .htaccess file or <Directory>
section may be encountered again, and thus the ruleset may be run
again from the start. Most commonly this will happen if one of the
rules causes a redirect - either internal or external - causing the
request process to start over.
So the [L] flag does stop processing any further rewrite rules for that pass through the rule set. However, if your rule marked with [L] modified the request, and you're in the .htaccess context or the <Directory> section, then your modifed request is going to be passed back through the URL parsing engine again. And on the next pass, it may match a different rule this time. If you don't understand what happened, it looks like your first rewrite rule with the [L] flag had no effect.
The best way around this is to use the [END] flag (http://httpd.apache.org/docs/current/rewrite/flags.html#flag_end) instead of the [L] flag, if you truly want to stop all further processing of rules (and subsequent reparsing). However, the [END] flag is only available for Apache v2.3.9+, so if you have v2.2 or lower, you're stuck with just the [L] flag. In this case, you must rely on RewriteCond statements to prevent matching of rules on subsequent passes of the URL parsing engine. Or you must ensure that your RewriteRule's are in a context (i.e. httpd.conf) that will not cause your request to be re-parsed.
Another great feature are rewrite-map-expansions. They're especially useful if you have a massive amout of hosts / rewrites to handle:
They are like a key-value-replacement:
RewriteMap examplemap txt:/path/to/file/map.txt
Then you can use a mapping in your rules like:
RewriteRule ^/ex/(.*) ${examplemap:$1}
More information on this topic can be found here:
http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html#mapfunc
mod_rewrite can modify aspects of request handling without altering the URL, e.g. setting environment variables, setting cookies, etc. This is incredibly useful.
Conditionally set an environment variable:
RewriteCond %{HTTP_COOKIE} myCookie=(a|b) [NC]
RewriteRule .* - [E=MY_ENV_VAR:%b]
Return a 503 response:
RewriteRule's [R] flag can take a non-3xx value and return a non-redirecting response, e.g. for managed downtime/maintenance:
RewriteRule .* - [R=503,L]
will return a 503 response (not a redirect per se).
Also, mod_rewrite can act like a super-powered interface to mod_proxy, so you can do this instead of writing ProxyPass directives:
RewriteRule ^/(.*)$ balancer://cluster%{REQUEST_URI} [P,QSA,L]
Opinion:
Using RewriteRules and RewriteConds to route requests to different applications or load balancers based on virtually any conceivable aspect of the request is just immensely powerful. Controlling requests on their way to the backend, and being able to modify the responses on their way back out, makes mod_rewrite the ideal place to centralize all routing-related config.
Take the time to learn it, it's well worth it! :)