Ktor: define route for URL with and without tailing slash - kotlin

I am trying to learn Ktor, when responding to a GET request, I found that the trailing slash is treated as two routes, for example:
/greet?name=john
/greet/?name=john
Is it possible to define one route for the above two URLs, which is to handle trailing slash automatically?

You can use the IgnoreTrailingSlash plugin to solve your problem.

In your Application.module() or Application.configureRouting() add this:
install(IgnoreTrailingSlash)

Related

Using regex URI in location with proxy and sending an API as part of result

My problem is this, URI contains a changing variable in the middle. I would like to match and store this variable and use it to send an additional API call to a separate service. Trying to request a stream and change TV channels in one single request. Basically, I am becoming the man-in-the-middle (or API gateway). I am not new to nginx but not well versed either.
Example URL: http://10.0.0.1:1234/auto/v002?transcode=heavy
Where "v002" is the channel that I want to match and capture the "002" as a variable to use later in an API call to a remote IR blaster.
location ^~ "/auto/v(\d{3})\?transcode=heavy"{
rewrite "/auto/v((\d{3}))$\?transcode=heavy" /$1 break;
proxy_pass http://10.0.0.5:1234/auto/v002?transcode=heavy; # note the trailing slash!
}
I encounter different issues with using URIs and regex. My trial and error isn't paying off.
Even if I try to match explict,
location = "/auto/v002?transcode=heavy"{
I get no matches. I feel like the "?" in the URI might be an issue.
Second part of question is how to send a simple rest API as part of the flow using nginx?
Thanks in advance!
I have tried trial and error as well as searching the web with not much luck.

Using an expression in the mod_ext_filter module

I need to configure an apache reverse proxy to return a js script on each page. For that, I use a perl script that injects the javascript to the pages.
To do this, I used the following directive:
ExtFilterDefine fixtext mode=output intype=text/html cmd="/home/eloi/leanovia/observability/webserver-script/ttk-js-rum-injector.perl ${cookies_retrieved}"
I succeeded until this step.
However, I need to get the cookies from the request to use it in the javascript. I know I need to use expr=%{req:Cookie}
However, the expression does not evaluate when I use it like this:
ExtFilterDefine fixtext mode=output intype=text/html cmd="/home/eloi/leanovia/observability/webserver-script/ttk-js-rum-injector.perl expr=%{req:Cookie}"
I know I'm really close to the solution, but I've been struggling for hours trying to get the expression evaluated, without success. I haven't found a solution to my problem. I know that it is possible to send a header in the response that evaluates this expression, since I could test it with this directive:
Header set eloi "${cookies_retrieved}"

Using route parameters with CGI?

I'm converting an API using the Slim PHP framework to CGI. Slim handles route parameters very nicely (e.g. www.mysite.com/processorder/38929/w1?a1=test where 38929 and w1 are route parameters and includes additional GET parameters).
What is the correct or best way to do this? Do I need to configure httpd.conf to somehow convert the route parameters to POST parameters or additional GET parameters before it calls my CGI program?
I'm using the the CGIC C library at boutell.com as a starting point, if that matters.
Thanks!
Well I feel a little foolish. The answer was obvious if I had taken my head out of CGI for a second.
Since the site is served with Apache, I just needed to create an .htaccess file to convert the "route" parameters to GET parameters, including a QSA modifier to retain the original GET parameters in the URL (a1 in my example).

Could a manipulated URL cause security issues inside a RewriteRule?

I'm brand new to Apache.
I have the following .htaccess file
Options -Indexes
RewriteBase /
RewriteEngine on
RewriteRule ^([a-zA-Z0-9_]+)*$ redirect.php?uniqueID=$1 [QSA,L]
so that going to: mySite.com/242skl2j
loads the page: mySite.com/redirect.php?uniqueID=242skl2j
But let's say I didn't have this RegEx in my Apache code [a-zA-Z0-9_] and I just allowed for all characters.... could someone load Apache code directly into this by navigating to something like mySite.com/2%20[R]%20reWriteRule%20^([a-zA-Z0-9_]+)*$%20anything.html#www.aDifferentSite.com/index.html
Like SQL injection but for Apache? (I'm not even sure %20 would convert to a space in my apache code but there might be something that can?)
Or is this not really a concern because they can't do any real "harm" to the site, only to their own unique navigation?
As far as I know, there is no known security hole in Apache where something like this could slip through. Whatever is in your URL gets escaped before it's used inside Apache's engine.
Also, different from those in the central server config, rewrites and redirections defined in .htaccess can not "break out" of the current web root*, so even an accidental mis-written (or exploited) RewriteRule could not be used to get hold of something that isn't supposed to be served publicly.
* = see the description of RewriteRule's behaviour in the docs.
There's not big risks. But there's maybe some things to care a little about.
Usually things handled by mod_rewrite are already url decoded. So you could manipulate the HTTP_REFERRER or the query path without any character decoding considerations.
Also the rewrite rules does not suffer from rules injections. But you could try it, if you find a way to inject something in a Rule that would be interpreted as a rewriteRule code you would become a rich guy :-). Seriously I don't think you could, theses rules manage arguments as an SQL server would manage arguments when using parameterized queries, parameters cannot be read as code.
But mod_rewrite receive also the query string part of the query before any urldecode on it. So this is a point where you should be cautious.
Any rule that apply some access restriction based on query string arguments (everything after the ?), like this Access Control by Query String from http://wiki.apache.org is wrong:
Deny access to http://example.com/page?query_string if query_string contains the string foo.
RewriteCond %{QUERY_STRING} foo
RewriteRule ^/page - [F]
Using http://example.com/page?toto=%66oo with %66 for f would not be detected by mod_rewrite. Managing rules on the query_string part of the request is a very hard stuff. you could check this answer for examples, usually it means checking for combination of both encoded and decoded characters in the string. But the simple rule is avoid any access control by mod_rewrite based on the query string, work only on the query path. And even with paths, double check that using // instead of / stills works.
In the past some mod_rewrite exploits have existed:
http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2003-0542 (2003, buffer overflow)
http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2006-3747 (2006, buffer overflow)
http://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-1862 (2013, bad log escaping)
No, that's not possible. The only thing you need to worry about is how you handle the uniqueID parameter.
Short answer: No
Long answer: rewrite rule is nothing more than a replace function, it just replaces what it finds with what you have given it. But since you're passing that to a web application , that may cause a problem if you haven't taken care of incoming data!

Double request from mod-rewrite

I've written a module that sets Apache environment variables to be used by mod-rewrite. It hooks into ap_hook_post_read_request() and that works fine, but if mod-rewrite matches a RewriteRule then it makes a second call to my request handler with the rewritten URL. This looks like a new request to me in that the environment variables are no longer set and therefore I have to execute my (expensive) code twice for each hit.
What am I doing wrong, or is there a work around for this?
Thanks
You can use the [NS] modifier on a rule to cause it to not be processed for internal subrequests (the second pass you're seeing is an internal subrequest).
As I understand it, the NS flag (suggested in another answer) on a rule makes it evaluate as "if I am being called a second time, ignore me". The trouble is, by then it's too late since the hook has already been called. I believe this will be a problem no matter what you do in mod_rewrite. You can detect the second request, but I don't know of any way to prevent the second request.
My best suggestion is to put the detection in your handler before your (expensive) code and exit if it's being run a second time. You could have mod_rewrite append something to the URL so you'd know when it's being called a second time.
However...
If your (expensive) code is being called on every request, it's also being called on images, css files, favicons, etc. Do you really want that? Or is that possibly what you are seeing as the second call?
Thanks a bunch, I did something similar to what bmb suggested and it works! But rather than involving mod-rewrite in this at all, I added a "fake" request header in my module's request handler, like so:
apr_table_set(r->headers_in, "HTTP_MY_MODULE", "yes");
Then I could detect it at the top of my handler on the second rewritten request. Turns out that even though mod-rewrite (or Apache?) doesn't preserve added env or notes variables (r->subprocess_env, r->notes) in a subrequest, it does preserve added headers.
As for my expensive code getting called on every request, I have a configurable URL suffix/extension filter in the handler to ignore image, etc requests.