Can traefik's forwardAuth middleware be used to secure a browser page (not an api)? - traefik

I need to secure a web page with a token stored in a cookie or url param. All examples I can find for using forwardAuth middleware seems to be for securing an API, as it's easy to supply headers in an API request. Sending custom headers isn't an option w/ the browser, so I need to used cookies.
I would like to have the auth token passed in through a query string arg, eg ?token=ABCDEFG, then stored in a cookie for future requests. Here's what the workflow looks like:
I've tried experimenting with forwardAuth to see how I can do this. The auth endpoint reads the Authorization header, but I need something that reads the cookie in the request and transforms that to an Authorization header.
Is there any way this can be done with Traefik?

It looks like the answer is yes. Originally I had thought traefik wouldn't forward cookies, but it does in fact appear to forward cookies.
I ended up creating a "sidecar" auth container on the same host as traefik so that auth requests would be faster.
The auth function looks like this (node/express):
app.get('/auth', (req, res) => {
logger.info('CHECKING AUTH');
const url = new URL(`${req.headers['x-forwarded-proto']}://` +
`${req.headers['x-forwarded-host']}` +
`${req.headers['x-forwarded-uri']}`);
const urlAuthToken = url.searchParams.get('token');
if (urlAuthToken) {
url.searchParams.delete('token');
const domain = BASE_DOMAIN;
const sameSite = false;
const secure = url.protocol === 'https:';
return res
.cookie('auth-token', urlAuthToken, {domain, sameSite, secure})
.redirect(url.toString());
}
// Simulate credentials check
if (req.cookies['auth-token'] === 'my-little-secret') {
return res.status(200).send();
}
return res.status(401).send('<h1>401: Unauthorized</h1>');
});

Related

How to validate a token that is send by socket.io using passport ? I am using passport-azure-ad strategy

I have an application that is using passport-azure-ad strategy to authenticate users. When the client sends a post or get request, I have a middleware that checks if the request is valid or not with passport.authenticate('oath-bearer', {session: false})(requires, next) and this work perfectly Fine.
But, on this same application, I am also using socket.io for uploading images. When the client tries to establish socket connection with the server, it sends a token on the header like this - `const socket = io('http://localhost:3000', auth: {token : 'eyhadjhad...'})`. I have access to this token on the server side like this - const token = socket.handshake.auth.token . Now I am having trouble authenticating this token.
Is there a way I can add a middleware on namespaces like the one I have for routes? for example like this -
io.of('/fileUpload')
.use((socket, next) =>
passport.authenticate(token)
)).on('connection', (socket) => {
console.log('user authenticated, allow upload')
})

implement authentication in Next.js, graphQL, Apollo client

I'm trying to build SSR application using NextJS and apollo-client on the frontend, and graphql with express using (graphQL Yoga) on the backend.
I came from client side rendering background and things there are simpler than SSR when it comes to authentication, in regular client side rendering my approach to authenticate user was like:
1- once the user login after server validation, sign a JWT with current user data, then send it to the client side, and save it in localstorage or cookies, etc...
2- implement a loadUser() function and call it in the (root) App component's useEffect hook to load the user in every component (page) if the JWT in localstorage is valid.
3- if the JWT isn't there or is invalid just return user as null and redirect to login page.
so in Next.js i know we can't access localstorage cause it works server side, so we just save the token in a cookie, and the approach i implemented is painful and i was wondering if there is an pimplier way, my approach is like:
1- once the user login he calls the login mutation which sets a cookie in the req header, and return a user and any data i want.
2- in each page that requires authentication i need to get the token from the cookie to send it back in the header and i did that in getInitialProps() or getServerSideProps() cause both runs server side and have access to the request cookies in the header like so:
export const getServerSideProps = async ctx => {
const apolloClient = initializeApollo();
// get the cookies from the headers in the request object
const token = ctx.req.headers.cookie ? ctx.req.headers.cookie : null;
return {
props: {
initialApolloState: apolloClient.cache.extract(),
token: token
}
};
};
now i have access to the token in the page props and can send the token back with the req header with my apollo client like so:
let getUserQuery = await apolloClient.query({
query: GET_USER_QUERY,
variables: { id: ctx.params.id },
context: { headers: { token: token } }
});
now i have access to the token in the server side request like req.headers.token
what i wanna achieve:
1- is there an easier way to implement loadUser() that loads the user with every page render that i can implement in next.js custom _app , i found this answer but it doesn't return auth object or user in all components as he mentioned in his answer.
2- i read that if i set cookies httpOnly and credentials: "include" i have access to cookie in every request, but it seems that it doesn't work with apollo client, that would be awesome if there is an alternative approach.
3- there is apollo-link-context provided by apollo team where i can send a token or any value in every request's header using setContext() like so:
const authLink = setContext((_, { headers }) => {
// get the authentication token from local storage if it exists
const token = localStorage.getItem('token');
// return the headers to the context so httpLink can read them
return {
headers: {
...headers,
authorization: token ? `Bearer ${token}` : "",
}
}
});
but since i don't have access to localstorage i can't implement it cause next runs server side, so if anyone has an implementation for this please consider sharing.
PS. i made this thread after searching and reading for like 1 week and it's my last resort to ask you guys, and thanks in advance.
get token by store
store.getState()..path.to.your.token
the problem is that the token doesn't completely update when the blind changes and I'm looking for a solution.

What is Code Сhallenge in query param in authorization server like IdentityServer (from JS SPA client pov)?

When I do manual redirect, I'm getting an error from IdentityServer
invalid_request, code challenge required
However when I use oidc-client-js library for the same authorization request, I do not get that error. Library somehow sets code challenge under the hood.
Here is me JS code.
Set up:
const config = {
authority: "https://demo.identityserver.io",
client_id: "interactive.confidential",
redirect_uri: "http://localhost:3000/callback",
response_type: "code",
scope:"openid profile email api offline_access",
post_logout_redirect_uri : "http://localhost:3000/post_logout",
};
const url = `https://demo.identityserver.io/connect/authorize?
client_id=${config.client_id}&
redirect_uri=${config.redirect_uri}&
response_type=${config.response_type}&
scope=${config.scope}`;
My manual authorization redirect request that throws:
const onFormSubmit = async (ev: React.FormEvent) => {
ev.preventDefault();
window.location.replace(url); // I simply do replace
}
Code with the library that doesn't throw:
import Oidc from 'oidc-client';
const onFormSubmit = async (ev: React.FormEvent) => {
ev.preventDefault();
const mgr = new Oidc.UserManager(config);
mgr.signinRedirect(); // login redirect here, no errors
}
I want to understand what code challengem is. And how it gets generated. Give me a hint what to read about it.
I ca go on with the library, but I'd prefer not to import third-party libs into my app where possible.
Authorize Endpoint handle multiple grant types, the way you are sending your request, matched to Authorization Code Grant which needs code_challenge parameter during the request.
Try something simpler to make a request like:
GET /connect/authorize?
client_id=client1&
scope=openid email api1&
response_type=id_token token&
redirect_uri=https://myapp/callback&
state=abc&
nonce=xyz
Read Authorize Endpoint for more information.
Heres an example of generating a challenge code:
private string CreateCodeChallenge()
{
_codeVerifier = RandomNumberGenerator.CreateUniqueId();
var sha256 = HashAlgorithmProvider.OpenAlgorithm(HashAlgorithm.Sha256);
var challengeBuffer = sha256.HashData(
CryptographicBuffer.CreateFromByteArray(Encoding.UTF8.GetBytes(_codeVerifier)));
byte[] challengeBytes;
CryptographicBuffer.CopyToByteArray(challengeBuffer, out challengeBytes);
return Base64Url.Encode(challengeBytes);
}
Include the code and the method in the request querystring.
You can generate codes for testing here: https://tonyxu-io.github.io/pkce-generator/
That's as far I've gotten with it but I am shown the login screen.
It's a parameter required by the Proof Key for Code Exchange standard.
OAuth 2.0 public clients utilizing the Authorization Code Grant are susceptible to the authorization code interception attack. This specification describes the attack as well as a technique to mitigate against the threat through the use of Proof Key for Code Exchange (PKCE, pronounced "pixy").

ValidateAntiForgeryToken in an ASP.NET Core React SPA Application

I'm trying to use the framework's tools to add some simple CSRF validation to an ASP.NET Core React SPA. The application itself is essentially a create-react-app setup (a single index.html with a root element and everything else is loaded in from bundled JavaScript).
Tinkering with some information found on links such as this one, I've set the following in my Startup.ConfigureServices:
services.AddAntiforgery(options => options.Cookie.Name = "X-CSRF-TOKEN");
And confirmed in my Chrome tools that the cookie is being set. If I omit the above line, a cookie is still set with a partially randomized name, such as: .AspNetCore.Antiforgery.RAtR0X9F8_w Either way the cookie is being set. I've also confirmed that any time I re-start the whole application the cookie value is updated, so the framework is actively setting this cookie.
Observing network requests in my Chrome tools, I confirm that the cookie is being sent to the server on AJAX request. Placing a breakpoint on the server and observing the Request.Cookies value in a controller action also confirms this.
However, if I decorate any such AJAX requested action with [ValidateAntiForgeryToken] then the response is always an empty 400.
Is there a configuration step I've missed somewhere? Perhaps the action attribute is looking in the wrong place and I need to use a different validation?
I just inspect the log and find out there's an exception:
Microsoft.AspNetCore.Antiforgery.AntiforgeryValidationException: The required antiforgery cookie ".AspNetCore.Antiforgery.HPE6W9qucDc" is not present.
at Microsoft.AspNetCore.Antiforgery.Internal.DefaultAntiforgery.ValidateRequestAsync(HttpContext httpContext)
at Microsoft.AspNetCore.Mvc.ViewFeatures.Internal.ValidateAntiforgeryTokenAuthorizationFilter.OnAuthorizationAsync(AuthorizationFilterContext context)
It indicates that you forgot to configure the cookie name :
public void ConfigureServices(IServiceCollection services)
{
//services.AddAntiforgery();
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);
// In production, the React files will be served from this directory
services.AddSpaStaticFiles(configuration =>
{
configuration.RootPath = "ClientApp/build";
});
}
So I just add a configuration as below :
public void ConfigureServices(IServiceCollection services)
{
services.AddAntiforgery(o => {
o.Cookie.Name = "X-CSRF-TOKEN";
});
// ...
}
and it works now.
Also, if you would like to omit the line of services.AddAntiforgery(options => options.Cookie.Name = "X-CSRF-TOKEN"); , you can use the built-in antiforgery.GetAndStoreTokens(context) method to send cookie:
app.Use(next => context =>
{
if (context.Request.Path == "/")
{
//var tokens = antiforgery.GetTokens(context);
var tokens = antiforgery.GetAndStoreTokens(context);
context.Response.Cookies.Append("X-CSRF-TOKEN", tokens.CookieToken, new CookieOptions { HttpOnly = false });
context.Response.Cookies.Append("X-CSRF-FORM-TOKEN", tokens.RequestToken, new CookieOptions { HttpOnly = false });
}
return next(context);
})
Both should work as expected.
The accepted answer here is extremely incorrect when it suggests to send both cookies via JS-readable cookies:
// do not do this
context.Response.Cookies.Append("X-CSRF-TOKEN", tokens.CookieToken, new CookieOptions { HttpOnly = false });
context.Response.Cookies.Append("X-CSRF-FORM-TOKEN", tokens.RequestToken, new CookieOptions { HttpOnly = false });
If you send both the Cookie token and the Request token in a Cookie that is readable by JS, you are defeating the purpose of having a Cookie token and a Request token.
The purpose of using both tokens is to make sure that
you have a valid session (the HTTP-only Cookie proves this),
you have requested a form from the site using this valid session (the HTTP-readable Cookie or another method can prove this), and
you are submitting the form from the same valid session
Why It's Wrong.
The Request Token
The Request Token ensures that you have actually loaded a page (example.com/example-page). Think about this: if you are logged in to example.com as an administrator, a request from anywhere from your browser (where CORS allows the necessary properties) can successfully validate against Cookie-based CSRF Validation and your authentication.
However, by adding the Request Token, you are confirming that your browser also actually loaded a request to the form (or at least, the site) before submitting it. This is usually done with a hidden input. This is automatically done by using the Form Tag Helper in Asp.Net.
<form action="/myEndpoint" method="POST">
<input name="__RequestVerificationToken" type="hidden" value="#antiforgery.GetAndStoreTokens(context).RequestToken" />
<button type="submit">Submit</button>
</form>
It can also be set .. anywhere. like window.CSRFRequestToken, and manually added to a POST request, like in this fetch example:
fetch('/myEndpoint', { method: 'POST', headers: { 'X-XSRF-Token': window.myCSRFRequestToken, 'Bearer': window.mySuperSecretBearerToken } };
The Cookie Token
In the above contrived example, the user is logged in via a bearer token via OAuth or something (not recommended, use HTTP-only Cookies in a browser environment).
The Cookie Token ensures that a malicious script cannot exfiltrate your Request Token and send requests on your behalf. Without it, in a supply chain attack, a malicious user can send your secrets to a malicious actor:
window.addEventListener('load', => sendMySuperSecretInfoToTheShadowRealm(window.CSRFRequestToken, window.mySuperSecretBearerToken));
Now the malicious user could send a request from wherever they want using your CSRF and bearer token to authenticate. BUT! Not if you have your good friend HTTP-only Cookie-based CSRF Validation -- because JavaScript cannot read HTTP-only cookies.
The Solution
Asp.Net combines these solutions by setting both a Cookie Token and a Request Token. Therefore, when you are sending a request to AspNet you send both:
The cookie:
Cookies.Append('X-CSRF-Token', #antiforgery.GetAndStoreTokens(context).CookieToken);
and either the aspnet form helper tag:
<form action="myEndpoint" />
or manually print the token:
<form action="myEndpoint" asp-antiforgery="false">
#Html.AntiForgeryToken()
</form>
or provide the token manually to your scripts:
window.myCSRFRequestToken = "#antiforgery.GetAndStoreTokens(context).RequestToken)";
fetch('/myEndpoint', { method: 'POST', headers: { 'X-CSRF-Token': window.myCSRFRequestToken };
Don't take my word for it
Please please read this page fully in case I didn't explain anything clearly:
https://learn.microsoft.com/en-us/aspnet/core/security/anti-request-forgery?view=aspnetcore-6.0
A final note:
In the documentation above, the very last example uses a cookie to send the request cookie. This is very different in a subtle way than the answer here. The accepted answer sends both cookies as Javascript-readable { HttpOnly = false }. This means JavaScript can read both and a malicious user can read both and craft a special request themselves that will validate against both Cookie and Request CSRF validations (where CORS allows).
In the documentation, one is sent via an HTTP only cookie (this cannot be read by JS, only used for Cookie-based CSRF validation) and the other is sent via an HTTP-readable cookie. This HTTP-readable cookie MUST be read by JavaScript and used with one of the above methods (form input, header) in order to validate CSRF Request Token Validation.

Safely passing bearer tokens to the header from a server only cookie

Is there a way to "safely" store a bearer token in cookies (server side from an Express server running a Next.js app) and then provide it as part of the header so that it's included with every Apollo request? The Apollo team has an example using localStorage, but nothing about grabbing it from a cookie vs localStorage to set in the header. I'm looking into this in order to mitigate XSS. Is there a way to safely provide a token to this code without exposing the token in the browser? This has seemingly been covered in parts in multiple tutorials, but I can't seem to find any definitive code based on this example.
import { ApolloClient } from 'apollo-client';
import { createHttpLink } from 'apollo-link-http';
import { setContext } from 'apollo-link-context';
import { InMemoryCache } from 'apollo-cache-inmemory';
const httpLink = createHttpLink({
uri: '/graphql',
});
const authLink = setContext((_, { headers }) => {
// get the authentication token from local storage if it exists
const token = localStorage.getItem('token');
// return the headers to the context so httpLink can read them
return {
headers: {
...headers,
authorization: token ? `Bearer ${token}` : "",
}
}
});
const client = new ApolloClient({
link: authLink.concat(httpLink),
cache: new InMemoryCache()
});
I have an Oauth2 server that is expecting a bearer token sent from the header in the form of authorization: 'Bearer ****'. I'm just looking for the most secure way to do this, and feel like im getting conflicting information from the majority of tutorials.
Making my comments into an answer since it helped you figure things out:
Browser Javascript cannot access a server-only cookie at all. So, if the server is putting the token in a server-only cookie, then you can't access that from browser Javascript. The server could make it just a regular cookie and then you could get to it from browser Javascript. Or, if the client isn't actually in a browser, it could get the cookie.
Is there a way to safely provide a token to this code without exposing the token in the browser?.
If this code is in a browser (which it looks like it is because it's accessing localStorage), then "no", there is no way to let this code have access to the token, but not let other code in the browser have access to it. Anything you access with browser Javascript is accessible to the world. Usually, you maintain security by having your server keep track of auth tokens and having your server access the privileged server on behalf of the client and just send results to the client.
What I'm asking is. Is there a way to attach the token cookie via node.js so that it gets attached to all Apollo requests without adding it via the client.
Are Apollo requests being made direct from the client or only from your server? Sorry, but I don't understand your architecture here. If only from the server, then you can create a client session on the server (see express-session) and you can store things on the server in that session that belong to a specific client and you can then retrieve those things on future requests.