I have a fastify server that serves cookies to maintain the session.
When I test on localhost, it works as expected. I get a cookie on localhost:3000.
When I host my server as a container image through Cloud Run, however, I can not see any cookies inside the route. I can see the request has a cookie when it's sent to the server, but it's not being parsed. This is causing my session to create a new session and I can't get any context.
Are there any special considerations regarding Cloud Run and cookies?
Only If you are using Firebase Hosting + Cloud Run, __session is the only cookie you can store, by design.
This is necessary for Google to be able to efficiently cache content on the CDN -- Google strips all cookies from the request other than __session.
This is documented here.
If you are using Load Balancer, or other means like custom domains to connect to your Cloud Run service, there is no restriction on cookies, and you get all.
Related
I have been scratching my head for days on this issue so thought I would try and seek some help here.
So I have a Wildfly server and an external keycloak server used for autentication.
My keycloak server uses OICD and a public client.
Usually the flow works fine, the user logs in, and subsequent resources authenticate properly with Keycloak. However the issue comes when I deploy my custom plugins to my wildfly server. Each plugin needs to authenticate to the keycloak server, usually this happens without issue on the first request. However some of my plugins are REST only, so their first request is an XHR request and this request fails as it gets redirected to the login page of keycloak which it does not understand. I am unsure how this flow is meant to work like.
If I set my XHR request withCredentials to true, it triggers CORS on my keycloak server, which is fine I have that setup correctly, however what is baffling to me is that the return request from keycloak then triggers CORS on my wildfly server, and because Keycloak redirects uses a no-refferer policy the origin is null! Setting my wildfly server to accepting null would not be acceptable.
I preferably would want a solution where each plugin would not need to authenticate towards my keycloak if the user has already done the login process but I cannot find any way of enabling session sharing between deployed plugins.
I currently have an API running on Nodejs Express where you can get or upload all types of files (images, videos...) as well as simple json responses.
I would like to connect Sveltekit to this API but it is secured with a SSO so I need to provide an access token for each request.
I already get the access token from the SSO (oidc) on sveltekit.
Solution 1:
a service workers intercept requests to the API and add the access token.
Problems: I don't want to build every time but as the documentation says: service workers only work in the production build, not in development
Solution 2:
send requests to the svletekit backend and then pipe them to the API with the access token
Problems: Works only for basic requests but not for stream, it seems that it is supported recently (https://github.com/sveltejs/kit/issues/5344) but there is no documentation or example and this solution requires more resources (requests should be from the browser to the api)
Solution 3:
Hooks externalFetch
This function allows you to modify (or replace) a fetch request for an external resource that happens inside a load function that runs on the server (or during pre-rendering).
Problems: It doesn't work for requests like the src of an image
Any idea ?
edit: Solution, with the new version of sveltekit node-fetch has been replaced by Undici and the streams are functional, so it is possible to pipe requests from the backend.
For the dev it work well but it's not the best solution for production so you can use both depending on the environnement.
Here is what I'm trying to achieve:
Right now, the Desktop App, Auth server and API are working correctly, I can get my JWT and use it to call the API.
Both web apps are already in use, subdomain1.domain.com use NGINX auth_request, cookie and sessions on an old auth server to get access.
Web app in subdomain2.domain.com use session, and connect to the API with an app token.
And, all these servers are part of the same domain.
So, is it possible to share the JWT from my Desktop app with browsers? We generally use Chrome.
The desktop app use Python 3, and most of the user will be using Windows.
If I can't, and I'm pretty too stupid to do this working, my other concern is, can my browser use that JWT on all web app once it connected get it from auth server? All servers shares the same main domain.
Our web server can be using Apache2, Nginx, Nodejs or Flask (python), which is kind of annoying when trying to make things like that works.
I could use cookie for .domain.com, and store the jwt inside, am I right?
If yes, is this really the best idea?
The idea behind it is:
User log in the app or browser
JWT is generated
JWT is shared between app and browser (not sure about this one)
JWT is used on all subdomain by the browser
What is your advice on this?
I think you can use a cookie in that case with no regrets. Just configure it correctly that every domains you need to have an access to this token.
I'm writing a web app with a separate frontend and backend. The frontend is written in React, and the backend is a node.js server running an Express endpoint. How do I ensure that only my frontend can access the API, and not anyone else? My API URL is exposed in my frontend client side code, so anyone can see that.
I added JWT authentication to my API, but I still need to have an unprotected /login endpoint in order to generate the JWT token, and in order to login to generate the token, I must post both a username and password from my frontend, which other users can see, since it's done from the client side.
What is the proper way of securing an API that is hosted on a separate backend like this, so that only my frontend can access it, in a way where nobody can see what credentials are being used to access the endpoint?
You can't. Your API is on the internet. Anyone can access it. You can require an account and login credentials for the account before allowing access to the API, but once someone has an account and credentials, they can access the API from their own script rather than via your web page. This is how the web works. Not much you can do about it. And credentials being used by the client cannot be hidden. All data that is EVER on the client can be looked at by a hacker on the client. This is the way of the web.
Larger companies will typically monitor their API usage to look for inappropriate use. This includes rate limiting, detecting behaviors and sequences that are not typical of a regular human user. When they detect inappropriate use, they will often disable that action or ban the offending account, either temporarily or permanently. This is also why some pages use techniques to detect if an actual human is individually causing the operation such as reCaptcha. For example, on stack overflow, when editing comments or posts, I often run into rate limiting where it tells me that I have to wait a bit before it will accept my edit.
There is no absolutely secure way to store credentials in a client. The most common scheme for credentials is to require username and password (securely over https) and then when that is accepted on the server as legit credentials, some sort of token is issued to the client which can be used for future API calls. That token may be in a cookie or may need to be manually included with each subsequent API call (the advantage of a cookie when using APIs from a browser is that the cookie is automatically sent with each subsequent request).
If the token is a cookie, then the cookie is stored in the browser's cookie storage and an expiration can be set for it. The browser's cookie storage is protected from access by web pages from other sites, but can be accessed by someone on the local computer (it's stored in the file system).
If the token is not a cookie, just returned as a token, and the client wishes to store it, there are a few other places that Javascript provides access to in order to store it. Local storage has similar security as cookie storage. It is protected from access by other web sites, but can be accessed by a person on the local computer.
I have a Drupal deployed on a Bitnami Image containing some content that I use in a mobile application. For that, I sat a Rest API to request Drupal and get the data.
Using Postman and the basic_auth credentials, I am able to get my API response properly with no problems.
After that, I deployed Azure API Management and tried to pass the call of Drupal API through it, I provided the credentials via an authentication-basic policy, everything worked perfectly for a moment then suddenly I now getting a 403 Forbidden Access.
I still can request the Drupal API directly via POSTMAN using the same credentials with no problems.
Thanks
Actually, When I was creating the API on API Management, I used wrong authentication credentials many times for that, Drupal had blocked the API from access to the data.
I fixed that by doing a Truncate on the table flood of my database