Amplify gives "No current user" only in Safari? - safari

Context: I'm using Amplify for authentication in a static site that is composed of 2 sites frankensteined together by building Website A, putting it into an S3 bucket and the building website B and placing its build files within a subfolder of that S3 bucket. It then hosts perfectly fine, the only hitch being that to navigate between the sites, I can't use the Router of website A because at build time it doesn't know Website B exists, so to direct users toward website B I have to use window.location.assign(SAME_DOMAIN/v2/website_B).
Expected Result: I can navigate to the other site in Chrome, Safari, Firefox, etc.
Actual Result: I can navigate to the other site in Chrome, Firefox, Opera. In Safari, when I try to navigate to the other site, Amplify.currentSession() returns an error "Can't get current user" so the website naturally redirects me back to sign-in. Having checked storage, it seems to be wiping away the Amplify credentials(whether I keep them in localStorage or cookies). Another weird behaviour is that if I type in the URL manually, it navigates me there no problem, but I've tried window.location.replace, window.location.assign, document.location.replace, etc. and nothing works.
Amplify Config
auth: {
region: "eu-west-2",
userPoolId: "eu-west-2_XXXXXXXX",
userPoolWebClientId: "XXXXXXXXXXXXXXXXXXXXXX",
cookieStorage: {
domain: "XXX.XXXXX.com",
path: "/",
expires: 365,
secure: true,
},
},

Safari seems more inflexible than other browsers when handling cookies, whenever amplify auth is used with secure: true on a page served by HTTP (as often the case when on dev computer), it fails with "No current user". Served over HTTPS, all is fine.
See this GH issue on the same subject.
Solution we settled on is to tie secure property to development env:
secure: process.env.NODE_ENV !== "development"

Related

How to configure Content-Security-Policy of Helmet package in express to allow cross site iframe and cross site scripting?

Updating details to understand more: *In my project, user uploads html themes. For each user, if they authenticate, I am creating a public static folder for authenticated user in the same theme folder they are requesting. Then there is a editor in the front end where they can edit html theme contents. I am trying to show html themes in the editor using a iframe using the static link from backend. But the problem is I can't add script to the html theme in the iframe. It's saying permission denied. How can I solve this problem?
I am using express in backend and nextjs in frontend. I have added this code in helmet middleware.
app.use(
helmet({
contentSecurityPolicy: {
directives: {
'connect-src': ["'self'", 'http://localhost:3000'],
'default-src': "'self'",
'frame-ancestors': ["'self'", 'http://localhost:3000'],
sandbox: ['allow-forms', 'allow-scripts'],
'script-src': ["'self'", 'http://localhost:3000'],
},
},
})
);
For cross site scripting,
app.use(xss())
But still getting error in iframe.
From Backend I am trying to allow a route to be use in iframe in the frontend. Since, both server have different port in localhost, it's violating cross site embed and scripting. So, I am using helmet and xss package. I need help to configure it.
I am using iframe's onload attribute to check if it is loaded and then injecting another script to the iframe from frontend.
You have an issue of Same Origin Policy, not with Content Security Policy. Helmet package can't help you.
Set the value document.domain = 'example.com'; (example.com = 'localhost' in your case) both in the iframe and in the main page. It will reset port number to null and subdomain any.example.com to domain example.com, see test.
If both iframe and main page are on the same domain, you can just set document.domain = document.domain;.
Both variants leads resetting port number to null. therefore yoy'll be able to acces iframe with a different port number.

Next-auth Receiving 404 after login attempt in deployed Vercel application

Developing a next.js application that uses next-auth for authentication. It is currently setup with GitHub as the only Provider.
In development, the authentication works just fine.
In production, after I click "Sign in with GitHub", I am directed to a 404.
I'm 99% sure this has to do with the callback URL I have setup in my GitHub OAuth app. For dev purposes it is set to http://localhost:3000/api/auth/callback/github. Obviously this is no good for a deployed app, but I don't know what to set it to. I've tried a couple of different URL's with no luck.
Other than the callback URL is there anything else I need to set up in my code to get this working in production?
Other than the callback URL, this is from the docs:
https://next-auth.js.org/getting-started/example#deploying-to-production
When deploying your site set the NEXTAUTH_URL environment variable to the canonical URL of the website.
NEXTAUTH_URL=https://example.com
You should set this as a production environment variable on the Vercel dashboard to link to the URL where Vercel deployed your site.
It is possible that this is because you changed your .env.local file to contain your GITHUB_ID value, but did not restart next.js to pick up the new value.
Try stopping your dev sever, and restarting it (with npm run dev).
The clue is that the Github authentication URL contains &client_id= at the end, with no Github ID token.
I ran into the same issue and I used next export for static HTML export. Without next export, just simply to use next build, it worked for me on vercel.
I had the same 404 problem, it was due to the changes in Version 4 of NextAuth, I was getting a 404 page when logging in and changing the code in nextauth.js fixed it for me, you could try the following:
import NextAuth from 'next-auth'
import GitHubProvider from 'next-auth/providers/github'
export default NextAuth({
providers: [
GitHubProvider({
clientId: process.env.GITHUB_CLIENT_ID,
clientSecret: process.env.GITHUB_CLIENT_SECRET,
}),
],
})

Safari Cross Site Ajax call does not store cookie

I have a website on VueJS and a backend on AWS.
Lets say the website is on www.mywebsite.com, on a hosting server with CPanel and my backend on aws runs under www.mybackend.com
When the user logs in using the website, it makes an axios/fetch call to the backend. The backend will return a set-cookie for the www.mywebsite.com domain.
Although Chrome and FF works fine. Safari does not store the cookie as it is a cross site cookie.
Is there any easy way to make Safari store the cookie and send it to the calls to the backend? Can I mask the backend url with a subdomain from my main domain? Any ideas?
Safari does behave differently from those other browsers. It will only allow cross-origin cookies if they are from the same cookie domain.
So you can get this to work but only if you're in a position to change the URL so that the domains match.
So if you have a website at:
www.mywebsite.com
and the backend at:
backend.mywebsite.com
You can then share the cookie by setting the Domain:
Set-Cookie: my-cookie=value; Domain=mywebsite.com
If the two sites are on totally unrelated domains and you can't change that then I'm not aware of any way to make that work with Safari.
I did a more complete write-up of using cookies with CORS (including the quirks with Safari) at https://cors-errors.info/faq#cdc8

Error 404 on a page that exists and that works fine through internal link

I created a website with several pages on Vue.js.
Everything is working fine locally, but when I deploy to Heroku, all pages are only working when I click on an internal link in my menu that redirects to the corresponding page (using router push).
When I try to access directly /any-page from the browser I get a 404 with a message saying "Cannot GET /any-page" whereas the same page is displayed correctly via a click on a link.
As I mentioned when I locally serve my app I don't have this problem.
I really can't see where this can come from, thanks in advance for your help.
There's a deployment guide specifically for Heroku in the official Vue CLI documentation.
You'll quickly notice the relevant information:
static.json
{
"root": "dist",
"clean_urls": true,
"routes": {
"/**": "index.html"
}
}
For SPA's (Single Page Applications), you'll want to point every route to the index. Vue router will take care of navigating to the proper page.
Heroku is serving the contents of your Vue build folder. Since Vue builds the app as a single index.html file, only the main route works.
Vue doesn't actually navigate to the route, it rather rewrites the the browser url using the history API and handles the loading of the new route.
You could use one of these options:
OPTION 1
You could use mode: "hash" to fix routes when reloading the page. However this will add a # before every route.
const router = new VueRouter({
mode: "hash",
routes: [...]
})
OPTION 2
Write an Node.JS (eg Express) app that routes every request to your index.html file. This is called a middleware
Reference: https://router.vuejs.org/guide/essentials/history-mode.html#example-server-configurations

Upload to Google Cloud Storage using Signed URL returned origin not allowed on Safari only

Using the Node.js library to generate signed URL and have no problem uploading files from Chrome on local machine and production. But CORS issue appears when sending the PUT request from Safari, both desktop (v13.0.5) and iOS, so far no issue with Chrome on Mac. It says Origin https://website.com is not allowed by Acess-Control-Allow-Origin
I am pretty sure it is somewhere related to Safari sending the request. I have double checked my API to generate the url it has all proper params (matched content type with client) also it does work on Chrome as well. The PUT request is sent using fetch().
Have tried to update GCS cors config using gsutil but safari still complain origin not allowed, even now it does not work on Chrome without wildcard on origin and responseHeader. Someone mentioned on the internet Chrome fine with wildcard but Safari expects headers/origin to be explicit but can't figure out what are required headers. Have tried different variations for responseHeader such as Acess-Control-Allow-Origin, origin, Origin, x-goog-resumable.
[
{
"origin": ["https://website.com"],
"responseHeader": ["Content-Length", "Content-Type", "Date", "Server", "Transfer-Encoding", "X-GUploader-UploadID", "X-Google-Trace"],
"method": ["GET", "HEAD", "POST", "PUT"],
"maxAgeSeconds": 3000
}
]
I have other project that run same setup and has no problem, the only difference probably the #google-cloud/storage npm version and using version: 'v4' when generating getSignedUrl().
Found few on the internet saying to use https://bucket.storage.googleapis instead of https://storage.googleapis/bucket still no avail.