How to authenticate inside an iframe when it is blocked by X-Frame-Options - authentication

I am working with an application where another application is embedded inside an iframe.
The embedded application uses a google service which requires authentication.
Of course Google is not happy about this
Refused to display 'https://accounts.google.com/signin/oauth/error?authError=[...characters...].apps.googleusercontent.com' in a frame because it set 'X-Frame-Options' to 'deny'
Is there any alternative for authenticating and connecting to this service within the iframe?

If X-Frame-Options is set to DENY then supporting browsers will refuse to display the site in an iframe regardless of site settings or domain.
This behavior is intended to prevent clickjacking attacks.
If you have control over the browser your users are utilizing, you can select a browser that does not support X-Frame-Options.

Related

Is there a way to add an addtional Authentication Token when using seccure Apache website

We have our Apache Websites secured using the password security which works well. Ive noticed If I add the following header to my browser requests I can bypass the security Authorization: Basicxxxxxxxxx. However, many other websites I visit use this same headers which requires me to always disble this header before visiting other sites.
Is there a way to configure Apache to recognize an additional headers to bypass security, so I can store this header in my browser settings and be able to visit other sites without having to disable it.

Safari Cross Site Ajax call does not store cookie

I have a website on VueJS and a backend on AWS.
Lets say the website is on www.mywebsite.com, on a hosting server with CPanel and my backend on aws runs under www.mybackend.com
When the user logs in using the website, it makes an axios/fetch call to the backend. The backend will return a set-cookie for the www.mywebsite.com domain.
Although Chrome and FF works fine. Safari does not store the cookie as it is a cross site cookie.
Is there any easy way to make Safari store the cookie and send it to the calls to the backend? Can I mask the backend url with a subdomain from my main domain? Any ideas?
Safari does behave differently from those other browsers. It will only allow cross-origin cookies if they are from the same cookie domain.
So you can get this to work but only if you're in a position to change the URL so that the domains match.
So if you have a website at:
www.mywebsite.com
and the backend at:
backend.mywebsite.com
You can then share the cookie by setting the Domain:
Set-Cookie: my-cookie=value; Domain=mywebsite.com
If the two sites are on totally unrelated domains and you can't change that then I'm not aware of any way to make that work with Safari.
I did a more complete write-up of using cookies with CORS (including the quirks with Safari) at https://cors-errors.info/faq#cdc8

Third Party Cookies - Cross Domain APIs w/ Session Tracking

Given a CORS API that requires a session cookie to track users as they move through a checkout process, there are issues in multiple browsers where the cookie is not set until after the user visits the site the API is hosted on.
For example:
johnny.com uses an CORS JSON API from jacob.com. jacob.com sets a
cookie after the first AJAX call is made, but some browsers will not
set the cookie for subsequent calls. Therefore the API will not
function as expected.
Browser Behavior:
Chrome seems to function fine unless "Third-Party cookies" are
deliberately disabled. There doesn't seem to be a workaround for
this.
IE does not allow the cookie to be set initially unless there is a P3P privacy policy header returned with the initial call.
Safari does not allow the cookie to be set initially unless a hack is used (see: http://measurablewins.gregjxn.com/2014/02/safari-setting-third-party-iframe.html)
Any insight on how to work around these issues is greatly appreciated.
Unfortunately, it seems there are not option to make that work across all browsers.
Safari now restricts third party use of cookies.
It seems the best is to evaluate alternatives :
Setup a proxy server that will redirect the calls to the different services (for example, when you hit johnny.com/jacob/abc, act as proxy to retrieve jacob.com/abc)
Use oauth login on API (it might be impractical)
Move the API under johnny.com/api/...
Paypal has also created several js based solutions to try to go around this kind of problems : https://medium.com/#bluepnume/introducing-paypals-open-source-cross-domain-javascript-suite-95f991b2731d

Google Chrome Forces HTTPS

I am developing a Rails application that uses SSL connection. I am currently using third party resources that are js and css files for implementing a map (OpenStreetMap) . I have already tried to import these resources (js and css) into my application, but the javascript code tries to access an external WMS via HTTP.
The problem is that Google Chrome is blocking access to third-party resources from HTTP when the application is in HTTPS.
So I disabled SSL on a certain pages of the application and tried to force the HTTP or HTTPS the way I desire.
Following this blog: http://www.simonecarletti.com/blog/2011/05/configuring-rails-3-https-ssl/ and it works.
But when I force the HTTP protocol to the page where these resources will be used using Google Chrome, it forces HTTPS connection causing infinite loop.
If I clear the Chrome cache (that have already accessed the same page with HTTPS) in order access it via HTTP it works. But if I have accessed a HTTPS page and try to access via HTTP, Chrome forces the HTTPS connection resulting in an infinite loop.
The question is: Is there something I can set in the request that causes Chrome to accept the connection?
Regards
I've been doing some research on this, and it turns out that turning on force_ssl = true on Rails 3 causes the app to send an HSTS header. There's a bit of information about it here: How to disable HTTP Strict Transport Security?
Essentially, the HSTS header tells Chrome (and Firefox) to access your site only through HTTPS for a specific amount of time.
So... the answer I have for you now is that you can clear your own HSTS setting by going to about:net-internals within your Chrome browser and removing the HSTS state.
I think the answers here can help you: Rails: activating SSL support gets Chrome confused

How do I get placemark icons to load over ssl?

I'm working on a web application that uses the google earth plugin. Recently, a new requirement to have non-public users logon was added, which meant that some users were now using the site over https. Among the things that broke in testing were the custom placemark icons (They were working using http).
The icons are hosted on the same server which servers the page.
Here are the urls for each of the protocols.
http - http://localhost/Images/yellow.png
https - https://localhost/Images/yellow.png
I can follow that link and the image will appear as you would expect.
The images hrefs are declared as icon styles in dynamically generated kml.
I want to avoid loading the images over http because I think that will cause internet explorer to present the user with a mixed content warning.
How do I get the images to load properly while using https?
I have been wrestling with this myself -- the short answer is that this won't work. If the content is served off of an HTTPS site that generates any kind of error/prompt (authentication, invalid certificate, etc.) the plugin will simply not load the content.
Interestingly, the desktop client works fine and prompts the user for credentials if necessary. However, neither client will allow content to be served off of site with an untrusted certificate.
The only workaround that I have found is:
Use a trusted HTTPS certificate on the server hosting the content (either trust the certificate on the client systems or just use a real certificate.)
Do not use HTTPS basic auth as that will always generate 401/Challenge responses which the web browser client will simply ignore
If authentication is a requirement, use NTLM authentication and common (e.g., domain) logins. If you load the plugin in Interent Explorer (or in a .NET WebBrowserControl) the authentication will be handled properly and the images will show up.
I was at a Google Earth administrator's training last week and the trainer confirmed this "bug". It is supposed to be fixed in the next version of the plugin (it may actually be fixed already -- what version of the plugin are you using?)