Does anyone know how to get Safari to display CORS pre-flight OPTIONS requests in the dev tools network tab?
IIRC they used to show up, and I know the requests are being made as we can see them logging on the server.
We had the same problem suddenly occur with Chrome in the last few weeks (around Chrome 79/80) and had to force it by setting chrome://flags/#out-of-blink-cors to disable
Related
I have a very weird problem with Safari opening my web app.
The setup: I am running an Vuejs application stored in a S3 Bucket on AWS. The app is exposed by an API Gateway.
The Problem: When opening the app only index.html and the favicon are loaded but not the other assets. Sometimes they occur inside the Network tab in dev tools with the message "Failed to load resource" but sometimes not.
"Solution": When I open the app with http (which doesn't work) and then with https again, the resources can be loaded somehow and the app will work fine even when reloading with cache clearing.
Does anyone know how to overcome this problem? 🤷‍♂️
I have a similar problem (not related to vue.js and amazon, but just to Subject: Safari selectively loads resources):
Mac OS Safari 14 (desktop version, I don't know for a mobile one) doesn't load all referred css, js, and image files.
Safari 13 performed well. Other browsers (Chrome, Firefox) perform well. All files are referred the same way, using relative URLs.
When using Safari 14, some of them are loaded, some are not. Caching is not an issue (I use "empty caches" before loading a page).
It looks like successfully loaded files are randomly chosen (the same file is sometimes loaded, sometimes it isn't).
In the Network tab of Develop menu, for not loaded files Preview says: "An error occurred trying to load the resource",
and in Headers, section "Request", there are only: Accept, Referer, and User-Agent lines, while:
GET, Cookie, Accept-Encoding, Host, Accept-Language, and Connection lines are missing.
Response says: "No response headers".
In the server's access log, there are no lines related to not loaded files.
EDIT:
The cause: TLS 1.0
The solution: TLS 1.2
My predecessor at my workplace built a website (mahlerclean.com) for a client that allows job applicants to log onto another site (joblinkapply.com) via an iframe. The client has recently gotten complaints from applicants who are not able to log into the site via the iframe.
I am able to reproduce the issue in Safari. When I go to https://www.mahlerclean.com/career-center/job-openings it does not let me log into https://www.joblinkapply.com/company/6435 from there, and I see this message in the Safari web console:
Blocked a frame with origin "https://www.joblinkapply.com" from accessing a frame with origin "https://www.mahlerclean.com". Protocols, domains, and ports must match.
I have not been able to reproduce the issue in Firefox or Chrome though, and of course, if you navigate directly to https://www.joblinkapply.com/company/6435 (rather than through the iframe), it works fine in all browsers.
I control mahlerclean.com, but do not have any control over joblinkapply.com
My questions are:
Is there anything I can do to the site at mahlerclean.com that would
allow the iframe to joblinkapply.com to work on all browsers?
Why am I only seeing the issue in Safari? Are the other browsers likely to get more strict (i.e. behave like Safari) in the future?
Is it even reasonable to try to support logins to a remote site through an iframe, or should I tell the client to ditch the iframe, and just link out to https://www.joblinkapply.com/company/6435?
I am running non http url inside of my tests (so migration to https wont be easy) and because of that I am getting an warning inside of browser.
How to disable "Non secure" warning inside of Chrome during selenium tests?
I've tried to play with arguments but nothing works
args: [
'start-maximized',
'disable-webgl',
'blacklist-webgl',
'blacklist-accelerated-compositing',
'disable-accelerated-2d-canvas',
'disable-accelerated-compositing',
'disable-accelerated-layers',
'disable-accelerated-plugins',
'disable-accelerated-video',
'disable-accelerated-video-decode',
'disable-gpu',
'disable-infobars',
'test-type',
'disable-extensions',
'allow-running-insecure-content',
'disable-web-security',
'ignore-certificate-errors',
'ignore-gpu-blacklist',
'no-default-browser-check',
'no-first-run',
'disable-default-apps'
]
The issue is that I need to resize windows to 420x800 but because of warning browser can't do that.
"Not Secure" SSL Error
As per Fix “Not Secure” SSL Error on Chrome Browser | Remove Warning with the release of Chrome 68, Google started showing all the HTTP sites as Not Secure on Chrome Browser.
Treatment of HTTP pages
This feature can be turned On / Off by accessing the page at chrome://flags/#enable-mark-http-as and setting the following attribute:
Mark non-secure origins as non-secure: Changes the UI treatment for HTTP pages on Mac, Windows, Linux, Chrome OS, Android
Default
Enabled
Enabled (mark as actively dangerous)
Enabled (mark with a Non Secure warning and dangerous on form edits)
Disabled
Using Selenium to disable this feature you need to use the ChromeOption --allow-running-insecure-content as follows:
Python:
chrome_options = webdriver.ChromeOptions()
chrome_options.add_argument("start-maximized")
chrome_options.add_argument('disable-infobars')
chrome_options.add_argument('--allow-running-insecure-content')
driver = webdriver.Chrome(chrome_options=chrome_options, executable_path=r'C:\Utility\BrowserDrivers\chromedriver.exe')
driver.get("http://www.legislation.vic.gov.au/")
This does not work for chrome on android devices. It's a bad idea for companies to tell users what and what they can not look at. Tech giants like Google have gone too far and the government is letting it happen.
Problem
Sometimes important HTTP POST requests sent with AJAX got duplicated so several entries of the same data got created in the production database, that is of course not supposed by users.
What is important is that users have a poor internet connection and this request is taking a long time (9-20 seconds). We can't reduce this time because it is the requirements of business logic.
Requests are sent with http, not https.
Details
We have Apache/2.4.18 (Ubuntu) with PHP module loaded and two frontends: one for desktop (AngularJS) and one for mobile (React) devices. AngularJS is sending requests with the $http service, and React is using whatwg-fetch (tried whatwg-fetch-timeout also).
We know from Apache access.log and PHP logs that the same request is coming from the client several times and PHP processes them without the errors. But! These requests have response with 200 status code, %b > 0, and %O = 0, that means request is aborted before a response is sent (Apache logging format docs).
Reproduce
So we tried to reproduce and the same happens sometimes. The following case is just a reproduced case, but this happens on mobile devices (iPhones and Android phones) with different browsers installed. Also, we have repeated it in the Firefox under Windows.
Environment: Windows; both Chrome and Firefox; React frontend version; no proxy used.
That's what we found out: Google Chrome identifies the request as "Stalled", but is internally trying to send the request multiple times (and it is received and processed on the server actually), because it got the network error (ERR_CONNECTION_CLOSED). Only when the browser succeeded to fetch the response, it stops sending the repeating requests.
Gathered info
URL_REQUEST event log from chrome://net-internals/help.html#events (headers are also available there)
Google Chrome dev tools request screenshots:
Headers tab
Timeline tab
I personally can't reproduce this even once and I suppose a good internet connection is a reason for this.
I have googled a lot and even found some similar Chromium bugs, but nothing exactly about this problem.
Thank you in advance for any useful information.
I am also not pretty sure which tags should I set for this question, so if I should add or remove some, please tell me.
When opening some web pages in Safari (iOS - CMS website hosted in a Apache server) it shows the following message.
Tried to remove all scripts from the page and it doesn't worked
Checked Apache access log and none on the requests were logged in access log
Checked Apache error log and no errors are logged
Tried lot of methods to figure it out (technically and logically). Anyone experienced the same issue?
If none of your requests are being logged, then your client isn't getting through and you have a problem outside of your application scope (like a network connectivity or firewall issue)