Uploaded file is not visible in browser unless I force no cache browser reload - apache

I am facing weird issue with file uploads. When I upload a new file to publicly visible folder, I can see it instantly in anonymous mode. But if i try to access it in non-anonymous mode, the server responds with 404 unless I do hard refresh (ie ctrl + F5 for Mozzila).
I have already disabled cache control headers for that folder in apache, but that did not seem to resolve the issue. It seems to me that the apache is storing information that "there is actually no file at requested url" and serves it to user unless user clears cache even if the file is uploaded at that location. Anyone ran into similar issue in the past?

By default, most browsers cache images, styles and scripts automatically. The easiest way to bypass this for development environments is to set the caching headers detailed here
Another common way to bypass caching is to set a random query parameter (usually ?v=<random value here>).
Chromium based browsers also have a disable cache option in the dev tools

Related

Chrome extension that changes the wallpaper of a new tab - NOT in Chrome OS [duplicate]

I am not able to run my content script on the new tab page (where it is not assigned to any url).
I looked at various posts on the subject, ie, Does content script have access to newtab page?
and What is the URL of the google chrome new tab page and how to exclude it from manifest.json
which seem to suggest it is possible.
I enabled chrome://flags/#extensions-on-chrome-urls
I have:
"permissions": [
"http://*/*",
"https://*/*",
"chrome://*/*"
],
(also tried "*://*/_/chrome/newtab*")
still no luck ... what am I missing ?
this answer Can you access chrome:// pages from an extension? mentsions "wildcards are not accepted". Is this true ? and if so how to specify the newtab page ?
The problem is that Chrome 61 and newer explicitly forbids access to the contents of its built-in new tab page (NTP) via content scripts or any other API.
The solution is to create the entire replacement page as an html file in your extension and specify it in chrome_url_overrides.
As for why, here's quoting [source] rdevlin, one of the developers of chrome extensions API:
There's a few reasons for this change. One is to enforce policy,
the other is for consistency.
We've had a public policy for awhile now that states that modification of
the NTP through anything other than Chrome URL overrides isn't allowed (though
we didn't begin enforcing this policy in many cases until July 1st). This is
merely bringing chrome code more inline with that same policy to help prevent
surprise if an extension is modifying the NTP and is taken down for policy
violations.
This is also for consistency, since we've actually treated scripts on the NTP
differently for years now, due to certain NTP magic. For example, the URL seen
by the browser on the NTP is chrome://newtab, but the url in the renderer is
https://www.google.com/_/chrome/newtab. Since chrome.tabs.executeScript checks
the URL in the browser, the script would be denied, even though content scripts
(checked in the renderer) would be allowed. In theory, these permissions should
not be different. Similarly odd, if the user is using the local ntp
(chrome-search://local-ntp/local-ntp.html), injection would already be
disallowed in both the renderer and the browser. And, if we go waaaaay back,
the NTP used to be pure WebUI with an URL of chrome://newtab, where injections
were again disallowed. Rather than have inconsistent behavior depending on the
type of script injection the extension uses, we want to have consistency
throughout the system.
P.S. Please don't edit the quoted text.

Teams desktop client sometimes caches my tab application and I can't clear it

I've built a Microsoft Teams channel tab with SSO and I'm hosting the tab application which I've built with React via create-react-app.
The auth works well, and the app loads and runs.
But when I update my app on the web site, the Teams desktop client (Mac and PC) will sometimes cache the old app and will not pick up the changes. But then sometimes it will.
If I run the web client, it usually picks up the changes.
I've verified that I'm serving up new bundles with different names each time I update. But running the Teams desktop devtools I can see that Teams is asking for the old bundle, every time, so it's definitely caching the response from my app's URL.
I've read about the problems people have with the Teams desktop client has with caching Sharepoint content and not picking up content changes. I've tried the cache clearing techniques but they don't seem to work for this issue. And I can't reasonably have users do crazy cache clearing every time I make an update to the tab app.
What should I do? Some have suggested I need to update my version in the app manifest and redeploy to Teams -- that seems really brutal. Do I need to set some cache headers in a certain way to force the Teams client to pick up the new code?
Solution
Set a Cache-Control response header to no-cache (or must-revalidate) for your build/index.html.
Explanation
We had the exact same issue. Turns out it was because we cached our build/index.html.
According to the create-react-app doc, only the content of build/static/ can safely be cached, meaning build/index.html shouldn't be cached.
Why? Because files in build/static/ have a uniquely hashed name and are therefore cache busted on deployment. index.html is not.
What's happening is since Teams uses your old index.html, it tries to load the old /static/js/main.[hash].js defined in it, instead of your new JS bundle.
It works properly in the Teams web client because most browsers send a Cache-Control: max-age=0 request header when requesting your index.html, ignoring any cache set for the file. Teams desktop doesn't as of today.
This seems like an issue with the way your app is managing the default browser caching logic. Are service workers enabled for your app? What cache control headers is your web server returning?
There are some great articles that describe all the cache controls available to you; for example:
https://medium.com/#codebyamir/a-web-developers-guide-to-browser-caching-cc41f3b73e7c
Have you tried doing something like this to prevent caching of your page (do note that long term you might want to use something like ETags which is a more performant option):
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control#preventing_caching
P.S. You can also follow the instructions here to open the dev tools in the Desktop Client to debug all this:
(How) can I open the dev tools in the Microsoft Teams desktop client?
And even force clear any cached data/resources for your app:

MS Edge: Opening the developer tools panel causes all http requests to occur twice

Using MS Edge and apache w/ php, I just discovered via access.log that when I have the JavaScript debug panel (i.e. developer panel) open, it is making every http call twice. When I closed this panel, it has fixed the issue of all insert statements getting called twice.
Question: Does this doubling of http calls happen on every / most browsers that I need to look out for, or is this something special/unique with MS Edge?
I can't speak for all browsers and all developer tools. But, for IE and Edge the first time you open the tools and then open a JS file in the sources view it will try to request the file again. That request will be served from the local browser cache, sometimes not, depending on the cache settings for the file being requested.
The reason browser tools need to make this request is that browsers will often throw out the original source file as it doesn't need it to execute the page, as the source has been parsed it into something else that it can work with.
However, after you've opened the developer tools the browser will keep around sources in future navigations, either in the tools front end or elsewhere. Not keeping sources is an optimization for the first time use case, to save browsers keeping around source on the very low odds of the tool being used on any given navigation.
Of course some files are never cached by the browser and will need to be downloaded when requested by the tools, for example sourcemapped files.
In general any resources on your site that can be accessed by HTTP GET should be idempotent. That is, a GET shouldn't change the resource being requested (or generall the state of your site), so hopefully making additional requests shouldn't be an issue.

Preventing direct access to files in IIS 7

I have a PHP application running on a Micrisoft IIS 7 server. The application shows PDF files on an iFrame, which contains user's sensitive data that I wouldn't like to be directly accessed by anyone that knows the file address.
So basically, I'm looking for a way to protect files from direct browser access or download, but still be able to show it on the application's iFrame.
I made some research with Rewrite rules, but since the "HTTP_REFERER" of an iFrame is empty, I couldn't find a good solution
Any suggestions for this?
Thanks in advance
Without seeing any of your code, or how your application works, I can only give suggestions based on how I think your app works.
Rather than showing the files themselves, with links directly to those files, you should consider changing your application so that the PHP reads in the directory, displays the file names (however you want them to appear), with links that go to a download.php page. The download page (after checking whether the user has permission to download the file) then loads the file into memory and serves it out as a response (with appropriate Content-Disposition and Content-Type headers).
Since your PHP application can read files directly within the web directory, you can set up rewrite rules to prevent accessing those files from the web; that way, the files can only be accessed by the PHP application, which doesn't rely on rewrite rules to access the drive.
This is how places like Source Forge can display an advertisement with a countdown that your file download will begin in 5 seconds.

How do I communicate to the outside world from a Safari extension?

How would I let a running process know that a context menu has been clicked in Safari?
I've read that this is not possible due to security, but that seems wrong because 1Password somehow pulls all of the information from the desktop app's database into the Safari extension. I wrote the extension to display the context menu and was trying to send an XMLRPC request to localhost, but couldn't get it to work.
I'm not certain of this, but I think 1Password does what it does by having a background process (1PasswordAgent) constantly polling for certain changes in the extension's local database and/or config files. For example, to initially get your passwords into the extension, the extension could set a certain flag in its localStorage db, which would get written (by Safari, not by the extension) to a file. The agent would then notice the flag in the file and copy your passwords from the main 1Password database into the extension's local database. Similarly, when the extension creates a new password entry, the agent would notice the change in the extension's database and mirror it to the 1Password database.
Perhaps you could do something similar?
Although I have no idea about the implementation of 1Password, LiveReload achieves the same by using WebSocket to connect to a localhost URL (handled by the application). If you do it from the global page, cross-domain limitations do not apply, so you are free to connect to any URL:
var ws = new WebSocket("ws://localhost:98765");
...
(Be careful with that localhost thing, though, Chrome on Linux wants 0.0.0.0 instead of 127.0.0.1 or localhost. At least it used to want it.)