In our project, we follow frequent deployment process. With every deployment our client has to refresh the browser to reflect the changes because of browser cache. I want to know how can we avoid this situation.. i.e., when we deploy it should reflect directly in all users browsers with out the need of refresh. please help me to resolve this issue.
There's a solution called Hot Module Replacement. This seamlessly replaces your webpack bundle (assuming you're already using it).
See What exactly is Hot Module Replacement in Webpack? for some references. HMR setup is quite complicated and differs for each framework.
webpack-dev-server also depends on webpack, but it's much simpler: it automatically push the refresh button for you.
You can prevent cache by busting cache from the front end.
You can add a v={{number-of-version}} and you will always be shure that the loaded version is the one you want.
There is a gulp plugin which takes care of version your files during the build phase, so you don't have to do it manually. It's handy and you can easily integrate it in your build process. Here's the link gulp-annotate
Consider updating the cache directives in your response header to reduce or remove the cache duration allowed for browser caching.
To see what your website is currently using, view a page with Chrome with the "Inspect" option enabled and the "Network" tab selected. Then select a downloaded file to see the "Response Headers" information.
As an example, this SO page downloads a script file called ados.js that has a "max-age" value of 604800 (seconds, 7 days). That is the amount of time that the browser can wait before checking to see if a newer version of that file is available from the web server. If the file is almost never changed, then 7 days may be tolerable.
However, a max-age of 7 days may be too long of a duration for dynamic situations in which browser cached files are changed frequently.
Refer to this MDN page on HTTP caching for more information on this.
Changing the cache-control directives depends on your web server.
For ASP.NET, it can be done in the web.config file.
One option to consider is a two phased approach for releases.
Phase 1: change the cache-control directives to no cache prior to the release for any cache-able file that is about to be updated by the pending release.
Then wait for the longest time duration that the updated files currently have their max-age value set to. This allows the current website browsers to pickup the "no cache" directives. Then, all of the current website browsers have been primed to pick up the updates immediately after the file updates have been released.
Phase 2: release the file updates to the website with the desired max-age cache-control directive. These updates should be picked up immediately due to what was done in Phase 1.
Another option is to shorten the current max-age value to a smaller value, like 5 minutes. This option is simpler, but there may be problems in that 5 minute period after the release. However, the shorter time span reduces the likelihood and/or impact of updates made to cached browser files.
Related
I have a Blazor WASM application using .Net Core 6, and all of the latest nuget packages, hosted using Azure App Services.
As per my understanding, the browser caches the application DLLs appending a sha256 hash code to the filename. The browser keeps the DLLs cached until it reads a new sha256 hash value on the blazor.boot.json, at which point it replaces the old DLL in the cache with a new one.
As per my observations, this works most of the time. If I look at my own browser application cache, using the dev tools, I will see the same version the DLL today, tomorrow, and the next day, as long as there hasn't been any deployments. Then, after a deployment, I open the application in my browser, go back to my cache, and without needing to manually clear anything or do anything out of the ordinary, I will see the DLL with an updated sha256 code.
However, every once in a while, this isnt the case. For example, I have a user, that rarely uses the development environment. The version of the application dll has been cached since august (2 months ago). Additionally, the contents of the blazor.boot.json correctly indicate the new version of the dll. But still, the old version, with a different SHA hash, remains in her cache.
I've seen this behavior before. I can't quite put my finger on what the difference is in the application changes that lead to successful automatic cache busting vs when the old cache continues to persist.
The simple solution is to have the user delete the entire blazor-resources group of dll's to force a new download. However, as we expand the footprint of the app to new users, I don't want to have to ask them to clear their application cache as we are deploying new versions of the code.
Given this example, it seems the failure is somewhere between the retrieval of the updated blazor.boot.json and whatever standard javascript runs to retrieve the new DLLs, which I believe is in the service-worker.js.
Any assistance regarding:
An explanation of why this might be happening (so that I can reliably reproduce the issue and have a valid test case after it's been fixed)
What code/configuration I may be able to force the DLL download... not on every refresh of the app, but at least when the SHA hash has changed.
Thanks,
Mike
I've built a Microsoft Teams channel tab with SSO and I'm hosting the tab application which I've built with React via create-react-app.
The auth works well, and the app loads and runs.
But when I update my app on the web site, the Teams desktop client (Mac and PC) will sometimes cache the old app and will not pick up the changes. But then sometimes it will.
If I run the web client, it usually picks up the changes.
I've verified that I'm serving up new bundles with different names each time I update. But running the Teams desktop devtools I can see that Teams is asking for the old bundle, every time, so it's definitely caching the response from my app's URL.
I've read about the problems people have with the Teams desktop client has with caching Sharepoint content and not picking up content changes. I've tried the cache clearing techniques but they don't seem to work for this issue. And I can't reasonably have users do crazy cache clearing every time I make an update to the tab app.
What should I do? Some have suggested I need to update my version in the app manifest and redeploy to Teams -- that seems really brutal. Do I need to set some cache headers in a certain way to force the Teams client to pick up the new code?
Solution
Set a Cache-Control response header to no-cache (or must-revalidate) for your build/index.html.
Explanation
We had the exact same issue. Turns out it was because we cached our build/index.html.
According to the create-react-app doc, only the content of build/static/ can safely be cached, meaning build/index.html shouldn't be cached.
Why? Because files in build/static/ have a uniquely hashed name and are therefore cache busted on deployment. index.html is not.
What's happening is since Teams uses your old index.html, it tries to load the old /static/js/main.[hash].js defined in it, instead of your new JS bundle.
It works properly in the Teams web client because most browsers send a Cache-Control: max-age=0 request header when requesting your index.html, ignoring any cache set for the file. Teams desktop doesn't as of today.
This seems like an issue with the way your app is managing the default browser caching logic. Are service workers enabled for your app? What cache control headers is your web server returning?
There are some great articles that describe all the cache controls available to you; for example:
https://medium.com/#codebyamir/a-web-developers-guide-to-browser-caching-cc41f3b73e7c
Have you tried doing something like this to prevent caching of your page (do note that long term you might want to use something like ETags which is a more performant option):
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Cache-Control#preventing_caching
P.S. You can also follow the instructions here to open the dev tools in the Desktop Client to debug all this:
(How) can I open the dev tools in the Microsoft Teams desktop client?
And even force clear any cached data/resources for your app:
I have a VueJS application that is deployed to a local IIS 10 server for intranet use.
Trouble is, the index.html file is getting cached and a forced, manual clearing of the browser is needed to see updates. I understand there are ways on the server side to prevent this, but I'm unclear based on what I've read so far as to what the proper way of making sure the html file isn't cached is (js, css and the like are, of course, not cached since they have the additional value appended to the file name during build.)
I'm very much a novice when it comes to the server side of things, so any insight would be greatly appreciated. Thanks!
Which packaging tool do you use in your project? Generally speaking, Webpack/Vue-CLI has settings for prevent the file from caching in the browser on the client-side. In other words, it adds hash value to output files which could flag the version we build recently, this will result in the client browser to forcibly request the new version file.
In the Webpack.config.js
output: {
filename: '[name].[contenthash].bundle.js'
}
https://webpack.js.org/guides/caching/
See these links for more details.
Browser cache problems with webpack chunks and vue.js components
how to force clearing cache in chrome when release new Vue app version
VueJS/browser caching production builds
Using MS Edge and apache w/ php, I just discovered via access.log that when I have the JavaScript debug panel (i.e. developer panel) open, it is making every http call twice. When I closed this panel, it has fixed the issue of all insert statements getting called twice.
Question: Does this doubling of http calls happen on every / most browsers that I need to look out for, or is this something special/unique with MS Edge?
I can't speak for all browsers and all developer tools. But, for IE and Edge the first time you open the tools and then open a JS file in the sources view it will try to request the file again. That request will be served from the local browser cache, sometimes not, depending on the cache settings for the file being requested.
The reason browser tools need to make this request is that browsers will often throw out the original source file as it doesn't need it to execute the page, as the source has been parsed it into something else that it can work with.
However, after you've opened the developer tools the browser will keep around sources in future navigations, either in the tools front end or elsewhere. Not keeping sources is an optimization for the first time use case, to save browsers keeping around source on the very low odds of the tool being used on any given navigation.
Of course some files are never cached by the browser and will need to be downloaded when requested by the tools, for example sourcemapped files.
In general any resources on your site that can be accessed by HTTP GET should be idempotent. That is, a GET shouldn't change the resource being requested (or generall the state of your site), so hopefully making additional requests shouldn't be an issue.
Prior to migrating from Mobile to App Services I could change node.js APIs in real time. Now changes seem to take an undetermined time to go live. I don't know if they're now being compiled or cached anywhere along the way. Ideally I would like to regain the ability to effect immediate change.
Technically, there is a file watcher that watches a subset of the files in your site - when you change one of those files, the site is meant to restart, thus making your change go live. This is configured in the web.config file which is a part of your site.
Make sure that the web.config is configured to watch the files you are interested in.
Restarting the site manually is a backup step that is effective.