browser caching feature vs asp.net caching feature - browser-cache

by default browsers will cache the static files like image, js and css files. And it also cache http get request. If this feature is already there, then why we need asp.net output caching feature?
Thanks.

The asp.net caching is for creating the output sent to multiple clients the browser cache is a single client caching for itself.
Asp.net caching can cache individual parts of a larger output and jsut change the bits that are required to service a particualr client. e.g. changing the greeting at the top of the page, or making the "Top sellers" region relative.

Related

PWA Caching Issue

I have a PWA which has been developed in ASP.net Core and hosted on an Azure App Service (Linux).
When a new version of the PWA was released, I found that devices failed to update without clearing the browser cache.
To resolve this, I discovered a tag helper called asp-append-version that will clear cache for a specific file. I also discovered that I can append the version of the src attribute that specifies the URL of a file to trigger the browser to retrieve the latest file. For example, src="/scripts/pwa.js?v=1". Each time I update the pwa.js file I would also change the version i.e. v=2.
I’ve now discovered that my PWA is caching other JavaScript files in my application which results in the app not working on devices that have been updated to the new version however failed to clear the cache on specific files.
I believed that if I didn’t specify any cache control headers such as Cache-Control that the browser would not cache any files however this appears not to be the case.
To resolve this issue, is the recommended approach to add the appropriate Cache-Control headers (Cache-Control, Pragma, and Expires) to prevent browser caching or should I only add the tag helper asp-append-version to for example scripts tags to auto clear cache for that specific file?
I would preferably like the browser to store for example images rather than going to the server each time to retrieve these. I believe setting the header Cache-Control: no-cache would work as this would check if the file has changed before retrieving the updated version?
Thanks.
Thanks # SteveSandersonMS for your insights, In your web server returns correct HTTP cache control headers, browsers will know not to re-use cached resources.
Refer here link 1 & link 2 for Cache control headers on Linux app service
For example, if you use the "ASP.NET Core hosted" version of the Blazor WebAssembly template, the server will return Cache-Control: no-cache headers which means the browser will always check with the server whether updated content is present (and this uses etags, so the server will return 304 meaning "keep using your cached content" if nothing has changed since the browser last updated its content).
If you use a different web server or service, you need to configure the web server to return correct caching headers. Blazor WebAssembly can't control or even influence that.
Refer here

dojo app - caching static content such as imgs/css

Im working on and existing dojo application which is requesting the same static content on every page change.
Is there a way to configure the application so this content is cached?.... i.e so every http request has cache control headers?
This is nothing related with Dojo or any other javascript framework.
You can define the cache control configurations either in the Content Server (Apache / Nginx) or in the application server. (if the content is created by the backend server).

ResponseCache attribute does not cache data on client side

In ASP.NET Core application I have a action method that returns some data. I wanted to cache this data on client side. So based on the documentation here i can use ResponseCache attribute on the action method. This attribute adds Cache-Control header in response
Response caching refers to specifying cache-related headers on HTTP
responses made by ASP.NET Core MVC actions. These headers specify how
you want client and intermediate (proxy) machines to cache responses
to certain requests (if at all). This can reduce the number of
requests a client or proxy makes to the web server, since future
requests for the same action may be served from the client or proxy’s
cache.
also
Response caching does not cache responses on the web server. It
differs from output caching, which would cache responses in memory on
the server in earlier versions of ASP.NET and ASP.NET MVC.
So this is how my action method looks
public class LookupController : Controller
{
[HttpGet]
[ResponseCache(Duration = 120)]
public IEnumerable<StateProvinceLookupModel> GetStateProvinces()
{
return _domain.GetStateProvinces();
}
}
Then i call the method using browser as http://localhost:40004/lookup/getstateprovinces
Here is the request and response headers
Notice that Response Headers has Cache-Control: public,max-age-120 as expected.
However if refresh the page using F5 (before 120 seconds), the debugger breakpoint inside GetStateProvince action method alway hits. That means its not cahing the data on client side.
Is there anything else i need to do to enable client side caching?
Update
I have tried using IE, Chrome and also POSTMAN with no luck. Everytime i type the url in address bar or hit refresh the client ( that is browser or postman) makes a call to action method.
Actually ResponseCache attribute works as intended.
The difference is that the response is cached if you navigate through your website pages (case 1), or use back and forward buttons (not when refreshing the page).
As an example of case 1, I have the following:
you're on page http://localhost:65060/Home/Index
type another url and click enter or click a link in your webpage: http://localhost:65060/Home/Users
type again the url http://localhost:65060/Home/Index (you will see that this time the response for this url gets fetched from disk cache)
As you will see in the article Response Caching in ASP.Net Core 1.1, the following is stated:
During a browser session, browsing multiple pages within the website or using back and forward button to visit the pages, content will be served from the local browser cache (if not expired).
But when page is refreshed via F5, the request will be go to the server and page content will get refreshed. You can verify it via refreshing contact page using F5.
So when you hit F5, response caching expiration value has no role to play to serve the content. You should see 200 response for contact request.
References:
[1]. ASP.NET Core Response Caching Sample
[2]. ResponseCache attribute sample
[3]: How to control web page caching, across all browsers?
Long story short, using the ResponseCache attribute like the following is sufficient to get expiration-based client-side caching to work in a brand new, default dotnet core project (including async methods):
[HttpGet]
[ResponseCache(Duration = 120)]
public IEnumerable<StateProvinceLookupModel> GetStateProvinces()
{
return _domain.GetStateProvinces();
}
This is working correctly in the screenshot above, as the Cache-Control: public,max-age=120 is visible there. In most cases, browsers won't send subsequent requests before the expiration (i.e. for the next 120 seconds or 2 minutes), but this is a decision of the browser (or other client).
If the request is sent regardless, you either have some middleware or server configuration overwriting your response headers, or your client ignores the caching directive. In the screenshot above, the client ignores caching, because the cache control header is there.
Common cases where the client cache is ignored and the request is sent:
Chrome prevents any kind of caching when using HTTPS without a certificate (or an invalid certificate, this is often common for local development, so make sure to use HTTP when testing your cache, or trust a self-signed cert)
Most browser dev tools disable caching by default when open, this can be disabled
Browsers usually send additional headers, Chrome sends Cache-Control: no-cache
Refreshing directly (i.e. Ctrl+F5) will instruct most browsers to not use a cache and make the request regardless of age
Browsers usually send additional headers, Chrome sends Cache-Control: max-age=0 (this is visible in your screenshot)
Postman sends the Cache-Control: no-cache header which makes it bypass the local cache, resulting in requests to be sent; you can disable it from the settings dialog, in which case requests will no longer be sent with the above client cache configuration
At this point we are beyond expiration-based client caching, and the server will receive the request in one way or another, and another layer of caching occurs: you may make the server respond with a 304 Not Modified code (which is then again up to the client to interpret in whatever way it wants) or use a server-side cache and respond with the full content. Or you may not use any subsequent caching and just perform the entire request processing again on the server.
Note: the ResponseCache attribute is not to be confused with services.AddResponseCaching() & app.UseResponseCaching() middleware in startup configuration, because that is for server-side caching (which by default uses an in-memory cache, when using the middleware). The middleware is not required for client-caching to work, the attribute by itself is enough.
First of all I want to clarify few thing and I am sure that you already knew it.
ResponseCache is not equal to OutputCache any way.
ResponseCache is as per my thinking set header but it does not cache anything on server side.
Now If you want to Cache same as OutputCache then you might have to use preview release 1.1 that just release.
ASP.net core 1.1 preview release
https://blogs.msdn.microsoft.com/webdev/2016/10/25/announcing-asp-net-core-1-1-preview-1/
They introduce new Response Caching Middleware. Response Caching Middleware
Demo of it available here . https://github.com/aspnet/ResponseCaching/blob/dev/samples/ResponseCachingSample/Startup.cs

Setting up the dispatcher and CDN integration with AEM 6.x - Steps and Best Practices?

We need to setup a new AEM 6.x project that on production makes use of benefits of a CDN (like Akamai) and a dispatcher module within a Apache HTTP Web Server.
So this query is about asking for what point to begin at and what are the steps involved in the same? Also what are the best practices to take into consideration while going for the same?
It entirely depends on how you want to configure your systems, both dispatcher and CDN cache have their own best practices outlined in documentation (available over internet).
There are two types of setup I have seen so far -
Cache everything on dispatcher as well as CDN
Cache everything on dispatcher but do not cache HTML on CDN (so effectively you are caching images, CSS, JS but no HTML)
Cache everything on dispatcher as well as CDN
After first hit everything gets cached
Simple setup
Cache cleanup is complex, you will need your own logic to associate with dispatcher flush to flush CDN cache. Refer to Akamai Connector
There are complexities to related content flush, while publishing the content from author to publish AEM identifies the related content and sends the activation for same. This needs to happen for CDN flush as well.
Complete flush of CDN cache is not an option, it takes a lot of time to complete.
Not caching HTML on CDN
Has all the advantages of above approach
For libraries and image assets implement selector based versioning (AEM ACS Commons provides that for ClientLibs, you could implement your own logic for ASSETS url rewriter that adds last modifiedDate as selector to asset call, and your rendering servlet takes care of selector management)
With proper expires header set on Assets and clientLibraries you will not have to worry about explicit CDN cache management
Pages when activated with new assets and/or library will refer to updated selectors and get cached on dispatcher. When a call is made to that page, CDN caches the libraries and assets and page refers to CDN version of same. Assets and libraries are independent and are reflected independently with the Pages.
Based on TTL the outdated resources gets cleared of from CDN
There may be additional steps required in getting the above steps working, what I have outlined is the high level approach. You will need to follow the security, SSL, domain modeling, and other configuration guidelines as specified in the dispatcher documentation and CDN setup. For few you could refer to AKAMAI blog here

jsessionid overridden with tomcat and web-logic as backend servers

For a web-application, we are dependant on CMS deployed on web-logic and web-app deployed on tomcat. When user access a page, dynamic content is rendered from tomcat(sticky session is enabled) and static content(js, css etc.,) are rendered from CMS(on web-logic). This is leading to a conflict on JSESSIONID cookie. The web-logic JSESSIONID is overriding the Tomcat JSESSIONID and the user is loosing the contents saved in session, when moving to and from various parts of the site.
The request flow is as below
[1]: http://i.stack.imgur.com/17Ft5.png
As a band-aid, we wrote a rule on load balancer to drop JSESSIONID for all responses coming from CMS.
Though it worked, looking for a better way to handle this.
Why your CMS is setting a cookie? Does it need sessions to provide those files?
Usually static files do not need a session. One should allow them to be cached on proxies and on the client.
Configure your CMS appropriately. If it is a web application, you may add a Filter that removes Set-Cookie header from its responses (like you are doing on your LB).
It is possible to change the name of a session cookie. This is configurable using <session-config>/<cookie-config>/<name> element in web.xml in web applications that adhere to Servlet 3.0 (or later) specification.
(It is also configurable as sessionCookieName attribute on Context element in META-INF/context.xml, but using web.xml is the recommended way).
Note that Cookies can have a Path attribute. A browser won't send a cookie if its Path does not match the URL of the request. Cookies with Path:/web and Path:/content can happily co-exist together.
Tomcat supports requests that have several JSESSIONID cookies. It just chooses the one that matches an existing session. All the others are ignored.