Some our links are wrapped by PJAX. When a user clicks on a PJAX link the server returns only the required part of the HTML.
If I do the following:
Click PJAX link
Click simple link
Press back button
the browser will display content that was returned by the PJAX request. The HTML will be broken because it's only part of the HTML to be displayed (check this question).
We have tried to fix this by not caching PJAX responses (Cache-Control header). This fixed our problem but raised another one:
When the user presses the back button, WebKit (Chrome 20.0) loads full content from server, then fires popstate event that causes an unnecessary PJAX request.
Is it possible to recreate correct back button behaviour?
To make the Browser aware of the different versions of the HTTP resources depending on the request-headers I added a Vary http header.
Using Vary, you don't need to send no-cache headers anymore and therefore get your page fast again.
In PHP this would look like:
header("Vary: X-PJAX");
Since we sometimes use 3 representations per URL (regular http, pjax and ajax) - because migrate to a PJAX approach in a already sometimes ajaxified app - we actually use:
header("Vary: X-PJAX,X-Requested-With");
In case you need to support old IE (older than IE9) versions you need to make sure that the Vary header is stripped by your webserver, because otherweise old IE will disable caching for all your resources which provide a Vary header.
This could be achieved by the following setting in your .htaccess/vhost config:
BrowserMatch "MSIE" force-no-vary
Edit:
Underlying chrome bug, https://code.google.com/p/chromium/issues/detail?id=94369
This all depends from server caching settings. You browser caches the AJAX response from server and when you click Back button it uses cached version.
To prevent caching set following headers on server:
'Cache-Control' => 'no-cache, no-store, max-age=0, must-revalidate'
'Pragma' => 'no-cache'
If you are using Rails, then definitely try Wiselinks https://github.com/igor-alexandrov/wiselinks. It is a a Swiss Army knife for browser state management. Here are some details: http://igor-alexandrov.github.io/blog/2013/07/11/the-way-to-wiselinks-1-dot-0/.
Related
I have a PWA which has been developed in ASP.net Core and hosted on an Azure App Service (Linux).
When a new version of the PWA was released, I found that devices failed to update without clearing the browser cache.
To resolve this, I discovered a tag helper called asp-append-version that will clear cache for a specific file. I also discovered that I can append the version of the src attribute that specifies the URL of a file to trigger the browser to retrieve the latest file. For example, src="/scripts/pwa.js?v=1". Each time I update the pwa.js file I would also change the version i.e. v=2.
I’ve now discovered that my PWA is caching other JavaScript files in my application which results in the app not working on devices that have been updated to the new version however failed to clear the cache on specific files.
I believed that if I didn’t specify any cache control headers such as Cache-Control that the browser would not cache any files however this appears not to be the case.
To resolve this issue, is the recommended approach to add the appropriate Cache-Control headers (Cache-Control, Pragma, and Expires) to prevent browser caching or should I only add the tag helper asp-append-version to for example scripts tags to auto clear cache for that specific file?
I would preferably like the browser to store for example images rather than going to the server each time to retrieve these. I believe setting the header Cache-Control: no-cache would work as this would check if the file has changed before retrieving the updated version?
Thanks.
Thanks # SteveSandersonMS for your insights, In your web server returns correct HTTP cache control headers, browsers will know not to re-use cached resources.
Refer here link 1 & link 2 for Cache control headers on Linux app service
For example, if you use the "ASP.NET Core hosted" version of the Blazor WebAssembly template, the server will return Cache-Control: no-cache headers which means the browser will always check with the server whether updated content is present (and this uses etags, so the server will return 304 meaning "keep using your cached content" if nothing has changed since the browser last updated its content).
If you use a different web server or service, you need to configure the web server to return correct caching headers. Blazor WebAssembly can't control or even influence that.
Refer here
In ASP.NET Core application I have a action method that returns some data. I wanted to cache this data on client side. So based on the documentation here i can use ResponseCache attribute on the action method. This attribute adds Cache-Control header in response
Response caching refers to specifying cache-related headers on HTTP
responses made by ASP.NET Core MVC actions. These headers specify how
you want client and intermediate (proxy) machines to cache responses
to certain requests (if at all). This can reduce the number of
requests a client or proxy makes to the web server, since future
requests for the same action may be served from the client or proxy’s
cache.
also
Response caching does not cache responses on the web server. It
differs from output caching, which would cache responses in memory on
the server in earlier versions of ASP.NET and ASP.NET MVC.
So this is how my action method looks
public class LookupController : Controller
{
[HttpGet]
[ResponseCache(Duration = 120)]
public IEnumerable<StateProvinceLookupModel> GetStateProvinces()
{
return _domain.GetStateProvinces();
}
}
Then i call the method using browser as http://localhost:40004/lookup/getstateprovinces
Here is the request and response headers
Notice that Response Headers has Cache-Control: public,max-age-120 as expected.
However if refresh the page using F5 (before 120 seconds), the debugger breakpoint inside GetStateProvince action method alway hits. That means its not cahing the data on client side.
Is there anything else i need to do to enable client side caching?
Update
I have tried using IE, Chrome and also POSTMAN with no luck. Everytime i type the url in address bar or hit refresh the client ( that is browser or postman) makes a call to action method.
Actually ResponseCache attribute works as intended.
The difference is that the response is cached if you navigate through your website pages (case 1), or use back and forward buttons (not when refreshing the page).
As an example of case 1, I have the following:
you're on page http://localhost:65060/Home/Index
type another url and click enter or click a link in your webpage: http://localhost:65060/Home/Users
type again the url http://localhost:65060/Home/Index (you will see that this time the response for this url gets fetched from disk cache)
As you will see in the article Response Caching in ASP.Net Core 1.1, the following is stated:
During a browser session, browsing multiple pages within the website or using back and forward button to visit the pages, content will be served from the local browser cache (if not expired).
But when page is refreshed via F5, the request will be go to the server and page content will get refreshed. You can verify it via refreshing contact page using F5.
So when you hit F5, response caching expiration value has no role to play to serve the content. You should see 200 response for contact request.
References:
[1]. ASP.NET Core Response Caching Sample
[2]. ResponseCache attribute sample
[3]: How to control web page caching, across all browsers?
Long story short, using the ResponseCache attribute like the following is sufficient to get expiration-based client-side caching to work in a brand new, default dotnet core project (including async methods):
[HttpGet]
[ResponseCache(Duration = 120)]
public IEnumerable<StateProvinceLookupModel> GetStateProvinces()
{
return _domain.GetStateProvinces();
}
This is working correctly in the screenshot above, as the Cache-Control: public,max-age=120 is visible there. In most cases, browsers won't send subsequent requests before the expiration (i.e. for the next 120 seconds or 2 minutes), but this is a decision of the browser (or other client).
If the request is sent regardless, you either have some middleware or server configuration overwriting your response headers, or your client ignores the caching directive. In the screenshot above, the client ignores caching, because the cache control header is there.
Common cases where the client cache is ignored and the request is sent:
Chrome prevents any kind of caching when using HTTPS without a certificate (or an invalid certificate, this is often common for local development, so make sure to use HTTP when testing your cache, or trust a self-signed cert)
Most browser dev tools disable caching by default when open, this can be disabled
Browsers usually send additional headers, Chrome sends Cache-Control: no-cache
Refreshing directly (i.e. Ctrl+F5) will instruct most browsers to not use a cache and make the request regardless of age
Browsers usually send additional headers, Chrome sends Cache-Control: max-age=0 (this is visible in your screenshot)
Postman sends the Cache-Control: no-cache header which makes it bypass the local cache, resulting in requests to be sent; you can disable it from the settings dialog, in which case requests will no longer be sent with the above client cache configuration
At this point we are beyond expiration-based client caching, and the server will receive the request in one way or another, and another layer of caching occurs: you may make the server respond with a 304 Not Modified code (which is then again up to the client to interpret in whatever way it wants) or use a server-side cache and respond with the full content. Or you may not use any subsequent caching and just perform the entire request processing again on the server.
Note: the ResponseCache attribute is not to be confused with services.AddResponseCaching() & app.UseResponseCaching() middleware in startup configuration, because that is for server-side caching (which by default uses an in-memory cache, when using the middleware). The middleware is not required for client-caching to work, the attribute by itself is enough.
First of all I want to clarify few thing and I am sure that you already knew it.
ResponseCache is not equal to OutputCache any way.
ResponseCache is as per my thinking set header but it does not cache anything on server side.
Now If you want to Cache same as OutputCache then you might have to use preview release 1.1 that just release.
ASP.net core 1.1 preview release
https://blogs.msdn.microsoft.com/webdev/2016/10/25/announcing-asp-net-core-1-1-preview-1/
They introduce new Response Caching Middleware. Response Caching Middleware
Demo of it available here . https://github.com/aspnet/ResponseCaching/blob/dev/samples/ResponseCachingSample/Startup.cs
Environment:
Firefox with Firebug and Jsonview addons installed and verified to work,
A web service that claims to send the response with "application/json" as the mime type; the service URL does NOT end in ".json".
Problem: Firefox puts up the open/save dialog instead of letting Jsonview to display the content.
Question: How do I see the response headers/content in Firebug for attachments or other responses that otherwise trigger an open/save dialog?
Related question: Is it possible to see headers/content in Firebug for responses that are targeted for popup windows?
Would like to avoid installing other plugins or network monitors if at all possible. Any help is greatly appreciated.
I've noticed that Firebug needs to be "refreshed" when you want it to display a request with Content-Disposition: attachment:
Try to switch to another Firefox tab, where Firebug is not activated, and then switch back to your current tab. Firebug will get refreshed and show also previously invisible request in its Net panel.
EDIT
Now it's known issue 7264
I can do this in FF and IE, and I know it doesn't exist in Chrome yet. Anybody know if you can do this in a Safari plugin? I can't find anything that says one way or another in the documentation.
Edit (November 2021): as pointed out in the comments, ParosProxy seems to no longer exist (and was last released ~2006 from what I can see). There are more modern options for debugging on Mac (outside of browser plugins on non-Safari browsers) like Proxyman. Rather than adding another list of links that might expire, I'll instead advise people to search for "debugging proxy" on their platform of choice instead.
Original Answer (2012):
The Safari "Develop" menu in advanced preferences allows you to partially customize headers (like the user agent), but it is quite limited.
However, if a particular browser or app does not allow you to alter the headers, just take it out of the equation. You can use things like Fiddler or ParosProxy (and many others) to alter the requests regardless of the application sending the request.
They also have the advantage of allowing you to make sure that you are sending the same headers regardless of the application in question and (depending on your requirements) potentially work across multiple browsers and apps without modification.
Safari has added extension support but its APIs don't let you have granular level control over Request & Response as compared to Chrome/Firefox/Edge.
To have granular level control over your Request and Response, you need setup a system wide proxy instead.
Requestly Desktop App automatically does this for you and on top of that, you can do various types of modifications too like:
Modify Request/Response Headers
Redirect URLs
Modify Response
Delay Network request
Insert Custom Scripts
Change User-Agent
Here's an article about header modification using requestly
https://requestly.io/feature/modify-request-response-headers/
Disclaimer: I work at Requestly
I'm working on a webserver that I didn't totally set up and I'm trying to figure out which parts of a web page are being sent encrypted and which aren't. Firefox tells me that parts of the page are encrypted, but I want to know what, specifically, is encrypted.
The problem is not always bad links in your page.
If you link to iresources at an external site using https://, and then the external site does its own HTTP redirect to non-SSL pages, that will break the SSL lock on your page.
BUT, when you viewing the source or the information in the media tab, you will not see any http://, becuase your page is properly using only https:// links.
As suggested above, the firebug Net tab will show this and any other problems. Follow these steps:
Install Firebug add-on into firefox if you don't already have it, and restart FF when prompted.
Open Firebug (F12 or the little insect menu to the right of your search box).
In firebug, choose the "Net" tab. Hit "Enable" (text link) to turn it on
Refresh your problem page without using the cache by hitting Ctrl-Shift-R (or Command-shift-R in OSX). You will see the "Net" tab in firefox fill up with a list of each HTTP request made.
Once the page is done loading, hover your mouse over the left colum of each HTTP request shown in the net tab. A tooltip will appear showing you the actual link used. it will be easy to spot any that are http:// instead of https://.
If any of your links resulted in an HTTP redirect, you will see "301 Moved Permanently" in the HTTP status column, and another HTTP request will be just below for the new location. If the problem was due to an external redirect, that's where the evidence will be - the new location's request will be HTTP.
If your problem is due to redirections from an external site, you will see "301 Moved permanently" status codes for the requests that point them to their new location.
Exapnd any of those 301 relocations with the plus sign at the left, and review the response headers to see what is going on. the Location: header will tell you the new location the external server is requesting browsers use.
Make note of this info in the redirect, then send a friendly polite email to the external site in question and ask them to remove the https:// -> http:// redirects for you. Explain how it's breaking the SSL on your site, and ideally include a link to the page that is broken if possible, so that they can see the error for themselves. (this will spur faster action than if you just tell them about the error).
Here is sample output from Firebug for the the external redirect issue.. In my case I found a page calling https:// data feeds was getting the feeds rewritten by the external server to http://.
I've renamed my site to "mysite.example.com" and the external site to "external.example.com", but otherwise left the heders intact. The request headers are shown at the bottom, below the response headers. Note that I"m requesting an https:// link from my site, but getting redirected to an http:// link, which is what was breaking my SSL lock:
Response Headers
Server nginx/0.8.54
Date Fri, 07 Oct 2011 17:35:16 GMT
Content-Type text/html
Content-Length 185
Connection keep-alive
Location http://external.example.com/embed/?key=t6Qu2&width=940&height=300&interval=week&baseAtZero=false
Request Headers
Host external.example.com
User-Agent Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101 Firefox/7.0.1
Accept */*
Accept-Language en-gb,en;q=0.5
Accept-Encoding gzip, deflate
Accept-Charset ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection keep-alive
Referer https://mysite.example.com/real-time-data
Cookie JSESSIONID=B33FF1C1F1B732E7F05A547A9CB76ED3
Pragma no-cache
Cache-Control no-cache
So, the important thing to note is that in the Response Headers (above), you are seeing a Location: that starts with http://, not https://. Your browser will take this into account when figuring out if the lock is valid or not, and report only partially encrypted content! (This is actually an important browser security feature to alert users to a potential XSRF and/or phishing attacks).
The solution in this case is not something you can fix on your site - you have to ask the external site to stop their redirect to http. Often this was done on their side for convenience, without realizing this consequence, and a well written, polite email can get it fixed.
For each element loaded in page, check their scheme:
it starts with HTTPS: it is encrypted.
it starts with HTTP: it's not encrypted.
(you can see a relatively complete list on firefox by right-clicking on the page and selecting "View Page Info" then the "medias"tab.
EDIT: FF only shows images and multimedia elements. They are also javascript files & CSS ones which have to be checked. And Firebug is a good tool to find what you need.
Some elements may not list http or https, in this case whichever was used for the page will be used for these items, i.e. if the page request is under SSL then these images will come encrypted while if the page request is not under SSL then these will come unencrypted. Fiddler in Internet Explorer may also be useful in tracking down some of this information.
Sniff the packets - that'll tell you really quick. WireShark is a good program for such a task.
Can firebug do this?
Edit: Looks like firebug will also do this using the "Net" panel, which also gives you some other interesting statistics.
The best tool I have found for detecting http links on a https connection is Fiddler. It's also great for many other troubleshooting efforts.
I use FF plugin HTTPFox for this.
https://addons.mozilla.org/en-us/firefox/addon/httpfox/