Add Expires headers for Cloudflare (Google, Disqus) - cloudflare

I am using DNS management cloudflare.
My site get hosting services on jekyll and github
When I analyze my site at Gtmetrix, I encounter the error "Add Expires headers".
How can I fix this error?
https://www.googletagmanager.com/gtm.js?id=GTM-KX5WC3P
https://cse.google.com/cse.js?cx=partner
https://ahmet123.disqus.com/
https://www.google-analytics.com/analytics.js
https://www.google.com/cse/static/style/look/v2/default.css
https://cse.google.com/adsense/search/async-ads.js
https://www.google.com/uds/css/v2/clear.png

Check out the first and second answers here: Github pages, HTTP headers
You can't change the HTTP headers, but you can do something like:
<meta http-equiv="Expires" content="600" />
inside the <head> section of the layout you use for your pages.

Hello How Fixing This problem after GT METRIX Performance Score
Add Expires headers
There are 9 static components without a far-future expiration date.
https://www.googletagmanager.com/gtag/js?id=UA-195633608-1
https://fonts.googleapis.com/css?family=Roboto+Slab%3A100%2C100italic%2C200%2C200italic%2C300%2C300italic%2C400%2C400italic%2C500%2C500italic%2C600%2C600italic%2C700%2C700italic%2C800%2C800italic%2C900%2C900italic%7CRoboto%3A100%2C100italic%2C200%2C200italic%2C300%2C300italic%2C400%2C400italic%2C500%2C500italic%2C600%2C600italic%2C700%2C700italic%2C800%2C800italic%2C900%2C900italic&display=auto&ver=5.9.2
https://www.googletagmanager.com/gtag/js?id=G-69EDJ9C5J7
https://static.cloudflareinsights.com/beacon.min.js
https://harhalakis.net/cdn-cgi/scripts/5c5dd728/cloudflare-static/email-decode.min.js
https://www.google-analytics.com/analytics.js
https://www.googletagmanager.com/gtag/js?id=G-69EDJ9C5J7&l=dataLayer&cx=c
https://www.googletagmanager.com/gtm.js?id=GTM-W3NQ5KW
https://www.google-analytics.com/plugins/ua/linkid.js
Thank you.
Konstantinos Harhalakis

Related

How can I force a hard refresh if page has been visited before

Is it possible to check if the client has a cached version of a website, and if so, force his browser to apply a hard refresh once?
You can't force a browser to do anything, because you don't know how rigidly a remote client is observing the rules of HTTP.
However you can set HTTP headers which the browser is supposed to obey.
One such is Cache-control. There are a number of values that may meet your needs including no-cache and max-age. There is also the Expires header which specifies a wall-clock expiration time.
It is not ready apparent if the client has a cached version. To tell the client not to use cache you can use these meta tags.
<HEAD>
<TITLE>---</TITLE>
<META HTTP-EQUIV="Pragma" CONTENT="no-cache">
<META HTTP-EQUIV="Expires" CONTENT="-1">
</HEAD>

Can Cloudflare cache HTML for anonymous users but not logged in users?

We have a website with anonymous user content generally static (updated once every hour), and content for logged in users different for every user, updated frequently.
Is it possible to configure cloudflare so that HTML is cached for anonymous users but not logged in users, given the same URL for both?
Are there any cache headers we can set that are relevant?
i have been struggling with the same scenario, and as far as i seen there's no way,
In cloudflare you can set up Page Rules but those will only look at the URL itself.
So the only solution i see would be having diferent URL's (some extra parameter or maybe a subdomain) for each kind of users, then you'll easily set up the rules to manually force the cache expiration time
I am not sure if this would work, but the only way I can think of doing this as a possibility is via HTTP/HTTPS as you can use this in the CloudFlare Page Rules.
So if you set the WordPress FORCE_SSL_ADMIN constant to true, and then were to redirect all the logged in user URL requests to the HTTPS schema of the page, this might do the trick and thus make it possible to bypass the CloudFlare cache?
This is not possible unless you are on the Enterprise plan level.
https://www.cloudflare.com/plans/
A solution from here
Add a private cache control to the page header:
//recommended by cloudflare for logged in users
if ( is_user_logged_in() ) {
header(‘Cache-Control: private, max-age=3600’);
}
Yes. Cloudflare says it will respect meta tags requesting no cache, therefore simply add those tags to the header section of the page for logged in users.
Example (using smarty)
{if $loggedin}
<meta http-equiv="cache-control" content="no-cache" />
{/if}
(using PHP)
if($loggedin)
{
echo "<meta http-equiv=\"cache-control\" content=\"no-cache\" />";
}

Internet Explorer 9 ignoring no cache headers

I'm tearing my hair out over Internet Explorer 9's caching.
I set a series of cookies from a perl script depending on a query string value. These cookies hold information about various things on the page like banners and colours.
The problem I'm having is that in IE9 it will always, ALWAYS, use the cache instead of using the new values. The sequence of events runs like this:
Visit www.example.com/?color=blue
Perl script sets cookies, I am redirected back to www.example.com
Colours are blue, everything is as expected.
Visit www.example.com/?color=red
Cookies set, redirected, colours set to red, all is normal
Re-visit www.example.com/?color=blue
Perl Script runs, cookies are re-set (I have confirmed this) but! IE9 retreives all resources from the cache, so on redirect all my colours stay red.
So, every time I visit a new URL it gets the resources fresh, but each time I visit a previously visited URL it retrieves them from the cache.
The following meta tags are in the <head> of example.com, which I thought would prevent the cache from being used:
<META HTTP-EQUIV="CACHE-CONTROL" CONTENT="NO-CACHE">
<META HTTP-EQUIV="PRAGMA" CONTENT="NO-CACHE">
<META HTTP-EQUIV="EXPIRES" CONTENT="0">
For what it's worth - I've also tried <META HTTP-EQUIV="EXPIRES"
CONTENT="-1">
IE9 seems to ignore ALL these directives. The only time I've had success so far in that browser is by using developer tools and ensuring that it is manually set to "Always refresh from server"
Why is IE ignoring my headers, and how can I force it to check the server each time?
Those are not headers. They are <meta> elements, which are an extremely poor substitute for HTTP headers. I suggest you read Mark Nottingham's caching tutorial, it goes into detail about this and about what caching directives are appropriate to use.
Also, ignore anybody telling you to set the caching to private. That enables caching in the browser - it says "this is okay to cache as long as you don't forward it on to another client".
Try sending the following as HTTP Headers (not meta tags):
Cache-Control: private, must-revalidate, max-age=0
Expires: Thu, 01 Jan 1970 00:00:00
I don't know if this will be useful to anybody, but I had a similar problem on my movies website (crosstastemovies.com). Whenever I clicked on the button "get more movies" (which retrieves a new random batch of movies to rate) IE9 would return the exact same page and ignore the server's response... :P
I had to call a random variable in order to keep IE9 from doing this. So instead of calling "index.php?location=rate_movies" I changed it to "index.php?location=rate_movies&rand=RANDOMSTRING".
Everything is ok now.
Cheers
Will just mention that I had a problem looking very like this. But I tried IE9 on a different computer and there was no issue. Then going to Internet Options -> General -> Delete and deleting everything restored correct behaviour. Deleting the cache was not sufficient.
The only items that HTML5 specifies are content-type, default-style and refresh. See the spec.
Anything else that seems to work is only by the grace of the browser and you can't depend on it.
johnstok is correct. Typing in that code will allow content to update from the server and not just refresh the page.
<meta http-equiv="Content-Type" content="text/html; charset=utf-8; Cache-Control: no-cache" />
put this line of code into your section if you need to have it in you asp code and it should work.

iweb pages are downloading rather than viewing

I'm having an issue with a friends iWeb website - http://www.africanhopecrafts.org. Rather than pages viewing they want to download instead but they're all html files. I've tried messing with my htaccess file to see if that was affecting it but nothings working.
Thanks so much
Most likely your friend's web site is dishing up the wrong MIME type. The web server might be malconfigured, but the page can override the content-type responde header by adding a <meta> tag to the page's <head> like this:
<meta http-equiv="content-type" content="text/html" charset="ISO-8859-1" />
(where the charset in use reflects that of the actual web page.)
If the page is being served up with the correct content-type, the browser might be malconfigured to not handle that content type. Does the problem occur for everybody, or just you? IS the problem dependent on the browser in use?
You can sniff the content-type by installing Firefox's Tamper Data plug in. Fire up Firefox, start TamperData and fetch the errant web page via Firefox. Examining the response headers for the request should tell you what content-type the page is being served up with.

Why do I have both HTTPS and HTTP links on site, need them all secure!

I am getting the security alert: "You are about to be directed to a connection that is not secure. the information you are sending to the current site might be transmitted to a non-secure site. Do you wish to continue?" when I try to login as a customer on my clients oscommerce website. I noticed the link in the status bar goes from a https prefix to a nonsecure http prefix. The site has a SSL certificate, so how do I ensure the entire store portion of the site directs to the secured site?
It is likely that some parts of the page, most often images or scripts, are loaded non-secure. You'll need to go through them in the browser's "view page source" view one by one and eliminate the reason (most often, a configuration setting pointing to http://).
Some external tools like Google Analytics that you may be embedding on your site can be included through https://, some don't. In that case, you may have to remove those tools from your secure site.
If you can't switch all the settings, try using relative paths
<img src="/images/shop/xyz.gif">
but the first thing is to identify the non-secure elements using the source code view of your browser.
An immediate redirection from a https:// page to a http:/ one would not result in a warning as you describe. Can you specify what's up with that?
Use Fiddler and browse your site, in the listing it should become evident what is using HTTP and HTTPS.
Ensure that the following are included over https:
css files
js files
embedded media (images, videos)
If you're confident none of your own stuff is included over http, check things like tracking pixels and other third-party gadgets.
Edit: Now that you've linked your page, I see that your <base> tag is the problem:
<base href="http://balancedecosolutions.com/products//catalog/">
Change to:
<base href="https://balancedecosolutions.com/products//catalog/">
If the suggestion from Pekka doesn't suit your needs you can try using relative links based on the schema (http or https):
e.g.,
I am a 100% valid link!
The only problem with this technique is that it doesn't work with CSS files in all browsers; though it does work within Javascript and inline CSS. (I could be wrong here; anyone want to check?).
e.g., the following :
<link rel="stylesheet" href="/css/mycss.css" />
<!-- mycss.css contents: -->
...
body{
background-image:url(//static.example.com/background.png);
}
...
...might fail.
A simple Find/Replace on your source code could be easy.
It sounds to me like the HTML form you are submitting is hardcoded to post to a non-secure page.