Leverage Browser Caching for WOFF fonts - seo

Google Page Speed inisght still tells me to leverage Browser Caching for some .woff fonts:
/fontawesome-webfont.woff?v=4.6.1
/rssocial-font.woff?13037212
in my .htaccesss file I have:
# Web Open Font Format (WOFF) 1.0
ExpiresByType application/font-woff "access plus 1 month"
ExpiresByType application/x-font-woff "access plus 1 month"
ExpiresByType font/woff "access plus 1 month"
but it looks like they are ignored...

Your syntax looks correct but it's actually being served as "access plus 2 days".
This suggests either your .htaccess is being completely ignored or its being overridden (either later in .htaccess or by other config).
I would say though that when loading 200 resources (70 of which from an advertising domain), that "only" caching 2 fonts (out of those 200 resources) for two days isn't really that big a deal or that big a performance drag.
Reduce your number of resources and domains for a much bigger impact.

Related

Apache returns old Etag and LastModified

I have a website with several pages (for example 1.htm and 2.htm) and some script files, referenced from this page.
My .htaccess file contains this code:
FileETag MTime Size
<ifModule mod_expires.c>
ExpiresActive On
ExpiresByType text/html "access plus 1 day"
ExpiresByType text/css "access plus 1 week"
ExpiresByType text/javascript "access plus 1 week"
ExpiresByType application/javascript "access plus 1 week"
ExpiresByType application/x-javascript "access plus 1 week"
ExpiresByType image/gif "access plus 1 year"
ExpiresByType image/jpeg "access plus 1 year"
ExpiresByType image/png "access plus 1 year"
</ifModule>
I visit page 1.htm, then change script file and navigate to page 2.htm. I expect that Apache will return new Etag/LastModified values and the script file will be updated. But it returns old values. What is wrong?
When I refresh the page I get new Etag/LastModified values.
The point of caching is that you don't have to download the file again for the specified time.
So this line:
ExpiresByType text/html "access plus 1 day"
Means if you visit the page on the same day then the page will be served from the cache rather than from the server. Therefore you will not get the new page nor the new Etag/Expiry headers. If you load developer tools in Chrome for example, you'll see the page is loaded "from cache".
If it's still in the cache and you refresh it, then the browser does double check with the server if the file has changed returning a 301 if it hasn't, and redownloading the page if it has changed (including new headers). But with a normal page load it doesn't even do this check with the server and serves straight from your cache. This is the way it's supposed to work.
So with above setting visitors to your site could still be seeing the old version of your page for up to 1 day after you change it.
You could add must-revalidate to the cache headers so it will cache and also check with the server every time but that loses most of the benefit of caching.
Btw, as an aside, you shouldn't use Etags with Apache as they don't work when also using gzip. More details here: https://www.tunetheweb.com/performance/http-performance-headers/etag/

How to keep the browser from requesting a file for a second time

I'm working on a web site, once finished the js files and pictures may remain unchanged for weeks, months or even years, who knows. So I though I could get a tremendous performance boost if I could instruct the browser to download and forget, that is: if the file is already on the local cache, retrieve and use that version regardless anything else, so the only way a file would be downloaded twice would be if the user clears the browser's cache. This obviously implies a commitment, because if something in my web site changes, I would be forced to change the name of the affected file(s) too, but with the huge advantage of being capable of overriding the browser's local cache whenever I need or want, gaining full control of it, fast, simple and wasting zero band width and CPU cycles.
I'm trying to achieve this by adding the proper cache commands to the HTTP response header.
Here is a sample:
HTTP/1.1 200 OK
Connection: Keep-Alive
Date: Sat, 27 Jul 2013 02:24:05 GMT
Content-Type: application/x-javascript
Content-Encoding: gzip
Content-Length: 13728
Last-Modified: Sat, 27 Jul 2013 00:37:59 GMT
ETag: "20130726193759"
Expires: Sat, 26 Jul 2014 05:00:00 GMT
Cache-Control: public, max-age=31536000
As you may realize, I'm instructing the browser to keep the file on its cache one full year. But I don't know if I'm doing something wrong, because the desktop browsers still make the file request with the If-None-Match parameter, in this case I just tell them the file hasn't changed which is sub optimal, but the case of the Android's browser is even worst, because it makes the request as if it was the first time.
Can anyone tell me if I'm doing something wrong???
You need to turn on the browser leverage caching using .htaccess files.
In order to make the files cached by the browser, say for example the images, stylesheets, JavaScript etc, you need to turn on the Leverage Caching using a .htaccess file, by which you can specifically inform the browser that specific content won't be changing for a specified period - a week/month/year. As the browser will not repeatedly request that content it will drastically improve your page loading speed for future visits from the same browser.
In your .htaccess file, you can add the following (for files that won't change)
ExpiresActive On
ExpiresByType image/jpg "access plus 1 year"
ExpiresByType image/jpeg "access plus 1 year"
ExpiresByType image/gif "access plus 1 year"
ExpiresByType image/png "access plus 1 year"
ExpiresByType text/css "access plus 1 month"
ExpiresByType application/pdf "access plus 1 month"
ExpiresByType text/x-javascript "access plus 1 month"
ExpiresByType application/x-shockwave-flash "access plus 1 month"
ExpiresByType image/x-icon "access plus 1 year"
ExpiresDefault "access plus 2 days"
So this way you can send the HTTP headers to set the expiry dates of different files.
You can refer an article here which will teach you in detail about the leverage caching and how to turn it on.

Stuck with Images/CSS cached by browser

There are some images present on my web page which we do update from back end. But those images not getting reflect directly means user need to do ctr+F5. It means client's browser cached that images. Is there any way to reload updated images, JS, CSS once thay got updated?
What I have try from my side.
ETag
Which didn't work for me.
Added following tags in my page header
<meta http-equiv="Expires" CONTENT="-1" ></meta>
<meta http-equiv="cache-control" content="no-cache"></meta>
<meta http-equiv="Pragma" CONTENT="no-cache"></meta>
But same result.
Added following configurations in Apache
ExpiresActive on
ExpiresByType application/javascript "access plus 0 seconds"
ExpiresByType image/jpg "access plus 0 seconds"
ExpiresByType image/jpeg "access plus 0 seconds"
ExpiresByType image/gif "access plus 0 seconds"
ExpiresByType image/png "access plus 0 seconds"
ExpiresByType text/css "access plus 0 seconds"
Above configurations works for JS. But facing problem with images.
I can't add any token number or version number at the end of url/image name which required too much code change and testing too.
Please guide me is there any other centralized way to restrict images,css from caching
Thanks in Advance.
You can use javascript to change the src of any image, this will cause the name to change and force the browser to reload.
Example:
document.getElementById('myimg').src = document.getElementById('myimg').src + '?r=' + Math.random();
Not the cleanest solution but it seems your options are limited.
To reload the entire pages images with jQuery you could try:
$('img').each(function() {
$(this).attr('src', $(this).attr('src') + '?r=' + Math.random())
});
By modifying the initial selector you can limit it to an area so you don't end up reloading everything. This will not effect images specified by css styling.

How to give different expires header for files in different folders with one .htaccess in root ?

I want to give different expiry headers to different images on my site. they are contained in different folders right now, what I want to do is give then different expires headers with one main .htaccess file. I know this can be done with multiple .htaccess files in those folders but I dont want it to be implemented that way, It will clearly be tough to manage.
Try using FilesMatch directive in your .htaccess file.
Eg:
# cache most product images at client side
ExpiresActive on
<FilesMatch "^images/products/[^\.]*\.(gif|jpe?g|png)$">
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
Header append Cache-Control "public"
</FilesMatch>
EDIT: This seems to be wrong! FilesMatch is said to match only files, so you cannot match directories as well. Sorry for bad post.

yslow still not giving me an A for expires header in apache httpd even though I added them

trying to add an ExpiresDefault ExpiresByType to content on my website so that way it is cached. I use cachebusting in the URL (a revision number in the path) for Javascript, CSS, and images so that way I can set it to forever for these mimetypes. I have the following rules set up in apache httpd:
ExpiresActive On
ExpiresDefault "access plus 1 minutes"
ExpiresByType image/gif "access plus 10 years"
ExpiresByType image/png "access plus 10 years"
ExpiresByType image/jpeg "access plus 10 years"
ExpiresByType image/jpg "access plus 10 years"
ExpiresByType text/javascript "access plus 10 years"
ExpiresByType text/css "access plus 10 years"
Then when I go to my website http://karmerd.com and use httplive headers to look at the headers I get what I think should be the correct Expires for css: Expires: Sun, 03 Feb 2019 17:52:48 GMT
But I use Yahoo's Yslow firebug extension and it's still giving me an F for not using Expires! Am I doing something wrong? I'm also using gravatars on my site, but they have Expires set. Seems like everything has an expires. Is it Yslow or me?
Your javascript files are being sent out as application/x-javascript, so aren't getting a far-future Expires header.
Don't rely on that tool to judge if your site is running fast or not. I've had it do many quirky things (just like yui) - and if it is giving you a false positive, your site is running fine, and you have no one complaining about speed - you most likely do not have a speed issue. The best way to see if things are cacheing are to watch the requests in firebug or another tool as they go out, if you aren't requesting it and retrieving it then it hasn't expired.