trying to enable gzip compression for javascripts (jquery libs) I have on my site.
I do have enabled deflate in Apache's httpd.config file, and I have added next lines in .htaccess:
RewriteEngine On
AddOutputFilterByType DEFLATE text/html text/plain text/javascript
But, when I check with Google's page speed web performance tool, it gives me information that js is not compressed.
Can you tell me what I do wrong and how can I enable gzip compression for my web app?
Thank you in advance!
Javascript isn't text/javascript, it is application/javascript.
Related
We've currently setup GZIP compression using htaccess deflate command. I wondered if anybody would kindly help us understand the following...
Are there any potential problems using htaccess to deflate, like additional server strain in deflating?, And is this suitable for a site with 1,200 daily page views which pulls in several JS / CSS files?
We've considered hosting GZIP files alongside our content and creating a script to update zip files should the unzipped file change. However a simple dump of the following code seems to be much easier providing it doesn't bring its own issues...
# compress text, html, javascript, css, xml:
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE font/ttf font/otf
Thanks in advance for any advice.
Dave
It really dependends on what your are deflating. If you have a good CPU the perfomance hit for text based files(js,css etc) is almost nothing. I would also include mod_expires for cache control of static files. If you are caching large dynamic files then you might run into a performance hit. However the text files that you are caching in your current rules should not really have a large impact.
So yes you should be caching your files for a good user experience which you can read about here.
https://developers.google.com/web/fundamentals/performance/optimizing-content-efficiency/optimize-encoding-and-transfer
And depending on what you are servering you can find out some performance info on mod_deflate here.
http://www.webperformance.com/library/reports/moddeflate/
I personally use mod_deflate on my static content and my sites load very fast. I don't think you will have an issue with what you are currently using it for.
Let's say however that you have 20 external JS and 10 CSS files embedded in the same page (as many sites do these day), making it a total of 31 files (including the .html file itself), and they averaged say 20KBs each. I'm guessing that would equate to 31 processes on the server side to compress all files, and 31 process on the client side to deflate all files (someone please confirm).
With 1200 requests a day, that'd equate to 37,200 process on the server side. I'd be looking at CPU usage for each process. On a busy server you may run into trouble. For static ASCII files like CSS, HTML and JS it may be better to retain a copy of the gizps and use something like this in your .htaccess file :
<FilesMatch "\.css\.gz$">
ForceType text/css
AddEncoding gzip .gz
</FilesMatch>
<FilesMatch "\.html\.gz$">
ForceType text/html
AddEncoding gzip .gz
</FilesMatch>
RewriteEngine On
RewriteCond %{HTTP:Accept-Encoding} gzip
RewriteCond %{REQUEST_FILENAME}.gz -f
RewriteRule (.*\.(html|css))$ $1.gz [L]
Which would serve the existing gzip, rather than creating thousands of gzips on the fly every day. You'd have to create the gzips using this method though.
I am using mod_defalte, as so:
<IfModule mod_deflate.c>
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE image/svg+xml
AddOutputFilterByType DEFLATE image/x-icon
</IfModule>
I am wondering, but cannot find the answer to: Does the above mean all content that matches those rules will be gzipped to any request? Or does mod_defalte on compress when the HTTP request states it can accept gzip?
Further, I am reading some posts where people are disabling certain browsers with bugs in their gzip implementation. But there is no explanation for this. Does anyone have a definitive set of rules for this. Or is it not needed?
mod_deflate is capable of compressing using gzip encoding.
Sometimes the module skips certain files because they are either too small or thought to have no significant gain.
The request header tells the server whether to compress or not.
Most of the bugs are related to proxy server on the client side where gzipped content is cached because a browser that accepts the encoding requested a resource first, but other browsers behind the same cache cannot. This is the reason to use the Vary header.
When I check this website: http://www.tropicbreeze.co.uk/, for example with http://checkgzipcompression.com/ - it reports that it is using Gzip. But Yslow disagrees.
I have this in my .htaccess file:
AddOutputFilterByType DEFLATE text/html text/php text/x-php application/php application/x-php application/x-httpd-php-source text/plain text/xml text/css text/javascript application/x-javascript application/xhtml+xml application/javascript application/x-httpd-php
Checking in the Net tab of Firebug for headers, I can see that the various associated .css and .js files on that page have Content-Encoding gzip appearing as expected - but the php files do not.
Yslow tells me that the homepage is not using Gzip. The Firebug Net tab says that the home page (and other php files on the server) are not being sent with Content-Encoding gzip
I've tried adding all the mimetype filters I could find suggested, and so far as I can see, DEFLATE text/html should cover it anyway, but still no joy.
I have cleared my cache and am not using a proxy.
Can anyone suggest what I've missed? Why are the php files not being gzipped when the other files are? Or, if they are being gzipped, why does Firebug /yslow think they aren't?
Not sure exactly what the issue might be but perhaps an easy fix would be to try something like...
<IfModule mod_deflate.c>
<FilesMatch "\.(css|htm|html|js|php|txt|xml)$">
SetOutputFilter DEFLATE
</FilesMatch>
</IfModule>
...which should setup compression on popular file extensions including PHP.
I personally think this format is much easier for a human to read too. Bonus!
I'm thinking of enabling deflate mode on Apache. I'm curious how does Apache deal with images or flash files when deflate is on? Does it try to compress images and flash files as well?
Sure. If you don't set any filters the server will compress everything if the client supports the compression.
If you use the line from the documentation:
AddOutputFilterByType DEFLATE text/html text/plain text/xml
That would just compress text files with that content types.
"Image file formats supported by the web, as well as videos, PDFs and
other binary formats, are already compressed; using gzip on them won't
provide any additional benefit, and can actually make them larger. To
compress images, see Optimize images."
https://developers.google.com/speed/docs/best-practices/payload#GzipCompression
What is the recommended way to configure Apache to enable HTTP compression for CSS and JS fiels, using .htaccess? I have seen several different directives used in examples on the web. I have tried some of these and yet I have not been able to get it to work. I am checking the HTTP headers using Firebug.
Ensure that mod_deflate and mod_mime are enabled and add something like:
AddOutputFilterByType DEFLATE application/x-javascript text/javascript text/css
to your .htaccess.
See also: http://brightscape.net/blog/compress-your-web-pages-with-mod_deflate/