How to add gzip compression to Hypercorn server serving a Quart app - gzip

I have a quart app running on a hypercorn (0.6) server. (EC-2 amazon Ubuntu 18)
The page loads too slow and one of the recommendations is to add gzip compression.
Does anyone have experience with this?
kind regards,
alex

You probably should use a reverse proxy server to do this job upfront. But you can also use Quart so serve its static files compressed. See this link

Related

AWS S3 and AjaXplorer

I'm using AjaXplorer to give access to my clients to a shared directory stored in Amazon S3. I installed the SD, configured the plugin (http://ajaxplorer.info/plugins/access/s3/) and could upload and download files but the upload size is limited to my host PHP limit which is 64MB.
Is there a way I can upload directly to S3 without going over my host to improve speed and have S3 limit, no PHP's?
Thanks
I think that is not possible, because the server will first climb to the PHP file and then make transfer to bucket.
Maybe
The only way around this is to use some JQuery or JS that can bypass your server/PHP entirely and stream directly into S3. This involves enabling CORS and creating a signed policy on the fly to allow your uploads, but it can be done!
I ran into just this issue with some inordinately large media files for our website users that I no longer wanted to host on the web servers themselves.
The best place to start, IMHO is here:
https://github.com/blueimp/jQuery-File-Upload
A demo is here:
https://blueimp.github.io/jQuery-File-Upload/
This was written to upload+write files to a variety of locations, including S3. The only tricky bits are getting your MIME type correct for each particular upload, and getting your bucket policy the way you need it.

Why is mod_deflate not supported by my hosting company?

I was just doing some testing with YSlow and it's telling me:
Grade F on Compress components with gzip: There are 10 plain text
components that should be sent compressed
I know that Apache 1.3 uses mod_gzip while Apache 2.x uses mod_deflate, and so the easiest solution to remedy this is to use mod_deflate on an Apache 2 server.
However, I've checked with two shared hosting companies and one local company and they've all told me that they don't support mod_deflate.
I know that some older browsers have trouble accepting gzipped / deflated content, and I'm not suggesting it be enabled by default, but are there any negatives for making mod_deflate available? Is it just extra load on the server's processors?
Also, are there any alternatives? I saw that if you are using a CMS like Wordpress you could potentially install a caching plugin which would serve out gzipped cached versions of the pages initially generated via PHP.
Compression takes CPU time. Maybe the hosting company decided they care more about CPU than network traffic. Maybe they offer it with a more expensive package. MAybe they simply didn't add it. Only your hosting company would know.
When using PHP you can check whether your PHP setup has zlib support enabled. If that is the case you can use ob_start("ob_gzhandler"); in code to enable an output buffer which will compress your data or set zlib.output_compression in your php configuration for instance by using php_flag zlib.outout_compression on in your .htaccessfile.
http://php.net/ob_gzhandler
http://php.net/zlib.output-compression

Using mod_disk_cache in Apache?

I want to use mod_disk_cache in apache to cache my xml feeds to a folder and serve direct from that folder.
These are feeds dynamically created by php - but not changing very often.
I want the caching at the htaccess level to avoid any strain/call to php and keep server stress to a minimum.
http://httpd.apache.org/docs/2.2/mod/mod_cache.html
httpd.apache.org/docs/2.2/mod/mod_disk_cache.html
Has anyone done this before? Did it work for you?
I'm getting my server company to install the modules I need and can then have a go myself.
I'm hoping to use something similar to:
<IfModule mod_cache.c>
<IfModule mod_disk_cache.c>
CacheRoot c:/cacheroot
CacheEnable disk /
CacheDirLevels 5
CacheDirLength 3
</IfModule>
</IfModule>
I'll be sending Expires: and Last-Modified: headers in the xml too.
Think this will give me the desired solution and filling that cache folder and avoiding calls to php?
Or is this approach all wrong?
Thanks in advance for any guidance
I used in the past Apache with mod_cache on a Unix environment. It worked fine with low user load, but days with heavy load the system went down all the day.
After some tests we moved to Varnish Cache and now everything works better.
The problem is that only Unix environment is supported, a new varnish windows cygwin-based version exists, but I don't now if is suitable for production environment:
http://varnish-cache.org/trac/wiki/VarnishOnCygwinWindows
It's not a bad thing. I've been using it long time ago. It works.
But you should know there are now really better alternatives when handling caches in front of an apache server. One of theses nice tools is Varnish. You will have very fine tunnings available.
Here's a deep explanation of why varnish is a modern tool and why this new way of using the OS (and not separating memory and disk in spirit) is good : http://www.varnish-cache.org/trac/wiki/ArchitectNotes
About the headers you should use theses headers to communicate with Varnish (or other things, like urls) and let the cache tool handle the final headers.
If you can have a direct access on your server and not just a restricted apache access try it. Now if you can only access apache configuration... but ... c:/cacheroot, you're using a windows server in production? You'll need an Unix-like system for varnish preferably 64bits.

Apache, lighttpd, nginx, cherokee, what's the best combination?

I have a blog, dynamic (php) and static content (images, css, js). I googled a lot to find benchmarks on each server and figured out that there's actually no best server. Therefore I'm looking for returns on experience to choose the good combination.
Update in response to wheaties: well, my needs are I think, the same as everyone; I need all my pages to load quickly--including static content--an I need the highest HTTP queries/second rate possible. Also, if it can help, I'm using MongoDB. Btw, do I still need to cache my DB queries with this?
Regarding Apache and Nginx:
I used Apache for almost 10 years. Then I discovered Nginx.
Quickly I found Nginx appealing
simple and powerful C code
configuration syntax is intuitive and elegant
Nginx was built with performance and efficiency in mind. It is incredibly efficient, even with thousands of connections.
php-fpm works well with Nginx
So, I would recommend, between the two (Apache and Nginx), Nginx.
Lighttpd is well known for serving static content. Nginx is a good option for dynamic (php) pages. I've heard of few sites which use lighttpd only for serving static content.
Lighttpd for static content and some caching scripts for dynamic (PHP).

Speeding up site using gzip and far-future expiration dates

I recently deployed a site http://boardlite.com . One of the tester websites http://www.gidnetwork.com/tools/gzip-test.php suggests that gzip is not enabled for my site. YSlow gives an A grade for Gzip but does not mention that gzip is on.
How do I make sure the site properly implements Gzip. I am also going to enable far-future expiry dates for static media. I would like to know if there are any best practices for setting the expiry date.
Static media on the site is served by nginx server while the site itself runs on top of apache, just in case if this information is required.
I'd advise against going too far into the future or you'll make site upgrades a nightmare. I believe a week should be enough since after that you'll still only be serving 302 responses not the whole image.
It looks like Gzip is enabled on your server. You can tell by checking the HTTP response headers for 'Content-Encoding: gzip'.
I can't think of any "best practices" for future expiry dates - other than to make sure they're not in the past ;)
There are many other ways you can optimize your web site. Spriting your CSS background images and using a content delivery network for your static content are a few.
Andrew