Will Gzip compression affect my db queries? - gzip

Maybe it is a stupid question, but will gzip compression affect my db queries ( i mean, will they be cached) or gzip is related only to html and css output delivered to browser and php stuff will still be efecuted on the server with every page load?

If your using HTTP server compression then as you guessed its a way to optimize transmission from server to client, it (generally) works transparently.
Browser extensions as firebug or monitor tools such as fiddler can help you see whats is really happens behind the scenes.

Related

optimize build size vuejs

When I build the production build, the size of the css+js is going up to 3.8MB.
The only thing I could see is bootstrap which is taking half of the size amongst 3.8MB.
The app contains CRUD functionality in admin module where I have used bootstrap mostly and the other module is a list of static pages wherein I have used only grid of bootstrap.
Kindly guide on How can I make improvement in optimizing this further?
This is expected and using bootstrap and there's nothing you can do. If you had, instead, used bootstrap-vue you could import only the specific parts of the modules that you need (javascript) and that would significantly reduce the size of your bundle.
With that said, there's nothing wrong here. The gzip size of these is 252kb at maximum and that's quite cheap.
If you serve your site using http2 and the browser supports it, your request will be multiplexed and will use TCP pipelines to load the assets. This has huge gains and improvements over HTTP1 in that:
the connection to your server is opened through a TCP socket
the TCP socket then balances the requests by using Frames (which are asynchronous) vs http1 which is synchronous and could only manage 2 synchronous HTTPD threads at a time
the pipeline does not wait for assets and continues to cascade requests for assets, which improves the page load vastly.
So to summarize - serve your assets gzipped and make sure your web server uses http2 and your issue is trivial at this time.
Consider using purgecss plugin to get rid of all unused bootstrap classes: https://www.purgecss.com/guides/vue

Http benchmark tool for testing with different post payload each time

Can you please recommend a http benchmark tool that can use a dynamic post payload on each request?
I want it to be able to do concurent requests and at the same time each request to be different?
I've tested with Apache Benchmark but couldn't be able to do that, I also tried with Curl multi, but statistics were awful to gather.
Is there any other solution?
If the objective is to push load to apache web server then apache JMeter is a very good tool. However, it is not reliable for measuring the performance. JMeter can read from a CSV file, so you can build your POST data dynamically.

Gzip compression over SSL with Safari?

I ran into a really weird issue this morning when trying to access a web app I'm building using an iPad (Safari Mobile/Webkit). On the front end, the web app relies heavily on XHR/Ajax requests. On the back end, the server is configured to gzip compress responses if the "Accept-Encoding" includes "gzip".
Everything was working great until I flipped the server to SSL. Then I started getting intermittent "CFURLErrorDomain:303" errors in Safari.
After a quick search I found this link:
http://beyondrelational.com/modules/2/blogs/45/posts/12034/failed-to-load-resource-safari-issue.aspx
According to the link, Safari requires a content-length header when making XHR (ajax) request over SSL/HTTPS. In my case, the server is gzipping content directly to the output stream so I have no way of knowing what the final content length will be.
As a workaround, I added the following logic on the server:
if (request.isEncrypted()) gzip =
!request.getHeader("User-Agent").toLowerCase().contains("webkit");
In other words, if the connection is encrypted via SSL, and the browser is some webkit derivative (e.g. Safari, Chrome, etc), then don't compress the output. This seems to work but it really slows things down.
So my question is this:
Does Safari support gzip compressed responses over SSL or am I barking up the wrong tree?
Turns out the error I was seeing was a bug in the server and nothing to do with Safari. The server was relying on chunked transfer encoding when compressing large byte arrays. Individual "chunks" were broken up into pieces (header, body, trailer) and sent to the client in separate messages. The SSL client (safari) was expecting one contiguous "chunk" so it didn't know what to do when it saw an incomplete chunk. The server has been patched and the issue is now resolved.

Http Live Streaming with the Apache web server

Is it possible to do HLS with an Apache web server? Would it be enough to "put here the playlist with data chunks"? Is it that simple? Or is there some module, which can be used for that purpose?
Thanks a lot for the reply
Yes, it's sufficient to merely have the m3u8 and segmented ts files available. The benefit of HLS is that it is bog simple HTTP.
It's possible that you'll have to setup the mime types in Apache, but it's probably correct by default.
There are surely apache2 modules for doing that.
My personal choice for streaming audio and video and especially Video-On-Demand however is vlc. Great funcionality for recoding, adopting your output to what ever client wants to view it, etc. etc.
Maybe worth a look.

Is there an HTTP proxy tool that can substitute browsed content?

What I'm looking for is some sort of a proxy tool that will allow me to specify a local file to load instead of one specified in the web page that is being browsed. I have tried Burp Suite which is almost working - it allows us to intercept a file and replace it by pasting the contents of the file we are swapping in into an input field. The file content is compiled code (Flash content) so we are pasting in bytecode, but something isn't working.
The reason is we are a 3rd party software developer without access to our client's development or testing environments. Our content must interact correctly with the rest of the content on their webpage (there are elements on their page that communicate with our content) and to test any changes we make takes several hours turnaround to get our files uploaded to their servers. So what we need is some sort of hacking tool to let us test our work with their web pages, hence the requirement to specify a file in a webpage to swap with a local version.
The autoresponder feature in Fiddler Web Debugging Proxy might do what you need, if it's only static content.
I've been using HTTP::Proxy for a long time, and it has always helped me fiddle with things on the fly.
You might be able to do this with Greasemonkey but I'm not sure if the tests will be totally reliable.
http://diveintogreasemonkey.org/patterns/replace-element.html
And if Greasemonkey seems plain wrong for you I would take it as the perfect excuse to try out mouseHole. Now I have to admit that I've never tried it but since _why also made Hpricot I expect it to be fun, productive, and different.