Browser is sending request for same content (PNG image) multiple times - browser-cache

I've a PNG file named icons.png hosted on an apache server. Basically this file is a combination of other small image elements (CSS Sprite). This file loads with a normal 200 OK response when the page loads for the 1st time.
After the page loads; there are some links hovering on which a custom tooltip is triggered. This tooltip displays part of the image from the icons.png file as a background image of some HTML elements.
For example the HTML code is like this:
jQuery(".profile").tipTip({delay: 200, defaultPosition: "top", content: "<a href='#' style='width: 32px; height: 32px; display: block; background: url(images/icons.png) no-repeat -200px -64px'></a>"});
[There are some other places in the HTML file where icons.png has been referred to]
Now every time I hover on the link, the tooltip is showing up, but
simultaneously the browser is sending a HTTP request to the server for
the icons.png file. And the response code from the server is
coming as 304 Not Modified.
Although the content of the file is not being fetched, but the overhead of sending the headers (around 166 Bytes) is still there every time, which in turn is causing a latency of 1.5 s (I'm on a damn slow connection). During this period of 1.5 s the tooltip element has no background image & suddenly the image is showing up out of nowhere.
Here are some screen-shots of
Chrome network panel:
Firebug net panel:
HTTP headers:
As far as I know once a resource has been fetched the browser should hold it in its cache & fetch from there whenever necessary, instead of requesting the server multiple times.
As I've found out that the server is not sending any "Expires" or "Cache-Control" header along with the content. I'm not sure whether this can be the reason behind such exceptional behavior of Chrome. Any suggestion is highly appreciated.
P.S: The application is hosted on Heroku's shared hosting environment. I'm using Firefox 15.0 & Chrome Version 21.0.1180.89 on Ubuntu 12.04 x86_64.

Every time you show an element for the first time, that is the point at which it downloads any associated background images... in modern browsers at least.
Your multiple requests are likely to be because they are the times you hovered over a new tooltip, bringing it in to visibility, and thus prompting the image to be called.
Your instincts are right though, the issue would be that if no caching header configuration is done directly on your server, or through .htaccess files, then it will keep requesting the server with a http request to see if it needs to download a newer version or not. As soon as you sort out your "expires" headers, which can be set through mod_expires, it'll start to return a locally requested version of the file each time instead.
Source: http://httpd.apache.org/docs/2.2/mod/mod_expires.html

I just recently came across with this behaviour as well, during developing locally. An element with a sprite background had a :hover state in the CSS file, which pointed on the same sprite background image with a different background position and that caused a very small, but nevertheless noticeable latency when switching the background of the element.
.class {
background: transparent url('sprite.png') 0 0 scroll;
}
.class:hover {
background: transparent url('sprite.png') -50px 0 scroll;
}
One way of making sure this doesn't happen is simply using the background-position CSS property only.
.class {
background: transparent url('sprite.png') 0 0 scroll;
}
.class:hover {
background-position: -50px 0;
}
Of course cache control is still necessary, this approach of coding can save you some headache.

Related

TinyMCE 5 - large images pasted via Safari do not render correctly

We are running TinyMCE version 5.4.1 with various options including:
paste_data_images: true
powerpaste_allow_local_image: true
When we drag & drop (or paste) in smaller images (400px X 400px) everything seems to work fine. The Base64 encoding is saved to the database and the image is rendered from all browsers, Chrome, Firefox and Safari.
However, when we paste in a larger image (1920px x 1081px) the image is only saved and rendered correctly in Chrome and Firefox. In Safari the Base64 encoding is saved with all lowercase characters. Therefore it doesn't render when attempting to view it. Has anyone else experienced this?
I have searched here as well as on the TinyMCE website but don't see anything mentioning this behavior. We will eventually attempt to move away from this Base64 implementation as it's no longer recommended but it's what we have for the time being so I'm just trying to address this issue.
When the page loads, its' elements can do so in parallel. But when the browser sees the base64 image, it blocks the page from loading until this image is rendered. Thus, inserting large images into the page as base64 is certainly not a good practice - it may slow down page loads and worsen the UX.
To fix this problem and maybe several other issues, utilizing the automatic_uploads option is highly recommended. It will upload pasted images on the server instead of converting them to base64. Here is the example of the PHP upload handler that will upload images and give their URLs back to TinyMCE.
Concerning the issue with Safari, some minimal reproducible example would be very useful.
I should also mention that PowerPaste is a premium feature that will not work with TinyMCE opensource. If you are using the paid version of TinyMCE, you can create a support ticket.

Random high content download time in chrome?

We have an API which randomly takes high content download time in chrome, It works fine always in firefox and takes an only few ms. The response size is 20kb uncompressed and 4kb compressed. The same request also works fine using curl.
Things that we have tried:
Disabling If-None-Match header to disable cache response from the browser.
Trying various compressions (gzip, deflate, br).
Disabling compression.
Disabling all chrome extensions.
The same request works fine sometimes on chrome but randomly returns very high content download time.
We are unable to understand the root cause of this issue. What are the other things we can try to minimize this time?
I made three requests here and the 3rd one took the most time (before the last spike). CPU does not seem to be maxing out for a longer period of time. Most of the time is idle time.
Also, When replaying the call using Replay XHR menu, the Content download period drops from 2s to 200 ms.
Are you by chance trying to implement infinite scrolling? If you are, try dragging the scroll bar instead of using the mouse wheel. For some reason, Chrome seems to struggle with mouse scroll events. If the scroll bar worked just fine, keep reading.
This post provides a detailed walkthrough of someone experiencing something similar - https://github.com/TryGhost/Ghost/issues/7934
I had attached a watcher on the scroll event which would trigger an AJAX request. I had throttled the request and could see that only 1 was being sent. I watched my dev server return the response within a few ms but there would be a 2 second delay in chrome. No render, no api calls, no and scripts executing. But the "Content Download" would take 3 seconds for 14kb. No other browser had this issue.
I stumbled upon suggestions that using requestAnimationFrame instead of setTimeout would solve the problem. That approach seems that approach works when the "Waiting" or green is significant, not so much for the "Content Download" or blue.
After hours of digging, I tried conditionally calling e.preventDefault() on the mousewheel event and to my amazement, it worked.
A few things to note:
1) I did not use the mousewheel event to make the api call. I used the scroll event along with throttling.
2) The mousewheel event is non-standard and should not be used. See https://developer.mozilla.org/en-US/docs/Web/Events/mousewheel
3) BUT in this case, you have to watch and handle the mousewheel event because of chrome. Other browsers ignore the event if they don't support it and I have yet to see it cause an issue in another browser.
4) You don't want to call preventDefault() every time because that disables scrolling with a mouse :) You only want to call it when deltaY is 1 if you are using vertical scroll. You can see from the attached image that deltaY is 1 when you basically can't scroll anymore. the mousewheel event is fired even though the page cannot scroll. As a side note, deltaX is -0 when you are scrolling vertically and deltaY is -0 when scrolling horizontally.
My solution:
window.addEventListener("mousewheel", (e) => {
if (e.deltaY === 1) {
e.preventDefault();
}
})
That has been the only solution that I've seen work and I haven't seen it mentioned or discussed elsewhere. I hope that helps.
console log of mousewheel event
I think you may be doing it wrong.™
Fundamentally, if this really only happens with Chrome, then perhaps the client-side code is to blame, of which you don't reveal any details.
Otherwise, you are trying to debug what you present as a backend condition (based on the choice on the nginx tag) with front-end tools:
Have you tried using tcpdump(8) to troubleshoot the issue? What packets gets exchanged and at what times?
Have you tried logging the times of the request being received and processed by nginx? E.g., $request_time?
Where is the server located? Perhaps you're experiencing packet loss, which may require timeouts and retransmission of some TCP packets, which invariably will introduce a random delay?
Finally, the last possibility is that the field doesn't mean what you think it does -- it sounds like it may take a hit from CPU load, as this is the result of the XMLHTTPRequest (XHR) processing -- perhaps you run some advertising with user tracking that randomly consumes a significant amount of CPU, slowing down your metrics?

PhantomJS render fails for a big page

When I take a screenshot of a webpage with PhantomJS 1.9.8, I have a test case where the output is always a zero size file. I tried several debugging options with page.onError, I see some errors related with Facebook plugins and scripts, but nothing very helpful...
So when PhantomJS fails on rendering a page, is there a way to know what's going on above the status of the render() function?
URL: http://www.santenatureinnovation.com/verrues-un-nombre-incroyable-de-solutions/
The page is so big that it uses between 600 and 700 MB of RAM to render the image. The dimensions of the resulting image are 960 x 141524 (sic!). Make sure you have enough RAM and wait a little. It takes several seconds for the image to be rendered. The good thing is that JavaScript is single threaded and you don't have to add anything to wait for the rendering to finish, everything else freezes.
I tried it successfully with PhantomJS 1.9.7 and 1.9.8 (on windows) without special care to viewportSize or user agent string.

Setting Device Width for Remote Webdriver in Selenium

I am using Selenium grid 40 with Firefox remote driver that runs in windows 7. I also use C# API. I was wondering how to Set Device width on headless Firefox browser. The device width is less than ipads max width and this causes it to pick up ipad specific css that is defined like below:
#media only screen and (min-device-width: 481px) and (max-device-width: 1024px) and (orientation:portrait) {
/* For portrait layouts only */
}
#media only screen and (min-device-width: 481px) and (max-device-width: 1024px) and (orientation:landscape) {
/* For landscape layouts only */
}
I have already changed window size using:
driver.Manage().Window.Size = new System.Drawing.Size(1290,900)
But it still picks up those css directives.
Other information: My grid node is virtual machine that nobody actually logs into. I remotely run the selenium grid, and that might be the reason why device width is small. it might default to smallest resolution for windows. If there is way to change that it might help me, but I am not aware of it.
Update: I tried to set all instances of DefaultSettings.XResolution, DefaultSettings.YResolution and DefaultSettings.BitsPerPel in registry to 1290, 900 and 16 through a powershell script and restart the computer but it didn't work.
I don't know how to set the device width using selenium, but I figured out how to set it using the remote frame buffer. For me this was Xvbf. I pass in the screen resolution when I start the service.
Below is an example of an Xvfb service with a resolution of 1024x768 with a depth of 24.
https://gist.github.com/dloman/931d65cbb79b00593ac3dd5d0cdf37d9
My experience is limited to using the Python implementation of Selenium so this may not work for you, but there you can use driver.set_window_size(width, height) before performing the driver.get() action that will load the desired page.
The key is the order in which you perform these actions: you want to tell selenium to change the dimensions of the browser before you load the page, because otherwise you are simulating the same behaviour as resizing a browser window after the initial page load. As you may know, when resizing a browser window, a page reload is needed for some 'responsive' features to be activated.

EasySlider 1.7 - IE9 breaks it by blocking scripts

I've been playing around with EasySlider and everything was working perfectly - until I viewed my site in IE9.
Instead of displaying the slider with 3 images sliding across, it displayed all 3 images sat one underneath the other, and a message at the bottom of the screen saying that scripts etc had been blocked and did I want to allow blocked content.
As I was using this in the hear of my website it pushed my whole site down the page and just looked stupid with the 3 banners on top of each other.
I realise I can get rid of this by unblocking the content, but that's not the point. I think this is the default security settings for IE so everyone that visits my site will see it like this for the first time (or everytime if they don't unblock the content).
So is there a way around this? Or at least a way that if the script is blocked only the first image is shown instead of all of them? This seems a pretty big flaw!
I had the same issue as you and found a very simple solution for it using CSS. All you need to do is copy and paste the line of CSS code below onto the screen.css stylesheet that came with the EasySlider 1.7 Plugin. I hope this helps.
#slider {
position:relative;
}
I found this solution at the "ClickNathan Handmade Websites".