I performed PageSpeed Insights testing of my website which created by using WordPress.
It displays Eliminate render-blocking JavaScript and CSS in above-the-fold content
The error message disappeared when I remove style.css file which is my core CSS file that contains 7000 line of codes.
Referring to the Optimize CSS Delivery, I applied this line of coding into my <script>
<script>
var cb = function() {
var l = document.createElement('link'); l.rel = 'stylesheet';
l.href = 'small.css';
var h = document.getElementsByTagName('head')[0]; h.parentNode.insertBefore(l, h);
};
var raf = requestAnimationFrame || mozRequestAnimationFrame ||
webkitRequestAnimationFrame || msRequestAnimationFrame;
if (raf) raf(cb);
else window.addEventListener('load', cb);
</script>
Apply this method leads to more blocking resources.
May I know how to overcome this issue? Thanks.
P/S: Please provide a reason if you downvote. So I can improve my quality of question next time.
Can you load the CSS asynchonously or are there media queries that are necessary?
The best thing to do is inline some of the media query stuff so that each query is done from the HTML - this will eliminate the render blocking.
I will state however that Pageinsights can be ignored if your page load speeds are showing lower in chrome. It's not 100%. That is where you really want to focus because Google gives you more credit for user feedback than it's bot's quality score. And users WILL bounce less if you can shave it down to 0.5 seconds load.
Related
I'm generating a HTML-table with data from a geojson-file using leaflet. It works fine, but only if I do not delete the "alert" from the following code. Otherwise the data are displaied without table. How to solve this?
$.ajax({url:"wind.geojson"}).done(function(data) {
var data = JSON.parse(data);
L.geoJson(data,
{pointToLayer: MarkerStyle1});
});
alert();
function MarkerStyle1 (feature,latlng) {
...
document.writeln ("<td width='40'><div align='center'>" ,feature.properties.title, "</div></td>\n");
...
};
I do not see any possibility here to upload a file, it's 200kb. It's to find here: 1 The file works well, when I show the objects on a map.
2 shows on older version of the site, where the table is generated with php (done by a friend, I do not use php).
Perhaps it is a problem, that I do not have a "map", no "map.addLayer()" in this code!?
I now realized that the problem is caused by the asynchronous working of AJAX. I changed the code to
$.ajax({url:"wind.geojson", async: false})
Now it is working without the "alert()"-line!
I am use Web Polygraph load testing tool to make rapid http requests as it is reliable, low resource consumption, and has good reporting. However, I cannot find any settings to tell Web Polygraph to run for a certain amount of time. I want to be able to have accurate reporting instead of potential misses caused by sending a kill signal to the process.
I have been reading through web polygraph's help pages and can see that the requests per second is configurable, but am not seeing support for request duration time.
I have the config file as such (I think this is where the option would go, likely in the Robot configuration):
Content SimpleContent = {
size = exp(1KB); // response sizes distributed exponentially
cachable = 100%;
};
Server S1 = {
kind = "S101";
contents = [ SimpleContent ];
direct_access = contents;
addresses = ['X.X.X.X' ];
};
// a primitive robot
Robot R1 = {
kind = "R101";
req_rate = 100/sec;
interests = [ "foreign" ];
foreign_trace = "/home/x/trace.urls";
pop_model = { pop_distr = popUnif(); };
recurrence = 100% / SimpleContent.cachable;
origins = S1.addresses;
addresses = ['X.X.X.X' ];
};
I am expecting to be able to set some duration, say 40min, where I am able to have the R1 robot request 100 pages per second for 40 minutes.
I got an answer from the Web Polygraph support. For future reference, this can be accomplished through the Phase and Goal objects, as well as using the Schedule function with them. Here is a snipbit of the email I got back:
See the goal field inside the Phase object:
http://www.web-polygraph.org/docs/reference/pgl/types.html#type:docs/reference/pgl/types/Goal
http://www.web-polygraph.org/docs/reference/pgl/types.html#type:docs/reference/pgl/types/Phase
Do not forget to schedule() your phases:
http://www.web-polygraph.org/docs/reference/pgl/calls.html
Many workloads that are distributed with Polygraph include Phase
schedules. To see examples, search for "goal" in workloads/
I'm using Aurelia with http-server, as it is described in Aurelia getting started docs. I'm unable to see changes, because any browser seems to cache entire page at first load. When I use F5, ctrl+F5 or ctrl+r, page refreshes but nothing changes, none of my modifications are visible. Then I can use another browser and at first visit changes are visible, but any subsequent visit shows always the first one. It occurs in every browser I use (Chrome and Firefox, ever in private mode). I'm certain that it is not bug in Aurelia itself.
I tried to change port and use http-server with -c parameter. Nothing changed. Any ideas?
If you want to force your Aurelia app to be constantly refreshed after you modify it, you can have a look at the following thread:
https://github.com/aurelia/framework/issues/94
Aaike commented on 8 May 2015:
change your index.html to add the extension right before you import
the aurelia-bootstrapper
<script>
var systemLocate = System.locate;
System.locate = function(load) {
var System = this;
return Promise.resolve(systemLocate.call(this, load)).then(function(address) {
if(address.lastIndexOf("html.js") > -1) return address;
if(address.lastIndexOf("css.js") > -1) return address;
return address + System.cacheBust;
});
};
System.cacheBust = '?bust=' + Date.now();
System.import('aurelia-bootstrapper');
</script>
This is a caching issue that pops up sometimes with http-server. I don't know exactly what causes it but I believe the -c modifier changes the length of cache-control so I would set that as 0 or 1 instead of assigning a port that way.
ASP.NET MVC - Is it possible to upload only the first 10 lines of a file? Basically, we have some files that can range from 1-10GB but the data that we need is present only in the first 10 rows in the file. Using the typical web development approache, we'd upload the whole file to the server and then read the first 10 rows, but uploading a 10GB file just to read a few bytes of data seems a big waste of resources. Is it possible to read such a file without uploading all of it to the webserver?
Solution - FileAPIs slice function solved this problem (thanks to Chris below). The simplified code is below for anyone interested -
var sampleFile = document.getElementById('yourfileelement').files[0];
var reader = new FileReader();
var fileData = sampleFile.slice(0, 500000); //Read top 500000 bytes
reader.onprogress = function (evt) { //Show progressbar etc }
reader.onloadend = function (evt) { alert(evt.target.result); } //evt.target.result contains the file data that was read
reader.readAsText(fileClientReadData);
No, but you may be able to accomplish it using the File API client-side to read and send to the server via AJAX just the first 10 lines. However, note that the File API is only supported in modern browsers, so this won't work with IE 9 or less. You might be able to create a more comprehensive solution using a Flash or Java applet, but ugh.
So our app is set up like the standard left frame with the tree, right frame has the main content (loaded from clicking the tree).
Our web app inconsistently displays a blank page in the main frame in Firefox. By inconsistent I mean everyday for a few, rarely for others, never for most. Once we get this, going to any other page through our tree results in a blank page. We found that deleting the "aTreeSaveStateCookie" restores normal operation. "aTree" is the name of our Div. I found "SaveStateCookie" strings in dijit/Tree.js.
This also happens in IE, except I would get a browser error page which I can't recall right now. I would then delete the only cookie I could find for our app (not sure how to do the Firefox steps in IE)
Any ideas on why this would happen?
Thanks
Dojo 1.3 through http://ajax.googleapis.com/ajax/libs/dojo/1.3/dojo/dojo.xd.js
Firefox 3.1x
IE 8
Windows XP
In my case, I don't recall ever changing browser settings around Private Data.
Please check to see if the response code is 413 (413 = request entity too large), usually this happens when the cookie(s) used to store the tree(s) expansion state (aTreeSaveStateCookie) exceed(s) the maximum request size for your server
You could try increasing the maximum request size (follow instructions for your specific web app server) or at least display a meaningful error message like "please clear your browser cache" when the 413 error code is encountered
If the persist property is set to a truthy value, dijit.Tree is persisting its state to remember which nodes were expanded, and expand them after a page reload. If you need to persist the tree state in presence of a very large data structure, I recommend overriding Tree to use localStorage instead of dojo.cookie.
This is Dojo v. 1.9, but similar changes can be done to the non-AMD version 1.3
_saveExpandedNodes: function(){
if(this.persist && this.cookieName){
var ary = [];
for(var id in this._openedNodes){
ary.push(id);
}
// Was:
// cookie(this.cookieName, ary.join(","), {expires: 365});
localStorage.setItem(this.cookieName, ary.join(","));
}
},
And:
_initState: function(){
// summary:
// Load in which nodes should be opened automatically
this._openedNodes = {};
if(this.persist && this.cookieName){
// Was:
// var oreo = cookie(this.cookieName);
var oreo = localStorage.getItem(this.cookieName);
if(oreo){
array.forEach(oreo.split(','), function(item){
this._openedNodes[item] = true;
}, this);
}
}
},