The best way to expedite the loading of this website - e-commerce

The website is http://www.kikbo.com
Is it slow enough to be costing me conversions (Maybe people in Europe)?
Here's the pingdom load time test result: http://tools.pingdom.com/fpt/#!/t71Fj5LGf/http://kikbo.com
The biggest "offenders" seem to be the fading javascript, the pre-loaded images, and the like button.
Suggestions?
What I did to make it go faster
http://www.webpagetest.org/result/120103_H6_2QFXT/1/details/
Thanks. Was able to shave off some seconds here and there.
In order of page load speed increases:
Moving most of the js to the back and the css to the front
Not pre-loading the images in the roll-over image gallery
loading prototype.js from google's servers
making the gallery of rollovers a css sprite
Minifying
Gzipping is basically impossible on GoDaddy and putting my files on google's CDN would be a good improvement.

Right then...
For starters here's a page load waterfall generated using the Amsterdam instance of webpagetest.org http://www.webpagetest.org/result/111230_1A_2PFQ6/1/details/
Your server looks like it's pretty slow at generating/serving out the base page and static assets but there are also other issues.
You want to load the CSS as soon as possible but delay the JS as long as possible.
I would try to simplify the CSS files then merge and minify them.
For the JS, work out what you really need to render the page and delay everything else until later either by including it at he end of the body or loading it asynchronously see Stoyan's article for how you should load the social media buttons http://www.phpied.com/social-button-bffs/
You also need to turn on gzip for the text based content e.g. HTML, CSS, JS etc. Suspect your images can be compressed further too.
Based on what I saw in the waterfall the way you're using JS is a big part of the slowness.

Your images should be optimized and compressed. The CSS and Javascript should be minified and combined into fewer files if possible.

In Short there are very few basic steps being suggested by Yahoo team which have really helped my people to come up with such problem.Those are only some good steps which we generaly forgot and in 70-80% of cases slowness is of these few reasons
Not Proper use of Java-Script.
Proper use of CSS.
How to optimize images for best loading time.
Proper use of Expiry headers.
I suggest you to go through the detail blog being written by Yahoo team really good to start at
Web Performance Best Practices and Rules

Optimize images # www.smushit.com/ysmush.it/ which helps to increase website load time.

Related

How to mix SSR and static content generation in NuxtJS 3?

I read since some month that NuxtJs 3 will bring a new Nitro engine able to handle both SSR and static generation.
I am unable to find any kind of document, tutorial, video or any about this feature.
But it is really what we need
Imagine to be able to do SSR of a catalogue of products having the single datasheet page, very very intensive to generate and always static in our case, be statically generated.
How can we realize it?
Briefly, how to use both SSR and static and in which sense can we use both with nitro? Which limits and what problema can lead to?
You're looking for this one: https://github.com/nuxt/framework/discussions/560#discussion-3589420
This is planned, no date yet and pretty much what you need. Expect it maybe by the end of the summer.
It should not lead any particular problem.

Make PDF viewable online in the browser

Note: Before asking i searched some on embedding but couldn't get exactly what i wanted.
I have a resume file in the pdf format that i would like to display in my website without storing anywhere like google or other but on my own. I have static website [which i made using Jekyll] lets say https://www.example.com and what i actually need is to display my resume accessible in the following link https://www.example.com/resume
Some of them have long permalinks and i actually hate them. (Just saying)
Upload the PDF into the website's / or assets/ directory.
To make a link for HTML:
CV
To make a link for Markdown:
[CV](<PATH>/cv.pdf)
On Chrome, this has been around for a long time and plagues webdevs to this day. There seem to be no plans to change that anytime soon because that's just the way it was built. Chrome behaves slightly different when not online, so offline/local-testing will not always produce expected result.
My answer to you, for this question, is a suggestion. In order to make it cross-browser compatible, your mode of implementation should be:
Modal or,
Lightbox
Whether or not you are using a SSG should not matter here. Look for a bootstrap or material implementation.
On the client-side, it is possible with extension. I reckon this isn't helpful to you; but I'm including this information for future readers.

Grade E on Make fewer HTTP requests in YSlow for my magento website

Grade E on Make fewer HTTP requests
This page has 3 external stylesheets. Try combining them into one.
This page has 19 external background images. Try combining them with CSS sprites.
What should i do to improve it to Grade A.
Is there anything i should in .htaccess file or anywhere else to improve this.
I have already done a lot of things and I got the score 89 but i want to improve it to Grade A.I am using apache server.
How would i do this.Please suggest someone.
Thanks
Here's a really simple way to fix your external stylesheet problem: open up your Magento admin, go to System>Configuration>Advanced>Developer and under CSS Settings set "Merge CSS Files" to Yes.
To resolve the second issue, creating a CSS sprite sheet is a good idea (though it can be a bit of a time sink unless you did it from the start). Loading your theme graphics independently causes your response time to take a pretty big hit, so the general idea is to have your site icons and background images load in a single file, then use some CSS trickery to only show them as you need them. This article on Smashing Magazine should get you started with CSS sprites: http://coding.smashingmagazine.com/2012/04/11/css-sprites-revisited/
As for overall speed optimisation, there are numerous (extensive) blog posts on this subject, just have a search for them. Make sure you know what you're doing before making changes to your server configuration, or else a slow response will be the least of your troubles!

Using RefineryCMS and the Theming Gem

I am in the beginning stages of diving into the world of RefineryCMS and am having an issue with the theming Engine.
I was able to customize the look of the home and blog pages, but for individual pages I create outside of the home and blog engine are not falling under the normal theme I created. I used the override method to copy all of the necessary files to my custom theme, but like I said, the page is using the default /pages/show.html.erb file vs the theme/mytheme/pages/show.html.erb file..
Any ideas on why this would be happening?
The use of the theming engine is not recommended anymore:
USE OF THE REFINERYCMS-THEMING GEM IS NO LONGER RECOMMENDED. Why?
Theming performs some strange code hacks in order to get it to work.
Therefore, it makes it difficult to keep it compatible with other
engines. Also, many people have reported over 15 second load times
with theming, whereas regularly you would get 3 second load times.
Finally, Resolve Digital no longer uses nor supports this method.
https://github.com/resolve/refinerycms-theming

Command-line web browser that outputs the DOM

I'm looking for a way to process a web page and associated Javascript from the command-line, so that the resulting DOM model can be outputted.
The purpose for this is to identify forms within the page without doing any nasty HTML (and Javascript) parsing with regular expressions.
Are there any command-line tools that will do this? So hypothetically speaking, a command-line web browser that downloads the content and outputs the DOM as text rather than producing a pretty page.
I don't know of any, but I wanted to highlight one difficulty with what you've suggested:
process a web page and associated Javascript
When would the output be? Many webpages have time-sensitive javascripts, or onclick/onhover scripts which would affect the DOM. Would you want these to be executed? All of them, or only some? It's not trivial to decide when the page is "finished" and ready for the DOM to be output after javascript manipulation. (Before javascript manipulation, it's an easier problem; just wait till the document.DOMReady event...)
Edit: I'm not saying that you don't need javascript execution at all: you might want to handle any document.write sections during loading, as they might write out a form... I'm saying it's hard to know when you've done "enough" javascript...
For java, I've had fairly good experiences with htmlunit.
I've also used the BeautifulSoup python library to parse forms and formdata. No need to specify regexps, as it'll let you traverse the DOM tree without much effort.