I am working on application that loads about same 10 js on each page. This makes the website performance too slow. Is there a way I can changes the configuration in Apache, so that I can have all my js loaded in Cache on the home page itself.
No, each resource - images, css, js, etc. - is loaded by Apache individually.
The way around this is to minify (compress) your JS into one file using a tool. But, you'll have to rewrite your HTML pages to point to the new compressed file.
Related
I can not figure out how to make nuxt generate fully static website. It makes api call static and that is awesome. But all images, and download links still making request to a remote server.
Is it possible to generate fully static website where all links to external files(<img src="remote.jpg">, <a href="remote.pdf", background-image: url('remote.jpg')) will be downloaded and placed in local folder and then every url will be replaced to local files? Or nuxt does SSG only for APIs?
You could totally optimize and put all of your assets into the /static directory indeed.
It will require some CI or any kind of build step to have them properly updated, organized etc but nothing impossible (this will keep everything in the same place). Meanwhile, having resources outside of your server is not bad in the principle itself neither.
When I check my website URL on Google URL inspection tool it shows that page resources could not be loaded i.e image, stylesheet and script files. However, my website is working perfectly on a live server and the website is not rendered properly by Googlebot smartphone. I have tried everything to remove these errors but nothing helped. I have also checked that these resources are not blocked in robots.txt file.
Screenshot of page resources error
I've been struggling with this for a couple of days now, and finally reached the only solution that has worked for me. In my case, it wasn't a robots.txt problem, as I believe that you've already checked before posting this.
The problem has to do with the number of resources Googlebot is willing to fetch before giving up. If your CSS and JS files are too many, or too big, Googlebot gives up before fetching all of the resources needed to render the page properly.
You can solve it by minifying your files via a server mod, or via plugins like WP Rocket or Autoptimize. If you have too many CSS and JS files and the problem persists after minifying, try combining these files as well by using the same plugins.
I am new at using GZIP.
I'm using a Wordpress plugin that GZIP the website but I ran the Google Speed test and it says that the website is not gzipping bootstrap .less files and the javascript files that I call.
I'm not sure how to call a gzip or even making one, can somebody help me?
Javascript should be served minified and after minification both JS and CSS should be served compressed. Have a look at How to 'minify' Javascript code
If you are using apache 2.* as your web server you could enable mod_deflate. You can then configure the module to compress .less and any other static content you want to compress.
Here is a link to the documentation for the module: http://httpd.apache.org/docs/2.2/mod/mod_deflate.html
I'm wokring on a small website that I wouldlike to use Less CSS with, but I am having trouble getting the .less file to become available. If I go to the path on the server that the .less file is at through ftp, the file is there in the browser and I can read it. However, tag in the html is bringing me to a 404 page. If I manually type in the location through http it does not work. Why might this be happening?
Requests for static files return 404 error (IIS 6.0)
Im not sure if i understood you correct, but: LESS is not a replacement for CSS. You must generate a CSS-file from your LESS-file to be able to serve it to the browser.
The reason why you cant view it in your browser is because the web server har no MIME-type for LESS-files, and it shouldnt have.
You can however view it through FTP cause its a "normal" text document.
Edit: You can also process the .less-file with javascript, but thats not recommended for production use...
Is there a setting for apache or .htaccess to not open images in browser, but instead force the user to download them to their computer to open e.g. when he navigates to http://site.com/image.jpg this will make him download the file. The only time I want images loaded in the browser is when they're embedded in a HTML page. e.g. http://site.com/mypage.html
If it is not possible then can we at least just block it completely if they go to http://site.com/image.jpg, they will get error 403 or something for any file other than html and php?
There would be a bit of a performance overhead, but you could make a page (php or whatever language) that all it does it pull up images from a directory that otherwise isn't web accessable. You could then make all image links go to that page and make them still look like image urls using rewrites.
Page: /images/25.jpg => /images.php?id=25&type=jpg or something similar
Note sure exactly what you are trying but might want to read this:
http://michael.theirwinfamily.net/articles/csshtml/protecting-images-using-php-and-htaccess