I am using wp-minify and css and scripts file aggregation plugin for website optimization.
Can anybody answer the following
Which plugin is considered good for Database optimization?
How to add expiry headers to javascripts,images and stylesheet?
Which plugin is good for enabling cache for website's better performance?
There are several plugins that helps to create a cache and instead of fetching content from the database the content is loaded from the cache i.e. html version.
Moreover, you need to follow the few SEO guidelines that I have mentioned below to optimize your wordpress website for search engine like Google.
http://wordpress-tuts.blogspot.com/2017/05/best-ways-to-optimize-and-boost-seo-of.html
For expiry headers you can modify the .htaccess file in your website root folder. You can set the expiry time for all files in said root.
Here is a simple guide:
http://softstribe.com/wordpress/how-to-add-expires-headers-in-wordpress/
Related
I have a website on SiteFinity 4.4. I need to make a document available on a very specific URL, i.e.
http:www.example.com/reports/the-report.pdf
If I just create a directory in the root of the site it does not work (503 error). Also when I try to use the 302Redirect.xml file to redirect the URL to the PDF it does not work either (same error). The link has already been published and has to be exactly as specified. How do I solve this?
Any help would be greatly appreciated.
Sitefinity wouldn't block a folder. Adding a physical folder and dropping that report on the proper place should function, so it probably means you'll have to check your server configuration.
Anyway, the fastest way outside Sitefinity, would be to just create a IIS rewrite rule. Make the http:/www.example.com/reports/the-report.pdf the pattern and redirect them to the url of the document from the sitefinity library.
When you upload a document to the library in sitefinity it gives you an direct url, something like /docs/defaultlibrary/document. You can verify the url by going to content >> documents and files and chose Embed link to this file. That gives you a pop-up with the url.
I'm trying to understand how dynamic caching is done in Apache.
I read the Caching Guide of Apache and an article about dynamic caching, but still don't understand exactly the internals of how dynamic caching works.
Say for example I have a PHP page that serves content through reading from a database according to parameters in the user's URL query-string (or parameters specified in POST).
e.g. www.mySite.com?articleID=31
How is that cached then?
Does mod_cache keeps the content retrieved from the database for this specific article?
Any sources or suggestions are welcomed.
It caches the output of your script, the HTML. What I'm not sure if CacheStorePrivate On will cache PHP scripts with no cache headers. I'm using Apache 2.4.17 and look like it doesn't.
I have our basic corporate static html website installed in our web root directory and our billing software installed in /portal. I have integrated the websites to look like a single site by including the /menu.tpl smarty template file in the /portal/header.tpl file. However, if I use relative URL's, the menu sysem doesnt work as the base url for the billing script is /portal. i.e. if I create a link to faq.php in the menu.tpl and I load a page on the portal site, the link in the menu back to the faq page is now /portal/faq.php whereby if I load a page off the root site the link is just /faq.php as it should be.
The obvious answer is to just use absolute URL's, but I need the site to be portable as I have many developers who need to install and test it.
I cant find anyway to resolve this. Any ideas?
I ran into the same problem as you a while ago, and after trying a lot of dead ends, I finally ended up with the following solution:
For any URL you need to be a chamelion, i.e. change its path depending on the environment, insert a PHP function that writes out the correct URL.
If you include the PHP function from a single central file, then you can change all of the URL's in the entire site automatically, based on a setting, or some pre-detected switch such as the current domain name, etc.
Example:
<?php print_base_url_plus("/menu.php"); ?>
... where print_base_url_plus() is a function which appends the base URL onto the output.
You may find that you have to change some of the URL's to be php, so they are preprocessed by the PHP engine, or, you can alter the web settings so that standard .htm files are piped through the PHP engine, just like .php files.
Is there any way to automatically minify static content and then serve it from a cache automatically? Similar to have mod_compress/mod_deflate work? Preferably something I could use in combination with compression (since compression has a more noticeable benefit).
My preference is something that works with lighttpd but I haven't been able to find anything, so any web server that can do it would be interesting.
You can try nginx's third party Strip module:
http://wiki.nginx.org/NginxHttpStripModule
Any module you use is just going to remove whitespace. You'll get a better result by using a minifier that understands whatever you're minifying. e.g. Google's Closure javascript compiler.
It's smart enough to know what a variable is and make it's name shorter. A whitespace remover can't do that.
I'd recommend minifying offline unless your site is very low traffic. But if you want to minify in your live environment I recommend using nginx's proxy cache. (Sorry but I don't have enough reputation to post more than one link)
Or you can look into memcached for an in-memory cache or Redis for the same thing but with disk backup.
I decided to do this through PHP (mostly because I didn't feel like writing a lighttpd module).
My script takes in a query string specifying the type of the files requested (js or css), and then the names of those files. For example, on my site the CSS is added like this:
<link rel="stylesheet" href="concat.php?type=css&style&blue" ... />
This minifies and concatenates style.css and blue.css
It uses JSMin-PHP and cssmin.
It also caches the files using XCache if it's available (since minifying is expensive). I actually plan to change the script so it doesn't minify if Xcache isn't available, but I have Xcache and I got bored.
Anyway, if anyone else wants it, it's here. If you use mine you'll need to change the isAllowed() function to list your files (it may be safe to make it just return true, but it was easy to just list the ones I want to allow).
I use Microsoft Ajax Minifier which comes with a C# library to minify js files. I use that on the server and serve up a maximum of two minified .js files per page (one "static" one that is the same across the whole site, and one "dynamic" one that is specific to just that page).
Yahoo's YUI compressor is also a simple Java .jar file that you could use as well.
The important thing, I think, is not to do it on a file-by-file basis. You really do need to combine the .js files to get the most benefit. For that reason, an "automatic" solution is not really going to work - because it will necessarily only work on a file-by-file basis.
If you use Nginx instead of lighttpd then you can take advantage of Nginx's embedded Perl support to leverage the Perl module JavaScript-Minifier to minify and cache JS server-side.
Here are the details on how to achieve this: wiki.nginx.org/NginxEmbeddedPerlMinifyJS
I have read about a technique involving writing to disk a rendered dynamic page and using that when it exists using mod_rewrite. I was thinking about cleaning out the cached version every X minutes using a cron job.
I was wondering if this was a viable option or if there were better alternatives that I am not aware of.
(Note that I'm on a shared machine and mod_cache is not an option.)
You could use your cron job to run the scripts and redirect the output to a file.
If you had a php file index.php, all you would have to do is run
php index.php > (location of static file)
You just have to make sure that your script runs the same on command line as it does served by apache.
I would use a cache on application level. Because the application knows best when the cached version is out of date and is more flexible and powerful in the matter of cache negotiation.
Does the page need to be junked every so often because it just has to? Or should it be paralleled with a static version after an update to the page?
If the latter, you could try and write a script that would make a copy of the just edited page and save it to its static filename version. That should lighten the write load since in that scenario you wouldn't need to have a fresh static copy unless there was a change made that needed some show time.