Does vwo(visual website optimizer) slow down website load time in any manner? - optimization

Visual website optimizer is a A/B testing tool which can help one site owner to analyze his site with a modified of that. It puts a simple code in your website and make a new version of your web page.Then it show one version of your webpage to 50% of your visitors and another ver to rest of the 50%. This way the owner can analyze which ver of the site is generating more revenue & dump the other one.
So my question is can vwo reduce the site loading time somehow?Or what is the drawbacks of using vwo in a website?

Yes, there's a little bit of additional lag in load time, as the script that makes the decision has to call home to the VWO servers, see what variations should be served, then serve that particular page.
The trick to minimising that loading lag is to put the script absolutely first on the target page, so that nothing else is happening before the script fires (but you'll always have lag).

This blog post by VWO sums everything up: https://vwo.com/blog/how-vwo-affects-site-speed/
They write in that post:
Having said all this, we are confident that VWO’s best-in-class technology coupled with optimal campaign settings will ensure that your website never slows down
However, I would suggest to test it out on your page and see whether it works for you or not.

Related

Why is it better to work on a local website copy than live on the website?

I used to work on the live website when I'm editing a website (I'm working alone), but some people told me "it's the old way". I'm inclined to evolve and I like to work, but how can't I lose time doing this?
First, that means that I need to get a copy of the website on my computer. I need to copy the files, dump and restore the database, first waste of time. If my customer adds extension on the website in the meantime(for example, Wordpress) my modification should be impacted then I need to add it on my local copy to. If I need to modify the DB I will need to do it on the local copy too.
Secondly if I want to show a work in progress to my customer I need to apply all modifications to the live website and check than everything works, still a waste of time.
And finally when everything is ok, I need to update again the live website, files and DB.
So, there's two options:
this is not the correct way to do and there's tools to do all that transparently (I hope so)
this is not a waste of time but a needed time to work properly (I understand why agency have big prices and I'll keep my method)
It depends on the complexity of the project and the size of your team.
One of the major risk of working on a live site is the introduction bugs in production. You also want to have some confirmation of functionalities developed from QA or your customer before having your users access them.
Basically, you want to make sure your new code does not break the live site, so working on the local instance could help you in this way, and you could also deploy on the live test site you changes for approval and QA.
Also if you working with a larger team working on the live site just won't scale up and the risk of introducing bug is even higher.
You could consider using docker, to simplify development on your local machine.

Performance testing of a Sitecore website

My client gave me a Sitecore website to do some performance testing.I really don't have any expirence with Sitecore website or Sitecore itself (which I am working on now).I have some idea of performance testing of a website and also got additional info from stackflow. But I am curious to know if there is any difference in testing a Sitecore website? What is the best practice to test a Sitecore website? Little bit about the performance testing scope:
The website handles different kind of enrollment path for students. So there are a couple of enrollment paths , all of which ends with a payment done by the customer.There can be more than 1 student enrolling at a time (like 6 together). Performance testing will include enrollments for all of these paths.
A lot of customers trying to enroll at the same time in the same enrollment path and different enrollment paths.
Also have to keep in mind that since this is a customer facing website the images/texts/files that are hosted in Sitecore should be shown in the website quite quickly.
Any help is appreciated.Thanks!
Typically there are three ways to come at perforamce testing for Sitecore.
The first is that it's basically just a web application, so most tools you'd use to test those are valid. Load testing tools like jMeter (or Windows equivalents) that simulate requests to pages and measure response times can give you an idea of how your Sitecore application works under load. Or the developer tools in browsers can show you how long individual requests take, and what resources are being downloaded. Both can help you form a picture of the site's overall performance levels.
The second is that Sitecore includes some tools for measuring how hard Sitecore itself is working to render pages. The "Experience Editor" (the WYSIWYG view for editing web page content) has a "debug" mode which can tell you how many content items are being read to render a page, what UI components are being run and how long these things are taking. This can help you tweak how code queries Sitecore's databases, and how components are cached in order to increase performance.
Thirdly, any ASP.Net application can have low-level performance tracing done with standard .Net tools. Visual Studio's performance tracing tools, or 3rd party stuff like dotTrace can all give you a detailed view of how long IIS spends working on individual pages, and what parts of the code are taking the most time or memory.
I wrote up a user-group presentation I did on this topic a while back:
https://jermdavis.wordpress.com/2017/10/02/measure-if-you-want-to-go-faster/
and more recently I wrote about some general patterns you might see when doing low-level performance traces:
https://jermdavis.wordpress.com/2018/02/05/spotting-common-challenges-when-youre-doing-performance-tracing/
Sitecore is basically a .NET-based content management system so there should not be any difference from other web applications performance testing so the same approach applies.
The best entry-level document I've seen over the web so far is Performance Testing Guidance for Web Applications, you can quickly get familiarized with the concept of load testing, how to implement it, what metrics need to be considered, etc.
With regards to load testing tool, the most natural choice would be Microsoft Visual Studio Load Testing Framework, however it assumes having a relevant license and some C# coding skills. If you don't have any of these you can consider one of free and open source load testing tools.
While creating your script keep in mind that each virtual user needs to represent real user as close as possible so mind cookies, headers, cache, think times, distribution of virtual user groups, etc.

How to do load testing in xpages

I am facing issue with my xpage application. It works perfectly fine with less number of concurrent users. But When more concurrent users say more then 1000 , try to access xpage application, It becomes very slow. I have looked the code and corrected some redundant code .
But I am not sure this is the issue. For that is there any way in lotus notes to simulate the load testing with 1000 users?
Please help me if any workaround there.
Agree with Oliver about using JMeter.
But then what you really want is to find out where you have "expensive" code. For an agent you can just "profile" it. However, that is a little less straight-forward for an XPage. You can try the XPages Toolbox from OpenNTF.org. I have not tried it on Domino 9.0.x but I would think you could use it.
Another simple (and quick) way to get an idea is to print some time info on the console of the server when you load the pages in your application. You can use a phase listener to add this information - or put it in another more specific location - it really depends on the way that your application is structured. But this way you can get a very quick idea of where the bottlenecks are before you dive into something like the toolbox :-)
/John
We used JMeter to get an idea what will happen if X users will access our app in Y threads etc. http://jmeter.apache.org/

How to compare test website and live website

We have our production server running our website. Then we have a test server which has exact same data but with changes to code to do some new functionality. This web app has over 500 pages.
Is there any program that can
Login to the test site
Crawl through each page and then save the page as html
Compare with the same page saved with live site?
This way we can make sure that new features that we add to our test site will not break the live site when code updates are applied to production.
I am currently trying to use WinHTTrack website copier and then comparing the test and live folders with some code comparison tool like beyond compare. This works ok but there are lot of files changed because of the domain name changes.
Looking forward to ideas / solutions for this problem.
Regards
Have you looked at using Watir for this? It's not exactly the thing you are looking for but it might allow you some more granularity in your tests and ensure the site is functionally identical rather than getting caught up on changing guids, timestamps and all the other things that tend to change across any significant size website from day to day as part of it's standard functionality.
Apparently you can't make consistent, reproduceable builds in your project, can you? I would recommend moving towards that in the long run, it will save you a lot of headaches. That way you would know exactly what was deployed to which server when, so there would be no more need to bend around backwards to get the deployed sources back like this...
I know this is not a direct solution to your problem... but maybe it is worth comparing, whether you would save more in the long run by investing the efforts into your build process now, instead of implementing this workaround (and then improving your build process anyway - because one day you will almost surely need to do that).
wget has a --convert-links option, there are also some options to preserve cookies that might let you do it logged in http://drupal.org/node/118759#comment-664498
use an Offline Downloader, download all files to your computer from both sources, then compare the folder contents using a free tool like Total Commander.
EDIT
Load both of your sources into a CVS, and compare it there.

browser plugin to test a site's look when migrating

I'm thinking I need a browser plugin that does the following, and if it doesn't exist, it should. I may as well say FF for now, but it could be any browser.
The problem: when moving a website from one server to another, you need migration testing. It is a pain to click on every link by hand and compare it to the old host. You really need 2 machines or have to constantly thrash your hosts file.
The plugin:
Would allow you to specify an alternate hosts entry for a website. 2 entries would make it clear, one for live, one for test.
The plugin would crawl every link on the site, and render the page in the browser, and save an image of the entire page.
It would switch hosts and repeat, and save images in a second folder. Since the rendering engines match, the images should match. We need to switch hosts (like /etc/hosts) so all absolute links are the same for the site.
Now this could be part of the plugin or external, now that we have 2 folders of identically named images, we run an image-diff program on the whole batch. A quick test would be a bdiff or hash, or we could get more sophisticated and determine how different each image is.
This would save so much time. So can it be done with existing tools, or do I need to go write it?
Have a look at Selenium, it allows you to script interactions with the browser and verify content.
That is overengineered. What kind of website is it? How big? Which framework (PHP, JSP, Rails, etc.)? Why not copy the website onto the new server and grep the code for specific ties to the old server?
I'd concentrate on why you think the site would differ between two servers, and focus on testing those specific cases rather than the whole site. When a site is moved to a new machine the issues are generally very obvious from looking at a couple of pages.
Presumably they are both looking at the same data source, assuming there is a data source, otherwise a folder diff on the two installations would suffice. This being the case, it should be a simple task to identify which areas of the site are likely to be affected by a server migration.
Also, I wouldn't personally trust a machine matching two images to sign off system as ready to go live. There just isn't a substitute for real human testing. Yes it's time consuming, but how important is your site?
Try http://www.browsercam.com/ - free trial should allow you to specify main page and follow links to make screenshots automatically of the sub-pages as well.