My client gave me a Sitecore website to do some performance testing.I really don't have any expirence with Sitecore website or Sitecore itself (which I am working on now).I have some idea of performance testing of a website and also got additional info from stackflow. But I am curious to know if there is any difference in testing a Sitecore website? What is the best practice to test a Sitecore website? Little bit about the performance testing scope:
The website handles different kind of enrollment path for students. So there are a couple of enrollment paths , all of which ends with a payment done by the customer.There can be more than 1 student enrolling at a time (like 6 together). Performance testing will include enrollments for all of these paths.
A lot of customers trying to enroll at the same time in the same enrollment path and different enrollment paths.
Also have to keep in mind that since this is a customer facing website the images/texts/files that are hosted in Sitecore should be shown in the website quite quickly.
Any help is appreciated.Thanks!
Typically there are three ways to come at perforamce testing for Sitecore.
The first is that it's basically just a web application, so most tools you'd use to test those are valid. Load testing tools like jMeter (or Windows equivalents) that simulate requests to pages and measure response times can give you an idea of how your Sitecore application works under load. Or the developer tools in browsers can show you how long individual requests take, and what resources are being downloaded. Both can help you form a picture of the site's overall performance levels.
The second is that Sitecore includes some tools for measuring how hard Sitecore itself is working to render pages. The "Experience Editor" (the WYSIWYG view for editing web page content) has a "debug" mode which can tell you how many content items are being read to render a page, what UI components are being run and how long these things are taking. This can help you tweak how code queries Sitecore's databases, and how components are cached in order to increase performance.
Thirdly, any ASP.Net application can have low-level performance tracing done with standard .Net tools. Visual Studio's performance tracing tools, or 3rd party stuff like dotTrace can all give you a detailed view of how long IIS spends working on individual pages, and what parts of the code are taking the most time or memory.
I wrote up a user-group presentation I did on this topic a while back:
https://jermdavis.wordpress.com/2017/10/02/measure-if-you-want-to-go-faster/
and more recently I wrote about some general patterns you might see when doing low-level performance traces:
https://jermdavis.wordpress.com/2018/02/05/spotting-common-challenges-when-youre-doing-performance-tracing/
Sitecore is basically a .NET-based content management system so there should not be any difference from other web applications performance testing so the same approach applies.
The best entry-level document I've seen over the web so far is Performance Testing Guidance for Web Applications, you can quickly get familiarized with the concept of load testing, how to implement it, what metrics need to be considered, etc.
With regards to load testing tool, the most natural choice would be Microsoft Visual Studio Load Testing Framework, however it assumes having a relevant license and some C# coding skills. If you don't have any of these you can consider one of free and open source load testing tools.
While creating your script keep in mind that each virtual user needs to represent real user as close as possible so mind cookies, headers, cache, think times, distribution of virtual user groups, etc.
Related
We are in the process of building a website from scratch from an existing website. The web page is an identical copy, and as the web page contains many pages we need a way to compare content between the sites. It is of course possible to do manually, but it takes both a lot of time and entails a risk of human errors.
I have seen that there are services that offer this by inputting two URLs which are then analyzed and where discrepancies are presented. However, these cannot be used as our test environment is local (built in Sitecore).
Is there a way to solve this without making our test environment available online (which is not possible)? For example, does software exist for this, or alternatively some service where you can compare a web page that is online with one that is local?
Note that we're only looking for content comparison (not visual).
(Un)fortunately there's many ways to do this, but fortunately there are some simple ones.
What I would do is:
Get a list of URLs for each site. If the Sitemap is exhaustive, then you could use that, if it's not you might want to run some Sitecore Powershell to get the lists.
Given the lists (from files, or Sitecore API or something), write a program to visit each URL, get the text of the page after it's done rendering, and save it to disk (something like Selenium is good for this and you can use any language). You'll want some folder structure like host/urlpart/urlpart/pagename.txt, basically the same as your content tree.
Use some filesystem diff program like WinMerge to compare the two folders
This is quick and dirty, but a good place to start.
Visual website optimizer is a A/B testing tool which can help one site owner to analyze his site with a modified of that. It puts a simple code in your website and make a new version of your web page.Then it show one version of your webpage to 50% of your visitors and another ver to rest of the 50%. This way the owner can analyze which ver of the site is generating more revenue & dump the other one.
So my question is can vwo reduce the site loading time somehow?Or what is the drawbacks of using vwo in a website?
Yes, there's a little bit of additional lag in load time, as the script that makes the decision has to call home to the VWO servers, see what variations should be served, then serve that particular page.
The trick to minimising that loading lag is to put the script absolutely first on the target page, so that nothing else is happening before the script fires (but you'll always have lag).
This blog post by VWO sums everything up: https://vwo.com/blog/how-vwo-affects-site-speed/
They write in that post:
Having said all this, we are confident that VWO’s best-in-class technology coupled with optimal campaign settings will ensure that your website never slows down
However, I would suggest to test it out on your page and see whether it works for you or not.
Question
Is it worth building a web application front-end for my department's automated regression tests? I've searched quite a bit and I don't think anything like this exits. Basically the web application would allow a user to specify a URL, expected inputs, and expected outputs and an expected return URL. On the back-end a headless browser would be running on the server to test the scenario just defined by the user, most likely using calls to a headless browser... I've searched quite a bit to see if something as simple as this exists but I haven't had any luck. I've found lots of tools for allowing programmatic operation of browser commands but a web front-end for testing another web application I have not.
Background
My team has dedicated automated regression tests that the testers run on their local machines. The tests are written in Python, utilize some Selenium integration plugins, and use an excel spreadsheet as input on what to test. They are maintained by the QA department.
Problem
Nobody outside the QA team knows how extensive these regression tests
are because they exist only on individual laptops.
They have no central repository, and the dev team has no means of
actively updating these tests as we build new features. We must leave
it 100% up to the QA department.
The business analysts don't have access to the results of these
tests. Because of all this, a lot of uncertainty exists around our
automated testing increasing reluctance to change things without
instructing the QA team to perform full scale manual regression
tests...
This has led me to consider putting all of our Selenium tests in the cloud behind a user-friendly web front-end that anyone can use and access from anywhere. They could then easily create new tests using dropdown menus. Everyone, developers, testers, and business analysts, can see whats covered in a test sequence and update them as we add new features. I believe this would also make it easier to have Jenkins jobs trigger tests to run at timed intervals if web application exposed web service hooks for jenkins... But I feel like perhaps I'm re-inventing the wheel. Is what I'm proposing to build worth it?
Personally i would not spend too much time in creating a website to accept user input to create a testscript. Instead I would spend that time in creating a solid test framework and use Jenkins to trigger the tests.
You also need to consider the 'website' maintenance in future. What will happen if some new feature has to be included in the website? QA/BA team will depend on the developer to add the feature.
I think it is better to use keyword driven framework - where you can write your entire test in spreadsheet. [In my project QA people who are not familiar with programming create test scripts with this approach].
As Jenkins web based application - anyone can trigger your automated regression tests. Even the BAs (in my project, that is what i have done). No technical skill is required. We can also pass parameters through jenkins. Parameters can be anything from text to a file. So, you can upload a file which contains the steps to be executed to the jenkins job and the rest should be taken care by your test framework.
You would definitely need a central repository. It is a must have. You can take a look at VisualSVN server. It is easy and FREE.
Keyword Driven framework using Selenium:
http://www.testautomationguru.com/keyword-driven-framework-for-localization-testing-using-selenium-webdriver/
Continuous regression & results:
http://www.testautomationguru.com/continuous-regression-testing-best-practises/
Smoke Test after each build:
http://www.testautomationguru.com/automated-smoke-test-best-practises/
We have our production server running our website. Then we have a test server which has exact same data but with changes to code to do some new functionality. This web app has over 500 pages.
Is there any program that can
Login to the test site
Crawl through each page and then save the page as html
Compare with the same page saved with live site?
This way we can make sure that new features that we add to our test site will not break the live site when code updates are applied to production.
I am currently trying to use WinHTTrack website copier and then comparing the test and live folders with some code comparison tool like beyond compare. This works ok but there are lot of files changed because of the domain name changes.
Looking forward to ideas / solutions for this problem.
Regards
Have you looked at using Watir for this? It's not exactly the thing you are looking for but it might allow you some more granularity in your tests and ensure the site is functionally identical rather than getting caught up on changing guids, timestamps and all the other things that tend to change across any significant size website from day to day as part of it's standard functionality.
Apparently you can't make consistent, reproduceable builds in your project, can you? I would recommend moving towards that in the long run, it will save you a lot of headaches. That way you would know exactly what was deployed to which server when, so there would be no more need to bend around backwards to get the deployed sources back like this...
I know this is not a direct solution to your problem... but maybe it is worth comparing, whether you would save more in the long run by investing the efforts into your build process now, instead of implementing this workaround (and then improving your build process anyway - because one day you will almost surely need to do that).
wget has a --convert-links option, there are also some options to preserve cookies that might let you do it logged in http://drupal.org/node/118759#comment-664498
use an Offline Downloader, download all files to your computer from both sources, then compare the folder contents using a free tool like Total Commander.
EDIT
Load both of your sources into a CVS, and compare it there.
An recent article has prompted me to pick up a project I have been working on for a while. I want to create a web service front end for a number of sites to allow automated completion of forms and data retrieval from the results, and other areas of the site. I have acheived a degree of success using Selenium and custom code however I am looking to extend this to a stage where adding additional sites is a trivial task (maybe one which doesn't require a developer even).
The Kapow web data server looks to achieve a lot of this however I am told it is quite expensive (currently awaiting a quote). Has anyone had experience with this, or can suggest any alternatives (Open Source ideally)?
Disclaimer: I realise the potential legality issues around automating data retrieval from 3rd party websites - this tool is designed to be used in a price comparison system and all of the websites integrated with it will be done with the express permission of the owners. Where the sites provide an API this will clearly be the favoured approach.
Thanks
Realised it's been a while since I posted this, however should anyone come across it, I have had lots of success in using the WSO2 framework (particularly the mashup server) for this. For data mining tasks I have also used a Java library that this wraps - webharvest - which has achieved everything I needed