How to do load testing in xpages - testing

I am facing issue with my xpage application. It works perfectly fine with less number of concurrent users. But When more concurrent users say more then 1000 , try to access xpage application, It becomes very slow. I have looked the code and corrected some redundant code .
But I am not sure this is the issue. For that is there any way in lotus notes to simulate the load testing with 1000 users?
Please help me if any workaround there.

Agree with Oliver about using JMeter.
But then what you really want is to find out where you have "expensive" code. For an agent you can just "profile" it. However, that is a little less straight-forward for an XPage. You can try the XPages Toolbox from OpenNTF.org. I have not tried it on Domino 9.0.x but I would think you could use it.
Another simple (and quick) way to get an idea is to print some time info on the console of the server when you load the pages in your application. You can use a phase listener to add this information - or put it in another more specific location - it really depends on the way that your application is structured. But this way you can get a very quick idea of where the bottlenecks are before you dive into something like the toolbox :-)
/John

We used JMeter to get an idea what will happen if X users will access our app in Y threads etc. http://jmeter.apache.org/

Related

Does vwo(visual website optimizer) slow down website load time in any manner?

Visual website optimizer is a A/B testing tool which can help one site owner to analyze his site with a modified of that. It puts a simple code in your website and make a new version of your web page.Then it show one version of your webpage to 50% of your visitors and another ver to rest of the 50%. This way the owner can analyze which ver of the site is generating more revenue & dump the other one.
So my question is can vwo reduce the site loading time somehow?Or what is the drawbacks of using vwo in a website?
Yes, there's a little bit of additional lag in load time, as the script that makes the decision has to call home to the VWO servers, see what variations should be served, then serve that particular page.
The trick to minimising that loading lag is to put the script absolutely first on the target page, so that nothing else is happening before the script fires (but you'll always have lag).
This blog post by VWO sums everything up: https://vwo.com/blog/how-vwo-affects-site-speed/
They write in that post:
Having said all this, we are confident that VWO’s best-in-class technology coupled with optimal campaign settings will ensure that your website never slows down
However, I would suggest to test it out on your page and see whether it works for you or not.

Many user using one program (.exe) that includes datasets

I created a time recording program in vb.net with a sql-server as backend. User can send there time entries into the database (i used typed datasets functionality) and send different queries to get overviews over there working time.
My plan was to put that exe in a folder in our network and let the user make a link on their desktops. Every user writes into the same table but can only see his own entries so there is no possibility that two user manipulate the same dataset.
During my research i found a warning that "write contentions between the different users" can be occur. Is that so in my case?
Has anyone experience with "many user using the same exe" and where that is using datasets and could give me an advice whether it is working or what i should do instead?
SQL Server will handle all of your multi-user DB access concerns.
Multiple users accessing the same exe from a network location can work but it's kind of a hack. Let's say you wanted to update that exe with a few bug fixes. You would have to ensure that all users close the application before you could release the update. To answer you question though, the application will be isolated to each user running it. You won't have any contention issues when it comes to CRUD operations on the database due to the network deployment.
You might consider something other than a copy/paste style publishing of your application. Visual Studio has a few simple tools you can use to publish your application to a central location using ClickOnce deployment.
http://msdn.microsoft.com/en-us/library/31kztyey(v=vs.110).aspx
My solution was to add a simple shutdown-timer in the form, which alerts users to saving their data before the program close att 4 AM.
If i need to upgrade, i just replace the .exe on the network.
Ugly and dirty, yes... but worked like a charm for the past 2 years.
good luck!

How to verify lots of events in a reasonable way

I am new to software testing. Currently I need to test a middle-sized web application. We have just refactored our codebase and added many event logging logic to the existing code. The event logging code will write to both Windows Eventlog and a SQL database table as well.
The amount of the events is about 200. What approach should I take to test/verify this code refactoring effectivly and efficiently?
Thanks.
I would be tempted to implement unit tests for each of the events to make sure when an event occurs the correct information is passed into your event logging logic.
This would mean that you can trigger one event on the deployed site and verify the data is written to the database and event log. You can have an acceptable level of confidence that the remaining event will be recorded correctly.
If unit testing isn't an option then you will need to verify each event manually, I would alternate between checking the database and the event log as there should be little risk in this area failing. That would mean you would have 200 tests rather than 400 tests.
You could also partition the application into sensible sections and trigger a few events for each section to give you a reasonable level of confidence in the application.
The approach you take will really be determined by how long you have to test, what the cost of would be if an event didn't get logged, and how well developed the logging logic is.
Hope this helps
I would have added tests before you did the refactoring. you dont know where you have broken it already :).
you are saying that it logs into EventViewer and DB, I hope you have exposed logging feature as an interface so that you can:
Extend it to log to some other device if needed
Also makes mocking bit a lot easier
if you have 200 events to test, that's not going to be easy tbh. I dont think you can escape from creating eq number of tests for your 200 events.
I would do it this way:
i would search for all places where my logging interface is used and note all classes and
start with critical paths/ones first (that way you at least cover critical ones)
or you could start from the end, i.e. note down all possible combinations of logs you are getting, maybe point to stale data so that you know if the input is the same, output should be the same too. And every time, regression test your new binaries agaisnt this data and you should get similar number/level of logs.
This shouldn't be to difficult.
Pick a free automated web test tool like Watir (java) or WatiN (.net), (or VS UI Test if you have it.)
Create tests that cover the areas of the web application you expect/need to fire events. Examine the SQL Db after each test to see what events did fire.
If those event streams are correct for the test add a step into the test to verifiy that exactly that event stream was created in the Db.
This will give you a set of tests that will validate the eventing from any portion of your web site in a repeatable fashion.
The efficent & efective part of this approach is that it allows you to create only as many tests as you need to verify the app. Also you do not need to recreate a unit test approach with one test per event.
Automating the tests will allow you re-execute them without additonal effort, and this will really add up over the long haul.
This approach can also be taken with manual testing, but it will be tricky to get consistent & repeatable results. Also re-testing will take nearly as long as the testing uncovers defects that need to be fixed.
Note: while this will be the most effective & efficent way it will not be exhaustive. There will likely be edge case that get missed, but that can be said of nearly any test approach. Just add test cases until you get the coverage you need.
Hope this helps,
Chris

How to compare test website and live website

We have our production server running our website. Then we have a test server which has exact same data but with changes to code to do some new functionality. This web app has over 500 pages.
Is there any program that can
Login to the test site
Crawl through each page and then save the page as html
Compare with the same page saved with live site?
This way we can make sure that new features that we add to our test site will not break the live site when code updates are applied to production.
I am currently trying to use WinHTTrack website copier and then comparing the test and live folders with some code comparison tool like beyond compare. This works ok but there are lot of files changed because of the domain name changes.
Looking forward to ideas / solutions for this problem.
Regards
Have you looked at using Watir for this? It's not exactly the thing you are looking for but it might allow you some more granularity in your tests and ensure the site is functionally identical rather than getting caught up on changing guids, timestamps and all the other things that tend to change across any significant size website from day to day as part of it's standard functionality.
Apparently you can't make consistent, reproduceable builds in your project, can you? I would recommend moving towards that in the long run, it will save you a lot of headaches. That way you would know exactly what was deployed to which server when, so there would be no more need to bend around backwards to get the deployed sources back like this...
I know this is not a direct solution to your problem... but maybe it is worth comparing, whether you would save more in the long run by investing the efforts into your build process now, instead of implementing this workaround (and then improving your build process anyway - because one day you will almost surely need to do that).
wget has a --convert-links option, there are also some options to preserve cookies that might let you do it logged in http://drupal.org/node/118759#comment-664498
use an Offline Downloader, download all files to your computer from both sources, then compare the folder contents using a free tool like Total Commander.
EDIT
Load both of your sources into a CVS, and compare it there.

Website data retrieval

An recent article has prompted me to pick up a project I have been working on for a while. I want to create a web service front end for a number of sites to allow automated completion of forms and data retrieval from the results, and other areas of the site. I have acheived a degree of success using Selenium and custom code however I am looking to extend this to a stage where adding additional sites is a trivial task (maybe one which doesn't require a developer even).
The Kapow web data server looks to achieve a lot of this however I am told it is quite expensive (currently awaiting a quote). Has anyone had experience with this, or can suggest any alternatives (Open Source ideally)?
Disclaimer: I realise the potential legality issues around automating data retrieval from 3rd party websites - this tool is designed to be used in a price comparison system and all of the websites integrated with it will be done with the express permission of the owners. Where the sites provide an API this will clearly be the favoured approach.
Thanks
Realised it's been a while since I posted this, however should anyone come across it, I have had lots of success in using the WSO2 framework (particularly the mashup server) for this. For data mining tasks I have also used a Java library that this wraps - webharvest - which has achieved everything I needed