Asynchronous WCF Web Service Load Testing - wcf

I see several other questions about load testing web services. But as far as I can tell those are all synchronous load testing tools. (Meaning they send a ton of requests but the go one at a time.)
I am looking for a tool where I can say, "I want 100 requests to be launched at the exact same time".
Now, I am new to the whole load testing thing, so it is possible that those tools are asynchronous and I am just missing it.
Anyway, in short my question is: Is there a good tool for load testing WCF Web Services asynchronously (ie lots of threads).

In general, I recommend you look at soapUI, for anything to do with testing web services. They do have load testing features in the Professional edition (I haven't used these yet).
In addition, they've just entered beta with a loadUI product. If it's anywhere near as good as the parent product, then it's worth a hard look.

you can use the Visual Studio load testing agent components to run on multiple client machines and that will allow you to run as asynchronously as you have machines to load.
There is a licence requirement for using this feature.
There are no tools that will allow you to apply a load at exactly the same instant (i.e. within milliseconds), but this is not necessary to load test an application correctly.
For most needs a single load test server running Visual Studio Ultimate edition will be more than enough to get an understand of how your webservice performs under load.
Visual Studio and most other tools I imagine will apply load in an asynchronous manner, but I think in your view you want to apply a set load all at once.
This is not really necessary as in practice load is not applied to a service in this manner.
The best bet for services expecting high load is to load your service until a given number of "requests per second" is reached. Finding what level your application should expect is a bit trickier, but involves figuring out roughly how many users you would expect and the amount they will be using it over a given period.
The other test to do is to setup a load test harness and run the load up until either the webservice starts to perform badly or the test harness runs out of "oomph" and cannot create any more load.

For development time you can use NLoad (http://nload.github.io)
to run load tests on your development machine or testing environment.
For example
public class MyTest : ITest
{
public void Initialize()
{
// Initialize your test, e.g., create a WCF client, load files, etc.
}
public void Execute()
{
// Send http request, invoke a WCF service or whatever you want to load test.
}
}
Then create, configure and run a load test:
var loadTest = NLoad.Test<MyTest>()
.WithNumberOfThreads(100)
.WithDurationOf(TimeSpan.FromMinutes(5))
.WithDeleyBetweenThreadStart(TimeSpan.Zero)
.OnHeartbeat((s, e) => Console.WriteLine(e.Throughput))
.Build();
var result = loadTest.Run();

Related

How to test SAP .Net Connector 3 Client / Server without SAP System

I want to write some code using the SAP .Net Connector 3 to receive and send data to a SAP System using RFC and iDoc.
How can I setup a simple SAP Test System with RFC to test my code.
Is there a way to mock the SAP System or do I have to install a SAP System?
If so is there any simple tutorial on how to setup an SAP System with a simple "Hello World" RFC?
I was originally going to post a comment. But it was too long.
This isnt a solution, its a warning.
I think you have placed too much emphasis on unit test for this type of solution. Mock the rest of the code all you like. But mocking an interface that may/will behave differently is false confidence.
By all means abstract the infrastructure layer and push dummy data into int to test the rest of the app. But dont plan on mocking the interface in any way that is relevant to stability.
How do you plan to mock:
The sign on process
single sign on, SNC...
gateway connection
connection specific settings
authorizations
load balancing
connection pooling
timeout
Test it against the DEV system, then test again in QA system
and get ready for unexpected issues in PROD.
You can write code to generate TABLE/STRUCTURE content. So you easily mock what that you expect to receive or send to SAP system. Write a dummy that returns that data and mock the call. Dont bother with mock infrastructure. That achieves nothing.

Best way to run scheduled tasks in ASP.NET CORE [duplicate]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
Today we have built a console application for running the scheduled tasks for our ASP.NET website. But I think this approach is a bit error prone and difficult to maintain. How do you execute your scheduled task (in an windows/IIS/ASP.NET environment)
Update:
Examples of tasks:
Sending email from an email-queue in the database
Removing outdated objects from the database
Retrieving stats from Google AdWords and fill a table in the database.
This technique by Jeff Atwood for Stackoverflow is the simplest method I've come across. It relies on the "cache item removed" callback mechanism build into ASP.NET's cache system
Update: Stackoverflow has outgrown this method. It only works while the website is running but it's a very simple technique that is useful for many people.
Also check out Quartz.NET
All of my tasks (which need to be scheduled) for a website are kept within the website and called from a special page. I then wrote a simple Windows service which calls this page every so often. Once the page runs it returns a value. If I know there is more work to be done, I run the page again, right away, otherwise I run it in a little while. This has worked really well for me and keeps all my task logic with the web code. Before writing the simple Windows service, I used Windows scheduler to call the page every x minutes.
Another convenient way to run this is to use a monitoring service like Pingdom. Point their http check to the page which runs your service code. Have the page return results which then can be used to trigger Pingdom to send alert messages when something isn't right.
Create a custom Windows Service.
I had some mission-critical tasks set up as scheduled console apps and found them difficult to maintain. I created a Windows Service with a 'heartbeat' that would check a schedule in my DB every couple of minutes. It's worked out really well.
Having said that, I still use scheduled console apps for most of my non-critical maintenance tasks. If it ain't broke, don't fix it.
I've found this to be easy for all involved:
Create a webservice method such as DoSuchAndSuchProcess
Create a console app that calls this webmethod.
Schedule the console app in the task scheduler.
Using this methodology all of the business logic is contained in your web app, but you have the reliability of the windows task manager, or any other commercial task manager to kick it off and record any return information such as an execution report. Using a web service instead of posting to a page has a bit of an advantage because it's easier to get return data from a webservice.
Why reinvent the wheel, use the Threading and the Timer class.
protected void Application_Start()
{
Thread thread = new Thread(new ThreadStart(ThreadFunc));
thread.IsBackground = true;
thread.Name = "ThreadFunc";
thread.Start();
}
protected void ThreadFunc()
{
System.Timers.Timer t = new System.Timers.Timer();
t.Elapsed += new System.Timers.ElapsedEventHandler(TimerWorker);
t.Interval = 10000;
t.Enabled = true;
t.AutoReset = true;
t.Start();
}
protected void TimerWorker(object sender, System.Timers.ElapsedEventArgs e)
{
//work args
}
Use Windows Scheduler to run a web page.
To prevent malicous user or search engine spiders to run it, when you setup the scheduled task, simply call the web page with a querystring, ie : mypage.aspx?from=scheduledtask
Then in the page load, simply use a condition :
if (Request.Querystring["from"] == "scheduledtask")
{
//executetask
}
This way no search engine spider or malicious user will be able to execute your scheduled task.
This library works like a charm
http://www.codeproject.com/KB/cs/tsnewlib.aspx
It allows you to manage Windows scheduled tasks directly through your .NET code.
Additionally, if your application uses SQL SERVER you can use the SQL Agent to schedule your tasks. This is where we commonly put re-occurring code that is data driven (email reminders, scheduled maintenance, purges, etc...). A great feature that is built in with the SQL Agent is failure notification options, which can alert you if a critical task fails.
I'm not sure what kind of scheduled tasks you mean. If you mean stuff like "every hour, refresh foo.xml" type tasks, then use the Windows Scheduled Tasks system. (The "at" command, or via the controller.) Have it either run a console app or request a special page that kicks off the process.
Edit: I should add, this is an OK way to get your IIS app running at scheduled points too. So suppose you want to check your DB every 30 minutes and email reminders to users about some data, you can use scheduled tasks to request this page and hence get IIS processing things.
If your needs are more complex, you might consider creating a Windows Service and having it run a loop to do whatever processing you need. This also has the benefit of separating out the code for scaling or management purposes. On the downside, you need to deal with Windows services.
If you own the server you should use the windows task scheduler. Use AT /? from the command line to see the options.
Otherwise, from a web based environment, you might have to do something nasty like set up a different machine to make requests to a certain page on a timed interval.
I've used Abidar successfully in an ASP.NET project (here's some background information).
The only problem with this method is that the tasks won't run if the ASP.NET web application is unloaded from memory (ie. due to low usage). One thing I tried is creating a task to hit the web application every 5 minutes, keeping it alive, but this didn't seem to work reliably, so now I'm using the Windows scheduler and basic console application to do this instead.
The ideal solution is creating a Windows service, though this might not be possible (ie. if you're using a shared hosting environment). It also makes things a little easier from a maintenance perspective to keep things within the web application.
Here's another way:
1) Create a "heartbeat" web script that is responsible for launching the tasks if they are DUE or overdue to be launched.
2) Create a scheduled process somewhere (preferrably on the same web server) that hits the webscript and forces it to run at a regular interval. (e.g. windows schedule task that quietly launches the heatbeat script using IE or whathaveyou)
The fact that the task code is contained within a web script is purely for the sake of keeping the code within the web application code-base (the assumption is that both are dependent on each other), which would be easier for web developers to manage.
The alternate approach is to create an executable server script / program that does all the schedule work itself and run the executable itself as a scheduled task. This can allow for fundamental decoupling between the web application and the scheduled task. Hence if you need your scheduled tasks to run even in the even that the web app / database might be down or inaccessible, you should go with this approach.
You can easily create a Windows Service that runs code on interval using the 'ThreadPool.RegisterWaitForSingleObject' method. It is really slick and quite easy to get set up. This method is a more streamlined approach then to use any of the Timers in the Framework.
Have a look at the link below for more information:
Running a Periodic Process in .NET using a Windows Service:
http://allen-conway-dotnet.blogspot.com/2009/12/running-periodic-process-in-net-using.html
We use console applications also. If you use logging tools like Log4net you can properly monitor their execution. Also, I'm not sure how they are more difficult to maintain than a web page, given you may be sharing some of the same code libraries between the two if it is designed properly.
If you are against having those tasks run on a timed basis, you could have a web page in your administrative section of your website that acts as a queue. User puts in a request to run the task, it in turn inserts a blank datestamp record on MyProcessQueue table and your scheduled task is checking every X minutes for a new record in MyProcessQueue. That way, it only runs when the customer wants it to run.
Hope those suggestions help.
One option would be to set up a windows service and get that to call your scheduled task.
In winforms I've used Timers put don't think this would work well in ASP.NET
A New Task Scheduler Class Library for .NET
Note: Since this library was created, Microsoft has introduced a new task scheduler (Task Scheduler 2.0) for Windows Vista. This library is a wrapper for the Task Scheduler 1.0 interface, which is still available in Vista and is compatible with Windows XP, Windows Server 2003 and Windows 2000.
http://www.codeproject.com/KB/cs/tsnewlib.aspx

Run automated tests on a schedule to server as health check

I was tasked with creating a health check for our production site. It is a .NET MVC web application. There are a lot of dependencies and therefore points of failure e.g. a document repository, Java Web services, Site Minder policy server etc.
Management wants us to be the first to know if ever any point fails. Currently we are playing catch up if a problem arises, because it is the the client that informs us. I have written a suite of simple Selenium WebDriver based integration tests that test the sign in and a few light operations e.g. retrieving documents via the document api. I am happy with the result but need to be able to run them on a loop and notify IT when any fails.
We have a TFS build server but I'm not sure if it is the right tool for the job. I don't want to continuously build the tests, just run them. Also it looks like I can't define a build schedule more frequently than on a daily basis.
I would appreciate any ideas on how best achieve this. Thanks in advance
What you want to do is called a suite of "Smoke Tests". Smoke Tests are basically very short and sweet, independent tests that test various pieces of the app to make sure it's production ready, just as you say.
I am unfamiliar with TFS, but I'm sure the information I can provide you will be useful, and transferrable.
When you say "I don't want to build the tests, just run them." Any CI that you use, NEEDS to build them TO run them. Basically "building" will equate to "compiling". In order for your CI to actually run the tests, it needs to compile.
As far as running them, If the TFS build system has any use whatsoever, it will have a periodic build option. In Jenkins, I can specify a Cron time to run. For example:
0 0 * * *
means "run at 00:00 every day (midnight)"
or,
30 5 * 1-5 *
which means, "run at 5:30 every week day"
Since you are making Smoke Tests, it's important to remember to keep them short and sweet. Smoke tests should test one thing at a time. for example:
testLogin()
testLogout()
testAddSomething()
testRemoveSomething()
A web application health check is a very important feature. The use of smoke tests can be very useful in working out if your website is running or not and these can be automated to run at intervals to give you a notification that there is something wrong with your site, preferable before the customer notices.
However where smoke tests fail is that they only tell you that the website does not work, it does not tell you why. That is because you are making external calls as the client would, you cannot see the internals of the application. I.E is it the database that is down, is a network issue, disk space, a remote endpoint is not functioning correctly.
Now some of these things should be identifiable from other monitoring and you should definitely have an error log but sometimes you want to hear it from the horses mouth and the best thing that can tell you how you application is behaving is your application itself. That is why a number of applications have a baked in health check that can be called on demand.
Health Check as a Service
The health check services I have implemented in the past are all very similar and they do the following:
Expose an endpoint that can be called on demand, i.e /api/healthcheck. Normally this is private and is not accessible externally.
It returns a Json response containing:
the overall state
the host that returned the result (if behind a load balancer)
The application version
A set of sub system states (these will indicate which component is not performing)
The service should be resilient, any exception thrown whilst checking should still end with a health check result being returned.
Some sort of aggregate that can present a number of health check endpoints into one view
Here is one I made earlier
After doing this a number of times I have started a library to take out the main wire up of the health check and exposing it as a service. Feel free to use as an example or use the nuget packages.
https://github.com/bronumski/HealthNet
https://www.nuget.org/packages/HealthNet.WebApi
https://www.nuget.org/packages/HealthNet.Owin
https://www.nuget.org/packages/HealthNet.Nancy

How to apply TDD to web API development

I'm designing a web service running on Google App Engine that scrapes a number of websites and presents their data via a RESTful interface. Based on some background reading, I think I'd like to attempt Test Driven Development (TDD) and develop my tests before I write any business code.
My problem is caused by the fact that my list of scraped elements includes timetables and other records that change quite frequently. The limit of my knowledge on TDD is that you write tests that examine the results of code execution and compare these results to a hardcoded result set. Seeing as the data set changes frequently, this method seems impossible. Assuming that this is true, what would be the best approach to test such an API? How would a large-scale web API be tested (Twitter, Google, Netflix etc.)?
You have to choose the type of test:
Unit tests just test proper operation of your modules (units). You provide input data and test that code outputs proper results. If there are system dependent classes you try to mock them or in case of GAE services, you use google provided local services. Unit tests can be run locally on your machine or on CI servers. There are two popular unit test libs for java: Junit & TestNG.
Integration tests check that various modules (internal & external) work together - they basically check that APIs between modules are working. They are usually run on real servers and call real external services. They are technology specific and are harder to run.
In your case, I'd go with unit tests and provide sets of different input data which you logic should parse and act upon. Since your flow is pretty simple (load data from fixed Url, parse it) you could also embed loading of real data into unit tests (we do this when we parse external sources).
From what you are describing you could easily find yourself writing integration tests. If your aim is to test the logic for processing what is returned from the scraped data (e.g. you know that you are going to get a timetable in a specific format coming in and you now have logic to process that data) you will need to create a SEAM between your web services logic and your processing logic. Once you have done this you should be able to mock the data that is returned from the web service call to always return the same table data and then you can write consistent unit tests against it.
public class ScrapingService : IScrapingService
{
public string Scrape(string url)
{// scraping logic}
}
public interface IScrapingService
{
string Scrape(string url);
}
public class ScrapingProcessor
{
private IScrapingService _scrapingService
// inject the dependency
pubilc ScrapingProcessor(IScrapingService scrapingService)
{
_scrapingService = scrapingService;
}
public void Process(string url)
{
var scrapedData = _scrapingService.Scrape(url)
// now process the scrapedData
}
}
To test you can now create a FakeScrapingService that implements the IScrapingService interface and then return whatever data you like from the Scrape method. There are some very good Mocking frameworks out there that make this type of thing easy. My personal favorite is NSubstitue.
I hope this explanation helps.

Advices to correctly manage threads

I have a big Domino Web application, which uses numerous calls "OpenAgent" to Java agents to retrieve data via ajax. The application is used by several users.
What are the main parameters that you advise me to check and adjust on server, in order to avoid HTTP hang or performance issues?
There is quite an overhead in calling to an agent be it LotusScript or Java. So if your AJAX calls are quite frequent you are going to overload the server easily.
Domino comes with a test tool for this called Server.Load. It will allow you to emulate a heavy load server and you will see how your code performs under that. Another I've used is Rational Functional Tester (trial version), but there are probably free ones out there as well (eg. JMeter/LoadRunner. I haven't used).
So if you are doing infrequent complex actions that may take time and don't need a quick response to the user, I would recommend to continue with the web agent.
If it is simple look up calls I would recommend to use alternative methods. For example XPages has the AJAX functionality built into it with scaling in mind. Or if it is JSON data then look into Domino Data Service, or Domino URL commands.