How to simulate large number of clients using UCMA, for load testing? - ucma

I have created application using Lync client side SDK 2013 and UCMA 4.0 . Now I test my application with large number of users. How can I simulate large number of client using UCMA or Lync client side SDK?

It depends on what exactly what you want to "simulate".
If you just want call traffic there is sipp, but that is just simple sip calls and doesn't really reflect an actual Microsoft Lync Client.
As far as I know, Microsoft doesn't provide any load testing tools in Lync. You will have to generate them yourself base on what exactly you want to "simulate".
With a UCMA trusted application, you should be able to startup and use a large number of user endpoints to "simulate" common lync services (like randomly changing presence, making calls, send IM's, etc). You would have to create such an app yourself.

I created a tool in UCMA to do my stress test for all my applications than I have made.
It is simple to make, and it is composed of two parts.
This example is a stress tester for calls. Of course, you can easily make a different one by using this example.
We create our platform, follow our Set-CsTrustedApplication.
var platformSettings = new ProvisionedApplicationPlatformSettings("InnixiTester", "urn:application:innixitester");
var collabPlatform = new CollaborationPlatform(platformSettings);
collabPlatform.EndStartup(collabPlatform.BeginStartup(null, null));
Ok, I know what I am doing here is a wrong chaining together, the Begin and the End into one line of code. However, this is just a code exemple. I invite you to read the article of Tom Morgan, he explains why it is not good to do it like me.
We use here a Parallel loop to create all our users-endpoint. In that way, it goes faster.
/*
* Proprieties of the class
*/
private AutoResetEvent _waitForStressTestToFinish = new AutoResetEvent(false);
private List<UserEndpoint> _listUserEndpoints = new List<UserEndpoint>();
private int _maxUsers = 200;
private int _tickTotal;
private int _tickCount;
private int _nbrCallsByIntervall;
/*
* End
*/
_maxUsers = 200; // Nbr max of users
const var callsTotal = 200; // Nbr of total call
const var timeToTest = 30; // Total time to test
const var intervalOfCalls = 5; // We want to make our calls between specific intervals
Parallel.For(0, _maxUsers, i =>
{
CreateUserEndpoint(collabPlatform, i.ToString());
});
You simply create your UserEndpoint here. The scenario is that my users in the active directory are stressuser0 to stressuser200. With extension starting from +14250 to +1425200
private void CreateUserEndpoint(CollaborationPlatform cp, string iteration)
{
try
{
UserEndpointSettings settingsUser = new UserEndpointSettings($"sip:stressuser{iteration}#pferde.net", "pool2010.pferde.net", 5061);
settingsUser = InitializePublishAlwaysOnlineSettings(settingsUser);
var userEndpoint = new UserEndpoint(cp, settingsUser);
userEndpoint.EndEstablish(userEndpoint.BeginEstablish(null, null));
PublishOnline(userEndpoint);
_listUserEndpoints.Add(userEndpoint);
Console.WriteLine($"The User Endpoint owned by URI: {userEndpoint.OwnerUri} was created\n");
}
catch (Exception)
{
Console.WriteLine($"failed to create for --> sip:stressuser{iteration}#pferde.net");
throw;
}
}
private UserEndpointSettings InitializePublishAlwaysOnlineSettings(UserEndpointSettings settings)
{
settings.AutomaticPresencePublicationEnabled = true;
settings.Presence.PreferredServiceCapabilities.AudioSupport = CapabilitySupport.Supported;
return (settings);
}
Now time to place the calls! We are going to code a simple algorithm with a timer. Is going to calculate how many calls it needs to make for X time and for Y Calls and for Z intervals.
Console.WriteLine("Tape a key to place calls...");
Console.ReadKey();
PlaceCalls(callsTotal, timeToTest, intervalOfCalls);
_waitForStressTestToFinish.WaitOne();
}
catch (Exception ex)
{
Console.WriteLine($"Shutting down platform due to error {ex}");
ShutdownPlatform(collabPlatform);
}
ShutdownPlatform(collabPlatform);
}
private void PlaceCalls(int callsMax, int timeMax, int timeIntervall)
{
_tickTotal = timeMax / timeIntervall;
_nbrCallsByIntervall= callsMax / _tickTotal;
Console.WriteLine($"_nbrCallsByIntervall --> {_nbrCallsByIntervall}");
var timeIntervalTimespan = new TimeSpan(0, 0, 0, timeIntervall);
_timer = new Timer(timeIntervalTimespan.TotalMilliseconds);
_timer.Elapsed += new ElapsedEventHandler(_timer_Elapsed);
_timer.Enabled = true;
}
void _timer_Elapsed(object sender, ElapsedEventArgs e)
{
if (_tickCount < _tickTotal)
{
Console.WriteLine($"\n Pause Timer | On {_tickCount} to {_tickTotal}\n");
_timer.Enabled = false;
for (var i = 0; i <= _nbrCallsByIntervall - 1; ++i)
{
ConversationSettings convSettings = new ConversationSettings();
Conversation conversation = new Conversation(_listUserEndpoints[generateNumber(0, _listUserEndpoints.Count)], convSettings);
var audioVideoCall = new AudioVideoCall(conversation);
CallEstablishOptions options = new CallEstablishOptions();
var gNbr = generateNumber(0, _listUserEndpoints.Count);
try
{
// Here I'm calling a single phone number. You can use GenerateNumber to call stressusers each others. But you have to extend your code to accept the calls coming.
audioVideoCall.BeginEstablish($"3322", options, null, audioVideoCall);
}
catch (Exception)
{
Console.WriteLine("Fail to Call the remote user...");
throw;
}
Console.WriteLine($"Call--> +1425{gNbr}.Counter--> {_tickCount} Ticket--> {_tickTotal} and thread id {Thread.CurrentThread.ManagedThreadId}");
}
_tickCount++;
_timer.Enabled = true;
Console.WriteLine("\n reStart Timer \n");
}
else
{
Console.WriteLine("\n!!! END Stress test !!!\n");
_timer.Enabled = false;
_waitForCallToEstablish.Set();
}
}
private int generateNumber(int min, int max)
{
var r = new Random();
Thread.Sleep(200);
return (r.Next(min, max));
}

Related

Blazor: Poll a server's object in an interval?

It seems that Blazor makes it easy for the client to call a server's method using something called SignalR underneath. But when I searched if Blazor does that for the other direction, the answer was that Blazor does not do that and I have to implement it myself using SignalR. So, I thought about polling.
That is, I like to read a property or call a method in the server's object at an interval to determine if something has changed. To test this, I added this property to the WeatherForecastService class. The property increases every time when read.
int _Value = 0;
public int Value
{
get
{
_Value++;
return _Value;
}
}
In the weather tablet sample plage, I removed the code to display the weather and added this.
<p>This component demonstrates fetching data from a service.</p>
<div>Value = #this.Value</div>
#code {
private int Value = 0;
protected override async Task OnInitializedAsync()
{
System.Timers.Timer t = new System.Timers.Timer();
t.Elapsed += (s, e) => { Value = ForecastService.Value; };
t.Interval = 1000;
t.Start();
}
}
And it did not work. I found an existing question, but it uses F# which I don't understand, and I don't know what ClientTImer is, so the answer was not helpful to me.
t.Elapsed += async (s, e) =>
{
Value = ForecastService.Value;
await InvokeAsync(StateHasChanged);
};

Real time GPS UWP

I really want to know how do I can update the position of the user in the map while the UWP app was running in bakground
Here is my code right now
private async void PinPoints()
{
//Pin point to the map
Windows.Devices.Geolocation.Geopoint position = await Library.Position();
double lat = position.Position.Latitude;
double lon = position.Position.Longitude;
//Geoposition alttest = await Library.Temp();
//alt = alttest.Coordinate.Altitude;
DependencyObject marker = Library.Marker(""
//+ Environment.NewLine + "Altitude " + alt
);
Display.Children.Add(marker);
Windows.UI.Xaml.Controls.Maps.MapControl.SetLocation(marker, position);
Windows.UI.Xaml.Controls.Maps.MapControl.SetNormalizedAnchorPoint(marker, new Point(0.5, 0.5));
Display.LandmarksVisible = true;
Display.ZoomLevel = 16;
Display.Center = position;
}
This function will pinpoint the current location for me but it will do only when user open this page due to I've put it in the public Map() {}
Current : Get the location when open map page and when I walk the map still be the same place
What I want : The position keep changing while I move on and also run on background (If application is close location data still changed)
Is there any code to solve this location problem if I have to add code where should I fix and what should I do?
Additional now I perform the background (Not sure is it work or not) by create the Window Runtime Component (Universal) with class like this
*I already put this project as the reference of the main one
namespace BackgroundRunning
{
public sealed class TaskBG : IBackgroundTask
{
BackgroundTaskDeferral _deferral = null;
Accelerometer _accelerometer = null;
Geolocator _locator = new Geolocator();
public void Run(IBackgroundTaskInstance taskInstance)
{
_deferral = taskInstance.GetDeferral();
try
{
// force gps quality readings
_locator.DesiredAccuracy = PositionAccuracy.High;
taskInstance.Canceled += taskInstance_Canceled;
_accelerometer = Windows.Devices.Sensors.Accelerometer.GetDefault();
_accelerometer.ReportInterval = _accelerometer.MinimumReportInterval > 5000 ? _accelerometer.MinimumReportInterval : 5000;
_accelerometer.ReadingChanged += accelerometer_ReadingChanged;
}
catch (Exception ex)
{
// Add your chosen analytics here
System.Diagnostics.Debug.WriteLine(ex);
}
}
void taskInstance_Canceled(IBackgroundTaskInstance sender, BackgroundTaskCancellationReason reason)
{
_deferral.Complete();
}
async void accelerometer_ReadingChanged(Windows.Devices.Sensors.Accelerometer sender, Windows.Devices.Sensors.AccelerometerReadingChangedEventArgs args)
{
try
{
if (_locator.LocationStatus != PositionStatus.Disabled)
{
try
{
Geoposition pos = await _locator.GetGeopositionAsync();
}
catch (Exception ex)
{
if (ex.HResult != unchecked((int)0x800705b4))
{
System.Diagnostics.Debug.WriteLine(ex);
}
}
}
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex);
}
}
public void Dispose()
{
if (_accelerometer != null)
{
_accelerometer.ReadingChanged -= accelerometer_ReadingChanged;
_accelerometer.ReportInterval = 0;
}
}
}
}
Your Solution :
Make 3 projects in your solution.
1> Background Task "references App_Code"
2> App_Code "contains calculations,mostly Backend Code"
3> Map(Main Project) "references App_Code"
Register a background Task to your project and specify the time interval after which it should run again
Scenario 1> App Open,User Requests Update
Trigger Your background Task from code behind.
Scenario 2> App Closed,Not Being Used
Run your background task!
So basically keep your backgroundTask simple(make it a class in whose run method you just call the proper App_Code Classes Method) and all calculations that you want to happen in the background keep them in App_Code. Also, if I am no wrong the minimum interval between which a background Task is triggered by itself cannot be set below 15 minutes.
For real-time you could look at SignalR ( can't help any further here)

My Akka.Net Demo is incredibly slow

I am trying to get a proof of concept running with akka.net. I am sure that I am doing something terribly wrong, but I can't figure out what it is.
I want my actors to form a graph of nodes. Later, this will be a complex graph of business objekts, but for now I want to try a simple linear structure like this:
I want to ask a node for a neighbour that is 9 steps away. I am trying to implement this in a recursive manner. I ask node #9 for a neighbour that is 9 steps away, then I ask node #8 for a neighbour that is 8 steps away and so on. Finally, this should return node #0 as an answer.
Well, my code works, but it takes more than 4 seconds to execute. Why is that?
This is my full code listing:
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using Akka;
using Akka.Actor;
namespace AkkaTest
{
class Program
{
public static Stopwatch stopwatch = new Stopwatch();
static void Main(string[] args)
{
var system = ActorSystem.Create("MySystem");
IActorRef[] current = new IActorRef[0];
Console.WriteLine("Initializing actors...");
for (int i = 0; i < 10; i++)
{
var current1 = current;
var props = Props.Create<Obj>(() => new Obj(current1, Guid.NewGuid()));
var actorRef = system.ActorOf(props, i.ToString());
current = new[] { actorRef };
}
Console.WriteLine("actors initialized.");
FindNeighboursRequest r = new FindNeighboursRequest(9);
stopwatch.Start();
var response = current[0].Ask(r);
FindNeighboursResponse result = (FindNeighboursResponse)response.Result;
stopwatch.Stop();
foreach (var d in result.FoundNeighbours)
{
Console.WriteLine(d);
}
Console.WriteLine("Search took " + stopwatch.ElapsedMilliseconds + "ms.");
Console.ReadLine();
}
}
public class FindNeighboursRequest
{
public FindNeighboursRequest(int distance)
{
this.Distance = distance;
}
public int Distance { get; private set; }
}
public class FindNeighboursResponse
{
private IActorRef[] foundNeighbours;
public FindNeighboursResponse(IEnumerable<IActorRef> descendants)
{
this.foundNeighbours = descendants.ToArray();
}
public IActorRef[] FoundNeighbours
{
get { return this.foundNeighbours; }
}
}
public class Obj : ReceiveActor
{
private Guid objGuid;
readonly List<IActorRef> neighbours = new List<IActorRef>();
public Obj(IEnumerable<IActorRef> otherObjs, Guid objGuid)
{
this.neighbours.AddRange(otherObjs);
this.objGuid = objGuid;
Receive<FindNeighboursRequest>(r => handleFindNeighbourRequest(r));
}
public Obj()
{
}
private async void handleFindNeighbourRequest (FindNeighboursRequest r)
{
if (r.Distance == 0)
{
FindNeighboursResponse response = new FindNeighboursResponse(new IActorRef[] { Self });
Sender.Tell(response, Self);
return;
}
List<FindNeighboursResponse> responses = new List<FindNeighboursResponse>();
foreach (var actorRef in neighbours)
{
FindNeighboursRequest req = new FindNeighboursRequest(r.Distance - 1);
var response2 = actorRef.Ask(req);
responses.Add((FindNeighboursResponse)response2.Result);
}
FindNeighboursResponse response3 = new FindNeighboursResponse(responses.SelectMany(rx => rx.FoundNeighbours));
Sender.Tell(response3, Self);
}
}
}
The reason of such slow behavior is the way you use Ask (an that you use it, but I'll cover this later). In your example, you're asking each neighbor in a loop, and then immediately executing response2.Result which is actively blocking current actor (and thread it resides on). So you're essentially making synchronous flow with blocking.
The easiest thing to fix that, is to collect all tasks returned from Ask and use Task.WhenAll to collect them all, without waiting for each one in a loop. Taking this example:
public class Obj : ReceiveActor
{
private readonly IActorRef[] _neighbours;
private readonly Guid _id;
public Obj(IActorRef[] neighbours, Guid id)
{
_neighbours = neighbours;
_id = id;
Receive<FindNeighboursRequest>(async r =>
{
if (r.Distance == 0) Sender.Tell(new FindNeighboursResponse(new[] {Self}));
else
{
var request = new FindNeighboursRequest(r.Distance - 1);
var replies = _neighbours.Select(neighbour => neighbour.Ask<FindNeighboursResponse>(request));
var ready = await Task.WhenAll(replies);
var responses = ready.SelectMany(x => x.FoundNeighbours);
Sender.Tell(new FindNeighboursResponse(responses.ToArray()));
}
});
}
}
This one is much faster.
NOTE: In general you shouldn't use Ask inside of an actor:
Each ask is allocating a listener inside current actor, so in general using Ask is A LOT heavier than passing messages with Tell.
When sending messages through chain of actors, cost of ask is additionally transporting message twice (one for request and one for reply) through each actor. One of the popular patterns is that, when you are sending request from A⇒B⇒C⇒D and respond from D back to A, you can reply directly D⇒A, without need of passing the message through whole chain back. Usually combination of Forward/Tell works better.
In general don't use async version of Receive if it's not necessary - at the moment, it's slower for an actor when compared to sync version.

Database query load balancing

I have several instances of an application running the same query against several SQL Server databases. There is a manual load balancing mechanism in place: each instance uses an algorithm to asimmetrically decide which server to query at a given time.
The processing time of the query, and thus the resource consumption on the server, varies greatly depending on the input parameters, which are different each time that the queries are fired.
The current implementation of the balancing algorithm causes, from time to time, one of the severs ending up as the target of several "long/heavy" queries, while the other servers are left underused.
As I know beforehand if a query is a heavy one, how could I improve the algorithm to prevent that server overload?
Right now each application instance decides how to do the balancing independently of each other, so I guess I should share the load information between all instances, right?
Thanks!
Answering myself:
Intially I was looking for something able to implement diffrent types of balancing http://www.peplink.com/technology/load-balancing-algorithms/ ,
even I have found a nice article http://www.codeproject.com/Articles/3338/NET-Dynamic-Software-Load-Balancing
Finally I have decide to keep it simple, as I know about the weight of the request I'll just take the server with less charge.
EDIT: Added based weight to each server. Now the balancing is done by weight and load. Code can be tested in linqpad
// Define other methods and classes here
public class LoadBalancer{
class Server{
public long Weight{ get; set;}
public long Load{ get; set;}
public string ConnectionString{ get; set;}
public long RequestCount;
// Return a weight based on the Load
public long CurrentWeight {
get{
var w = Weight - Load;
return w <= 0 ? 1: w;
}
}
}
List<Server> Servers;
public LoadBalancer(){
Servers = new List<Server>();
}
public void AddServer(long weight, string connection){
lock(Servers)
Servers.Add(new UserQuery.LoadBalancer.Server(){ ConnectionString = connection, Weight = weight });
}
static Random rnd = new Random();
// Returns server with less load
public string GetServer(int expectedLoad){
var hit = rnd.Next(0, (int)Servers.Sum(s => s.CurrentWeight));
long acc = 0;
foreach(var server in Servers){
if(hit < server.CurrentWeight + acc){
// Update load
lock(server){
server.Load += expectedLoad;
server.RequestCount++;
}
return server.ConnectionString;
}
acc += server.CurrentWeight;
}
throw new Exception("No servers available");
}
public void ReleaseServer(string conn, int expectedLoad){
var server = Servers.First(s => s.ConnectionString == conn);
// Update load
lock(server){
server.Load -= expectedLoad;
server.RequestCount--;
}
}
public IEnumerable<string> Dump(){
return Servers.Select(s => string.Format("Server: {0}, Requests: {1}, Weight: {2}, CurrentWeight: {3}",
s.ConnectionString, s.RequestCount, s.Weight, s.CurrentWeight));
}
}
void Main()
{
var balancer = new LoadBalancer();
// Add servers
balancer.AddServer(100, "Server1");
balancer.AddServer(100, "Server2");
balancer.AddServer(800, "Server3");
balancer.AddServer(200, "Server4");
balancer.AddServer(1000, "Server5");
var rnd = new Random();
var servers = new List<dynamic>();
Enumerable.Range(0, 100)
.All(i => {
var load = rnd.Next(1, 10);
var server = balancer.GetServer(load);
servers.Add(new {Server = server, Load = load});
if(i % 10 == 0){
balancer.Dump().Dump();
// Remove some load
var items = rnd.Next(0, servers.Count);
servers.Take(items)
.All(s => {
balancer.ReleaseServer(s.Server, s.Load);
return true;
});
servers = servers.Skip(items).ToList();
}
return true;
});
servers.All(s => {
balancer.ReleaseServer(s.Server, s.Load);
return true;
});
balancer.Dump().Dump();
}

Pattern for limiting number of simultaneous asynchronous calls

I need to retrieve multiple objects from an external system. The external system supports multiple simultaneous requests (i.e. threads), but it is possible to flood the external system - therefore I want to be able to retrieve multiple objects asynchronously, but I want to be able to throttle the number of simultaneous async requests. i.e. I need to retrieve 100 items, but don't want to be retrieving more than 25 of them at once. When each request of the 25 completes, I want to trigger another retrieval, and once they are all complete I want to return all of the results in the order they were requested (i.e. there is no point returning the results until the entire call is returned). Are there any recommended patterns for this sort of thing?
Would something like this be appropriate (pseudocode, obviously)?
private List<externalSystemObjects> returnedObjects = new List<externalSystemObjects>;
public List<externalSystemObjects> GetObjects(List<string> ids)
{
int callCount = 0;
int maxCallCount = 25;
WaitHandle[] handles;
foreach(id in itemIds to get)
{
if(callCount < maxCallCount)
{
WaitHandle handle = executeCall(id, callback);
addWaitHandleToWaitArray(handle)
}
else
{
int returnedCallId = WaitHandle.WaitAny(handles);
removeReturnedCallFromWaitHandles(handles);
}
}
WaitHandle.WaitAll(handles);
return returnedObjects;
}
public void callback(object result)
{
returnedObjects.Add(result);
}
Consider the list of items to process as a queue from which 25 processing threads dequeue tasks, process a task, add the result then repeat until the queue is empty:
class Program
{
class State
{
public EventWaitHandle Done;
public int runningThreads;
public List<string> itemsToProcess;
public List<string> itemsResponses;
}
static void Main(string[] args)
{
State state = new State();
state.itemsResponses = new List<string>(1000);
state.itemsToProcess = new List<string>(1000);
for (int i = 0; i < 1000; ++i)
{
state.itemsToProcess.Add(String.Format("Request {0}", i));
}
state.runningThreads = 25;
state.Done = new AutoResetEvent(false);
for (int i = 0; i < 25; ++i)
{
Thread t =new Thread(new ParameterizedThreadStart(Processing));
t.Start(state);
}
state.Done.WaitOne();
foreach (string s in state.itemsResponses)
{
Console.WriteLine("{0}", s);
}
}
private static void Processing(object param)
{
Debug.Assert(param is State);
State state = param as State;
try
{
do
{
string item = null;
lock (state.itemsToProcess)
{
if (state.itemsToProcess.Count > 0)
{
item = state.itemsToProcess[0];
state.itemsToProcess.RemoveAt(0);
}
}
if (null == item)
{
break;
}
// Simulate some processing
Thread.Sleep(10);
string response = String.Format("Response for {0} on thread: {1}", item, Thread.CurrentThread.ManagedThreadId);
lock (state.itemsResponses)
{
state.itemsResponses.Add(response);
}
} while (true);
}
catch (Exception)
{
// ...
}
finally
{
int threadsLeft = Interlocked.Decrement(ref state.runningThreads);
if (0 == threadsLeft)
{
state.Done.Set();
}
}
}
}
You can do the same using asynchronous callbacks, there is no need to use threads.
Having some queue-like structure to hold the pending requests is a pretty common pattern. In Web apps where there may be several layers of processing you see a "funnel" style approach with the early parts of the processing change having larger queues. There may also be some kind of prioritisation applied to queues, higher priority requests being shuffled to the top of the queue.
One important thing to consider in your solution is that if request arrival rate is higher than your processing rate (this might be due to a Denial of Service attack, or just that some part of the processing is unusually slow today) then your queues will increase without bound. You need to have some policy such as to refuse new requests immediately when the queue depth exceeds some value.