I'm having an issue where calling a WCF web service from Silverlight is very slow. The situation is very similar to the one outlined in http://forums.silverlight.net/t/148027.aspx/1 . Like that poster, I have set up a simple Silverlight Application project and a simple Console Application project. In each, I added a service reference to my WCF service and added a few test calls to the service. However, the time it takes for each call to go out is drastically different. The console app makes the call on the order of milliseconds. The Silverlight app makes it anywhere from 3-5 seconds.
In the post I linked to above, when the poster changed his service to use binary message encoding, the speed gap disappeared. However, in my situation, the large speed gap remains regardless of whether I'm using binary message encoding or text message encoding.
What could account for this speed difference? I've verified through fiddler that both the request and response are being binary-encoded when I have it configured to use my binary endpoint. I've also verified that the slow speed persists through multiple consecutive web service calls, so it can't just be spin-up time. The problem has to be on the Silverlight side somewhere.
Here's a snippet of the code that I'm using to measure the time difference:
var proxy = new WCFRef.WebServicesClient("binary");
var callbegin = DateTime.Now;
proxy.CallWSMethodAsync();
var callend = DateTime.Now;
var span = callend.Subtract(callbegin);
Debug.WriteLine("call time: " + span.Seconds + "." + span.Milliseconds + " sec");
The timespan captured in that code is the span I was talking about with the large difference between console (few milliseconds) and SL (few seconds).
A few months ago we also hit this speed bump from WCF.
Why it is slow we do not know.
I guess SOAP in general is a bit slow (lots of overhead) aswell as the serializers that aren't all that performant.
We've recently switched to a REST architecture (service stack, but web API/asp.net mvc work fine aswell. We simply liked Service stack a bit more) and our performance went up from anywhere from 2x to 15x the speed.
WCF is terribly slow at accepting and serialize/deserialize request.
I'm guessing silverlight with it's stripped down .net has something to do with it.
We also did some test with pure the serializer. We got a huge boost from switching out the xml/binary serializer that wcf uses to a costum JSON serializer. Alas it's not easy to switch out the serializer in WCF with silverlight.
Related
I currently have 2 static dictionaries in a wcf restful service. These both hold look up data that's not worth putting in a database. Will these stay in memory until the application restarts or should I put them in HttpContext.Current.Application?
The static data will remain until the process recycles or stops, the same as HttpContext.Current.Application.
If you are looking for a more sophisticated caching option, check out the System.Runtime.Caching namespace introduced in 4.0. It is easy to use, works in any .NET application, and offers features like setting expiration times and creating callback functions to execute on expire.
I'm using Portable Class Libraries to build service classes that all our UI technology with use to communicate with our services.
These libraries will support Silverlight, Asp.Net and any other .Net UI technology.
Since Silverlight is supported, all calls must be asynchronous.
With Silverlight, I can call CloseAsync() immediately after client.Method() call to the service.
However, I'm finding that doesn't work with Asp.Net clients.
I don't want to use CloseAync() in the completed code because if multiple async calls are being made you could run into a timing issue.
I'd rather not have to come up with a lot of logic something like putting a while loop in every async method to make sure CloseAsync() hasn't been called and completed.
Right now I'm testing just using Abort in the completed sections and everything appears to be working fine.
Just curious if anybody else out there knows of any problems we may run into Using Abort?
Thanks.
We're using .Net 4.5.
It depends on which binding you're using. If you're using a binding which uses sessions, then calling Close[Async] will attempt to first close that session (e.g., WSHttpBinding with reliable messaging), then close the connection, otherwise it will remain alive in the server side until it times out. If you're using a binding which does not use sessions (i.e., BasicHttpBinding), then they're pretty much equivalent.
I was looking into the new WCF 4.5 Websocket services.
Ran into trouble while making calls to the service via browser.
As it turns out (after alot of googling stuff), when you're dealing with web browser as a client for your web-sockets, the only way WCF 4.5 will work is, if you define your OperationContract with 'Action="*"' tag [as there is no explicit way to call a 'specific' function from the browser, you can just call 'ws.send("asd")' to send messages to the server, hence you need to define a single handler for all the incomming calls to the service, similarly there can only be one callback function]
Now, if you use 'Action="*"' you can only use the datatype 'Message' while defining your contracts.
This is well and good, if you want to create an echo server, but lets say, you want to upload/download data, in the default (buffered) mode, the data transfer speeds are not what they are supposed to be (20mb file takes 40-50 secs). The only way to improve the speeds is by setting the mode as 'Streamed' (i tried using 'StreamResponse').
But now the trouble is, since we can only use 'Message' as the datatype while defining the Contracts, and Message uses SOAP type def., it uses the 'Buffered' mode, even if its explicitly defined otherwise. [please correct me if i am wrong here]
So, my question is, is there any way to achieve, 'streamed data transfer' in WCF 4.5 Websockets.
And, yes i am using byteStreamMessageEncoding (latest one provided in 4.5).
And i am using 'custom binding' in the web.config as 'netHttpBinding' doesnt work with browsers.
Ohk....
Since WCF didnt work..found out it can be done using ASP.Net 4.5 handlers.
I'm new to WCF Data Services so I've been playing. After some initial tests I am disappointed by the performance of my test data service.
I realize that because a WCF DS is HTTP-based there is overhead inherent in the protocol but my tests are still way slower than I would expect:
Environment:
All on one box: Quad core 64-bit laptop with 4GB RAM running W7. Decent machine.
Small SQL database (SQLExpress 2008 R2) with 16 tables... the table under test has 243 rows.
Hosted my test service in IIS with all defaults.
Code:
I've created a Entity Framework model (DataContext) for this database (stock codegen by VS2010).
I've created a data-service based on this model.
I've created a client which has a direct service reference (ObjectContext) for this service (stock codegen by VS2010)
In the client I am also able to call the EF model directly and also use Native SQL (ADO.NET SqlConnection)
Test Plan:
Each iteration connects to the database (there is an option to reuse connections), queries for all rows in the target table ("EVENTS") and then counts them (thus forcing any deferred fetches to be performaed).
Run for 25 iterations each for Native SQL (SqlConnection/SqlCommand), Entity Framework (DataContext) and WCF Data Services (ObjectContext).
Results:
25 iterations of Native SQL: 436ms
25 iterations of Entity Framework: 656ms
25 iterations of WCF Data Services: 12110ms
Ouch. That's about 20x slower than EF.
Since WCF Data Services is HTTP, there's no opportunity for HTTP connection reuse, so the client is forced to reconnect to the web server for each iteration. But surely there's more going on here than that.
EF itself is fairly fast and it's the same EF code/model is reused for both the service and the direct-to-EF client tests. There's going to be some overhead for Xml serialization and deserialization in the data-service, but that much!?! I've had good performance with Xml serialization in the past.
I'm going to run some tests with JSON and Protocol-Buffer encodings to see if I can get better performance, but I'm curious if the community has any advice for speeding this up.
I'm not strong with IIS, so perhaps there are some IIS tweaks (caches, connection pools, etc) that can be set to improves this?
Consider deploying as a windows service instead? IIS may have ASAPI filters, rewrite rules, etc that it runs through. even if none of these are active, the IIS pipeline is so long, something may slow you down marginally.
a service should give you a good baseline of how long it takes the request to run, be packed, etc, without the IIS slowdowns
The link below has video that has some interesting WCF benchmarks and comparisons between WCF data services and Entity Framework.
http://www.relationalis.com/articles/2011/4/10/wcf-data-services-overhead-performance.html
I increased performance of our WCF Data Service API by 41% simply by enabling compression. It was really easy to do do. Follow this link that explains what to do on your IIs server: Enabling dynamic compression (gzip, deflate) for WCF Data Feeds, OData and other custom services in IIS7
Don't forget to iisReset after your change!
On the client-side:
// This is your context basically, you should have this code throughout your app.
var context = new YourEntities("YourServiceURL");
context.SendingRequest2 += SendingRequest2;
// Add the following method somewhere in a static utility library
public static void SendingRequest2(object sender, SendingRequest2EventArgs e)
{
var request = ((HttpWebRequestMessage)e.RequestMessage).HttpWebRequest;
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
}
Try setting security to "none" in the binding section in the configuration. This should make big improvement.
In order to eliminate most of the connection overhead you can try to batch all operations to the WCF DS to to see if that makes a significant difference.
NorthwindEntities context = new NorthwindEntities(svcUri);
var batchRequests =
new DataServiceRequest[]{someCustomerQuery, someProductsQuery};
var batchResponse = context.ExecuteBatch(batchRequests);
For more info see here.
WCF DataServices are for providing your disparate clients with OpenData protocol; so as you don't have to write/refactor multiple web service methods for each change request. I never advise it to be used if the entire system is microsoft technology stack based. It's meant for remote clients.
How do you pass those 25 iterations for WCF?
var WCFobj = new ...Service();
foreach(var calling in CallList)
WCFobj.Call(...)
If you call like that it means you call WCF 25 times, which consumes too many resources.
For me, I used to build up everything into a DataTable and user table name to stored procedure I'm calling; DataRow is params. When calling, just pass the DataTable in encrypted form by using
var table = new DataTable("PROC_CALLING")...
...
StringBuilder sb = new StringBuilder();
var xml = System.Xml.XmlWriter.Create(sb);
table.WriteXml(xml);
var bytes = System.Text.Encoding.UTF8.GetBytes(sb.ToString());
[optional]use GZip to bytes
WCFobj.Call(bytes);
The thing is you pass all 25 calls at once, that can save performance significantly. If the return object is same structure, just pass it as DataTable in bytes form and convert it back to DataTable.
I used to implement this methods with GZip for import/export data modules. Passing large amount of bytes is going make WCF unhappy. Its depends whatever you want to consume; computing resources or networking resources.
things to try:
1) results encoding: use binary encoding of your WCF channel if possible, see http://msdn.microsoft.com/en-us/magazine/ee294456.aspx -- alternately use compression: http://programmerpayback.com/2009/02/18/speed-up-your-app-by-compressing-wcf-service-responses/
2) change your service instance behavior, see http://msdn.microsoft.com/en-us/magazine/cc163590.aspx#S6 -- try InstanceContextMode = InstanceContextMode.Single, ConcurrencyMode=ConcurrencyMode.Multiple - if you can verify that your service is built in a thread safe way.
Regarding your benchmark, I think you should simulate more realistic load (including concurrent users) and ignore outliers, the first request to IIS will be really slow (it has to load all the DLLs)
In the app I'm currently working on we are using a couple of WCF services with a lot of methods. Until now, all methods are very short running, often just getting some data. I've just added a method that takes a way longer time to run.
I do not want to raise the timeout in the config, because 1 minute is long enough for all other methods on the service.
What is the best way to deal with 1 longer running method? And how do I provide feedback that it is still running?
A long running task should really be farmed off as an asynchronous call, that could then be polled for status (or an event through a duplex connection). For a really long running task, you might even want to push it into something like Windows Workflow.
Combining WCF with WF (Workflow Foundation) seems like the best option here. Workflow Foundation gives you lots of goodies, including long-term persistence over the lifetime of your long-running process.
In .NET 3.5, it's possible to do so, but clumsy and a lot of work.
Here are a few links for this topic:
http://channel9.msdn.com/posts/mwink/Introduction-to-Workflow-Services-building-WCF-Services-with-WF/
http://code.msdn.microsoft.com/WorkflowServices
http://msdn.microsoft.com/en-us/magazine/cc164251.aspx
With .NET 4.0, these "WorkflowServices" will be a big part of the new WF/WCF 4.0 package. You should basically be able to expose an interface for any workflow as a WCF service. Sounds very promising, haven't had a chance to try it myself.
Some links for the new stuff:
WCF / WF 4.0 and "Dublin"
http://blogs.msdn.com/murrayg/archive/2009/06/23/windows-azure-s-net-workflow-service-to-support-net-4-0-workflows.aspx
http://channel9.msdn.com/shows/10-4/10-4-Episode-24-Monitoring-Workflow-Services/
http://channel9.msdn.com/shows/10-4/10-4-Episode-16-Windows-Workflow-4/
Marc