WCF Streamed and Compact Framework 3.5 - wcf

I have a WCF service which transport mode is set to Streamed. The service need to accept a stream.
My Client is Compact Framework 3.5. In the client I have n list object that carries large data. I want to serialize this object to stream and send it to WCF service where I will deserialize it.
This turn out to be a mission because of the limited serialization options in Compact Framework.
Currently I have the following for the serializing:
ServiceClient sc = new ServiceClient(CommonClient.MyDefaultBinding(), CommonClient.MyEndpointAddress);
MemoryStream s = new MemoryStream();
XmlSerializer serializer = new XmlSerializer(typeof(ScannerService.AscAssetCaptureCollection));
serializer.Serialize(s, serverCollection);
OnComplete(sc.Send((Stream)s));
This is not working. The error I'm getting when trying to send is:
The type System.IO.MemoryStream was not expected. Use the XmlInclude or SoapInclude >attribute to specify types that are not known statically.
Does anyone know how can I achieve this?

There's no streaming from .NET CF. Because of memory limitations, the WCF version for .NET CF/WIN CE has a dramatically abbreviated tool set. The only option in your case for uploading files from a .NET CF device is buffering, though CF might be able to receive a stream (not sure. I've never tried it.) You're limited, too, in your binding options and encryption - SSL won't work, only message encryption.
Here's a link to the subset of features available in .NET 3.5 CF
http://blogs.msdn.com/b/andrewarnottms/archive/2007/08/21/the-wcf-subset-supported-by-netcf.aspx
Good luck ... I was able to get files uploading/downloading to CF, but it wasn't easy.

Related

Why .Net Remoting doesn't need known types but WCF does?

We are migrating our .net Remoting app to use WCF. One thing that confuses me now is the concept of "Known types" that WCF introduced but not needed by Remoting. While I am sort of understand what the known types are and what they do, what I am confused about is the difference between WCF and Remoting - On the sender's side, if WCF doesn't have enough type information about the object at hand to serialize it, why does Remoting? Same for the receiver: Why .net Remoting doesn't have a problem deserializing an object received but WCF does? Is that because Remoting sends metadata along with the data? If so why can't WCF do the same?
You're correct - .NET remoting sends type metadata with the requests. WCF can do the same, but by default it doesn't - it's a lot of extra information which makes the requests larger and more complex to process (affecting performance). Not sending the type information also allows for loosely coupled systems, where the client and the server can version separately, and as long as they adhere to the contracts established in the original version, they will continue working. And it also allows for WCF to talk to systems written in non-NET platforms (which isn't possible with remoting or other technologies which rely on shared type information).
If you really want to go with the non-known-types way, you can replace the default serializer used by WCF (the DataContractSerializer) with the NetDataContractSerializer, which will send type information with each request. To use that, search for "wcf netdatacontractserializer" and you'll find how to use it.

What is StreamBody, why I can't consum it from Delphi client side?

I am build a WCF service return Documents(Docx/Doc, PDF, Xls) to client side.
In current implementation, the WCF service return the output document as Stream to client side.
Without much effort, I can consume it on a .Net client application by copy the returned Streamto a FileStream and save it to local file system. And I also managed to build a Delphi client side which can consume a very simple WCF service, which return string to client side.
But the problem is that I can't find a way to consume the WCF service which return a Stream type, as the returned Stream is actually StreamBody and I don't know how to save that StreamBody to a local file on Delphi. Maybe it is not possible doing that in Delphi?
I am using Delphi 2007 with VS 2010 .Net 4.0
According to this question in SO, maybe using Stream is not such a great idea at all. And I think for documents I could return bye[] to client side instead, but I still like the idea of Stream over network instead of Array.

Consuming a WCF Rest Service with WP7

I have a WCF Restful Service that returns JSON objects that my iPhone and Android apps consume nicely. This is my first attempt at building something like this and I left WP7 till last as my background lies with C# and VS2010. But it seems it’s not going to be a simple as I had guessed.
So I guess I have three questions:
1, Can I consume JSON objects in WP7? If so does anyone know of a tutorial?
2, if not, can I use the existing service and build some new contracts for consumption in WP7? Or,
3, do I need to build a whole new service?
Option one is most desirable but either way, I need to develop for all three operating systems so does anyone know the best type of model to bring this all together???
Cheers,
Mike.
Yes, but not with the channel factory / proxy programming model which you may be used to with WCF. REST services are usually consumed by using some simpler classes such as WebClient. You can use the JSON libraries (DataContractJsonSerializer is in the WP7 profile) then to deserialize the data you receive. Even the untyped JSON (the System.Json classes from the System.Json.dll on Silverlight), while not officially in the profile, they kind of work on WP7 as well (I've seen a few people simply referencing the SL library on a WP7 project).
If you want proxy support, you can add a new endpoint to the service using BasicHttpBinding, which is supported in WP7; if you don't need it, see 1).
No. See 1) and 2).
Try this to deserialize a JSON object:
public static T Deserialize<T>(string strData) where T : class
{
DataContractJsonSerializer serializer = new DataContractJsonSerializer(typeof(T));
byte[] byteArray = Encoding.UTF8.GetBytes(strData);
MemoryStream memoryStream = new MemoryStream(byteArray);
T tRet = serializer.ReadObject(memoryStream) as T;
memoryStream.Dispose();
return tRet;
}
I find a totally wcf-based approach more interesting.
This is a good post that addresses this issue
http://blogs.msdn.com/b/carlosfigueira/archive/2010/04/29/consuming-rest-json-services-in-silverlight-4.aspx

How to Improve WCF Data Services Performance

I'm new to WCF Data Services so I've been playing. After some initial tests I am disappointed by the performance of my test data service.
I realize that because a WCF DS is HTTP-based there is overhead inherent in the protocol but my tests are still way slower than I would expect:
Environment:
All on one box: Quad core 64-bit laptop with 4GB RAM running W7. Decent machine.
Small SQL database (SQLExpress 2008 R2) with 16 tables... the table under test has 243 rows.
Hosted my test service in IIS with all defaults.
Code:
I've created a Entity Framework model (DataContext) for this database (stock codegen by VS2010).
I've created a data-service based on this model.
I've created a client which has a direct service reference (ObjectContext) for this service (stock codegen by VS2010)
In the client I am also able to call the EF model directly and also use Native SQL (ADO.NET SqlConnection)
Test Plan:
Each iteration connects to the database (there is an option to reuse connections), queries for all rows in the target table ("EVENTS") and then counts them (thus forcing any deferred fetches to be performaed).
Run for 25 iterations each for Native SQL (SqlConnection/SqlCommand), Entity Framework (DataContext) and WCF Data Services (ObjectContext).
Results:
25 iterations of Native SQL: 436ms
25 iterations of Entity Framework: 656ms
25 iterations of WCF Data Services: 12110ms
Ouch. That's about 20x slower than EF.
Since WCF Data Services is HTTP, there's no opportunity for HTTP connection reuse, so the client is forced to reconnect to the web server for each iteration. But surely there's more going on here than that.
EF itself is fairly fast and it's the same EF code/model is reused for both the service and the direct-to-EF client tests. There's going to be some overhead for Xml serialization and deserialization in the data-service, but that much!?! I've had good performance with Xml serialization in the past.
I'm going to run some tests with JSON and Protocol-Buffer encodings to see if I can get better performance, but I'm curious if the community has any advice for speeding this up.
I'm not strong with IIS, so perhaps there are some IIS tweaks (caches, connection pools, etc) that can be set to improves this?
Consider deploying as a windows service instead? IIS may have ASAPI filters, rewrite rules, etc that it runs through. even if none of these are active, the IIS pipeline is so long, something may slow you down marginally.
a service should give you a good baseline of how long it takes the request to run, be packed, etc, without the IIS slowdowns
The link below has video that has some interesting WCF benchmarks and comparisons between WCF data services and Entity Framework.
http://www.relationalis.com/articles/2011/4/10/wcf-data-services-overhead-performance.html
I increased performance of our WCF Data Service API by 41% simply by enabling compression. It was really easy to do do. Follow this link that explains what to do on your IIs server: Enabling dynamic compression (gzip, deflate) for WCF Data Feeds, OData and other custom services in IIS7
Don't forget to iisReset after your change!
On the client-side:
// This is your context basically, you should have this code throughout your app.
var context = new YourEntities("YourServiceURL");
context.SendingRequest2 += SendingRequest2;
// Add the following method somewhere in a static utility library
public static void SendingRequest2(object sender, SendingRequest2EventArgs e)
{
var request = ((HttpWebRequestMessage)e.RequestMessage).HttpWebRequest;
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
}
Try setting security to "none" in the binding section in the configuration. This should make big improvement.
In order to eliminate most of the connection overhead you can try to batch all operations to the WCF DS to to see if that makes a significant difference.
NorthwindEntities context = new NorthwindEntities(svcUri);
var batchRequests =
new DataServiceRequest[]{someCustomerQuery, someProductsQuery};
var batchResponse = context.ExecuteBatch(batchRequests);
For more info see here.
WCF DataServices are for providing your disparate clients with OpenData protocol; so as you don't have to write/refactor multiple web service methods for each change request. I never advise it to be used if the entire system is microsoft technology stack based. It's meant for remote clients.
How do you pass those 25 iterations for WCF?
var WCFobj = new ...Service();
foreach(var calling in CallList)
WCFobj.Call(...)
If you call like that it means you call WCF 25 times, which consumes too many resources.
For me, I used to build up everything into a DataTable and user table name to stored procedure I'm calling; DataRow is params. When calling, just pass the DataTable in encrypted form by using
var table = new DataTable("PROC_CALLING")...
...
StringBuilder sb = new StringBuilder();
var xml = System.Xml.XmlWriter.Create(sb);
table.WriteXml(xml);
var bytes = System.Text.Encoding.UTF8.GetBytes(sb.ToString());
[optional]use GZip to bytes
WCFobj.Call(bytes);
The thing is you pass all 25 calls at once, that can save performance significantly. If the return object is same structure, just pass it as DataTable in bytes form and convert it back to DataTable.
I used to implement this methods with GZip for import/export data modules. Passing large amount of bytes is going make WCF unhappy. Its depends whatever you want to consume; computing resources or networking resources.
things to try:
1) results encoding: use binary encoding of your WCF channel if possible, see http://msdn.microsoft.com/en-us/magazine/ee294456.aspx -- alternately use compression: http://programmerpayback.com/2009/02/18/speed-up-your-app-by-compressing-wcf-service-responses/
2) change your service instance behavior, see http://msdn.microsoft.com/en-us/magazine/cc163590.aspx#S6 -- try InstanceContextMode = InstanceContextMode.Single, ConcurrencyMode=ConcurrencyMode.Multiple - if you can verify that your service is built in a thread safe way.
Regarding your benchmark, I think you should simulate more realistic load (including concurrent users) and ignore outliers, the first request to IIS will be really slow (it has to load all the DLLs)

Silverlight 3.0 Binary Serialization Support?

Can I deserialize an object in the Silverlight 3.0 runtime that was serialized using the full .NET 2.0 runtime using the BinaryFormatter? I am using the following code to serialize an object to a ByteArray which we write to a DB table:
MemoryStream serStream = new MemoryStream();
BinaryFormatter binFormatter = new BinaryFormatter();
binFormatter.Serialize(serStream, csMetric);
serStream.Position = 0;
return serStream.ToArray();
The Silverlight client then needs to retrieve this binary data from the DB (via a Web service call) and deserizlize the bytes back into an instance of the csMetric class.
Is this possible? If so, how is that done on the client given that the BinaryFormatter is not availble in the SL 3.0 runtime?
Thanks,
jon
Since you have to go through WCF, and thus the full .NET Framework, to get the data into Silverlight anyway I'd recommend deserializing the object on the server before sending it back to Silverlight. The Silverlight 3 WCF stack supports binary WCF encoding which should make the data transfer reasonably efficient.
Jon,
Have you tried to deserialize the object using the DataContractSerializer? I have not tested this exact scenario, but this is how I would approach it:
the following is an extension method off of a byte array (byte[]):
pubilc static T Deserialize<T>(this byte[] yourSerializedByteArray)
{
T deserializedObject;
DataContractSerializer serializer = new DataContractSerializer(typeof(T));
using(MemoryStream ms = new MemoryStream(yourSerializedByteArray))
{
deserializedObject = (T)serializer.ReadObject(ms);
}
return deserializedObject;
}
Maybe one would like to try my SharpSerializer. It can serialize data to the both binary and xml format. It works on .NET Full, Compact und Silverlight.
DataContractSerializer has a whole bunch of problems, I've created a binary serializer that removes some of them (at least for me!) It uses reflection and produces reasonably compact representations that can be sent to WCF services.
More info here.