WCF Service wtih Stream response - wcf

I have a WCF service and one of the method returns Stream.
Now the question is while I try to consume that Stream object, am I trying to use the stream over the network or the client had received the full stream on its own side?
Will it make any difference if I would have used RESTful instead of WCF?

The whole point of using the streaming interface in WCF is that the client gets a stream from which it can read blocks of bytes. The whole return object (file, picture, video) will NOT be assembled in full on the server and sent back as once huge chunk, instead, the client can retrieve chunks at a time from the stream returned from the WCF service.
Your client gets back a "Stream" instance, from which it can then read the data, like from a FileStream or a MemoryStream. That way, the amount of memory needed at any given time is reduced to a manageable size (instead of potentially multiple gigabytes in the buffered mode, you'll transfer a large file in e.g. 1 MB chunks or something like that).
Marc

Related

Which is the best transferMode for soap response?

I have added service reference to my web.config file but I am not sure about the transferMode property inside binding tag.
In the basicHttpBinding, which is the best transferMode for soap/xml response?
Basically there are four transfer mode. If you narrow down those to two, buffered and streamed, here is the criteria:
If you are transferring large files, mostly binary files, try using streamed. This method streams the data to the client instead of sending a big chunk of data. It helps your application to be more efficient in terms of memory consumption. Some of the advanced functionalities of WCF are not available with this transfer mode.
By default buffered is selected. This is suitable for normal messages with relatively small or medium size. The whole request or response will be buffered in memory and then flush to the client or server.
There is another method that needs a custom channel that send messages in multiple chunks.
Chunk Channel

Cancelling WCF calls with large data?

I'm about to implement a FileService using WCF. It should be able to upload files by providing the filecontent itself and the filename. The current ServiceContract looks like the following
[ServiceContract]
public interface IFileService
{
[OperationContract]
[FaultContract(typeof(FaultException))]
byte[] LoadFile(string relativeFileNamePath);
[OperationContract]
[FaultContract(typeof(FaultException))]
void SaveFile(byte[] content, string relativeFileNamePath);
}
It works fine at the moment, but i want to be able to reduce the network payload of my application using this Fileservice. I need to provide many files as soon as the user openes a specific section of my application, but i might be able to cancel some of them as soon as the user navigates further through the application. As many of mine files are somewhere between 50 and 300 MB, it takes quite a few seconds to transfer the files (the application might run on very slow networks, it might take up a minute).
To clarify and to outline the difference to all those other WCF questions: The specific problem is that providing the data between client <-> server is the bottleneck, not the performance of the service itself. Is changing the interface to a streamed WCF service reasonable?
It is a good practice to use a stream if the file size is above a certain amount. At my work on the enterprise application we are writing, if it is bigger than 16kb then we stream it. If it is less than that, we buffer. Our file service is specially designed to handle this logic.
When you have the transfer mode of your service set to buffer, it will buffer on the client as well as on the service when you are transmitting your data. This means if you are sending a 300mb file, it will buffer all 300mb during the call on both ends before the call is complete. This will definitely create bottlenecks. For performance reasons, this should only be when you have small files that buffer quickly. Otherwise, a stream is the best way to go.
If the majority or all of your files are larger files I'd switch to using a stream.

Maximum binary contents length over WCF/Http

We have a WCF service that has a threshold of 30MB to send files over an http message, anything above that value gets transferred by file copy and the path sent back to the caller. Now we were requested to eliminate that file copy because customers complained it was too slow. So the decision was to remove any size limitation when sending binary content over WCF/HTTP.
My question is - how reliable is that? What type of issues will we encounter by pushing, say, a 2GB file over the wire in a single WCF message, if that is even possible?
Thanks!
If you set the MaxReceivedMessageSize in WCF to a high enough value on your WCF service, you can push a fairly large file through that service. The maximum is int64.MaxValue = 9,223,372,036,854,775,807, so you should be able to set a value to cover a 2GB message.
You might want to control the MaxBufferSize to ensure you're not trying to store too much into memory, and maybe consider switching to the more binary-efficient MTOM message encoding if you can. Note that the MaxReceivedMessageSize governs the size of the message after the binary file has been encoded, which means the original binary file size which can be sent over the service will be smaller than 2GB.
MSDN has a very nice article covering sending large amounts of data over WCF and what to look out for: Large Data and Streaming.
Edit: Turns out the max value allowed is actually Int64.MaxValue)

resuming file upload seeking a stream

I am uploading files from clients to server... when the server program receives the stream, property Length is not supported and CanSeek comes false, how would seeking be possible?? I can get the length if I read it in the client and send as a message header in the message contract but don't know how to seek. Ideas??
WCF is not technology for file transfers. Moreover seek is not supported by StreamFormatter used internally because the whole idea of seek in distributed application is nonsense. To make this work correctly internal stream will have to be network protocol with control flow over transferred data which is not. Internally the stream is only array of bytes. It means that even if WCF supported seeking you would still need to transfer all data before seek position.
If you need resume functionality you must implement it by yourselves by manually creating chunks of data and uploading them and appending them to file on the server. Server will control last correctly received chunk and refuse chunks already passed. MSDN has sample implementation using this as custom channel.
The stream sample here http://go.microsoft.com/fwlink/?LinkId=150780 does what your trying to do.
WCF\Basic\Contract\Service\Stream\CS\Stream.sln
the sample is explained here
http://msdn.microsoft.com/en-us/library/ms751463.aspx

Send binary data via WCF: binary vs MTOM encoding

I have limited knowledge in WCF as well as sending binary data via WCF, so this question may be somewhat rudimental.
I would like to know the difference between sending data with BinaryMessageEncodingBindingElement and MtomMessageEncodingBindingElement. It is still not clear to me when to use which approach after reading this page from MSDN on Large Data and Streaming.
Also, a small question: are a message with attachments and an MTOM message the same thing?
MTOM is a standard that uses multi-part mime-encoded messages to send portions of the message that are large and would be too expensive to base64 encode as pure binary. The SOAP message itself is sent as the initial part of the message and contains references to the binary parts which a web service software stack like WCF can then pull back together to create a single representation of the message.
Binary encoding is entirely proprietary to WCF and really doesn't just have to do with large messages. It presents a binary representation of the XML Infoset which is far more compact across the wire and faster to parse than text based formats. If you happen to be sending large binary chunks of data then it just fits right in with the other bytes that are being sent.
Streaming can be done used with any message format. That's more about when the data is written across the network vs. being buffered entirely in memoery before being presented to the network transport. Smaller messages make more sense to buffer up before sending and larger messages, especially those containing large binary chunks or streams, necessitate being streamed or will exhaust memory resources.