I have a WCF REST Service which accepts a JSON string
One of the parameters is a large string of numbers
This causes the following error - which is visible by tracing and using SVC Trace Viewer
There was an error deserializing the object of type CarConfiguration. The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader.
Now I've read all sorts of articles advising how to rectify this
All of them recommend increasing various config settings on the server and client
e.g.
Error Serializing String in WebService call
http://bloggingabout.net/blogs/ramon/archive/2008/08/20/wcf-and-large-messages.aspx
http://social.msdn.microsoft.com/Forums/en/wcf/thread/f570823a-8581-45ba-8b0b-ab0c7d7fcae1
So my config file looks like this
<webHttpBinding>
<binding name="webBinding" maxBufferSize="5242880" maxReceivedMessageSize="5242880" >
<readerQuotas maxDepth="5242880" maxStringContentLength="5242880" maxArrayLength="5242880" maxBytesPerRead="5242880" maxNameTableCharCount="5242880"/>
</binding>
</webHttpBinding>
...
...
...
<endpoint
address="/"
binding="webHttpBinding"
bindingConfiguration="webBinding"
My problem is that I can change this on the server, but there are no WCF config settings on the client as its a REST service and I'm just making a http request using the WebClient object
any ideas?
so it turns out you need a fullly qualified url on the endpoint address, not a relative one
Error calling a WCF REST service using JSON. length quota (8192) exceeded
That error wouldn't be happening on the client, since reader quotas are a WCF-only thing and WebClient/HttpWebRequest don't do deserialization themselves or enforce any other kind of quotas.
So I'd say say that it's likely you're putting the configuration in the wrong place and it's not getting picked up.
Either that or... you're not using one of the WCF DataContract Serializers manually on the client side, are you?
Related
I have a WCF service with the following custom binding:
<binding name="binaryHttpBinding" >
<binaryMessageEncoding />
<httpTransport maxReceivedMessageSize="2147483647" />
</binding>
(The client has of course configuration that matches this binding). The problem is that client doesn't receive generic FaultException, e.g. "T" is not received by the client, I can verify it if I trace the calls. However if I replace binaryMessageEncoding with textMessageEncoding using Soap 1.2, all fault exceptions come enriched with fault detail.
I searched on the net and wasn't able to find any information that would claim that binary message encoding over HTTP is not compatible with generic WCF fault exceptions. Also it doesn't look like I can control much of the binary message encoding - for example I can't set in the configuration SOAP message version (not supported by WCF for binary encoding). I wonder whether this scenario is supported.
After spending quite some hours on trying to figure out what could go wrong, I've finally made it work. Two reasons for failure, none of them obvious.
The fault message class has overridden ToString method that did some computation. Sure it was unwise to put such logic in ToString, but who could guess this would affect just binary serialization?
FaultException constructor has an optional parameter "actionName" that I set to the name of the method where the exception occurred. Apparently WCF is quite picky about what can be assigned to action name but leaving it blank always works. Again, who could guess that it only affect binary serialization and in such strange way (so it discards the message fault on the client side)?
I'm using NetMessagingBinding on a IIS hosted WCF service to consume messages published on a Windows Server Service Bus Topic.
From my understanding there is no limit on message size on Topics for Windows Server Service Bus, but nevertheless I'm getting an error deserializing a message from the subscription:
System.ServiceModel.Dispatcher.NetDispatcherFaultException: (...)
The maximum string content length quota (8192) has been exceeded while reading XML data.
This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader.'.
Please see InnerException for more details. ---> System.Runtime.Serialization.SerializationException: There was an error deserializing the object of type [Type].
The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader. ---> System.Xml.XmlException:
The maximum string content length quota (8192) has been exceeded while reading XML data. This quota may be increased by changing the MaxStringContentLength property on the XmlDictionaryReaderQuotas object used when creating the XML reader.
The way i see it there is no configuration that i can change in WCF's web.config to change the maximum string content. The only property that could be related is MaxBufferPoolSize but it is not exposed through the web.config.
The binding configuration used is:
<bindings>
<netMessagingBinding>
<binding name="messagingBinding"
closeTimeout="00:03:00" openTimeout="00:03:00"
receiveTimeout="00:03:00" sendTimeout="00:03:00"
prefetchCount="-1" sessionIdleTimeout="00:01:00">
<transportSettings batchFlushInterval="00:00:01" />
</binding>
</netMessagingBinding>
</bindings>
Thanks in advance,
Joao Carlos de Sousa
This issue can also be solved by using a custom binding which uses the netMessagingTransport. This way the readerQuotas node can be use to define the reader quotas.
<customBinding>
<binding name="sbBindingConfiguration" sendTimeout="00:01:00" receiveTimeout="00:01:00" openTimeout="00:01:00">
<binaryMessageEncoding>
<readerQuotas maxDepth="100000000" maxStringContentLength="100000000"
maxArrayLength="100000000" maxBytesPerRead="100000000" maxNameTableCharCount="100000000"/>
</binaryMessageEncoding>
<netMessagingTransport manualAddressing="false" maxBufferPoolSize="100000" maxReceivedMessageSize="100000000">
<transportSettings batchFlushInterval="00:00:00"/>
</netMessagingTransport>
</binding>
</customBinding>
Please refer to this post for more details on how to use the custom binding.
From the error, it seems like this is a WCF level error and not Service Bus. Have you tried to raise the MaxMessageSize? This thread has info on it, but basically, you need to setup something like the following on your binding's configuration in the web.config:
<binding name="yourBinding"
maxReceivedMessageSize="10000000"
maxBufferSize="10000000"
maxBufferPoolSize="10000000">
<readerQuotas maxDepth="32"
maxArrayLength="100000000"
maxStringContentLength="100000000"/>
</binding>
NetMessagingBinding currently does not allow one to change the MaxStringContentLenght through the XML configuration.
A solution that worked for me was to create a message formatter behaviour extension by implementing the IDispatchMessageFormatter interface.
The extension can then be used either by:
creating an attribute that can be used in code to identify which operation contracts will use the message formatter behaviour
public class MessageFormatterExtensionBehaviorAttribute :
Attribute, IOperationBehavior
{
(...)
public void ApplyDispatchBehavior(OperationDescription operationDescription, DispatchOperation dispatchOperation)
{
dispatchOperation.Formatter = new MessageFormatterExtension();
}
(...)
}
creating a configuration element that exposes the custom behaviour.
I have a use case in my ASP.MVC app in which I need to save a collection of about 15k records (this is from a CSV file upload). I'm putting it through CSLA business objects in order validate the uploaded data with business rules.
I'm making use of the WCF DataPortal. When save is called I get this error after about 30s to 45s:
System.ServiceModel.EndpointNotFoundException: There was no endpoint listening at [my dataportal host address]/WcfPortal.svc that could accept the message.
I have determined that if I break down the collection into smaller chunks, and call save on each of those chunks, the use case completes without a problem.
I have configured my Service to use the max values as follows (recommended in Rocky's book) (and increased the sendTimeout based on other guidance):
<binding name="wsHttpBinding_IWcfPortal" maxReceivedMessageSize="2147483647" sendTimeout="05:00:00">
<readerQuotas maxBytesPerRead="2147483647" maxArrayLength="2147483647" maxStringContentLength="2147483647" maxNameTableCharCount="2147483647" maxDepth="2147483647"/>
</binding>
Now I KNOW for a fact that my data does not exceed the 2147486347 size limit. Besides, if it did, I would expect to get a more meaningful error message indicating this (like I did when the size limits were at their defaults).
I have turned on WCF logging/tracing, which reveals nothing. This error seems to be some communication level error that gets hit before WCF stack comes into the picture.
Please advise as to why I would be getting this error when trying to save this large collection?
As WCF has changed over the years they've added some other limits that you can change. The latest info on WCF configuration for the data portal is available in two places:
The data portal FAQ page
The Using CSLA 4: Data Portal Configuration ebook
I am facing the problem while using the WCF to fetching the large amount of data, so I do not want to increase the "maxReceivedMessageSize="65536". So any alternative for that or can I achieve that using streaming. If yes then how ?
Please suggest.
Yes you can stream data in WCF, but WCF has some limitations while working in Streamed mode. So you might like to consider implementing a method that returns chunks of data and calling it multiple times if you don't mind handling it yourself.
Otherwise you can enable Streamed mode in configuration like
<basicHttpBinding>
<binding name="HttpStreaming" maxReceivedMessageSize="67108864"
transferMode="Streamed"/>
</basicHttpBinding>
<!-- an example customBinding using Http and streaming-->
<customBinding>
<binding name="Soap12">
<textMessageEncoding messageVersion="Soap12WSAddressing10" />
<httpTransport transferMode="Streamed" maxReceivedMessageSize="67108864"/>
</binding>
</customBinding>
And return a Stream object from your Contract method. This way the data will be transferred as you read the stream object.
interface IRemoteFileService
{
Stream OpenFile(string serverPath);
}
if your data is in a stream like a when you transfer a file. you just open the stream and return it. otherwise you can use a MemoryStream and DataContractSerializer to serialize almost any object tree.
for details check this and this
While this sounds simple there are complications and limitations for Streamed mode. If you just need a simple way to bypass the size limits for a big object transfer, Consider sending the object partially on multiple calls.
I have a client-server application, which communicates using WCF, and uses NetDataContractSerializer to serialize the objects graph.
Since a lot of data is transferred between the server and the client, I tried to decrease its size by fine tuning the size of the data members (e.g. changed int to short, long to int, etc.).
After finishing the tuning, I found out, that the amount of transferred data hasn't changed! The problem is, that the NetDataContractSerializer serializes the objects graph to XML, so no matter what's the size of the data-member, the only thing that matters is the size of its value. For example, the value 10023 of a Int16 data member will be serialized as the string "10023" (0x3130303233), instead of just 10023 (0x2727).
I remember that in Remoting I could use the BinaryFormatter which serialized the values according to the type of the data member, but I don't know if it's possible to use it with WCF.
Does someone have a solution?
WCF uses SOAP messages, but what kind of message encoding is used, is totally up to you.
Basically, out of the box, you have two: text encoding (text representation of XML message) or binary encoding. You can write your own message encoding, if you really must and need to.
Out of the box, the basicHttp and wsHttp bindings use text encoding - but you can change that if you want to. The netTcp binding (which is the clear preferred choice behind corporate firewalls) will use binary by default.
You can also define (just in config) your own "binary http" protocol, if you wish:
<bindings>
<customBinding>
<binding name="BinaryHttpBinding">
<binaryMessageEncoding />
<httpTransport />
</binding>
</customBinding>
</bindings>
and then use it in your service and client side config:
<services>
<service name="YourService">
<endpoint
address="http://localhost:8888/YourService/"
binding="customBinding"
bindingConfiguration="BinaryHttpBinding"
contract="IYourService"
name="YourService" />
</service>
</services>
Now you have a http-based transport protocol, which will encode your message in compact binary, for you to use and enjoy!
No additional coding or messy hacks or lots of manual XML serialization code needed - just plug it together and use it! Ah, the joy of WCF flexibility!
First thought; have you enabled transport compression?
How complex is the data? If it is something that would work with the regular DataContractSerializer (i.e. a simple object tree), then protobuf-net may be of use. It is a very efficient binary serialization library with support for WCF via additional attributes on the service contract - for example:
[ServiceContract]
public interface IFoo
{
[OperationContract, ProtoBehavior]
Test3 Bar(Test1 value);
}
(the [ProtoBehaviour] is what swaps in the different serializer for this method)
However:
it needs to be able to identify a numeric tag for each property - either via extra attributes, or it can use the Order on a [DataMember(Order = x)] attribute
inheritance (if you are using it) requires extra attributes
it works best if you are using assembly sharing ("mex" doesn't love it...)
Nicely, it also works with MTOM, reducing the base-64 cost for larger messages.
Here is an example of how to make custom encoding here https://www.codeproject.com/Articles/434665/WCF-Serialization-A-Case-Study
It's worth noting that what actually gets sent is the same as if you had a service method sending byte[] with default encoding. The message going over the wire still uses a SOAP XML envelope regardless of how you configure serialization.
It looks like this:
POST http://127.0.0.1:12345/forSwerGup182948/Client HTTP/1.1
Content-Type: text/xml; charset=utf-8
VsDebuggerCausalityData: uIDPo+WkoDpet/JOtGlW+EHdpDQAAAAAvFs5XOJ0tEW0wTvNVRDUIiabR6u+p+JNnd5Z+SWl1NcACQAA
SOAPAction: "http://tempuri.org/ITransmissionService/SendData"
Host: 127.0.0.1:12345
Expect: 100-continue
Accept-Encoding: gzip, deflate
Content-Length: 2890
<s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/"><s:Body><SendData xmlns="http://tempuri.org/"><message>eyI8Q2FsbGJhY2tJZD5rX19CYWNraW5nRmllbGQiOiJlYTQ3ZWIzMS1iYjIzLTRkODItODljNS1hNTZmNjdiYmQ4MTQiLCI8RnJvbT5rX19CYWNraW5nRmllbGQiOnsiPENoYW5uZWxOYW1lPmtfX0JhY2tpbmdGaWVsZCI6Ikdyb3VwMSIsIjxOYW1lPmtfX0==</message></SendData></s:Body></s:Envelope>
The binary encoder will NOT serialize your object in binary, because it has nothing to do with serialization at all! It is something working at a lower layer and decides how message is transported between server and client.
In other words, the object will first be serialized (by DataContractSerializer, for example) and then encoded (by BinaryEncoder). So your object will always be in XML format as long as DataContractSerializer is involved.
If you want a more compact data and better performance, read this blog:
https://blogs.msdn.microsoft.com/dmetzgar/2011/03/29/protocol-buffers-and-wcf/