How can I make RestSharp use BSON? - serialization

I'm using RestSharp, and using Json.NET for serialization (see here).
Json.NET supports BSON, and since some of my requests have huge blocks of binary data, I think this would speed up my application dramatically. However, as far as I can tell, RestSharp does not seem to have any in-built support for BSON.
The use of Json.NET is implemented as a custom serializer for RestSharp, and so at first glance it looks like it would be possible to modify that custom serializer to use BSON. But, the Serialize method of the RestSharp.Serializers.ISerializer interface returns a string - which is (I assume) unsuitable for BSON. So, I assume that it would take some more significant changes to RestSharp to implement this change.
Has anyone figured out a way to do this?
Update: I looked at the RestSharp source, and discovered that the RestRequest.AddBody method that takes my object and serializes it into the request body eventually calls Request.AddParameter (with the serialized object data, and the parameter type RequestBody).
I figured that I might be able to serialize my object to BSON and then call Request.AddParameter directly - and indeed I can. However, when RestSharp then executes the RestRequest, it fails to put the binary content into the request, because there are other embedded assumptions about the request content being UTF-8 encoded.
Thus it looks like this hack would not work - there would need to be some changes made to RestSharp itself, and I'm not the man for the job...
Update 2: I decided to have a go at using the debugger to figure out how much of RestSharp I'd have to change to overcome the body-encoding issue, so I swapped out my NuGet version of RestSharp and included the RestSharp project in my solution. And... it worked.
It turns out that there has been a change to RestSharp in the last few months that isn't yet in the NuGet version.
So, you can now use AddParameter and pass in an already-BSON-encoded object, and RestSharp will send it off to the server without complaint.

Per the updates in my question, it turns out that if you have the latest RestSharp source, then instead of this:
request.AddBody(myObject);
... you can do this instead whenever you have a payload that would benefit from using BSON:
using (var memoryStream = new System.IO.MemoryStream())
{
using (var bsonWriter = new Newtonsoft.Json.Bson.BsonWriter(memoryStream))
{
var serializer = new Newtonsoft.Json.JsonSerializer();
serializer.Serialize(bsonWriter, myObject);
var bytes = memoryStream.ToArray();
request.AddParameter("application/bson", bytes, RestSharp.ParameterType.RequestBody);
}
}
Note that the first parameter to AddParameter is supposedly the parameter name, but in the case of ParameterType.RequestBody it's actually used as the content type. (Yuk).
Note that this relies on a change made to RestSharp on April 11 2013 by ewanmellor/ayoung, and this change is not in the current version on NuGet (104.1). Hence this will only work if you include the current RestSharp source in your project.

Gary's answer to his own question was incredibly useful for serializing restful calls. I wanted to answer how to deserialize restful calls using JSON.NET. I am using RestSharp version 104.4 for Silverlight. My server is using Web API 2.1 with BSON support turned on.
To accept a BSON response, create a BSON Deserializer for RestSharp like so
public class BsonDeserializer : IDeserializer
{
public string RootElement { get; set; }
public string Namespace { get; set; }
public string DateFormat { get; set; }
public T Deserialize<T>(IRestResponse response)
{
using (var memoryStream = new MemoryStream(response.RawBytes))
{
using (var bsonReader = new BsonReader(memoryStream))
{
var serializer = new JsonSerializer();
return serializer.Deserialize<T>(bsonReader);
}
}
}
}
Then, ensure your request accepts "application/bson"
var request = new RestRequest(apiUrl, verb);
request.AddHeader("Accept", "application/bson");
And add a handler for that media type
var client = new RestClient(url);
client.AddHandler("application/bson", new BsonDeserializer());

Related

.NET Core pdf downloader "No output formatter was found for content types 'application/pdf'..."

I'm creating a .NET Core 3.1 web api method to download a pdf for a given filename. This method is shared across teams where their client code is generated using NSwag.
I recently changed produces attribute to Produces("Application/pdf") from json, this change is required so other teams can generate valid client code. However since this change, I haven't been able to download any files from this method. Requests to download documents return with a 406 error (in Postman) and the following error is logged to the server event viewer.
No output formatter was found for content types 'application/pdf, application/pdf' to write the response.
Reverting the produced content-type to 'application/json' does allow documents to be downloaded, but as mentioned, this value is required to be pdf.
Any suggestions would be greatly appreciated.
Method:
[HttpGet("{*filePath}")]
[ProducesResponseType(typeof(FileStreamResult), StatusCodes.Status200OK)]
[ProducesResponseType(StatusCodes.Status404NotFound)]
[ProducesResponseType(StatusCodes.Status400BadRequest)]
[ProducesResponseType(StatusCodes.Status401Unauthorized)]
[Produces("Application/pdf")]
public async Task<ActionResult> GetDocument(string fileName) {
RolesRequiredHttpContextExtensions.ValidateAppRole(HttpContext, _RequiredScopes);
var memoryStream = new MemoryStream();
var memoryStream = new MemoryStream();
using (var stream = new FileStream(filePath, FileMode.Open, FileAccess.Read, FileShare.Read, bufferSize: 4096, useAsync: true)) {
stream.CopyTo(memoryStream);
}
memoryStream.Seek(offset: 0, SeekOrigin.Begin);
return new FileStreamResult(memoryStream, "Application/pdf");
}
I just came across the same error and after some investigation I found out that the cause of the exception was indeed in the model binding error. You already wrote about it in your answer, but on closer inspection it became obvious that the reason was not related to binding itself, rather to the response body.
Since you specified [Produces("application/pdf")] the framework assumes this content type is the only possible for this action, but when an exception is thrown, you get application/json containing error description instead.
So to make this work for both "happy path" and exceptions, you could specify multiple response types:
[Produces("application/pdf", "application/json")]
public async Task<ActionResult> GetDocument(string fileName)
{
...
}
I'am using
public asnyc Task<IActionResult> BuildPDF()
{
Stream pdfStream = _pdfService.GetData();
byte[] memoryContent = pdfStream.ToArray();
return File(memoryContent, "application/pdf");
}
and it works. Could you please try?
The issue was caused by renaming the method parameter and not updating [HttpGet("{*filePath}")] to [HttpGet("{*fileName}")]
I had the same error, it is very confusing in some cases.
I got this error after adding new parameter of type int[] to my method forgetting [FromQuery] attribute for it.
After adding [FromQuery] attribute error gone.

Is there a way to require a minor version be present in the URL path segment?

I'm using the aspnet-api-versioning library to create a fairly simple versioning strategy, using namespaces/folders to specify the routes. My startup code looks like this:
services.AddApiVersioning(options =>
{
options.AssumeDefaultVersionWhenUnspecified = true;
options.ReportApiVersions = true;
options.Conventions.Add(new VersionByNamespaceConvention());
});
My controllers generally look like this:
namespace Something.Api.V1_0.Controllers
{
[ApiController]
[Route("[controller]")]
[Route("v{version:apiVersion}/[controller]")]
public class MessageController : ControllerBase
{
[HttpGet]
public IActionResult Get()
{
var formattedVersion = GetApiVersionString(HttpContext.GetRequestedApiVersion());
var message = $"V1: It is Get in v1 controller (version {formattedVersion}).";
return Ok(message);
}
[HttpGet]
[Route("specific-route")]
public IActionResult SpecificRoute()
{
var formattedVersion = GetApiVersionString(HttpContext.GetRequestedApiVersion());
var message = $"V1: It is specific route (version {formattedVersion}).";
return Ok(message);
}
}
}
Don't focus on GetApiVersionString(), it's just a private method that formats the version string.
This setup allows me to send the following routes to this controller's "specific route" method:
/Message/specific-route (this is for clients unaware of versioning)
/v1/Message/specific-route (matches 1.0)
/v1.0/Message/specific-route
Ideally, I would like to eliminate the second of those, and require that all calls which specify a version, specify a major and minor version. In other words, I'd like the response to be the same to a call to "v1" as it would be to a call to any undefined version. Does this library offer any technique to accomplish this?
Note: All of my namespaces/folders will follow the V_x format, so there will always be a minor version available in my scheme.
The answer is - yes ... and no.
Yes, it is possible to force the formatted API version string value to contain the minor version. Using version.ToString("VV") will format the version in the major.minor format. If the minor version is unspecified, it will be assumed to be 0. This is also true for comparisons, which means that 1 == 1.0.
No, in the sense that a client might not specify the minor numeric version. While it may be useful in being 100% explicit, it tends to be more pragmatic to make it optional when versioning by URL for client requests. You are completely free to impose your will though and generate URLs that explicitly have the .0 in it. Ultimately the net effect is the same.
You could also add some type of middleware or something, and short-circuit if there is an API version that yields null for the minor version.
BTW: you can still use HttpContext.GetRequestedApiVersion(), but Model Binding makes this much easier now. You can achieve the same thing by adding an API version parameter. For example, public IActionResult Get(ApiVersion version).

How to secure json result from hijacking in ASPNet Core

I need to secure my json responses from hijacking in aspnet core. I used to have a working solution in beta 8, but I cannot get that to work properly in 1.1.
What I want to do is to check whether a Json response is an enumerable or not. If it is, I wrap the enumerable in an object.
This is what I had in an override of JsonOutputFormatter:
public override Task WriteResponseBodyAsync(OutputFormatterContext context)
{
if (context.Object is IEnumerable)
{
context.Object = new { result = context.Object };
}
return base.WriteResponseBodyAsync(context);
}
This no longer works due to changes in the framework.
Does anyone have a good suggestion on how to wrap enumerable responses into an object to avoid hijacking?

Very slow performance deserializing using datacontractserializer in a Silverlight Application

Here is the situation:
Silverlight 3 Application hits an asp.net hosted WCF service to get a list of items to display in a grid. Once the list is brought down to the client it is cached in IsolatedStorage. This is done by using the DataContractSerializer to serialize all of these objects to a stream which is then zipped and then encrypted. When the application is relaunched, it first loads from the cache (reversing the process above) and the deserializes the objects using the DataContractSerializer.ReadObject() method. All of this was working wonderfully under all scenarios until recently with the entire "load from cache" path (decrypt/unzip/deserialize) taking hundreds of milliseconds at most.
On some development machines but not all (all machines Windows 7) the deserialize process - that is the call to ReadObject(stream) takes several minutes an seems to lock up the entire machine BUT ONLY WHEN RUNNING IN THE DEBUGGER in VS2008. Running the Debug configuration code outside the debugger has no problem.
One thing that seems to look suspicious is that when you turn on stop on Exceptions, you can see that the ReadObject() throws many, many System.FormatException's indicating that a number was not in the correct format. When I turn off "Just My Code" thousands of these get dumped to the screen. None go unhandled. These occur both on the read back from the cache AND on a deserialization at the conclusion of a web service call to get the data from the WCF Service. HOWEVER, these same exceptions occur on my laptop development machine that does not experience the slowness at all. And FWIW, my laptop is really old and my desktop is a 4 core, 6GB RAM beast.
Again, no problems unless running under the debugger in VS2008. Anyone else seem this? Any thoughts?
Here is the bug report link: https://connect.microsoft.com/VisualStudio/feedback/details/539609/very-slow-performance-deserializing-using-datacontractserializer-in-a-silverlight-application-only-in-debugger
EDIT: I now know where the FormatExceptions are coming from. It seems that they are "by design" - they occur when when I have doubles being serialized that are double.NaN so that that xml looks like NaN...It seems that the DCS tries to parse the value as a number, that fails with an exception and then it looks for "NaN" et. al. and handles them. My problem is not that this does not work...it does...it is just that it completely cripples the debugger. Does anyone know how to configure the debugger/vs2008sp1 to handle this more efficiently.
cartden,
You may want to consider switching over to XMLSerializer instead. Here is what I have determined over time:
The XMLSerializer and DataContractSerializer classes provides a simple means of serializing and deserializing object graphs to and from XML.
The key differences are:
1.
XMLSerializer has much smaller payload than DCS if you use [XmlAttribute] instead of [XmlElement]
DCS always store values as elements
2.
DCS is "opt-in" rather than "opt-out"
With DCS you explicitly mark what you want to serialize with [DataMember]
With DCS you can serialize any field or property, even if they are marked protected or private
With DCS you can use [IgnoreDataMember] to have the serializer ignore certain properties
With XMLSerializer public properties are serialized, and need setters to be deserialized
With XmlSerializer you can use [XmlIgnore] to have the serializer ignore public properties
3.
BE AWARE! DCS.ReadObject DOES NOT call constructors during deserialization
If you need to perform initialization, DCS supports the following callback hooks:
[OnDeserializing], [OnDeserialized], [OnSerializing], [OnSerialized]
(also useful for handling versioning issues)
If you want the ability to switch between the two serializers, you can use both sets of attributes simultaneously, as in:
[DataContract]
[XmlRoot]
public class ProfilePerson : NotifyPropertyChanges
{
[XmlAttribute]
[DataMember]
public string FirstName { get { return m_FirstName; } set { SetProperty(ref m_FirstName, value); } }
private string m_FirstName;
[XmlElement]
[DataMember]
public PersonLocation Location { get { return m_Location; } set { SetProperty(ref m_Location, value); } }
private PersonLocation m_Location = new PersonLocation(); // Should change over time
[XmlIgnore]
[IgnoreDataMember]
public Profile ParentProfile { get { return m_ParentProfile; } set { SetProperty(ref m_ParentProfile, value); } }
private Profile m_ParentProfile = null;
public ProfilePerson()
{
}
}
Also, check out my Serializer class that can switch between the two:
using System;
using System.IO;
using System.Runtime.Serialization;
using System.Text;
using System.Xml;
using System.Xml.Serialization;
namespace ClassLibrary
{
// Instantiate this class to serialize objects using either XmlSerializer or DataContractSerializer
internal class Serializer
{
private readonly bool m_bDCS;
internal Serializer(bool bDCS)
{
m_bDCS = bDCS;
}
internal TT Deserialize<TT>(string input)
{
MemoryStream stream = new MemoryStream(input.ToByteArray());
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
return (TT)dc.ReadObject(stream);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
return (TT)xs.Deserialize(stream);
}
}
internal string Serialize<TT>(object obj)
{
MemoryStream stream = new MemoryStream();
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(stream, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(stream, obj);
}
// be aware that the Unicode Byte-Order Mark will be at the front of the string
return stream.ToArray().ToUtfString();
}
internal string SerializeToString<TT>(object obj)
{
StringBuilder builder = new StringBuilder();
XmlWriter xmlWriter = XmlWriter.Create(builder);
if (m_bDCS)
{
DataContractSerializer dc = new DataContractSerializer(typeof(TT));
dc.WriteObject(xmlWriter, obj);
}
else
{
XmlSerializer xs = new XmlSerializer(typeof(TT));
xs.Serialize(xmlWriter, obj);
}
string xml = builder.ToString();
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern("<?xml*>", WildcardSearch.Anywhere), string.Empty);
xml = RegexHelper.ReplacePattern(xml, RegexHelper.WildcardToPattern(" xmlns:*\"*\"", WildcardSearch.Anywhere), string.Empty);
xml = xml.Replace(Environment.NewLine + " ", string.Empty);
xml = xml.Replace(Environment.NewLine, string.Empty);
return xml;
}
}
}
This is a guess, but I think it is running slow in debug mode because for every exception, it is performing some actions to show the exception in the debug window, etc. If you are running in release mode, these extra steps are not taken.
I've never done this, so I really don't know id it would work, but have you tried just setting that one assembly to run in release mode while all others are set to debug? If I'm right, it may solve your problem. If I'm wrong, then you only waste 1 or 2 minutes.
About your debugging problem, have you tried to disable the exception assistant ? (Tools > Options > Debugging > Enable the exception assistant).
Another point should be the exception handling in Debug > Exceptions : you can disable the user-unhandled stuff for the CLR or only uncheck the System.FormatException exception.
Ok - I figured out the root issue. It was what I alluded to in the EDIT to the main question. The problem was that in the xml, it was correctly serializing doubles that had a value of double.NaN. I was using these values to indicate "na" for when the denominator was 0D. Example: ROE (Return on Equity = Net Income / Average Equity) when Average Equity is 0D would be serialized as:
<ROE>NaN</ROE>
When the DCS tried to de-serialize it, evidently it first tries to read the number and then catches the exception when that fails and then handles the NaN. The problem is that this seems to generate a lot of overhead when in DEBUG mode.
Solution: I changed the property to double? and set it to null instead of NaN. Everything now happens instantly in DEBUG mode now. Thanks to all for your help.
Try disabling some IE addons. In my case, the LastPass toolbar killed my Silverlight debugging. My computer would freeze for minutes each time I interacted with Visual Studio after a breakpoint.

WCF: (MTOM) is there any way to change the scheme used in xop:Content reference uris generated by WCF?

WCF uses http://tempuri/1/number for Content-ID uri references when handling streamed MTOM requests.
Is there any way how to force WCF to use a different Content-ID references for the xop:Include?
Background of the problem:
I am building a .NET client for MTOM enabled jax ws java web service that handles streamed large data uploads. I have hand crafted the service and data contacts (the WSDL generated contracts were not correct and did not allow streaming).
The problem is that the web service (jax ws) does not receive the request body containing the data.
It receives the data that is transferred in headers.
We have built a java client for the ws - this one works.
I have captured and compared the HTTP traffic when issuing requests from java and wcf, and the only difference is in how Content-ID reference is generated when posting the multipart data:
WCF uses http://tempuri/1/... Content-ID references which yield in encoded value, like href="cid:http%3A%2F%2Ftempuri.org%2F1%2F634019957020047928"
Java client uses "email-style" uris, like href="cid:3f3ec388-8cd9-47aa-a603-fb1bc17935b8#example.jaxws.sun.com"
These yield in the following xop-includes (Data is the only element in the soap body) (XOP includes specification)
//WCF:
<Data>
<xop:Include xmlns:xop="http://www.w3.org/2004/08/xop/include" href="cid:http%3A%2F%2Ftempuri.org%2F1%2F634019957020047928" />
</Data>
//JAVA:
<Data>
<xop:Include xmlns:xop="http://www.w3.org/2004/08/xop/include" href="cid:3f3ec388-8cd9-47aa-a603-fb1bc17935b8#example.jaxws.sun.com"/>
</Data>
later on, in the multipart data, the content is referred to by unencoded Content-ID:
--uuid:7e166bb7-042f-4ba3-b6ef-98fbbc21244b+id=1
Content-ID: <http://tempuri.org/1/634019957020047928>
Content-Transfer-Encoding: binary
Content-Type: application/octet-stream
I guess there may be a bug in the jax web service framework and it is not recognizing WCF-generated+urlencoded Content-ID uri references.
Is there any way how to force WCF to use a different Content-ID references for the xop:Include?
EDIT: I have found the XmlMtomWriter which has the GenerateUriForMimePart method, this is used to generate Content-IDs.
public static string GenerateUriForMimePart(int index)
{
return string.Format(CultureInfo.InvariantCulture,
"http://tempuri.org/{0}/{1}", new object[] { index, DateTime.Now.Ticks });
}
It does not seem that the ID generation is in any way overridable.
A similar issue is described here, the answer provided does not help: http://social.msdn.microsoft.com/Forums/en/wcf/thread/f90affbd-f431-4602-a81d-cc66c049e351
Asnwering to myself after long investigation: Not possible without reimplementing the whole XmlMtomWriter and other related layers and concerns in WCF - almost everything involved in the mtom implementation is internal.
I know it is an old question. But I'm faced the same problem two days ago.
I found a way which works BUT it is a VERY VERY dirty hack (I know that. I thought about not publishing it here but perhaps it would help somebody.) Hopefully you will not blame me for that.
The ContentId is formatted with the use of CultureInfo.InvariantCulture. I didn't find an official way for replacing it with a custom CultureInfo. But with the help of reflection I got it running. The following implementation is only for .Net 4.0.
public class NoTempUriInvariantCultureInfo : CultureInfo, ICustomFormatter
{
private static CultureInfo originalCulture;
private static object originalCultureLock;
private static int enableCounter;
private NoTempUriInvariantCultureInfo(CultureInfo invariantCulture)
: base(invariantCulture.Name)
{
originalCulture = invariantCulture;
}
public static void Enable()
{
if(originalCultureLock == null)
originalCultureLock = new object();
lock (originalCultureLock)
{
if (enableCounter == 0)
{
var mInvCultField = typeof (CultureInfo).GetField("s_InvariantCultureInfo", BindingFlags.NonPublic | BindingFlags.Static);
mInvCultField.SetValue(null, new NoTempUriInvariantCultureInfo(CultureInfo.InvariantCulture));
}
enableCounter++;
}
}
public static void Disable()
{
lock (originalCulture)
{
if (enableCounter == 0)
return;
enableCounter--;
if (enableCounter == 0)
{
var mInvCultField = typeof (CultureInfo).GetField("s_InvariantCultureInfo", BindingFlags.NonPublic | BindingFlags.Static);
mInvCultField.SetValue(null, NoTempUriInvariantCultureInfo.originalCulture);
}
}
}
public override object GetFormat(Type formatType)
{
var result = originalCulture.GetFormat(formatType);
return result ?? this;
}
public string Format(string format, object arg, IFormatProvider formatProvider)
{
if (format == null)
return System.Text.RegularExpressions.Regex.Replace(arg.ToString().Replace("http%3A%2F%2Ftempuri.org%2F1%2F", ""), "http[:][/][/]tempuri[.]org[/][0-9]+[/]*", "");
return String.Format("{0:" + format + "}", arg);
}
}
I enable my own "InvariantCulture" only before a WCF call.
NoTempUriInvariantCultureInfo.Enable();
try
{
// make your call
}
finally
{
NoTempUriInvariantCultureInfo.Disable();
}
CultureInfo.InvariantCulture is a global state object. Enabling my own InvariantCulture affects every other thread.
Again, it is a dirty hack. But it works.
Both of the XOP includes samples that you indicated are correct and acceptable according to the W3C. I refer to them as the URL format and the Email format respectively.
I am not a JAVA developer, but recall a similiar problem when interfacing with a particular JAVA web service. I recall there being a bug in a particular JAVA release and after they (the JAVA developers) upgraded to the next release version, this issue simply went away. I wish I could provide you more details, but at the time, there were enough problems for me to address from my end of the wire and I was just glad to have one less item on the defect log.
//WCF: using URL format
<Data>
<xop:Include xmlns:xop="http://www.w3.org/2004/08/xop/include" href="cid:http%3A%2F%2Ftempuri.org%2F1%2F634019957020047928" />
</Data>
//JAVA: using EMAIL format
<Data>
<xop:Include xmlns:xop="http://www.w3.org/2004/08/xop/include" href="cid:3f3ec388-8cd9-47aa-a603-fb1bc17935b8#example.jaxws.sun.com"/>
</Data>