.NET API array of objects becomes empty object ONLY when returned as JSON - sql

I have a .NET API that was formally rendering views with a model, but is now set to return the model directly to the API caller as a JSON. When rendering the view with the model, I have an array of objects that populates fine. When returning strictly JSON data, it's missing almost all of that data.
Running the debugger, on the return the data is present, but the end result has an empty object instead of an array of objects.
Pseudo Code:
obj.property1 = sqlQuery1.ToArray(); //Empty object if returned using Ok(result)
obj.property2 = sqlQuery2.ToArray(); //Empty object if returned using Ok(result)
obj.property3 = sqlQuery3.ToArray(); //This one comes through for some reason
//Pack these objects into array
return Ok(result); //Most properties become {}, debugging confirms they exist at this point
return View(result); //All properties render normally

It was solved. The issue is that the models were using [DataContract] and this resulting in empty objects. Removing this safely fixed the issue.

Related

Efficient way to bring parameters into controller action URL's

In ASP.Net Core you have multiple ways to generate an URL for controller action, the newest being tag helpers.
Using tag-helpers for GET-requests asp-route is used to specify route parameters. It is from what I understand not supported to use complex objects in route request. And sometimes a page could have many different links pointing to itself, possible with minor addition to the URL for each link.
To me it seems wrong that any modification to controller action signature requires changing all tag-helpers using that action. I.e. if one adds string query to controller, one must add query to model and add asp-route-query="#Model.Query" 20 different places spread across cshtml-files. Using this approach is setting the code up for future bugs.
Is there a more elegant way of handling this? For example some way of having a Request object? (I.e. request object from controller can be put into Model and fed back into action URL.)
In my other answer I found a way to provide request object through Model.
From the SO article #tseng provided I found a smaller solution. This one does not use a request object in Model, but retains all route parameters unless explicitly overridden. It won't allow you to specify route through an request object, which is most often not what you want anyway. But it solved problem in OP.
<a asp-controller="Test" asp-action="HelloWorld" asp-all-route-data="#Context.GetQueryParameters()" asp-route-somestring="optional override">Link</a>
This requires an extension method to convert query parameters into a dictionary.
public static Dictionary GetQueryParameters(this HttpContext context)
{
return context.Request.Query.ToDictionary(d => d.Key, d => d.Value.ToString());
}
There's a rationale here that I don't think you're getting. GET requests are intentionally simplistic. They are supposed to describe a specific resource. They do no have bodies, because you're not supposed to be passing complex data objects in the first place. That's not how the HTTP protocol is designed.
Additionally, query string params should generally be optional. If some bit of data is required in order to identify the resource, it should be part of the main URI (i.e. the path). As such, neglecting to add something like a query param, should simply result in the full data set being returned instead of some subset defined by the query. Or in the case of something like a search page, it generally will result in a form being presented to the user to collect the query. In other words, you action should account for that param being missing and handle that situation accordingly.
Long and short, no, there is no way "elegant" way to handle this, I suppose, but the reason for that is that there doesn't need to be. If you're designing your routes and actions correctly, it's generally not an issue.
To solve this I'd like to have a request object used as route parameters for anchor TagHelper. This means that all route links are defined in only one location, not throughout solution. Changes made to request object model automatically propagates to URL for <a asp-action>-tags.
The benefit of this is reducing number of places in the code we need to change when changing method signature for a controller action. We localize change to model and action only.
I thought writing a tag-helper for a custom asp-object-route could help. I looked into chaining Taghelpers so mine could run before AnchorTagHelper, but that does not work. Creating instance and nesting them requires me to hardcode all properties of ASP.Net Cores AnchorTagHelper, which may require maintenance in the future. Also considered using a custom method with UrlHelper to build URL, but then TagHelper would not work.
The solution I landed on is to use asp-all-route-data as suggested by #kirk-larkin along with an extension method for serializing to Dictionary. Any asp-all-route-* will override values in asp-all-route-data.
<a asp-controller="Test" asp-action="HelloWorld" asp-all-route-data="#Model.RouteParameters.ToDictionary()" asp-route-somestring="optional override">Link</a>
ASP.Net Core can deserialize complex objects (including lists and child objects).
public IActionResult HelloWorld(HelloWorldRequest request) { }
In the request object (when used) would typically have only a few simple properties. But I thought it would be nice if it supported child objects as well. Serializing object into a Dictionary is usually done using reflection, which can be slow. I figured Newtonsoft.Json would be more optimized than writing simple reflection code myself, and found this implementation ready to go:
public static class ExtensionMethods
{
public static IDictionary ToDictionary(this object metaToken)
{
// From https://geeklearning.io/serialize-an-object-to-an-url-encoded-string-in-csharp/
if (metaToken == null)
{
return null;
}
JToken token = metaToken as JToken;
if (token == null)
{
return ToDictionary(JObject.FromObject(metaToken));
}
if (token.HasValues)
{
var contentData = new Dictionary();
foreach (var child in token.Children().ToList())
{
var childContent = child.ToDictionary();
if (childContent != null)
{
contentData = contentData.Concat(childContent)
.ToDictionary(k => k.Key, v => v.Value);
}
}
return contentData;
}
var jValue = token as JValue;
if (jValue?.Value == null)
{
return null;
}
var value = jValue?.Type == JTokenType.Date ?
jValue?.ToString("o", CultureInfo.InvariantCulture) :
jValue?.ToString(CultureInfo.InvariantCulture);
return new Dictionary { { token.Path, value } };
}
}

belongsTo only being set on first and last member of hasMany

My adapter uses findHasMany to load child records for a hasMany relationship.
My findHasMany adapter method is directly based on the test case for findHasMany. It retrieves the contents of the hasMany on demand, and eventually does the following two operations:
store.loadMany(type, hashes);
// ...
store.loadHasMany(record, relationship.key, ids);
(The full code for the findHasMany is below, in case the issue is there, but I don't think so.)
The really strange behavior is: it seems that somewhere within loadHasMany (or in some subsequent async process) only the first and last child records get their inverse belongsTo property set, even though all the child records are added to the hasMany side. I.e., if posts/1 has 10 comments, this is what I get, after everything has loaded:
var post = App.Posts.find('1');
post.get('comments').objectAt(0).get('post'); // <App.Post:ember123:1>
post.get('comments').objectAt(1).get('post'); // null
post.get('comments').objectAt(2).get('post'); // null
// ...
post.get('comments').objectAt(8).get('post'); // null
post.get('comments').objectAt(9).get('post'); // <App.Post:ember123:1>
My adapter is a subclass of DS.RESTAdapter, and I don't think I'm overloading anything in my adapter or serializer that would cause this behavior.
Has anybody seen something like this before? It's weird enough I though someone might know why it's happening.
Extra
Using findHasMany lets me load the contents of the hasMany only when the property is accessed (valuable in my case because calculating the array of IDs would be expensive). So say I have the classic posts/comments example models, the server returns for posts/1:
{
post: {
id: 1,
text: "Linkbait!"
comments: "/posts/1/comments"
}
}
Then my adapter can retrieve /posts/1/comments on demand, which looks like this:
{
comments: [
{
id: 201,
text: "Nuh uh"
},
{
id: 202,
text: "Yeah huh"
},
{
id: 203,
text: "Nazi Germany"
}
]
}
Here is the code for the findHasMany method in my adapter:
findHasMany: function(store, record, relationship, details) {
var type = relationship.type;
var root = this.rootForType(type);
var url = (typeof(details) == 'string' || details instanceof String) ? details : this.buildURL(root);
var query = relationship.options.query ? relationship.options.query(record) : {};
this.ajax(url, "GET", {
data: query,
success: function(json) {
var serializer = this.get('serializer');
var pluralRoot = serializer.pluralize(root);
var hashes = json[pluralRoot]; //FIXME: Should call some serializer method to get this?
store.loadMany(type, hashes);
// add ids to record...
var ids = [];
var len = hashes.length;
for(var i = 0; i < len; i++){
ids.push(serializer.extractId(type, hashes[i]));
}
store.loadHasMany(record, relationship.key, ids);
}
});
}
Solution
Override the DS.RelationshipChange.getByReference method by inserting the following code into your app:
DS.RelationshipChange.prototype.getByReference = function(reference) {
var store = this.store;
// return null or undefined if the original reference was null or undefined
if (!reference) { return reference; }
if (reference.record) {
return reference.record;
}
return store.materializeRecord(reference);
};
Yes, this is overriding a private, internal method in Ember Data. Yes, it may break at any time with any update. I'm pretty sure this is a bug in Ember Data, but I'm not 100% certain this is the right solution. But it does solve this problem, and possibly other relationship-related problems.
This fix is designed to be applied to Ember Data master as of 29 Apr 2013.
Reason
DS.Store.loadHasMany calls DS.Model.hasManyDidChange, which retrieves references for all the child records and then sets the hasMany's content to the array of references. This kicks off a chain of observers., eventually calling DS.ManyArray.arrayContentDidChange, in which the first line is this._super.apply(this, arguments);, calling the superclass method Ember.Array.arrayContentDidChange. That Ember.Array method includes an optimization that caches the first and last object in the array and calls objectAt on only those two array members. So there's the part that singles out the first and last record.
Next, since DS.RecordArray implements an objectAtContent method (from Ember.ArrayProxy), the objectAtContent implementation calls DS.Store.recordForReference, which in turn calls DS.Store.materializeRecord. This last function adds a record property to the reference that is passed in as a side effect.
Now we get to what I think is a bug. In DS.ManyArray.arrayContentDidChange, after calling the superclass method, it loops through all the new references and creates a DS.RelationshipChangeAdd instance that encapsulates the owner and child record references. But the first line inside the loop is:
var reference = get(this, 'content').objectAt(i);
Unlike what happens above to the first and last record, this calls objectAt directly on the Ember.NativeArray and bypasses the ArrayProxy methods including the objectAtContent hook, which means that DS.Store.materializeRecord--which adds the record property on the reference object--may have never been called on some references.
Next, the relationship changes created in the loop are immediately afterward (in the same run loop) applied with this call tree: DS.RelationshipChangeAdd.sync -> DS.RelationshipChange.getFirstRecord -> DS.RelationshipChange.getByReference. This last method expects the reference object to have a record property. However, the record property is only set on the first and last reference objects, for reasons explained above. Therefore, for all but the first and last records, the relationship fails to be established because it doesn't have access to the child record object!
The above fix calls DS.Store.materializeRecord whenever the record property doesn't exist on the reference. The last line in the function is the only thing added. On the one hand, it looks like this was the original intention: that var store = this.store; line in the original declares a variable that isn't otherwise used in the function, so what's it there for? Also, without the added line, the function doesn't always return a value, which is a little unusual for a function which is expected to do so. On the other hand, this could lead to mass materialization in some cases where that would be undesirable (but, the relationships just won't work without it in some cases, it seems).
Possibly related
The "chain of observers" I mentioned takes a bit of an odd path. The initiating event was setting the content property on a DS.ManyArray, which extends Ember.ArrayProxy--therefore the content property has a dependent property arrangedContent. Importantly, the observers on arrangedContent are executed before observers on content are executed (see Ember.propertyDidChange). However, the default implementation of Ember.ArrayProxy.arrangedContentArrayDidChange simply calls Ember.Array.arrayContentDidChange, which DS.ManyArray implements! The point being, this looks like a recipe for some code to execute in an unintended order. That is, I think Ember.ManyArray.arrayContentDidChange may getting executed earlier than expected. If this is the case, the above mentioned code that expects the record property to already exist on all references may have been expecting this reasonably, as one of the observers directly on the content property may call DS.Store.materializeRecord on each reference. But I haven't dug deep enough to find out if this is true.

Creating WCF service by determining type at runtime

I am trying to create a WCF service without knowing its type/interface at runtime. To do this, I use ChannelFactory. ChannelFactory is a generic class so I need to use Type.MakeGenericType. The type I pass to MakeGenericType is from a list of interfaces I previously gathered with reflection by searching some assemblies.
Ultimately, I call MethodInfo.Invoke to create the object. The object is created just fine, but I cannot cast it to the proper interface. Upon casting, I receive the following error:
"Unable to cast transparent proxy to type 'Tssc.Services.MyType.IMyType'"
After some experimenting, I have found that the interface/type passed to MakeGenericType seems to be the problem. If I substitute the interface in my list with the actual interface, then everything works fine. I have combed through the two objects and cannot see a difference. When I modify the code to produce both types, comparing them with Equals returns false. It is unclear to me whether Equals is just checking that they are referring to the same object (not) or thety are checking all properties, etc.
Could this have something to do with how I gathered my interfaces (Reflection, saving in a list...)? A comparison of the objects seems to indicate they are equivalent. I printed all properties for both objects and they are the same. Do I need to dig deeper? If so, into where?
// createService() method
//*** tried both of these interfaces, only 2nd works - but they seem to be identical
//Type t = interfaces[i]; // get type from list created above - doesn't work
Type t = typeof(Tssc.Services.MyType.IMyType); // actual type - works OK
// create ChannelFactory type with my type parameter (t)
Type factoryType = typeof(ChannelFactory<>);
factoryType = factoryType.MakeGenericType(new Type[] { t });
// create ChannelFactory<> object with two-param ctor
BasicHttpBinding binding = new BasicHttpBinding();
string address = "blah blah blah";
var factory = Activator.CreateInstance(factoryType, new object[] { binding, address });
// get overload of ChannelFactory<>.CreateChannel with no parameters
MethodInfo method = factoryType.GetMethod("CreateChannel", new Type[] { });
return method.Invoke(factory, null);
//--------------- code that calls code above and uses its return
object ob = createService();
//*** this cast fails
Tssc.Services.MyType.IMyType service = (Tssc.Services.MyType.IMyType)ob;
Ok, I understand whats happening here - the problem is relating to loading the same assembly being effectively loaded twice - once via a reference, and once via the assembly load command. What you need to do is change the place where you load your assembly, and check to see if it already exists in the current AppDomain, like this maybe:
Assembly assembly = AppDomain.CurrentDomain.GetAssemblies().FirstOrDefault(a => a.GetName().Name.Equals("ClassLibrary1Name"));
if (assembly == null)
{
assembly = System.Reflection.Assembly.LoadFile("path to your assembly");
}
//do your work here
This way if the assembly is already loaded into memory, it'll use that one.

JSON result problems in ASP.NET Web API when returning value types?

I'm learning aspnet mvc 4 web api, and find it very easy to implement by simply returning the object in the apicontrollers.
However, when I try to return value types such as bool, int, string - it does not return in JSON format at all. (in Fiddler it showed 'true/false' result in raw and webview but no content in JSON at all.
Anyone can help me on this?
Thanks.
Some sample code for the TestApiController:
public bool IsAuthenticated(string username)
{
return false;
}
Some sample code for the jQuery usage:
function isAuthenticated(string username){
$.getJSON(OEliteAPIs.ApiUrl + "/api/membership/isauthenticated?username="+username,
function (data) {
alert(data);
if (data)
return true;
else
return false;
});
}
NOTE: the jquery above returns nothing because EMPTY content was returned - however if you check it in fiddler you can actually see "false" being returned in the webview.
cheers.
Before your callback function is called, the return data is passed to the jquery parseJSON method, which expects the data to be in the JSON format. jQuery will ignore the response data and return null if the response is not formatted correctly. You have two options, wrap you return boolean in a class or anonymous type so that web api will return a JSON object:
return new { isAuthentication = result }
or don't use getJSON from jQuery since you're not returning a properly formatted JSON response. Maybe just use $.get instead.
Below is a quote for the jQuery documentation:
Important: As of jQuery 1.4, if the JSON file contains a syntax error,
the request will usually fail silently. Avoid frequent hand-editing of
JSON data for this reason. JSON is a data-interchange format with
syntax rules that are stricter than those of JavaScript's object
literal notation. For example, all strings represented in JSON,
whether they are properties or values, must be enclosed in
double-quotes. For details on the JSON format, see http://json.org/.

Doctrine ODM: Cannot prime->(true) getSingleResult(); throws cursor error

I have a document that has a ReferenceMany attribute to another document. The reference is setup fine, and the data is returned from the query fine, but each document in the arraycollection is returned as a proxy. On this site, I saw it was mentioned I should add ->prime(true) in order to return the actual referenced documents.
When that ArrayCollection of documents is returned, I am running a loop of ids I have submitted to the server to remove them from the referenced collection. The removeElement method is not working b/c the returned documents are proxies, and I am comparing an actual document vs. those proxies. So basically I am trying to:
Look up a single document
Force all documents in the ReferenceMany attribute to be actual documents and not Proxy documents
Loop through my array of id's and load each document
Send the document to the removeElement method
On the first getSingleResult query method below, I am getting an error cannot modify cursor after beginning iteration. I saw a thread on this site mention you should prime the results in order to get actual documents back instead of proxies, and in his example, he used getSingleResult.
$q = $this->dm->createQueryBuilder('\FH\Document\Person')->field('target')->prime(true)->field('id')->equals($data->p_id);
$person = $q->getQuery()->getSingleResult();
foreach($data->loc_id as $loc) {
$location = $this->dm->createQueryBuilder('\FH\Document\Location')->field('id')->equals(new \MongoId($loc))->getQuery()->getSingleResult();
$person->removeTarget($location);
}
....
....
....
public function removeTarget($document)
{
$this->target->removeElement($document);
return $this;
}
If I remove ->prime(true) from the first query, it doesn't throw an error, yet it doesn't actually remove any elements even though I breakpoint on the method, compare the two documents, and the data is exactly the same, except in $this->target they are Location Proxy documents, and the loaded one is an actual Location Document.
Can I prime the single result somehow so I can use the ArrayCollection methods properly, or do I need to just do some for loop and compare ids?
UPDATE
So here is an update showing the problem I am having. While the solution below would work just using the MongoId(s), when I submit an actual Document class, it never actually removes the document. The ArrayCollection comes back from Doctrine as a PersistentCollection. Each element in $this->coll is of this Document type:
DocumentProxy\__CG__\FH\Document\Location
And the actual Document is this:
FH\Document\Location
The removeElement method does an array_search like this:
public function removeElement($element)
{
$key = array_search($element, $this->_elements, true);
if ($key !== false) {
unset($this->_elements[$key]);
return true;
}
return false;
}
So because the object types are not exactly the same, even though the proxy object should be inheriting from the actual Document I created, $key always returns 0 (false), so the element is not removed. Everything between the two documents are exactly the same, except the object type.
Like I said, I guess I can do it by MongoId, but why isn't it working by submitting the entire object?
Don't worry about the prime(true) stuff for just now. All that does is tell doctrine to pull the referenced data now, so it doesn't have to make multiple calls to the database when you iterate over the cursor.
What I would do is change your removeTarget method to do the following.
$this->dm->createQueryBuilder('\FH\Document\Person')->field('id')->equals($data->p_id);
$person = $q->getQuery()->getSingleResult();
$person->removeTargets($data->loc_id);
Person.php
public function removeTargets($targets)
{
foreach ($targets as $target) {
$this->removeTarget($target);
}
}
public function removeTarget($target)
{
if ($target instanceof \FH\Document\Location) {
return $this->targets->removeElement($target);
}
foreach ($this->targets as $t) {
if ($t->getId() == $target) {
return $this->targets->removeElement($t);
}
}
return $this;
}
This would mean you don't have to perform the second query manually as doctrine will know it needs to pull the data on that reference when you iterate over it. Then you can make this operation quicker by using the prime(true) call to make it pull the information it needs in one call rather than doing it dynamically when you request the object.