Why does storing a Nancy.DynamicDictionary in RavenDB only save the property-names and not the property-values? - ravendb

I am trying to save (RavenDB build 960) the names and values of form data items passed into a Nancy Module via its built in Request.Form.
If I save a straightforward instance of a dynamic object (with test properties and values) then everything works and both the property names and values are saved. However, if I use Nancy's Request.Form then only the dynamic property names are saved.
I understand that I will have to deal with further issues to do with restoring the correct types when retrieving the dynamic data (RavenJObjects etc) but for now, I want to solve the problem of saving the dynamic names / values in the first place.
Here is the entire test request and code:
Fiddler Request (PUT)
Nancy Module
Put["/report/{name}/add"] = parameters =>
{
reportService.AddTestDynamic(Db, parameters.name, Request.Form);
return HttpStatusCode.Created;
};
Service
public void AddTestDynamic(IDocumentSession db, string name, dynamic data)
{
var testDynamic = new TestDynamic
{
Name = name,
Data = data
};
db.Store(testDynamic);
db.SaveChanges();
}
TestDynamic Class
public class TestDynamic
{
public string Name;
public dynamic Data;
}
Dynamic contents of Request.Form at runtime
Resulting RavenDB Document
{
"Name": "test",
"Data": [
"username",
"age"
]
}
Note: The type of the Request.Form is Nancy.DynamicDictionary. I think this may be the problem since it inherits from IEnumerable<string> and not the expected IEnumerable<string, object>. I think that RavenDB is enumerating the DynamicDictionary and only getting back the dynamic member-names rather than the member name / value pairs.
Can anybody tell me how or whether I can treat the Request.Form as a dynamic object with respect to saving it to RavenDB? If possible I want to avoid any hand-crafted enumeration of DynamicDictionary to build a dynamic instance so that RavenDB can serialise correctly.
Thank You
Edit 1 #Ayende
The DynamicDictionary appears to implement the GetDynamicMemberNames() method:
Taking a look at the code on GitHub reveals the following implementation:
public override IEnumerable<string> GetDynamicMemberNames()
{
return dictionary.Keys;
}
Is this what you would expect to see here?
Edit 2 #TheCodeJunkie
Thanks for the code update. To test this I have:
Created a local clone of the NancyFx/Nancy master branch from
GitHub
Added the Nancy.csproj to my solution and referenced the project
Run the same test as above
RavenDB Document from new DynamicDictionary
{
"Name": "test",
"Data": {
"$type": "Nancy.DynamicDictionary, Nancy",
"username": {},
"age": {}
}
}
You can see that the resulting document is an improvement. The DynamicDictionary type information is now being correctly picked up by RavenDB and whilst the dynamic property-names are correctly serialized, unfortunately the dynamic property-values are not.
The image below shows the new look DynamicDictionary in action. It all looks fine to me, the new Dictionary interface is clearly visible. The only thing I noticed was that the dynamic 'Results view' (as opposed to the 'Dynamic view') in the debugger, shows just the property-names and not their values. The 'Dynamic view' shows both as before (see image above).
Contents of DynamicDictionary at run time

biofractal,
The problem is the DynamicDictionary, in JSON, types can be either objects or lists ,they can't be both.
And for dynamic object serialization, we rely on the implementation of GetDynamicMemberNames() to get the properties, and I assume that is isn't there.

Related

Efficient way to bring parameters into controller action URL's

In ASP.Net Core you have multiple ways to generate an URL for controller action, the newest being tag helpers.
Using tag-helpers for GET-requests asp-route is used to specify route parameters. It is from what I understand not supported to use complex objects in route request. And sometimes a page could have many different links pointing to itself, possible with minor addition to the URL for each link.
To me it seems wrong that any modification to controller action signature requires changing all tag-helpers using that action. I.e. if one adds string query to controller, one must add query to model and add asp-route-query="#Model.Query" 20 different places spread across cshtml-files. Using this approach is setting the code up for future bugs.
Is there a more elegant way of handling this? For example some way of having a Request object? (I.e. request object from controller can be put into Model and fed back into action URL.)
In my other answer I found a way to provide request object through Model.
From the SO article #tseng provided I found a smaller solution. This one does not use a request object in Model, but retains all route parameters unless explicitly overridden. It won't allow you to specify route through an request object, which is most often not what you want anyway. But it solved problem in OP.
<a asp-controller="Test" asp-action="HelloWorld" asp-all-route-data="#Context.GetQueryParameters()" asp-route-somestring="optional override">Link</a>
This requires an extension method to convert query parameters into a dictionary.
public static Dictionary GetQueryParameters(this HttpContext context)
{
return context.Request.Query.ToDictionary(d => d.Key, d => d.Value.ToString());
}
There's a rationale here that I don't think you're getting. GET requests are intentionally simplistic. They are supposed to describe a specific resource. They do no have bodies, because you're not supposed to be passing complex data objects in the first place. That's not how the HTTP protocol is designed.
Additionally, query string params should generally be optional. If some bit of data is required in order to identify the resource, it should be part of the main URI (i.e. the path). As such, neglecting to add something like a query param, should simply result in the full data set being returned instead of some subset defined by the query. Or in the case of something like a search page, it generally will result in a form being presented to the user to collect the query. In other words, you action should account for that param being missing and handle that situation accordingly.
Long and short, no, there is no way "elegant" way to handle this, I suppose, but the reason for that is that there doesn't need to be. If you're designing your routes and actions correctly, it's generally not an issue.
To solve this I'd like to have a request object used as route parameters for anchor TagHelper. This means that all route links are defined in only one location, not throughout solution. Changes made to request object model automatically propagates to URL for <a asp-action>-tags.
The benefit of this is reducing number of places in the code we need to change when changing method signature for a controller action. We localize change to model and action only.
I thought writing a tag-helper for a custom asp-object-route could help. I looked into chaining Taghelpers so mine could run before AnchorTagHelper, but that does not work. Creating instance and nesting them requires me to hardcode all properties of ASP.Net Cores AnchorTagHelper, which may require maintenance in the future. Also considered using a custom method with UrlHelper to build URL, but then TagHelper would not work.
The solution I landed on is to use asp-all-route-data as suggested by #kirk-larkin along with an extension method for serializing to Dictionary. Any asp-all-route-* will override values in asp-all-route-data.
<a asp-controller="Test" asp-action="HelloWorld" asp-all-route-data="#Model.RouteParameters.ToDictionary()" asp-route-somestring="optional override">Link</a>
ASP.Net Core can deserialize complex objects (including lists and child objects).
public IActionResult HelloWorld(HelloWorldRequest request) { }
In the request object (when used) would typically have only a few simple properties. But I thought it would be nice if it supported child objects as well. Serializing object into a Dictionary is usually done using reflection, which can be slow. I figured Newtonsoft.Json would be more optimized than writing simple reflection code myself, and found this implementation ready to go:
public static class ExtensionMethods
{
public static IDictionary ToDictionary(this object metaToken)
{
// From https://geeklearning.io/serialize-an-object-to-an-url-encoded-string-in-csharp/
if (metaToken == null)
{
return null;
}
JToken token = metaToken as JToken;
if (token == null)
{
return ToDictionary(JObject.FromObject(metaToken));
}
if (token.HasValues)
{
var contentData = new Dictionary();
foreach (var child in token.Children().ToList())
{
var childContent = child.ToDictionary();
if (childContent != null)
{
contentData = contentData.Concat(childContent)
.ToDictionary(k => k.Key, v => v.Value);
}
}
return contentData;
}
var jValue = token as JValue;
if (jValue?.Value == null)
{
return null;
}
var value = jValue?.Type == JTokenType.Date ?
jValue?.ToString("o", CultureInfo.InvariantCulture) :
jValue?.ToString(CultureInfo.InvariantCulture);
return new Dictionary { { token.Path, value } };
}
}

Combining results of multiple API calls

I am invoking an API operation to fetch a list of objects and then, for each object in that list, I am invoking another API operation to fetch additional details of that object and adding those details into the object. The goal is to return a list of objects with all the properties (or details) that I need. In the example below, the /allObjects call will get a list of all objects with 2 properties - id and k1. But my application needs 3 properties, id, k1 and k4. So I invoke another API method to get detailed information for each object, which includes the k4 property. I copy that detailed property into the original result-set's object and return the "enriched" result-set. But the latency cost of this can be high. What is the best way of achieving this without slowing down the application too much? In the real world, I am dealing with 500-2000 objects.
GET /allObjects yields JSON results =>
{
"data" : [
{
"id" : "123",
"k1" : "v1"
}, {
"id" : "456",
"k1" : "v1"
}
]
}
for (obj in results.data) {
GET /object/{obj.id} yields JSON result =>
{
"data" : {
"id" : "123",
"k1" : "v1",
"k2" : "v2",
"k3" : "v3",
"k4" : "v4"
}
}
// Add k4 property to original result-set object, and assign
// it the value of the k4 property in the individual result object.
obj.k4 = result.data.k4;
}
return results.data;
Your requirement is such that, you have no other option but to go for a mash-up(unless you can convince the API developers to combine the two API).
You could, however, opt to make a mashup service with a low latency cost to stand in between your application to abstract out the mashup logic, conversely, you could opt to use a language that is custom made for this kind of work to program your application. Both of these options can be accommodated with Ballerina, I've written a post here showing how easy it is to do that using it.

belongsTo only being set on first and last member of hasMany

My adapter uses findHasMany to load child records for a hasMany relationship.
My findHasMany adapter method is directly based on the test case for findHasMany. It retrieves the contents of the hasMany on demand, and eventually does the following two operations:
store.loadMany(type, hashes);
// ...
store.loadHasMany(record, relationship.key, ids);
(The full code for the findHasMany is below, in case the issue is there, but I don't think so.)
The really strange behavior is: it seems that somewhere within loadHasMany (or in some subsequent async process) only the first and last child records get their inverse belongsTo property set, even though all the child records are added to the hasMany side. I.e., if posts/1 has 10 comments, this is what I get, after everything has loaded:
var post = App.Posts.find('1');
post.get('comments').objectAt(0).get('post'); // <App.Post:ember123:1>
post.get('comments').objectAt(1).get('post'); // null
post.get('comments').objectAt(2).get('post'); // null
// ...
post.get('comments').objectAt(8).get('post'); // null
post.get('comments').objectAt(9).get('post'); // <App.Post:ember123:1>
My adapter is a subclass of DS.RESTAdapter, and I don't think I'm overloading anything in my adapter or serializer that would cause this behavior.
Has anybody seen something like this before? It's weird enough I though someone might know why it's happening.
Extra
Using findHasMany lets me load the contents of the hasMany only when the property is accessed (valuable in my case because calculating the array of IDs would be expensive). So say I have the classic posts/comments example models, the server returns for posts/1:
{
post: {
id: 1,
text: "Linkbait!"
comments: "/posts/1/comments"
}
}
Then my adapter can retrieve /posts/1/comments on demand, which looks like this:
{
comments: [
{
id: 201,
text: "Nuh uh"
},
{
id: 202,
text: "Yeah huh"
},
{
id: 203,
text: "Nazi Germany"
}
]
}
Here is the code for the findHasMany method in my adapter:
findHasMany: function(store, record, relationship, details) {
var type = relationship.type;
var root = this.rootForType(type);
var url = (typeof(details) == 'string' || details instanceof String) ? details : this.buildURL(root);
var query = relationship.options.query ? relationship.options.query(record) : {};
this.ajax(url, "GET", {
data: query,
success: function(json) {
var serializer = this.get('serializer');
var pluralRoot = serializer.pluralize(root);
var hashes = json[pluralRoot]; //FIXME: Should call some serializer method to get this?
store.loadMany(type, hashes);
// add ids to record...
var ids = [];
var len = hashes.length;
for(var i = 0; i < len; i++){
ids.push(serializer.extractId(type, hashes[i]));
}
store.loadHasMany(record, relationship.key, ids);
}
});
}
Solution
Override the DS.RelationshipChange.getByReference method by inserting the following code into your app:
DS.RelationshipChange.prototype.getByReference = function(reference) {
var store = this.store;
// return null or undefined if the original reference was null or undefined
if (!reference) { return reference; }
if (reference.record) {
return reference.record;
}
return store.materializeRecord(reference);
};
Yes, this is overriding a private, internal method in Ember Data. Yes, it may break at any time with any update. I'm pretty sure this is a bug in Ember Data, but I'm not 100% certain this is the right solution. But it does solve this problem, and possibly other relationship-related problems.
This fix is designed to be applied to Ember Data master as of 29 Apr 2013.
Reason
DS.Store.loadHasMany calls DS.Model.hasManyDidChange, which retrieves references for all the child records and then sets the hasMany's content to the array of references. This kicks off a chain of observers., eventually calling DS.ManyArray.arrayContentDidChange, in which the first line is this._super.apply(this, arguments);, calling the superclass method Ember.Array.arrayContentDidChange. That Ember.Array method includes an optimization that caches the first and last object in the array and calls objectAt on only those two array members. So there's the part that singles out the first and last record.
Next, since DS.RecordArray implements an objectAtContent method (from Ember.ArrayProxy), the objectAtContent implementation calls DS.Store.recordForReference, which in turn calls DS.Store.materializeRecord. This last function adds a record property to the reference that is passed in as a side effect.
Now we get to what I think is a bug. In DS.ManyArray.arrayContentDidChange, after calling the superclass method, it loops through all the new references and creates a DS.RelationshipChangeAdd instance that encapsulates the owner and child record references. But the first line inside the loop is:
var reference = get(this, 'content').objectAt(i);
Unlike what happens above to the first and last record, this calls objectAt directly on the Ember.NativeArray and bypasses the ArrayProxy methods including the objectAtContent hook, which means that DS.Store.materializeRecord--which adds the record property on the reference object--may have never been called on some references.
Next, the relationship changes created in the loop are immediately afterward (in the same run loop) applied with this call tree: DS.RelationshipChangeAdd.sync -> DS.RelationshipChange.getFirstRecord -> DS.RelationshipChange.getByReference. This last method expects the reference object to have a record property. However, the record property is only set on the first and last reference objects, for reasons explained above. Therefore, for all but the first and last records, the relationship fails to be established because it doesn't have access to the child record object!
The above fix calls DS.Store.materializeRecord whenever the record property doesn't exist on the reference. The last line in the function is the only thing added. On the one hand, it looks like this was the original intention: that var store = this.store; line in the original declares a variable that isn't otherwise used in the function, so what's it there for? Also, without the added line, the function doesn't always return a value, which is a little unusual for a function which is expected to do so. On the other hand, this could lead to mass materialization in some cases where that would be undesirable (but, the relationships just won't work without it in some cases, it seems).
Possibly related
The "chain of observers" I mentioned takes a bit of an odd path. The initiating event was setting the content property on a DS.ManyArray, which extends Ember.ArrayProxy--therefore the content property has a dependent property arrangedContent. Importantly, the observers on arrangedContent are executed before observers on content are executed (see Ember.propertyDidChange). However, the default implementation of Ember.ArrayProxy.arrangedContentArrayDidChange simply calls Ember.Array.arrayContentDidChange, which DS.ManyArray implements! The point being, this looks like a recipe for some code to execute in an unintended order. That is, I think Ember.ManyArray.arrayContentDidChange may getting executed earlier than expected. If this is the case, the above mentioned code that expects the record property to already exist on all references may have been expecting this reasonably, as one of the observers directly on the content property may call DS.Store.materializeRecord on each reference. But I haven't dug deep enough to find out if this is true.

Doctrine ODM: Cannot prime->(true) getSingleResult(); throws cursor error

I have a document that has a ReferenceMany attribute to another document. The reference is setup fine, and the data is returned from the query fine, but each document in the arraycollection is returned as a proxy. On this site, I saw it was mentioned I should add ->prime(true) in order to return the actual referenced documents.
When that ArrayCollection of documents is returned, I am running a loop of ids I have submitted to the server to remove them from the referenced collection. The removeElement method is not working b/c the returned documents are proxies, and I am comparing an actual document vs. those proxies. So basically I am trying to:
Look up a single document
Force all documents in the ReferenceMany attribute to be actual documents and not Proxy documents
Loop through my array of id's and load each document
Send the document to the removeElement method
On the first getSingleResult query method below, I am getting an error cannot modify cursor after beginning iteration. I saw a thread on this site mention you should prime the results in order to get actual documents back instead of proxies, and in his example, he used getSingleResult.
$q = $this->dm->createQueryBuilder('\FH\Document\Person')->field('target')->prime(true)->field('id')->equals($data->p_id);
$person = $q->getQuery()->getSingleResult();
foreach($data->loc_id as $loc) {
$location = $this->dm->createQueryBuilder('\FH\Document\Location')->field('id')->equals(new \MongoId($loc))->getQuery()->getSingleResult();
$person->removeTarget($location);
}
....
....
....
public function removeTarget($document)
{
$this->target->removeElement($document);
return $this;
}
If I remove ->prime(true) from the first query, it doesn't throw an error, yet it doesn't actually remove any elements even though I breakpoint on the method, compare the two documents, and the data is exactly the same, except in $this->target they are Location Proxy documents, and the loaded one is an actual Location Document.
Can I prime the single result somehow so I can use the ArrayCollection methods properly, or do I need to just do some for loop and compare ids?
UPDATE
So here is an update showing the problem I am having. While the solution below would work just using the MongoId(s), when I submit an actual Document class, it never actually removes the document. The ArrayCollection comes back from Doctrine as a PersistentCollection. Each element in $this->coll is of this Document type:
DocumentProxy\__CG__\FH\Document\Location
And the actual Document is this:
FH\Document\Location
The removeElement method does an array_search like this:
public function removeElement($element)
{
$key = array_search($element, $this->_elements, true);
if ($key !== false) {
unset($this->_elements[$key]);
return true;
}
return false;
}
So because the object types are not exactly the same, even though the proxy object should be inheriting from the actual Document I created, $key always returns 0 (false), so the element is not removed. Everything between the two documents are exactly the same, except the object type.
Like I said, I guess I can do it by MongoId, but why isn't it working by submitting the entire object?
Don't worry about the prime(true) stuff for just now. All that does is tell doctrine to pull the referenced data now, so it doesn't have to make multiple calls to the database when you iterate over the cursor.
What I would do is change your removeTarget method to do the following.
$this->dm->createQueryBuilder('\FH\Document\Person')->field('id')->equals($data->p_id);
$person = $q->getQuery()->getSingleResult();
$person->removeTargets($data->loc_id);
Person.php
public function removeTargets($targets)
{
foreach ($targets as $target) {
$this->removeTarget($target);
}
}
public function removeTarget($target)
{
if ($target instanceof \FH\Document\Location) {
return $this->targets->removeElement($target);
}
foreach ($this->targets as $t) {
if ($t->getId() == $target) {
return $this->targets->removeElement($t);
}
}
return $this;
}
This would mean you don't have to perform the second query manually as doctrine will know it needs to pull the data on that reference when you iterate over it. Then you can make this operation quicker by using the prime(true) call to make it pull the information it needs in one call rather than doing it dynamically when you request the object.

SubSonic ORM with backbone.js

I am using Subsonic ORM by Rob Connery with Backbone.Js to build javascript single page demonstration application. in one of the service end point there is a contract that send all the records existing in the data source like below
[WebMethod]
[ScriptMethod(UseHttpGet = true)]
public TaskCollection GetAllTasks()
{
TaskCollection coll = new TaskCollection();
coll.Load();
return coll;
}
but seems that each Task in the collection is polluted with loads of properties that are required only on server side. This is the JSON returned on request
[{
"__type": "DAL.Task",
"Taskid": 1,
"Taskname": "welcome to india",
"Createdon": "\/Date(1334591056903)\/",
"Modifiedon": "\/Date(1334591056903)\/",
"ValidateWhenSaving": true,
"DirtyColumns": [],
"IsLoaded": true,
"IsNew": false,
"IsDirty": false,
"TableName": "task",
"ProviderName": null,
"NullExceptionMessage": "{0} requires a value",
"InvalidTypeExceptionMessage": "{0} is not a valid {1}",
"LengthExceptionMessage": "{0} exceeds the maximum length of {1}",
"Errors": []
}]
all i require is CreatedOn,ModifiedOn and TaskName, TaskId . How do i make sure only these are sent down the wire using SubSonic ORM
Here's a couple of ideas...
Use the viewmodel to autoselect the properties:
public class TaskView
{
public int TaskID { get; set; }
public string TaskDescription { get; set; }
}
...
var results = new Select().From(Tables.Task).ExecuteTypedList<TaskView>();
Use an anonymous type
var qry = new Select(new string[] { Task.Columns.TaskID, Task.Columns.TaskDescription }).From(Tables.Task);
var resultList = new List<object>();
using (IDataReader rdr = qry.ExecuteReader())
{
while (rdr.Read())
resultList.Add(new
{
TaskID = rdr[0].ToString(),
TaskDescription = rdr[1].ToString(),
});
}
I don't use SubSonic, but I have to admit this does seem like a good example of where you might want to use a ViewModel, that is, a model that is populated from your Model specifically for the view. Now as far as binding the ViewModel to the Model, and/or possibly generating properties for many ViewModels from the model for the view (because generating ViewModels can definitely be tedious and error prone for a bunch of models), well, I've heard of several general solutions. I'm actually trying to find a solution I'm happy with myself; in the meantime I've been forced to hand write them until I find a better solution. I believe if you're wanting strongly typed ViewModels, you can use a tool like AutoMapper, though I've never used it myself. I've also seen solutions which use or inherit from a C# dynamic and then modify the accessors (though I think that might be slightly problematic to generate JSON from).
The primary reason I've used ViewModels is that I can easily control the format of the Dates. But that might be done better by using a different JSON serializer. Of course, the use of ViewModels can also allow you flexibility to change your data layer as needed. But I have to admit, it's been tedious. I think my implementation can be handled better with a bit of automation, but I don't know how to handle that so far.
I realize this is only a partial answer. I'm curious what other answers might come up.