Generate automatically a delta feed in OData - asp.net-core

in odata (.net core) it is possible to generate delta results in a delta feedback,
like this https://learn.microsoft.com/en-us/odata/webapi/DeltaFeed_support.
As I see in the example, they create manually the delta response. This can occur in very large result a long-running task before it will be sent back to the client.
I wonder if there is possible, to get an automated way of doing this and how?
So I use entity framework for this.
I thought a little bit about it, but I don't see another change to implement a "log" table to hold every action on all entities, or I must use an event sourcing pattern. The second one was not a choice for already implemented solutions I think.
Maybe there exists another way for this?
Any Ideas?

Related

update old processes with the new process definition -Activiti

I have some processes that ran with old process definitions. But due to requirement change the user task data has been updated with new attributes and this process definition has been deployed. I'm aware that "SetProcessDefinitionVersionCmd" can be set to "yes" to point the processes to the new definition/version.
I would like to know how to migrate the old process data to have the newly added attributes of the user task updated in them?
There is no easy way to migrate process instance data, however, when you set the version to the new process definition the instance data will go with the migrated instance.
What you have to be careful of is to make sure you include null checks for any of the data that may not be present in the migrated process instances.
Hope this helps,
Greg
Indeed there is no easy way for migration, however depending on the differences between the two definitions and to what extend you may not prefer to use SetProcessDefinitionVersionCmd, you may find DynamicBpmnService useful when combined with detecting definitions' versions inside your logic.
And yes another way would be to use SetProcessDefinitionVersionCmd but be extra cautions for tasks that were actually active prior to migration, as Activiti's database model have some redundant data (some for performance reasons), you are better studying the DB tables first for these tasks and then inspecting the before and after migration state. For example, keeping up with a simple changed attribute is much easier than an added boundary event on an active User Task, which affects the "execution tree".
I would also advice to compare SetProcessDefinitionVersionCmd's implementations between Activiti and Camunda, it is sad to have such enhancements efforts separated, but that is another story.

Breeze and Point In Time Entities

We are creating a system that allows users to create and modify bills for their clients. The modifications need to be maintained as part of the bill for auditing purposes. It is to some extent a point in time architecture but we aren't tracking by time just by revision. This is a ASP.NET MVC 5, WebAPI2, EntityFramework 6, SQL Server app using Breeze on the client and the server.
I'm trying to figure how to get back the Breeze and our data model to work correctly. When we modify an entity we essentially keep the old row, make a copy of it with the modifications and update some entity state fields w/ date/time/revision number and so on. We can always get the most recent version of the entity based off of an entity ID and an EditState field where "1" is the most current.
I made a small sample app to work on getting Breeze working as part of the solution and to enable some nice SPA architecture and inline editing on the client and it all works... except that since our entity framework code automatically creates a new entity that contains the modifications, the SaveChanges response contains the original entity but not the new "updated" entity. Reloading the data on the client works but it would of course be dumb to do that outside of just hacking around for demo purposes.
So I made a new ContextProvider and inherited from EFContextProvider, overrode the AfterSaveEntities method and then things got a bit more complicated. Not all the entities have this "point in time" / revision functionality but most of them do. If they do I can as I said above get the latest version of that entity using its EntityId and EditState but I'm not seeing a straight forward way to get the new entity (pretty new to EF and very new to Breeze) so I'm hoping to find some pointers here.
Would this solution lie in Breeze or our DataContext? I could just do some reflection, get the type, query the updated entity and shove that into the saveMap. It seems like that might break down at some point (not sure how or when but seems sketchy). Is our architecture bad? Should we have gone the route of creating audit/log tables to store the modified values instead of keeping the datamodel somewhat smaller by keeping all of the revisions of the entities in their original tables but with the revision information and making the queries slightly more complicated? Am I just missing something in EF?
... and to head of the obvious response, I know we should have used a document database but that wasn't an option on this project. We are stuck in relational land.
I haven't tried this but another approach would be to simply change the EntityState of the incoming entity in the BeforeSaveEntities method from Modified to Added. You will probably need to also update some version field in this 'new' entity so that it doesn't have a primary key conflict with the original.
But... having built apps like this in the past, I really recommend another approach. Store your 'historical' entities of each type in a separate table. It can be exactly the same shape as the 'current' table. When you save you first copy the 'current' entity into the 'historical' table ( again with some version numbering or date schema for the primary key) and then just update your 'current' entity normally.
This might not give you the answer you expected, but here is an idea:
When saving an object, intercept save on server, you get an instance of object you need to modify, read object from database that has the same ID, put copy of that old object to legacy table in your database and continue with saving into main table. That way only latest revision stays in main table while legacy table would contain all previous versions.
So, all you would need to do is have two tables containing same objects:
public DbSet<MyClass> OriginalMyClasses{get;set;}
public DbSet<MyClass> LegacyMyClasses{get;set;}
override SaveChanges function and intercept when entry E state is Modified, read E type, get the original and legacy tables, read object O from Original with same ID as E, save O to Legacy table, and finally return base.SaveChanges(); (let it save as it is supposed to by default).

How to access results of Sonar metrics for use with applications like PowerPivot

I'm trying to run a number of applications with known failure rates through Sonar, with hopes of deciding which metrics are most valuable in determining whether a particular application will fail. Ultimately I'll be making some sort of algorithm that will look at the outputs of whatever metrics I'm using and generate a score from 1 - 100. I've got about 21 applications put through Sonar, and the results have been stored in a MySQL database. I originally planned to use PowerPivot to find relationships in the data, but it seems like the formatting of the tables doesn't lend itself well to that. Other questions on stackoverflow have told me that Sonar's tables are unformatted, and I should instead use the Web Service API to get the information. I'm unfamiliar with API and was unsuccessful in trying to do what I wanted by looking at Sonar's documentation for API.
From an answer to another question:
http://nemo.sonarsource.org/api/timemachine?resource=org.apache.cxf:cxf&format=csv&metrics=ncloc,violations_density,comment_lines_density,public_documented_api_density,duplicated_lines_density,blocker_violations,critical_violations,major_violations,minor_violations
This looks very similar to what I'd like to have, except I'm only looking at each application once (I'm analyzing a sample of all the live applications on a grid), which means Timemachine isn't really what I'm looking for. Would it be possible to generate a similar table, except instead of the stats for a particular application per date, it showed the statistics for an application and all of its classes, etc?
If you're not familiar with the WS API, you can also create your own Sonar plugin to achieve whatever you want: it is written in Java and it will execute on every analysis you run. This way, in the code ot this custom plugin, you can do whatever you want: flush the metrics you need in an output file, push them into a third party system, ... etc.
Just take a look on how to write a plugin (most probably you will create a Decorator). You have concrete examples also to get started faster.

Is it possible to store and retrieve objects created using Objective-C? (in a database, for use in iOS app)

I'm working on an iOS app that creates "location sets" where each row contains a location name and a GeoPoint, and each set has its own name. Each of these sets are stored in an object inside our program (all belonging to the same class). Now we want to give users the capability to create sets and upload them to a database, allowing other users to access and download them to their device.
I've been looking in to back-end solutions for work like this, but pretty much everything I've found so far focuses on relational databases and adding and deleting rows and using SQL-like language to retrieve them. Is there a way to store these objects just as objects (and not unpack the info inside to tables), and then retrieve them? It feels like that would be a much simpler way of going about this.
I'm an absolute beginner when it comes to databases, so forgive me if there's info missing here that you would need to help me out. I'll make sure to keep checking back in case someone asks for more info.
Thanks!
Coredata might be useful for you as its based upon the entity. So you can play multiple things around it by using queries (predicates).
But if you just want to save and retrieve back, then as a simplest solution I would suggest to create array/dictionary with entity data, save that into NSUserDefaults so you can retrieve back same while re-launching the app.
Webservices for iOS development:
raywenderlich
icodeblog
WSDL Webservices
Response data parsing, it would be either JSON or XML:
JSON Parsing
XML Parsing
Hope these links would be helpful for you.
I ended up using Parse's mobile back-end service. That was the type of service I was looking for. I've found other similar services since then, like Applilcasa and StackMob, but we're pretty happy with Parse so far.

Ajax autocomplete extender populated from SQL

OK, first let me state that I have never used this control and this is also my first attempt at using a web service.
My dilemma is as follows. I need to query a database to get back a certain column and use that for my autocomplete. Obviously I don't want the query to run every time a user types another word in the textbox, so my best guess is to run the query once then use that dataset, array, list or whatever to then filter for the autocomplete extender...
I am kinda lost any suggestions??
Why not keep track of the query executed by the user in a session variable, then use that to filter any further results?
The trick to preventing the database from overloading I think is really to just limit how frequently the auto updater is allowed to update, something like once per 2 seconds seems reasonable to me.
What I would do is this: Store the current list returned by the query for word A server side and tie that to a session variable. This should be basically the entire list I would think. Then, for each new word typed, so long as the original word A exists, you can filter the session info and spit the filtered results out without having to query again. So basically, only query again when word A changes.
I'm using "session" in a PHP sense, you may be using a different language with different terminology, but the concept should be the same.
This question depends upon how transactional your data store is. Obviously if you are looking for US states (a data collection that would not change realistically through the life of the application) then I would either cache a System.Collection.Generic List<> type or if you wanted a DataTable.
You could easily set up a cache of the data you wish to query to be dependent upon an XML file or database so that your extender always queries the data object casted from the cache and the cache object is only updated when the datasource changes.
RAM is cheap and SQL is harder to scale than IIS so cache everything in memory:
your entire data source if is not
too large to load it in reasonable
time,
precalculated data,
autocomplete webservice responses.
Depending on your autocomplete desired behavior and performance you may want to precalculate data and create redundant structures optimized for reading. Make use of structs like SortedList (when you need sth like 'select top x ... where z like #query+'%'), Hashtable,...
While caching everything is certainly a good idea, your question about which data structure to use is an issue that wasn't fully answered here.
The best data structure for an autocomplete extender is a Trie.
You can find a good .NET article and code here.