We are trying to deploy a Business Connectivity Services(BCS) Model solution where the properties in the model depend on the structure of the data exposed by a web service.
Ideally the BCS Model would expose a collection of key/value pairs which are then converted to columns in a sharepoint list later as this would mean that the same model could be used for several different datasets, however from what we can tell this is not how BCS Models have been designed, as they rely on the model to be strongly typed in order to reflect the entity being imported.
As it stands we are therefore looking at a solution which enables the user to "create" a new external list by providing the url to a remote dataset through a custom page in sharepoint central admin, that will then automatically construct the BCS Model project (by altering a project template) and then compiling and releasing the resulting feature on the fly.
This way we can create the "fixed" class with properties that represent the structure of the data being imported.
for example, datasource A could expose
<cars>
<car>
<color>blue</color>
<make>ford</make>
</car>
<car>
<color>red</color>
<make>lotus</make>
</car>
</cars>
in which case we need a BCS Model for "car" which has two public properties , color and make
however datasource B could expose
<invoices>
<invoice>
<amount>£34.00</amount>
</invoice>
<invoice>
<amount>£34.00</amount>
</invoice>
</invoices>
in which case we need a BCS Model "invoice" with a single public property for amount.
Would appreciate anyones feedback on this approach or the 'best practice' way of achieving this.
[I've had experience doing something similar in .Net - I'm not sure how relevant this will be to you.]
I had to write an import tool that could handle any file format. To handle this properly, I wrote a small class which would take an xml format definition (name, data type, format string, custom parser, etc..) and generate a class which could read the file and expose an IQueryable<FileFormat> and some additional metadata.
It's worth noting that to make it completely flexible, I had to allow the format definition to provide a C#/VB lambda which would be compiled and executed (eg, when the input date format was non-standard and needed a custom parser). This is clearly a security risk - so when I instantiated the dynamic class, I did so in a seperate AppDomain with very few privilieges. This may not apply in your situation.
We used a custom templating engine to generate the code which was then compiled using the System.Codedom.Compiler namespace - This allowed us to create assemblies and cache them until the definition changed. It may be worth considering the Razor templating engine if you do something similar.
The only real issues we had arose from coding against an unknown data type. By making the custom class implement our own IImportFile interface which exposed metadata in a standard way (effectively the same info as in the xml spec), we could work around it without too much effort.
We were lucky in that this is a tool for trusted users (at least only trusted users can provide a new file format specification) so the security risks were limited. If you're compiling code based on user input, make sure you've got adequate safeguards in place.
Related
Today I was checking out a few technologies: T4 templating, automapper
some mini orms: petapoco, sqlfu, ormlite
I understand the gist of what these technologies provide. I'm currently working on a 3 tier system, and I would have loved to replace the DAL (data access layer located on it's own data server) and have it integrated with a mini ORM as shown. However, I will be making no such plans for now. We currently use .NET Remoting (predates WCF).
So instead of replacing whatever is on the DataServer, I'd like to extend one of these new technologies on the application server.
I've done research on how Entity Framework can automatically generate POCO classes based on the context, which is done manually after building EF, I was wondering if I can do the same without using EF.
So here's the facts on what's currently happening:
Send a sql statement (or stored proc) to the DAL to execute
Retrieves a DataSet or a DataTable back to the application through TCP channel
My question is, is it possible to automatically generate a dynamic POCO class using keywords "var" and "dynamic" based on the values sent back from the DataSet and do dynamic mapping onto it during runtime? Would any of the technologies mentioned above help? Or do I have to manually create the POCO class first, and do a mapping on it?
It seems a bit redundant for me to manually create a POCO class and map it to a backend sql table if the application could be aware of what the POCO class is supposed to have. Like what happens if I update a table on the backend, then I'd have to update the POCO class associated with it as well. I'd love to have this to be automatic for me.
If you know the data sets at compile time, then T4 might be an option. You can write a T4 script that downloads the database schema, and constructs strongly-typed entity classes and database reads/write methods.
As far late-bound (runtime) classes, one option is to use the runtime typing provided by CustomTypeDescriptor. You can pass arrays of objects back and forth from the server, and use reflection or other techniques to infer the type.
I think it should be clear that #1 is preferable, if you know the types at compile time (which it sounds like in your case here). Runtime and dynamic should only be a last resort, as it circumvents a lot of valuable compile-time type checks.
Really, I would recommend using one of the micro ORMs like Dapper, etc, if you don't want to use the full Entity Framework. That is, unless you really want to re-invent the wheel.
I'm trying to build a WCF service that gets info from another service via XML. The XML usually has 4 elements ranging from int, string to DateTime. I want to build the service dynamically so that when it gets an XML, it stores it in the Database. I don't want to hardcode the types and element names in the code. If there is a change, I want it to dynamically add it to the database What is the best way of doing this? Is this a good practice? Or should i stick with just using Entity Framework and having a set model for the database and hardcode the element names and types?
Thanks
:)
As usual, "it depends...".
If the nature of the data you're receiving is that it has no fixed schema, or a schema that changes frequently as a part of its business logic/domain logic, it's a good idea to design a solution to manage that. Storing data whose schema you don't know at design time is a long-running debate on StackOverflow - look for "property bag", "entity attribute value" or EAV to see various questions and answers.
The thing you give up with this approach is - at the web service layer - the ability to check that the data you're receiving meets the agreed interface contract; this in turn helps to avoid all kinds of exciting bugs - what should your system do if it receives invalid XML? Without an agreed schema/dtd, you have to build all kinds of other checks into your code.
At the database level, you usually end up giving up aspects of the relational model (and therefore the power of SQL) to store your data without "traditional" row-and-column relationships. That often makes queries harder, and may sacrifice standards compliance (e.g. by using vendor specific extensions).
If the data changes because of technical reasons - i.e. it's not the nature of the data to change, it's just the technology chain that makes you worry about this - I'd suggest you instead build in the concept of versioning, and have "strongly typed" services/data with different versions. Whilst this seems more work, the benefits of relying on schema validation, the relational database model, and simplicity usually make it a good trade-off...
Put the xml into the database as xml. Many modern databases support XML directly. If not, just store it as text.
I am migrating a win32 Delphi VCL application written in Delphi 2010 to a multi-tiered architecture.
The application makes extensive use of ExpressGrids (TcxGrid) by devexpress for databinding.
I have designed the data tier based on Entity framework, using
dbContext and Data transfer objects.
I have expose the CRUD operations via WCF using T4 templates based on the entity data model.
That's all working fine, Delphi client successfully communicates with the WCF service, to exchange data transfer objects.
The problem is, how do I maintain the Databinding functionality?
I am leaning toward writing a service method that returns a generic data table, or a dataset. Can I convert linq statement results to Datasets, tables or views?
I am not sure I can bind Expressgrids to arrays of objects (WCF returns entity collections, but Delphi sees arrays). Has any one had experience with this type of interoperability?
EDIT
I have the data service returning DataSets, but guess what? The xml dataset from .net is not compatible with Delphi (2010 at least). I have managed to get that to work using the Gekko example (modified) , but it's not going to be practical due to the complexity of the queries we'd need to run from the client. Our options now, are to leave the Grids as they are until the BLL is complete and start a new project to re-write the client in C# (as a web or windows application) or, write a generic Grid plugin using RemObjects Hydra, to embed in forms with grids- trying to figure that out now.
Update
I have dropped the Gekko DataSet implementation in favour of writing my own by studying the xml format required by TClientDataSet in Delphi. On the server side, I've implemented a service that returns TClientDataSet compliant xml in the form of a byte array. On the client I Load this into a TStringStream and then load the stream into the TclientDataSet component. I use this for small collections to provide lookups and data binding while we roll out the Hydra solution.
The xml roughly follows this format:
"2.0">
<METADATA>
<FIELDS>
<FIELD attrname="ID" fieldtype="i4"/>
<FIELD attrname="Status" fieldtype="string" WIDTH="10"/>
<FIELD attrname="Created" fieldtype="date"/>
<ROWDATA>
<ROW RowState="4" ID="1" Status="Code1" Created="20130707" Made="20130707T21:37:55341" Stopped="00:00:00000" Volume="1174" IsReady="TRUE"/>
<ROW RowState="4" ID="2" Status="Code2" Created="20130707" Made="20130707T21:37:55341" Stopped="00:00:00000" Volume="2149" IsReady="FALSE"/>
This is one of the few sources of information on the xml format.
I know the question is old, but I would appreciate any insights on interoperability and legacy code migration.
Thanks
I have many, (15-20) different XML files that I need to load to VB.Net. They're designed as they would be in a database; they're designed in Access and bulk exported into XML files. Each file represents a different table in the database.
Now, I need to load this information into VB.Net. Initially, I'd love to use DAO and access the MDB directly via queries, but this won't be possible as I'm making sure the project will be easily ported to XNA/C# down the road. (Xbox 360 cannot use MDBs, so I'd rather deal with this problem now than down the road).
So, I'm stuck now trying to figure out how to wrangle together all of these XML files together. I've tried using Factories to parse each one individually. E.g., if three XML files contain data for a 'character' class, i'd pass in an instance of Character to each XML factory and the classes would apply the necessary data.
I'm trying to get past this though, as maintaining many different classes with redundant code is a pain. plus it is hard to debug as well. So I'm trying to figure out a new solution.
The only thing I can think of right now is using System.Reflection, where I parse through each member of the class/structure I'm instantiating, and then using the names of those members to read in the data from that element of the XML file.
However, this makes the assumption that each member of the structure/class has a matching element in the XML file, and vice-versa.
If you know the schema of the XML files - you could create .NET classes that can deserialize one of those XML files into an instance of a .NET object.
You can also you use xsd.exe (comes with Windows SDK download) to generate the .NET class definition for you if you have an XSD file (or can write an XSD easier than you can write a serializable .NET class).
Linq-to-XML is a good solution (and even better in VB.NET with things like XML Literals and Global Namespaces). Treating multiple XML files as DB tables can be a rough road some times, but certainly not impossible. I guess I'd start with JOIN (even though it has "C#" in the title, the samples are also in VB)..
How many software projects have you worked on used object serialization? I personally never came across a scenario where object serialization was used. One use case i can think of is, a server software storing objects to disk to save memory. Are there other types of software where object serialization is essential or preferred over a database?
I've used object serialization in a lot of my projects. Sometimes we use it to store computer-specific settings locally. I have also used XML serialization to simplify interaction and generation of XML documents. It is also very beneficial in communication protocols. Serialize on one end and re-inflate on the other end.
Well, converting objects to XML or JSON is a form of serialization that is quite common on the web. I've also worked on a project where objects were created and serialized to a binary file in one application and then imported into another custom application (though that's fragile since it uses C# and serialization has broken in the past between versions of the .NET framework). Also, application settings that have a complex structure may be useful to serialize. I also think remoting APIs use serialization to communicate. Basically, serialization in general is simply a way to store the states of your objects, and this has many different uses.
Here are few uses I can think of :
Send an object across network, the most common example is serializing objects across a cluster
Serialize object for (sort of) caching, ie save the state in a file and read it back later
Serialize passive/huge data to a file to minimize the memory consumption and read it back whenever required.
I'm using serialization to pass objects across a TCP socket. You put XmlSerializers on either side, and it parses your data into readily available objects. If you do a little ground work, you can get it so that you're basically passing objects back and forth, and it makes socket communication extremely easy, reducing it to nothing more than socket.Send(myObject);.
Interprocess communication is a biggie.
you can combine db & serialization. f.ex. when you have to store an object with a lot of attributes (often dynamic, i.e. one object attribute set will be different from another one) to the relational DB, and you don't want to create a new column per each attribute
We started out with a system that serialized all of the thousands of in-memory objects to disk every 15 minutes or so. When that started taking too long we switched over to a mixed mode of saving the objects into a relational db and pickle file (this was a python system btw). Eventually the majority of the data was stored in a relational database. Interestingly, the system was written in such a way that all of the application code couldn't care less what was going on down there. It was all done using XP and thousands of automated tests.
Document based applications such as word processors and vector graphics editors will often serialize the document model to disk when the user invokes the Save command. Serialization is often preferred over complex databases in these apps.
Using serialization saves you time each time you want to implement an import/export functionality.
Every time you need to export your system's data, create backups or store some kind of settings, you could use serialization instead and just save the state of the objects that represent the actual config, data or whatever else.
Only when you need a specific format of the exported/imported data, there is a sense in building a custom parser and exporter/importer.
Serialization is also change-proof. Whenever you change the format of the object that is involved in the exchange functionality, it is automatically exportable and you don't have to change the logic behind your export/import parts.
We used it for a backup & update functionality. It was basically serialized hibernate objects being backed up, then the DB schema is altered through the update and we delivered a helper class that "coverted" the old objects to the new DB schema. This way we had a pretty solid update mechanism that wouldnt break easily and does an automatic backup at the same time.
I've used XML serialization heavily on one project. The technique was used to persist to database data structures that had no common structure, so the data couldn't be stored directly. I also used serialization to separate application settings that could be changed at runtime.