Persisting Blob Streams with NHibernate - nhibernate

If I have a class declared as:
public class MyPersistentClass
{
public int ID { get; set; }
public Stream MyData {get;set; }
}
How can I use NHibernate's mappings to persist the MyData property to and from the database?

You could use a Stream using a custom type and map it according to your storage needs. But there are some issues with using the Stream object as I mention in my blog series about lazy streaming of BLOBs and CLOBs with NHibernate.
What you really need is a Blob object that in turn can create a Stream to read data from. Since Stream contains information about the position you're reading from and expects to be closed and disposed of it can create some issues when used directly in a domain model.
I would suggest that you take a look at the blog series as well as the source code of the NHibernate.Lob project. It includes various mapping options for just such a problem. A little scarcely documented so far but more is coming.

Related

Force WCF to use a existing buffers during Deserialization?

I looked for some other topics here as well, but I didn't find a solution to my problem yet.
Imagine the following:
I've a very simple ServiceContract with different OperationContracts. One of these OperationContracts is a simple use-case "download a data transfer object".
The Service looks like:
...
[OperationContract]
DTO Download(strind Id)
...
Class DTO looks like:
[DataContract]
public class DTO
{
[DataMember]
public string Id;
[DataMember]
public byte[] Data;
}
Of course it's very simple and it works fine, but I need to allocate the byte[] in DTO by myself!
My Code is part of a framework componenent and it's working in parallel under massive memory restrictions. I don't want WCF to allocate all the byte[] and I don't want the ManagedHeap to deallocate them all again. I need to share and reuse all parallel existing buffers.
So when I finished my serialization I will reuse the buffer on serverside.
On clientside I want WCF to read into my buffer!
I tried some solutions with own XmlObjectSerialiers and own OperationBehaviors, but it didn't work yet.
Does anyone have any other ideas?
Update:
I found a first working solution using an own Serializer:XmlObjectSerializer, an own IContractBehavior and an own DataContractSerializerOperationBehavior.
While reading from XmlDictionaryReader during Deserialization I use the following snippet:
public bool DeserializeFrom(XmlDictionaryReader source)
{
...
readBytes = source.ReadElementContentAsBase64(MyOwnAlreadyAllocatedBuffer, offset, toRead);
...
}
This works, but it's still a bit slower than the DataContractSerializer.
Any other options?
Notice: I just want to transfer already binary serialized data!

How to deserialize data from ApiController

I have some POCO objects that are set up for use with Entity Framework Code First.
I want to return one of those objects from an ApiController in my ASP.NET MVC 4 website, and then consume it in a client application.
I originally had problems with the serialization of the object at the server end, because the Entity Framework was getting in the way (see Can an ApiController return an object with a collection of other objects?), and it was trying to serialize the EF proxy objects rather than the plain POCO objects. So, I turned off proxy generation in my DbContext to avoid this - and now my serialized objects look OK (to my eye).
The objects in question are "tags" - here's my POCO class:
public class Tag
{
public int Id { get; set; }
public int ClientId { get; set; }
public virtual Client Client { get; set; }
[Required]
public string Name { get; set; }
[Required]
public bool IsActive { get; set; }
}
Pretty standard stuff, but note the ClientId and Client members. Those are EF Code First "navigation" properties. (Every tag belongs to exactly one client).
Here's what I get from my ApiController:
<Tag xmlns:i="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://schemas.datacontract.org/2004/07/Foo">
<Client i:nil="true"/>
<ClientId>1</ClientId>
<Id>1</Id>
<IsActive>true</IsActive>
<Name>Example</Name>
</Tag>
The Client member is nil because having disabled proxy generation I don't get automatic loading of the referenced objects. Which is fine, in this case - I don't need that data at the client end.
So now I'm trying to de-serialize those objects at the client end. I had hoped that I would be able to re-use the same POCO classes in the client application, rather than create new classes. DRY and all that. So, I'm trying:
XmlSerializer xmlSerializer = new XmlSerializer(typeof(Tag));
var tag = xmlSerializer.Deserialize(stream);
But I've run into two problems, both of which are due to EF Code First conventions:
Problem 1: Because my Tag class has a Client member, the XmlSerializer is complaining that it doesn't know how to de-serialize that. I guess that's fair enough (though I had hoped that because the member was Nil in the XML it wouldn't care). I could pass in extra types in the XmlSerializer constructor, when I tried that, it then complained about other classes that Client uses. Since Client references all sorts of other objects, I'd end up having to pass in them all!
I tried using the [DataContract] and [DataMember] attributes to remove the Client member from the XML (by not marking it as a DataMember). That did remove it from the XML, but didn't stop the XmlSerializer from whining about it. So I guess it's not the fact that it's in the XML that's the problem, but that it's in the class definition.
Problem 2: When I did try passing in typeof(Client) as an extra type, it also complained that it couldn't de-serialize that class because it contains an interface member. That's because - again due to EF Code First conventions - it has a Tags member as follows:
`public virtual ICollection<Tag> Tags { get; set; }`
So it looks like even if I get over the referenced-types problem, I'm still not going to be able to use my POCO classes.
Is there a solution to this, or do I have to create new DTO classes purely for use at the client side, and return those from my ApiController?
I just tried using DataContractSerializer instead of XmlSerializer, and for the Tag class that seems to work. I've yet to try it with a class that has a virtual ICollection<T> member...
Update: tried it, and it "works". It still manages to reconstruct the object, and leaves the ICollection member at null.
Update 2: OK, that turned out to be a dead end. Yes, it meant that I could correctly serialize and de-serialize the classes, but as everyone kept telling me, DTO classes were a better way to go. (DTO = Data Transfer Objects - classes created specifically for transferring the data across the wire, probably with a subset of the fields of the original).
I'm now using AutoMapper (thanks Cuong Le) so that I can easily transform my POCO entities into simpler DTO classes for serialization, and that's what I'd recommend to anyone faced with the same problem.

Writing a Custom Hive Provider using objects as datasource

Im trying to create a hive provider that would be able to work towards some objects.
An object may look something like this
public class MyContent
{
public System.Collections.Generic.List Content { get; set; }
}
public class ContentExample
{
public string Title { get; set; }
public string Text { get; set; }
}
public class MyFiles
{
public System.Collections.Generic.List Files { get; set; }
}
public class FileExample
{
public System.IO.FileInfo File { get; set; }
}
I've downloaded and checked the two Hive providers from the Visual Studio Gallery (Umbraco 5 Hive Provider and Umbraco 5 Simple Hive Provider), but the lack of documentation is a bit disturbing. I also downloaded some of the other example hives, like the Wordpress hive provider, but that one is rather different from the ones in the Visual Studio Gallery.
The Idea
Im used to working with stuff like ObjectDataSource, the example above could be complemented with full CRUD if required.
Now, I assume one Hive provider would be able to serve different parts of Umbraco with content (right?). Just set up a new Repository and go? I have now clue how to connect all parts or even how to get the data into the provider yet.
Any help in how I could bring all pieces together?
Thanks
The first step is to take a step back and evaluate your business requirements. Will you allow for users to be updating the information with forms in the frontend? Do you need a tree editor for the content in the backoffice? Do you need to be work with data outside of the built-in ORM?
If the answer to these is no, a hive provider is overkill. Evaluate solutions using either a simple surface controllers, or just a custom document type. Umbraco 5 is a full EAV/CR system, so unlike some CMS products, you'll be able to represent any rdbs structure you can imagine
ContentExample could be represented be a document type called 'Article', which has properties Title and Text. Just by defining this document type we're instantly given add and edit forms for our back office users in our content section. We can even restrict which nodes are able to have children of type 'Article', e.g News.
In the same way, an upload control is a field type that allows you to attach files to your document.
So what's point of a custom hive provider?
The goal of a custom hive provider is to unify CRUD actions for data access layers.
As a result data can be stored in the baked-in nhibernate orm, custom tables, rss feeds, or even flat files, while still using a common interface to retrieve and update it. If this sounds like what you're aiming for, read on.
Going back to the business requirements, specifically, where do you want to actually store the data?--Given that you have some fields and properties related to flat file storage, let's say that one TypedEntity (a model) is equivelant to one file and write some pseduocode:
The first step, is as you say 'get the data into the repository.' This involves going back to that VS template and filling in the 'not implemented' methods with your logic for storing and retrieving data.
protected override void PerformAddOrUpdate(TypedEntity entity)
{
// step 1: serialize the typed entity to xml
// step 2: write the file to the hdd, making sure that the file name is named using the hive id so that we can pull it back later.
}
Once you've written the data access layer, or DAL, you can hook it up in the hive config, giving it url to match. e.g. rather than matching content:\\, yours might match on file-manager:\\
We can allow our backoffice users to be able to add new entities (indirectly, new files) by writing a custom tree, and we can display the results to our front-end users via macros.

Mapping properties between two databases using Fluent NHibernate & S#arp Architecture

I have a scenario where I have a new application built using NHibernate and S#arp Architecture. This application is being integrated into a legacy application that does not use NHibernate. There will be separate databases for the legacy and new applications. I am able to successfully read objects from both databases using two S#arp NHiberate factory keys (as detailed here). This all works great. My problem is when I want to map a property from a class in my new application to an object in the legacy database. Consider the following scenario:
public class NewUser
{
public virtual LegacyUser LegacyUser { get; set; }
}
public class LegacyUser
{
public virtual string UserName { get; set; }
}
I can read my NewUser objects, but when I attempt to reference the LegacyUser property I get an error:
System.Data.SqlClient.SqlException: Invalid object name 'NewSchema.LegacyUser'.
I suspect this is happening because the session that is retrieving the NewUser objects doesn’t know anything about the session that is required to read the old user objects. The session factory key is usually specified by the S#arp repository.
I've tried specifying the schema in the mapping, but this doesn't work either.
mapping.Schema("OldSchema");
Question: Is there any way to specify in the Fluent NHibernate mapping that when the NewUser object attempts to read the LegacyUser object it needs to use a different session factory? Or do I have to manually read the objects from the legacy database and insert them into my new classes?
Thanks in advance!

Realize entity copy on NHibernate update

How can I make such behaviuor in NHibernate:
There is an entity called User in my domain
// I ommit mapping attributes
public class User
{
int Id {get; set;}
int pId {get; set;}
//....other props, such as Login, Name, email...
}
I need to make full copy of it, when updating. pId must be set to original Id. Old entity must be untouched.
So this must be like some versioning system, where pId - is immutable identity, and Id - is like version. I`ve tried to use Version mapping attribute, but it just updates Version field, withou recreating full entity. What approach will be better?
I wrote a versioning system for our application. I split the versioned object into two classes, one that represents the whole object and one for its versions.
In our software, the versioning is part of the business logic. The application provides access to the history and the user has some control over the versioning.
A new version is therefore created in memory. You need to perform a deep-copy. You could implement this by implementing an interface in the root entity and all its children. This interface provides a method DeepCopy which is recursively called through the object graph.
There is also a protected MemberwiseCopy method in object, which could be helpful. It does not a deep copy.
We didn't want the pain to maintain the code which copies each single property. So we are using serialization to copy an object graph in memory:
public static T CopyDataContract<T>(T obj)
{
NetDataContractSerializer serializer = new NetDataContractSerializer();
using (MemoryStream stream = new MemoryStream())
{
serializer.Serialize(stream, obj);
stream.Position = 0;
return (T)serializer.Deserialize(stream);
}
}
Additionally, we have methods in the entities which reset the ids and do some other cleanup. This is also done recursively though the object graph.
For now, it works fine. In the future we probably need to refactor it to the interface implementation, which is a bit cleaner.
Depending on whether history is NOT part of your domain logic I'd suggest using triggers. By not being part of domain logic I mean - you don't need to show it to users, you don't make copies out of historical version.
By using triggers you have several options - the simplest (and implying that you really need to save history only for handful of tables) - would be having a separate _history table that's a copy of your original table.
All in all it really depends on what exactly you are after.