I've tried to lazy-load a property in my domain model, but lazy loading doesn't work. (It is always loaded).
[Property(0, Column = "picture", Lazy=true)]
public virtual System.Byte[] Picture
{
get { return picture; }
set { picture = value; }
}
When reading the documentation here it says that it requires build-time bytecode instrumentation. What does this mean - and how can I get it ?
I have you tried a collection rather then an array?
[Property(0, Column = "picture", Lazy=true)]
public virtual IList<System.Byte> Picture
{
get { return picture; }
set { picture = value; }
}
For lazy loading to work NHibernate makes use of interception (via dynamic objects). That means it wraps your call to Picture and when you first call Picture it it will load the property from the database.
For this to work it can use one of three types of Dynamic object frameworks:
Castle DynamicProxy
Linfu
Spring
When you download NHibernate there is another folder with three types of these dynamic object plugins and you need to copy three dlls to the nhibernate folder (where nhibernate.dll is) and set a property in your nhibernate configuration file.
<property name="proxyfactory.factory_class">NHibernate.ByteCode.LinFu.ProxyFactoryFactory, NHibernate.ByteCode.LinFu</property>
Ref: http://nhforge.org/blogs/nhibernate/archive/2008/11/09/nh2-1-0-bytecode-providers.aspx
HTH Alex
Related
As title, is there any way to iterate or display Apache velocity template attributes?
for example, I have following code :
<code>
${ctx.messages.headerMessage}
</code>
And I want to know how many other attributes the variable ${ctx} has
It's only possible to discover and to loop on an object properties (that is, the ones with getters and/or setters) if you can add a new tool to your Velocity context. If you can't, you're rather stuck.
There are several ways to do this, I illustrate below how to do this with commons-beanutils.
First, add Apache commons-beanutils in your class path, and add it to your Velocity context from Java:
import org.apache.commons.beanutils.PropertyUtils;
...
context.put("beans", new PropertyUtils());
...
One remark: if you do not have access to the Java part, but if by chance commons-beanutils is already in the classpath, there is one hakish way of having access to it: #set($beans = $foo.class.forName('org.apache.commons.beanutils.PropertyUtils').newInstance()).
Then, let's say that I have the following object:
class Foo
{
public boolean isSomething() { return true; }
public String getName() { return "Nestor"; }
}
which is present in my context under $foo. Using your newly $beans properties introspector, you can do:
#set ($properties = $beans.getPropertyDescriptors($foo.class))
#foreach ($property in $properties)
$property.name ($property.propertyType) = $property.readMethod.invoke($foo)
#end
This will produce:
bar (boolean) = true
class (class java.lang.Class) = class Foo
name (class java.lang.String) = Robert
(you'll need to filter out the class property, of course)
One last remark, though. Templates are for coding the View layer of an MVC application, and doing such a generic introspection of objects in them is rather inadequate in the view layer. You're far better of moving all this introspection code on the Java side.
I have downloaded a sample code from github and run AtLeastOnceDelivery.sln
Every new run it is sending messages with it. And if I change the message namespace it shows an error started with
Error loading snapshot [SnapshotMetadata<pid: delivery, seqNr: 0, timestamp: 2018/09/24>], remaining attempts: [0]
If I could clear the persistence hopefully it will accept then changed namespace and restart messaging id.
By default, all snapshots are stored as files directly in ./snapshots directory of the application, while events are stored in the memory. Because of that you should consider using a one of the akka.persistence plugins for the production purposes.
Your problem happens because you're using akka.net default serializers (dedicated for networking) which are not very version tolerant - so changing any fields, their types, class names or namespaces makes previous version of the class non-deserializable - and in future will be subject to change. This is also why it's strongly discouraged to use default serializers for persistence.
How to make a custom Akka.NET Serializer
While there are plans to improve serializers API, at the current moment (Akka.NET v1.3.9), to make your own serializer you need to simply inherit from Akka.Serialization.Serializer class:
public sealed class MySerializer : Serializer
{
public MySerializer(ExtendedActorSystem system) : base(system) { }
public override int Identifier => /* globaly unique serializer id */;
public override bool IncludeManifest => true;
public override byte[] ToBinary(object obj)
{
// serialize object
}
public override object FromBinary(byte[] bytes, Type type)
{
// deserialize object
}
}
Keep in mind that Identifier property must be unique in cluster scope - usually values below 100 are used by akka.net internal serializers, therefore it's better to use higher values.
How to bind serializer to be used for a given type
By convention Akka.NET uses empty interfaces to mark message types that are supposed to be serialized. Then you can setup your HOCON configuration to use a specific serializer for a given interface:
akka.actor {
serializers {
my-serializer = ""MyNamespace.MySerializer, MyAssembly""
}
serialization-bindings {
""MyNamespace.MyInterface, MyAssembly"" = my-serializer
}
}
Where MyInterface is interface assigned to a message type you want to serialize/deserialize with MySerializer.
I'm doing things considered horrible by some lately, but I personally enjoy this kind of experiment. Here's a telegraph style description:
Use NH to fetch data objects
Each DataObject is wrapped by a CastleDynamicProxy
When Properties decorated with Custom Attributes are queried, redirect to own code instead of NHibernate to get Returnvalue.
Object creation / data fetch code
Objects=GetAll().Select(x=>ProxyFactory.CreateProxy<T>(x)).ToList();
public IList<Person> GetAll()
{
ISession session = SessionService.GetSession();
IList<Person> personen = session.CreateCriteria(typeof(Person))
.List<Person>();
return personen;
}
The Proxy generation Code:
public T CreateProxy<T>(T inputObject)
{
T proxy = (T)_proxyGenerator.CreateClassProxy(typeof(T), new ObjectRelationInterceptor<T>(inputObject));
return proxy;
}
The Interceptor used is defined like so:
public class MyInterceptor<T> : IInterceptor
{
private readonly T _wrappedObject;
public MyInterceptor(T wrappedObject)
{
_wrappedObject = wrappedObject;
}
public void Intercept(IInvocation invocation)
{
if (ShouldIntercept(invocation)) { /* Fetch Data from other source*/ }
else
{
invocation.ReturnValue = invocation.Method.Invoke(_wrappedObject, invocation.Arguments);
}
}
public bool ShouldIntercept(IInvocation invocation)
{
// true if Getter / Setter and Property
// has a certain custom attribute
}
}
This works fine in an environment without NHibernate (creating objects in code, where the Object holds its own data).
Unfortunately, the else part in the Intercept method seems to leave NHibernate unfunctional, it seems the _wrappedObject is reduced to it's base type functionality (instead of being proxied by NHibernate), so all mapped Child collections remain empty.
I tried switching from lazy to eager loading (and confirmed that all SQL gets executed), but that doesn't change anything at all.
Does anybody have an idea what I could do to get this back to work?
Thanks a lot in advance!
I found out that what I do is partially wrong and partially incomplete. Instead of deleting this question, I chose to answer it myself, so that others can benefit from it as well.
First of all, I have misunderstood the class proxy to be an instance proxy, which is why i stored the _wrappedObject. I needed the Object to perform invocation.Method.Invoke(_wrappedObject, invocation.Arguments), which is the next mistake. Instead of doing so, I should have passed the call on to the next interceptor by making use of invocation.Proceed().
Now, where was that Incomplete? NH seems to need to know Metadata about it's instances, so I missed one important line to make NH aware that the proxy is one of its kin:
SessionFactory.GetClassMetadata(entityName).SetIdentifier(instance, id, entityMode);
This only works in an NHibernate Interceptor, so the final product differs a bit from my initial one...Enough gibberish, you can see a very very comprehensible example on this on Ayende's website. Big props for his great tutorial!
I think it's a little bit ridiculous to ask that in an interview. But if the interviewer ask ... need to answer.
Explain in depth:
why properties and method must be virtual
how lazy loading works
Regards,
You will have to look at NHibernate source code for more details, but my understanding is following: lazy loading is implemented by substituting a class with a proxy generated at runtime. Proxy is inherited from the class, so that it can 'intercept' method calls and load the actual data lazily. This interception would only work if methods and properties are virtual, because the client code calls them through a reference to the class. Client code can be unaware of the fact that it really uses a proxy (derived from the class). The actual lazy loading logic is a lot more complex but this is roughly what is going on:
public class Customer {
public virtual String Name {
get { return _name; }
}
}
// code like this gets generated at runtime:
public class CustomerProxy7461293476123947123 : Customer {
private Customer _target;
public override String Name {
get {
if(_target == null){
_target = LoadFromDatabase();
}
return _target.Name;
}
}
}
This way the data would only get loaded when client actually calls 'Name':
Customer customer = Session.Load<Customer>(1); // <-- proxy is returned
// or
Customer customer = salesman.FavoriteCustomer; // <-- proxy is returned
...
String name = customer.Name; // <-- proxy's Name will be called, loading data
Similar mechanisms are used for collections except that collections don't need to be generated at runtime. NHibernate has built-in persistent collections that load items lazily.
I have two related objects: ProgramSession and ProgramTask, with a one-to-many relationship. A ProgramSession has many ProgramTasks. So the objects looks like this:
public class ProgramSession
{
public virtual IList<ProgramTask> ProgramTasks
{
get { return _programTasks; }
set { _programTasks = value; }
}
}
public class ProgramTask
{
public virtual ProgramSession ProgramSession
{
get { return _programSession; }
set { _programSession = value; }
}
}
And the mappings...
ProgramSession.hbm.xml
<bag name="ProgramTasks" lazy="false" cascade="all-delete-orphan" inverse="true" >
<key column="SessionUid"></key>
<one-to-many class="ProgramTask"></one-to-many>
</bag>
ProgramTask.hbm.xml
<many-to-one name="ProgramSession" column="SessionUid" class="ProgramSession" />
The problems begin when I try to change the ProgramSession of a ProgramTask.
If I remove the ProgramTask from the ProgramSession.ProgramTasks list property of the old session, and then add it to the same property on the new session, NHibernate tells me that the a deleted object will be resaved.
If I simply change the value of the ProgramTask.ProgramSession object, I have no problem saving. However, I get weird behavior if I do not save immediately because the ProgramSession.ProgramTasks properties (on both sessions) are not synchronized until after the NHibernate session is refreshed.
Changing the ProgramTask.ProgramSession object without directly modifying the lists as well, creates an invalid state. Take the following code as an example:
programTask.ProgramSession = newProgramSession;
Assert.That(newProgramSession.ProgramTasks.Contains(programTask)); // fails
Assert.That(!oldProgramSession.ProgramTasks.Contains(programTask)); // fails
This is more problematic in code that executed later on, that assumes the ProgramTasks collections are synchronized with the ProgramSession property. For example:
foreach(var programTask in programSession.ProgramTasks)
{
// whatever
}
One hack I used to work around this was to query the list. I can't use it everywhere, and it's clearly a bad solution, but it underscores the problem:
var tasksActuallyInSession =
programSession.ProgramTasks
.Where(task => task.ProgramSession == programSession)
.ToList();
Is there any way to handle this kind of situation? A best practice? Am I doing something wrong? Is the mapping incorrect? Is there some super-secret NHibernate flag I need to set?
Not sure if I understand everything what you are doing here. Some thoughts:
If you decide to move ProgramTasks around, then they get independent and should not be mapped using cascade="all-delete-orphan". If you do this, NH removes the ProgramTask from the database when you remove it from a ProgramSession.
Map it using cascade="none" and control the lifecycle of the object yourself. (this means: store it before a ProgramSession gets stored. Delete it when it is not used anymore.)
Not sure if this this is also a problem, but note that if you have a inverse reference, you code is responsible to make the references consistent. Of course, references get cleaned up after storing to the database and loading into an empty session, this is because there is only one foreign key in the database. But this is not the way it should be done. It is not responsibility of NH to manage your references. (Its only responsibility is to persist what you are doing in memory.) So you need to make it consistent in memory, and implement your business logic as if there wasn't NHibernate behind.