Hey all. Quick question on Fluent syntax. I had thought I had this down, but I'm getting a weird failure. Basically, I have a hierarchical kind of structure that I'm trying to persist, and it all seems to work, except when I do an actual integration test w/ the db.
I have a Node object which has a Parent property, which is another Node, and a _children field backing a readonly Children property, which is a collection of Nodes as well.
The properties handle correlating the relationships, and the in-memory objects test out just fine. When I retrieve them from the repository (an in-memory SQLite db in my tests), though, any Node's Children include itself for some reason. Any ideas?
My mappings are mostly done w/ AutoMap, but I've overridden the following:
mapping.References(x => x.Parent);
mapping.HasMany(x => x.Children).Inverse().Access.LowerCaseField(Prefix.Underscore);
I've also tried it w/o the Inverse() call.
Got it. The problem was that I needed to tell the children collection what Id field to hook into for the foreign key.
I changed that mapping to look like so:
mapping.HasMany(m => m.Children)
.Inverse()
.KeyColumn("ParentId")
.Access.CamelCaseField(Prefix.Underscore)
.Cascade.All()
Related
Had a question about what best practice might be for the implementation of "convenience" queries. In reference to this article:
http://www.jasongrimes.org/2012/01/using-doctrine-2-in-zend-framework-2/#toc-install-doctrine-modules
It's clear that the entity manager is available in the IndexController - he does a findAll to list the entire contents of the database. What if, however, we added a "band" column to the database, mapped it out, and wanted to query all albums by the Beatles? What if the Beatles albums were used rather often throughout the codebase (weak example, but you get it).
The EM only seems to be available in Controllers, and Classes don't really seem to be aware of the service locator.
Would you simply break out DQL right in the controller, and repeat the DQL in every controller that needs it? (not very DRY)
Do we instead finagle some access to the EM from the Entity, or Model?
Doesn't seem as cut-and-dry as straight Zend_Db usage where you can fire queries anywhere you like, cheating to get things done.
Thanks for helping me cross over into a "real" ORM from the Table Gateway world.
Erm, Doctrine 2 is able to handle Relationships (e.g.: Bands to Albums and vice-versa)
The EntityManager can be made available in every single class you wish, as long as you define the class as a service. I.e. inside your Module.php you can define a factory like this:
// Implement \Zend\ModuleManager\Feature\ServiceProviderInterface
public function getServiceConfig() {
return array(
//default stuff
'factories' array(
'my-album-service' = function($sm) {
$service = new \My\Service\Album();
$service->setEntityManager($sm->get('doctrine.entitymanager.orm_default'));
return $service;
}
)
)
);
You can then call this class from every Class that is aware of the ServiceManager like $this->getServiceLocator()->get('my-album-service')
This class would then automatically be injected with the Doctrine EntityManager.
To be clear: All queries you'd do SHOULD be located inside your Services. You'd have your Entities, which are basically the DB_Mapper from Doctrine 2, then you have your Services, which run actions like add(), edit(), findAll(), findCustomQuery(), etc...
You would then populate your Services with Data from the Controllers, the Service would give data back to the controller and the controller would pass said data to the view. Does that make sense to u and answer your question?
as you know, in Seam there are no problems with LazyInitializationException when reading entities's references to subobjects. So is there any problem if I favor running through the tree of relations in order to read the data that I need, instead of sending specific queries to releveant entities' DAOs? Do I break some important guidelinies/priciples?
Consider that the phrase:
"in Seam there are no problems with LazyInitializationException"
It's not true.
In seam there are no problems with LazyInitializationException if you use a pattern where your session is being persisted in the boundary of a long-running conversation.
This means using a Seam injected persistence context like:
#In
private EntityManager entityManager;
Or, if you are using stateful EJBs (bound to conversation scope too):
#PersistenceContext(type = PersistenceContextType.EXTENDED) EntityManager em;
BTW, once you have understand that, there is no problem navigating the relation tree. You should really do it, if you want to bind it to the interface using JSF.
Consider that you may incur in some speed problem if you access to ManyToOne or OneToMany relations in queries that returns more than one result. This is known as n+1 problem, when you basically runs one more roundtrip to the database for each record returned.
Let's summarize:
single detail object -> navigate relation tree
List of other object -> make a single query to the DAO using left join fetch.
I'm developing a web application and I would like caching to persist across web requests.
I am aware that the first level cache is per-session only. I have second-level caching enabled and this is working for queries.
However, the second-level cache does not seem to work for "getting" entities... therefore most of the DB work the application does is not being cached across web requests.
Is this normal / desirable behaviour? I'm reviewing one particular page that makes lots of round trips to the database, although each query is quick, these seem unnecessary if the entities could be cached.
Edit
Okay so I have second level cache enabled, and working for queries. I just can't seem to get it working for entities. I have Cache.Is(c => c.ReadWrite()) (fluent nhibernate) on my main entity that I'm testing. But nope, it still hits the DB each time. Any ideas?
Edit
I've tried using transactions like so:
public override Accommodation Get(int id)
{
using (var tx = Session.BeginTransaction())
{
var accomm = Session.Get<Accommodation>(id);
tx.Commit();
return accomm;
}
}
My mapping is such (and you can see we have a nasty schema):
public void Override(AutoMapping<Core.Entities.Itinerary.Accommodation.Accommodation> mapping)
{
mapping.HasManyToMany(x => x.Features).Table("AccommodationLinkFeatureType").ChildKeyColumn("FeatureTypeId").NotFound.Ignore();
mapping.HasManyToMany(x => x.SimilarAccommodation).Table("AccommodationLinkSimilarAccommodation").ChildKeyColumn("SimilarAccommodationId").NotFound.Ignore();
mapping.HasMany(x => x.TourItinerary).Table("AccommodationTourItinerary");
mapping.HasOne(x => x.Images).ForeignKey("AccommodationId").Cascade.All().Not.LazyLoad();
mapping.References(x => x.CollectionType).NotFound.Ignore().Not.LazyLoad();
mapping.References(x => x.AccommodationUnitType).NotFound.Ignore().Not.LazyLoad();
Cache.Is(c => c.ReadWrite());
}
However, this still doesn't seem to fetch from the 2nd level cache.
Incidentally, I see a lot of examples online using Cache.ReadWrite() but I can only see an Is method on the Cache helper, so I'm trying Cache.Is(c => c.ReadWrite()) -- has the fluent interface changed?
I have not tested this, but my understanding is that committing transactions is the magic that places objects into second level cache. If you are doing read operations outside of a transaction then the objects will not be placed in the second level cache.
I had the same problem.
In my case the cause was the References is mapped with NotFound().Ignore() (i.e if no entity is found with this foreign key just ignore it, which is actualy a data consistency error anyway). Remove NotFound.Ignore and fix your db.
Maybe you´re having some problem with the configuration of your cache provider. I´ve been able to do want you want using Couchbase as 2nd level cache provider, as described here:
http://blog.couchbase.com/introducing-nhibernate-couchbase-2nd-level-cache-provider
If your deployment enviroment is on Azure, i guess this might be useful. Note that The SysCache module can’t co-exist with the AzureMemcached module.
http://www.webmoco.com/webmoco-development-blog/orchard-cms-second-level-caching
I am having trouble working out how to correctly cache one-to-many or many-to-many relationships in NHibernate.
For example, an office class may have the following mapping:
public OfficeDbMap()
{
...
HasMany(x => x.Employees)
.Cache.NonStrictReadWrite();
}
However I find that when I delete an employee (without specifically removing its relationship to the office), that the cache of office->employees does not get invalidated and the employee continues to appear in the office's list of employees.
I suspect it may have something to do with cache regions, but I don't know whether the region should be the office's region or the employee's region (actually I have tried specifying both and neither works).
The problem may be the NonStrictReadWrite configuration. You have to use the Read-Write strategy.
I have a parent class that contains a list of children. I have the parent and child mapped bidirectional with a has-many and an inverse on the parent with cascade.all turned on. If I modify an object in the child list, but no property on the parent, nHibernate does not save the child. If I modify a property on the parent everything saves fine. Is this by design or is there a special property I need to set?
This might have something to do with the way you are adding the children to the collection. In a bidirectional, you have to manage both sides of the relationship in the code. Consider the example from the Fluent Nhibernate Getting Started Guide. Check the Store Entity.
A Store has many Employees. The Staff property of Store is collection of Employees. The relationship is setup as bidirectional.
Store has the following method
public virtual void AddEmployee(Employee employee)
{
employee.Store = this;
Staff.Add(employee);
}
As you can see, the childs Parent property needs to be set to the parent object. If this is not done, then Nhibernate will not be able understand who the parent of the child is and cannot automatically save the child if only the child is modified and the SaveOrUpdate(parent) is called.
You need to do both.
I figured it out. I was testing auditing using various listners. When I attached to the IFlushEntityListner it caused saves to stop working. Geez that was frustrating. Thanks everyone!