How to Inject ExecutionContext or format property and use it on My FieldExtractor with SpringBatch - batch-processing

I had to implement a customized FieldExtractor for my job like:
public class MyFieldExtractor implements
FieldExtractor<MyEntity> {
#Override
public Object[] extract(MyEntity e) {
......
}
}
In my xml, I use it in my customized Line Aggregator, like:
The "format" property is bind dynamically.
<bean id="myLineAggregator"
class="org.springframework.batch.item.file.transform.FormatterLineAggregator" scope="step">
<property name="fieldExtractor">
<bean
class=".........MyFieldExtractor">
</bean>
</property>
<property name="format" value="1$01d%#{jobExecutionContext[$jc{filename}].dynamicFormat}"/>
</bean>
I need a way to inject or to get the "format" property in my class MyFieldExtractor or at least, I would to inject the ExecutionContext in some field.
I tried:
#Value("#{jobExecutionContext[]}")
private ExecutionContext context;
#Value("#{jobExecutionContext}")
private ExecutionContext context;
unsuccessfully...
Is it possible?

JobExecutions (as well as StepExecutions) are not directly bindable so #Value("#{jobExecutionContext}") doesn't work.
You can inject 1$01d%#{jobExecutionContext[$jc{filename}].dynamicFormat} or #{jobExecutionContext[$jc{filename}]} using #Value() annotation.

Related

#Resource UserTransaction utx1 and #PersistenceContex programmatically in java application server

I am using glassfish server with multiple jdbc connection pools. I want to have EntityManagerFactories for every jdbc connection.
Using the code
Map props = new HashMap();
props.put(PersistenceUnitProperties.JTA_DATASOURCE, <my jdbc datasource>);
Persistence.createEntityManagerFactory(<persistence unit name>, props);
Works fine but I cannot combine it with the UserTransation to have transaction begin and commit, rollback.
Furthermore entityManager.getTrasnaction().begin, commit, rollback don’t work (I get records persisted in the database when an error occurs).
Things work great when I use UserTrasnaction and EntityManager injected with
#Resource
UserTransaction utx1;
#PersistenceContext(unitName = <persistence unit name>)
private EntityManager em1;
With the injection the utx1.begin(), utx1.commit() controls the entityManager perfectly!
But the problem is that I cannot use my multiple jdbc connection pools that i have set up on my server.
So my question is:
Can I do programmatically what this injection does?
#Resource
UserTransaction utx1;
#PersistenceContext(unitName = <persistence unit name>)
private EntityManager em1;
Thanx!
I have tried this, but it doesnt work
#Resource
UserTransaction utx1;
#PersistenceContext(
properties = {#PersistenceProperty(name=PersistenceUnitProperties.JTA_DATASOURCE, value="ORCLH_MARMA")},
unitName = "MINLO")
private EntityManager em1;
You should use CDI #Qualifiers and EntityProducers to do that, and use the persistence.xml file to define your multiple persistence-units (with the same entities), provided they are prepared to do that.
First things first: define as many datasource qualifiers as needed
#Qualifier
#Retention(RUNTIME)
#Target({METHOD, FIELD, PARAMETER, TYPE})
public #interface MainDatabase {
}
and a second one
#Qualifier
#Retention(RUNTIME)
#Target({METHOD, FIELD, PARAMETER, TYPE})
public #interface SecondaryDatabase {
}
Then create a EntityManager producer using proper datasources:
#ApplicationScoped
public EntityManagerProducer {
#Produces #RequestScoped
#MainDatabase
#PersistenceContext(unitName = "main-pu-01")
private EntityManager mainEntityManager;
#Produces #RequestScoped
#SecondaryDatabase
#PersistenceContext(unitName = "secondary-pu-01")
private EntityManager secondaryEntityManager
}
In your persistence.xml, define both persistence units for the same entities, but with different names and data-sources:
<persistence version="2.1"...>
<persistence-unit name="main-pu-01" transaction-type="JTA">
<jta-data-source>jdbc/maindatasource</jta-data-source>
...
</persistence-unit>
<persistence-unit name="secondary-pu-01" transaction-type="JTA">
<jta-data-source>jdbc/secondarydatasource</jta-data-source>
...
</persistence-unit>
</persistence>
Then, in your EJBs or CDI beans, inject the desired entity manager to perform your ops:
#RequestScoped
public abstract class EntityController<E> {
#Inject #MainDatabase
private EntityManager mainEM;
#Inject #SecondaryDatabase
private EntityManager secondaryEM;
public E doSomethingInMain(E e) {
return mainEM.something...
}
public E doSomethingInSecondary(E e) {
return secondaryEM.something...
}
}
Be aware that #Transactional operations may not propagate instantly between EntityManagers if they share the same database, except if you tune caches properly.
I don't know why you need to use different users for each datasource, so I suggest you check if you aren't incurring in a xyproblem.

NHibernate: Updating collections during EventListener "PreUpdateEvent"

I'm trying to write an audit tracking for Nhibernate that hooks into the PreUpdate event. I have an AuditLogEntry class (when, who, etc), that contains a list of AuditLogEntryDetails (i.e. individual properties that changed). If I isolate the AuditLogEntry class from the entity being audited then my code runs with no errors. However, if I add a list of AuditLogEntry's to the entity being audited then my code throws a
collection [DomainObjects.AuditTracking.AuditLogEntry.Details] was not processed by
flush()
assertion failure when I attempt to save the modified list inside the event listener. This only happens when the audited item already has one (or more) AuditLogEntry instance in the list. If there are no entries then a new list is created and added to the entity being audited and this is fine.
I think by isolating the issue to the above it would appear to be around (lazy) loading the existing list to add the new instance of AuditLogEntry too. However I've been unable to progress any further. Adding 'Lazy="False"' to the list mapping doesn't appear to help. I'm really in the early days of using NHibernate, having borrowed concepts from both the HN 3.0 Cookbook and this blog post. My code is very similar to this, but attempts to add the audit history to the item being audited in a list (and as such I think that I need to also do that in the pre, rather than post update event).
A snap shot of the entity interfaces/classes in question are:
public class AuditLogEntry : Entity
{
public virtual AuditEntryTypeEnum AuditEntryType { get; set; }
public virtual string EntityFullName { get; set; }
public virtual string EntityShortName { get; set; }
public virtual string Username { get; set; }
public virtual DateTime When { get; set; }
public virtual IList<AuditLogEntryDetail> Details { get; set; }
}
public interface IAuditTrackedEntity
{
Guid Id { get; }
IList<AuditLogEntry> ChangeHistory { get; set; }
}
public class AuditTrackedEntity : StampedEntity, IAuditTrackedEntity
{
public virtual IList<AuditLogEntry> ChangeHistory { get; set; }
}
public class LookupValue : AuditTrackedEntity
{
public virtual string Description { get; set; }
}
For the mappings I have:
AuditTrackedEntry.hbm.xml:
<?xml version="1.0" encoding="utf-8" ?>
<hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" assembly="DomainObjects" namespace="DomainObjects.AuditTracking">
<class name="AuditLogEntry">
<id name="Id">
<generator class="guid.comb" />
</id>
<version name="Version" />
<property name="AuditEntryType"/>
<property name="EntityFullName"/>
<property name="EntityShortName"/>
<property name="Username"/>
<property name="When" column="`When`"/>
<list name ="Details" cascade="all">
<key column="AuditLogEntryId"/>
<list-index column="DetailsIndex" base="1"/>
<one-to-many class="AuditLogEntryDetail"/>
</list>
</class>
</hibernate-mapping>
lookupvalue.hbm.xml:
<?xml version="1.0" encoding="utf-8" ?>
<hibernate-mapping xmlns="urn:nhibernate-mapping-2.2" assembly="DomainObjects" namespace="DomainObjects">
<class name="LookupValue">
<id name="Id">
<generator class="guid.comb" />
</id>
<discriminator type="string">
<column name="LookupValueType" unique-key="UQ_TypeName" not-null="true" />
</discriminator>
<version name="Version" />
<property name="Description" unique-key="UQ_TypeName" not-null="true" />
<property name="CreatedBy" />
<property name="WhenCreated" />
<property name="ChangedBy" />
<property name="WhenChanged" />
<list name ="ChangeHistory">
<key column="EntityId"/>
<list-index column="ChangeIndex" base="1"/>
<one-to-many class="DomainObjects.AuditTracking.AuditLogEntry"/>
</list>
</class>
</hibernate-mapping>
The EventListener PreUpdate event handler calls the follow code:
The lines that cause the problem are commented near the end of the code block
public void TrackPreUpdate(IAuditTrackedEntity entity, object[] oldState, object[] state, IEntityPersister persister, IEventSource eventSource)
{
if (entity == null || entity is AuditLogEntry)
return;
var entityFullName = entity.GetType().FullName;
if (oldState == null)
{
throw new ArgumentNullException("No old state available for entity type '" + entityFullName +
"'. Make sure you're loading it into Session before modifying and saving it.");
}
var dirtyFieldIndexes = persister.FindDirty(state, oldState, entity, eventSource);
var session = eventSource.GetSession(EntityMode.Poco);
AuditLogEntry auditLogEntry = null;
foreach (var dirtyFieldIndex in dirtyFieldIndexes)
{
if (IsIngoredProperty(persister, dirtyFieldIndex))
continue;
var oldValue = GetStringValueFromStateArray(oldState, dirtyFieldIndex);
var newValue = GetStringValueFromStateArray(state, dirtyFieldIndex);
if (oldValue == newValue)
{
continue;
}
if (auditLogEntry == null)
{
auditLogEntry = new AuditLogEntry
{
AuditEntryType = AuditEntryTypeEnum.Update,
EntityShortName = entity.GetType().Name,
EntityFullName = entityFullName,
Username = Environment.UserName,
//EntityId = entity.Id,
When = DateTime.Now,
Details = new List<AuditLogEntryDetail>()
};
//**********************
// The next three lines cause a problem when included,
// collection [] was not processed by flush()
//**********************
if (entity.ChangeHistory == null)
entity.ChangeHistory = new List<AuditLogEntry>();
entity.ChangeHistory.Add(auditLogEntry);
session.Save(auditLogEntry);
}
var detail = new AuditLogEntryDetail
{
//AuditLogEntryId = auditLogEntry.Id,
PropertyName = persister.PropertyNames[dirtyFieldIndex],
OldValue = oldValue,
NewValue = newValue
};
session.Save(detail);
auditLogEntry.Details.Add(detail);
}
session.Flush();
}
As previously stated, in this configuration I get an assertion failure "collection [] was not processed by flush()". If I remove the three lines above and the list mapping in the lookupcode.hmb.xml then everything works as expected, other than the entity being audited no longer contains a reference to it's own audited items.
we were facing very similar problem, exactly same exception, but in different situation. No solution found yet...
We have NH event listener implementing IPreUpdateEventListener and OnPreUpdate method used for audit log. Everything is fine for simple properties updating, dirty checking works well, but there are problems with lazy collections. When updating some object which has lazy collection and accessing any object field in the event listener OnPreUpdate method, the same exception as mentioned above is thrown. When lazy set to false, problem disappears.
So it seems there is some problem with lazy collections (and no influence of collection initialization before saving). Our problem isn't connected with creating new collection items; only reading existing object, only its field accessing from the event listener causes the problem.
So in your case, maybe, lazy set to false only for the associatioon could fix the problem, but on the other hand probably you really want to have the collection to be lazy. So hard to say, if the problem has resolution or IInterceptor have to be used instead.
Ok, I have found your problem, this line is actually causing the problem.
Details = new List<AuditLogEntryDetail>()
You can't initialize an empty collection before you save because the EntityPersister will not persist the collection, but it will error that the collection has not been processed.
Also, once nHibernate calls event listeners, cascades do not work (not sure if this is by design or not). So even though you are adding the detail item to the collection later, you are only calling save on the detail, not the parent, so the change is not propagated. I would recommend re-factoring so that items are completed in this order...
Detail, then save,
AuditLogEntry, then save,
Entity, then update.
I had exactly same problem while using EventListener. I was looping through properties one-by-one to detect changes, that included enumerating collections. However when I added a check for the collection using NHibernateUtil.IsInitialized(collection), problem disappeared. I wouldn't catch-and-ignore the AssertionFailure exception since it might have unknown side-effects.
There's an issue still open to solve this problem. There's a patch at the end of topic that solved it to me.
https://nhibernate.jira.com/browse/NH-3226

Not loading associations without proxies in NHibernate

I don't like the idea of proxy and lazy loading. I don't need that. I want pure POCO. And I want to control loading associations explicitly when I need.
Here is entity
public class Post
{
public long Id { get; set; }
public long OwnerId { get; set; }
public string Content { get; set; }
public User Owner { get; set; }
}
and mapping
<class name="Post">
<id name="Id" />
<property name="OwnerId" />
<property name="Content" />
<many-to-one name="Owner" column="OwnerId" />
</class>
However if I specify lazy="false" in the mapping, Owner is always eagerly fetched.
I can't remove many-to-one mapping because that also disables explicit loading or a query like
from x in session.Query<Post>()
where x.Owner.Title == "hello"
select x;
I specified lazy="true" and set use_proxy_validator property to false. But that also eager loads Owner.
Is there any way to load only Post entity?
In short, it is not possible with out of box NH. But here is attempt at just, lazy loading without proxies
http://thinkbeforecoding.com/post/2009/02/07/Lazy-load-and-persistence-ignorance
Set the class User to lazy = false on the mapping
<class name="User" table="Users" lazy="false">
Remove this property <property name="OwnerId" />... to get the owner id you can use Owner.Id. This will not trigger a lazy load. Owner will only be loaded if you hit any property besides the id. To make it a flat/simple POCO, you can use projections and ResultTransformers.
Davy Brion - Must Everything be Virtual with NHibernate

How to Map Enum in NHibernate to Properly Create DB Field on Schema Export?

I've seen several questions related to properly mapping an enum type using NHibernate.
This article by Jeff Palermo showed me how to do that properly by creating a custom type. I use Schema Export to create my DB during my dev cycles, but this method breaks my export statement. Is there a way to specify the type of the column on export?
Here is my enum code:
public enum OperatorCode
{
CodeA,
CodeB,
CodeC,
CodeD
}
Here is my custom type:
public class OperatorCodeType:EnumStringType
{
public OperatorCodeType():base(typeof(OperatorCode),20)
{
}
}
Here is my property in my mapping file:
<property name="OperatorCode" column="OperatorCode" type="OperatorCodeType" />
And finally here is my class declaration for that property:
public virtual OperatorCode OperatorCode { get; set; }
Is it even possible to do this?
I have not tested it, but you can use the Column declaration within a property to specify the sql type. Example from the docs:
<property name="Foo" type="String">
<column name="foo" length="64" not-null="true" sql-type="text"/>
</property>
Granted this is a string, but you may want to try it with the type of OperatorCodeType, column sql-type as text or nvarchar or whatever works.
If you try it, let me know? Not near my dev machine at the moment.

NHibernate: how to map a Point?

I have a class which contains a collection of Points (PointF's rather).
I want to be able to persist instances of that class using NHibernate.
My class looks somewhat like this (simplified):
public class MyClass
{
public IDictionary<string, PointF> Points = new Dictionary<string, PointF>();
public void AddPoint( location, PointF position )
{
Points.Add(location, position);
}
}
The mapping of this collection looks like this (simplified):
<map name="Points" table="Locations">
<key column="MyClassId" />
<index column="LocationName" />
<composite-element class="System.Drawing.PointF, System.Drawing">
<property name="X" column="X" />
<property name="Y" column="Y" />
</composite-element>
</map>
The problem now is, that NHibernate throws an error while processing the mapping file, since PointF is not a known (mapped) entity.
How can I solve this in the most simple way ?
How can I make sure that NHibernate is able to persist my collection of locations (with their coordinates (point) ?
The problem is not that you didn't map the type PointF - because you map it as composite-element, which is correct.
When mapping such types you need to make sure
that properties are writable (which is luckily the case here)
that it has a default constructor, which is not the case here.
So how should NH create new instances when there is not default constructor? It can't.
Your options are:
implement an interceptor or NH event. I think it is possible to inject code there which creates instances of certain types, but I don't know how.
implement a NH user type (derived from ICompositeUserType), which is not too hard to do
map another type (eg. a wrapper to PointF)