Type used in a using statement should be implicitly convertible to IDisposable - using-statement

I have the following logic:
try
{
using (var contents = new StreamReader(file.InputStream).ReadToEnd())
{
var rows = contents.Split(new[] { Environment.NewLine }, StringSplitOptions.None);
rows.ForEach(r => mids.Add(r.Split(',')[0]));
}
}
catch(IOException e)
{}
finally
{
contents = null;
}
In the using statement I have an error in the question. It happened probably because I use .ReadToEnd() method.
Without the using statement I would need to use try/catch/finally for a clean up (to fix veracode resource clean up issue)
How can I fix that, so I don't need to use try\catch\finally and use only the using statement?

So, using should be used with object which implements IDisposable interface. You calling ReadToEnd method which returns string and contents is not a IDisposable (because string is not).
You should use it like this:
using (var streamReader = new StreamReader(file.InputStream))
{
var contents = streamReader.ReadToEnd();
// Some actions
}
You want to clean up StreamReader, contents will be collected by GC when method will finished because it has type string.

Related

Pros & cons bean vs SSJS?

I was trying to build a bean that always retrieves the same document ( a counter document), gets the current value, increment it and save the document with the new value. Finally it should return the value to the calling method and that would get me a new sequential number in my Xpage.
Since the Domino objects cannot be serialized or singleton'ed what's the benefit creating a bean doing this, over creating a SSJS function doing the exact same thing?
My bean must have calls to session, database, view and document, which then will be called every time.
The same within the SSJS-function except for session and database.
Bean:
public double getTransNo() {
try {
Session session = ExtLibUtil.getCurrentSession();
Database db = session.getCurrentDatabase();
View view = db.getView("vCount");
view.refresh();
doc = view.getFirstDocument();
transNo = doc.getItemValueDouble("count");
doc.replaceItemValue("count", ++transNo);
doc.save();
doc.recycle();
view.recycle();
} catch (NotesException e) {
e.printStackTrace();
}
return transNo;
}
SSJS:
function getTransNo() {
var view:NotesView = database.getView("vCount");
var doc:NotesDocument = view.getFirstDocument();
var transNo = doc.getItemValueDouble("count");
doc.replaceItemValue("count", ++transNo);
doc.save();
doc.recycle();
view.recycle();
return transNo;
}
Thank you
Both pieces of code are not good (sorry to be blunt).
If you have one document in your view, you don't need a view refresh which might be queued behind a refresh on another view and be very slow. Presumably you are talking about a single sever solution (since replication of the counter document would for sure lead to conflicts).
What you do in XPages is to create a Java class and declare it as application bean:
public class SequenceGenerator {
// Error handling is missing in this class
private double sequence = 0;
private String docID;
public SequenceGenerator() {
// Here you load from the document
Session session = ExtLibUtil.getCurrentSession();
Database db = session.getCurrentDatabase();
View view = db.getView("vCount");
doc = view.getFirstDocument();
this.sequence = doc.getItemValueDouble("count");
this.docID = doc.getUniversalId();
Utils.shred(doc, view); //Shred currenDatabase isn't a good idea
}
public synchronized double getNextSequence() {
return this.updateSequence();
}
private double updateSequence() {
this.sequence++;
// If speed if of essence I would spin out a new thread here
Session session = ExtLibUtil.getCurrentSession();
Database db = session.getCurrentDatabase();
doc = db.getDocumentByUnid(this.docID);
doc.ReplaceItemValue("count", this.sequence);
doc.save(true,true);
Utils.shred(doc);
// End of the candidate for a thread
return this.sequence;
}
}
The problem for the SSJS code: what happens if 2 users hit that together? At least you need to use synchronized there too. Using a bean makes it accessible in EL too (you need to watch out not to call it too often). Also in Java you can defer the writing back to a different thread - or not write it back at all and in your class initialization code read the view with the actual documents and pick the value from there.
Update: Utils is a class with static methods:
/**
* Get rid of all Notes objects
*
* #param morituri = the one designated to die, read your Caesar!
*/
public static void shred(Base... morituri) {
for (Base obsoleteObject : morituri) {
if (obsoleteObject != null) {
try {
obsoleteObject.recycle();
} catch (NotesException e) {
// We don't care we want go get
// rid of it anyway
} finally {
obsoleteObject = null;
}
}
}
}

Sorting an ArrayList of NotesDocuments using a CustomComparator

I'm trying to sort a Documents Collection using a java.util.ArrayList.
var myarraylist:java.util.ArrayList = new java.util.ArrayList()
var doc:NotesDocument = docs.getFirstDocument();
while (doc != null) {
myarraylist.add(doc)
doc = docs.getNextDocument(doc);
}
The reason I'm trying with ArrayList and not with TreeMaps or HashMaps is because the field I need for sorting is not unique; which is a limitation for those two objects (I can't create my own key).
The problem I'm facing is calling CustomComparator:
Here how I'm trying to sort my arraylist:
java.util.Collections.sort(myarraylist, new CustomComparator());
Here my class:
import java.util.Comparator;
import lotus.notes.NotesException;
public class CustomComparator implements Comparator<lotus.notes.Document>{
public int compare(lotus.notes.Document doc1, lotus.notes.Document doc2) {
try {
System.out.println("Here");
System.out.println(doc1.getItemValueString("Form"));
return doc1.getItemValueString("Ranking").compareTo(doc2.getItemValueString("Ranking"));
} catch (NotesException e) {
e.printStackTrace();
}
return 0;
}
}
Error:
Script interpreter error, line=44, col=23: Error calling method
'sort(java.util.ArrayList, com.myjavacode.CustomComparator)' on java
class 'java.util.Collections'
Any help will be appreciated.
I tried to run your SSJS code in a try-catch block, printing the error in exception in catch block and I got the following message - java.lang.ClassCastException: lotus.domino.local.Document incompatible with lotus.notes.Document
I think you have got incorrect fully qualified class names of Document and NotesException. They should be lotus.domino.Document and lotus.domino.NotesException respectively.
Here the SSJS from RepeatControl:
var docs:NotesDocumentCollection = database.search(query, null, 0);
var myarraylist:java.util.ArrayList = new java.util.ArrayList()
var doc:NotesDocument = docs.getFirstDocument();
while (doc != null) {
myarraylist.add(doc)
doc = docs.getNextDocument(doc);
}
java.util.Collections.sort(myarraylist, new com.mycode.CustomComparator());
return myarraylist;
Here my class:
package com.mycode;
import java.util.Comparator;
public class CustomComparator implements Comparator<lotus.domino.Document>{
public int compare(lotus.domino.Document doc1, lotus.domino.Document doc2) {
try {
// Numeric comparison
Double num1 = doc1.getItemValueDouble("Ranking");
Double num2 = doc2.getItemValueDouble("Ranking");
return num1.compareTo(num2);
// String comparison
// return doc1.getItemValueString("Description").compareTo(doc2.getItemValueString("Description"));
} catch (lotus.domino.NotesException e) {
e.printStackTrace();
}
return 0;
}
}
Not that this answer is necessarily the best practice for you, but the last time I tried to do the same thing, I realized I could instead grab the documents as a NotesViewEntryCollection, via SSJS:
var col:NotesViewEntryCollection = database.getView("myView").getAllEntriesByKey(mtgUnidVal)
instead of a NotesDocumentCollection. I just ran through each entry, grabbed the UNIDs for those that met my criteria, added to a java.util.ArrayList(), then sent onward to its destination. I was already sorting the documents for display elsewhere, using a categorized column by parent UNID, so this is probably what I should have done first; still on leading edge of the XPages/Notes learning curve, so every day brings something new.
Again, if your collection is not equatable to a piece of a Notes View, sorry, but for those with an available simple approach, KISS. I remind myself frequently.

NHibernate - Handling StaleObjectStateException to always commit client changes - Need advice/recommendation

I am trying to find the perfect way to handle this exception and force client changes to overwrite any other changes that caused the conflict. The approach that I came up with is to wrap the call to Session.Transaction.Commit() in a loop, inside the loop I would do a try-catch block and handle each stale object individually by copying its properties, except row-version property then refreshing the object to get latest DB data then recopying original values to the refreshed object and then doing a merge. Once I loop I will commit and if any other StaleObjectStateException take place then the same applies. The loop keeps looping until all conflicts are resolved.
This method is part of a UnitOfWork class. To make it clearer I'll post my code:
// 'Client-wins' rules, any conflicts found will always cause client changes to
// overwrite anything else.
public void CommitAndRefresh() {
bool saveFailed;
do {
try {
_session.Transaction.Commit();
_session.BeginTransaction();
saveFailed = false;
} catch (StaleObjectStateException ex) {
saveFailed = true;
// Get the staled object with client changes
var staleObject = _session.Get(ex.EntityName, ex.Identifier);
// Extract the row-version property name
IClassMetadata meta = _sessionFactory.GetClassMetadata(ex.EntityName);
string rowVersionPropertyName = meta.PropertyNames[meta.VersionProperty] as string;
// Store all property values from client changes
var propertyValues = new Dictionary<string, object>();
var publicProperties = staleObject.GetType().GetProperties();
foreach (var p in publicProperties) {
if (p.Name != rowVersionPropertyName) {
propertyValues.Add(p.Name, p.GetValue(staleObject, null));
}
}
// Get latest data for staled object from the database
_session.Refresh(staleObject);
// Update the data with the original client changes except for row-version
foreach (var p in publicProperties) {
if (p.Name != rowVersionPropertyName) {
p.SetValue(staleObject, propertyValues[p.Name], null);
}
}
// Merge
_session.Merge(staleObject);
}
} while (saveFailed);
}
The above code works fine and handle concurrency with the client-wins rule. However, I was wondering if there is any built-in capabilities in NHibernate to do this for me or if there is a better way to handle this.
Thanks in advance,
What you're describing is a lack of concurrency checking. If you don't use a concurrency strategy (optimistic-lock, version or pessimistic), StaleStateObjectException will not be thrown and the update will be issued.
Okay, now I understand your use case. One important point is that the ISession should be discarded after an exception is thrown. You can use ISession.Merge to merge changes between a detached a persistent object rather than doing it yourself. Unfortunately, Merge does not cascade to child objects so you still need to walk the object graph yourself. So the implementation would look something like:
catch (StaleObjectStateException ex)
{
if (isPowerUser)
{
var newSession = GetSession();
// Merge will automatically get first
newSession.Merge(staleObject);
newSession.Flush();
}
}

How to iterate on an entire table using NHibernate?

I am looking for a simple NHibernate example which will show me how iterate on an entire table. Here is what I have so far, but it is not working. I am getting an "System.InvalidOperationException: Operation is not valid due to the current state of the object.". What am I doing wrong?
public IEnumerable<EMPDATA> getEMPData()
{
using (ISession session = NHibernateHelper.OpenSession())
{
IEnumerable<EMPDATA> empData = session.CreateQuery("from EMPDATA").Enumerable<EMPDATA>();
return empData;
}
}
public static void Main(System.String[] args)
{
log.Debug("Entered main");
Console.WriteLine("Entered main");
try
{
IEMPDataRepository repository = new EMPDataRepository();
IEnumerable<EMPDATA> iterList = repository.getEMPData();
while( iterList.GetEnumerator().MoveNext())
{
EMPDATA emp = iterList.GetEnumerator().Current;
log.Debug(emp.EMP_ID);
}
}
catch (System.Exception ex)
{
log.Error("Exception occured reading emp data", ex);
}
Here is my mapping:
You request an Enumerable result, which probably relies on the session still beeing open.
since you Dispose the session after returning the Enumerable instance, you have closed the connection to the database.
EDIT: see NotSupportedException on IQuery's Enumerable when using statelesssession
Short answer: use .List instead of .Enumerable.
Longer answer:
1. I agree with Phill- looks like a job for a SP
2. Diego is (obviously) right, but if I were you i'd use SetFirstResult() and SetMaxResult() in order to control the amount of data you load into memory in each iteration (don't forget to sort by something when using this method, of course).

An NHibernate audit trail that doesn't cause "collection was not processed by flush" errors

Ayende has an article about how to implement a simple audit trail for NHibernate (here) using event handlers.
Unfortunately, as can be seen in the comments, his implementation causes the following exception to be thrown: collection xxx was not processed by flush()
The problem appears to be the implicit call to ToString on the dirty properties, which can cause trouble if the dirty property is also a mapped entity.
I have tried my hardest to build a working implementation but with no luck.
Does anyone know of a working solution?
I was able to solve the same problem using following workaround: set the processed flag to true on all collections in the current persistence context within the listener
public void OnPostUpdate(PostUpdateEvent postEvent)
{
if (IsAuditable(postEvent.Entity))
{
//skip application specific code
foreach (var collection in postEvent.Session.PersistenceContext.CollectionEntries.Values)
{
var collectionEntry = collection as CollectionEntry;
collectionEntry.IsProcessed = true;
}
//var session = postEvent.Session.GetSession(EntityMode.Poco);
//session.Save(auditTrailEntry);
//session.Flush();
}
}
Hope this helps.
The fix should be the following. Create a new event listener class and derive it from NHibernate.Event.Default.DefaultFlushEventListener:
[Serializable]
public class FixedDefaultFlushEventListener: DefaultFlushEventListener
{
private static readonly log4net.ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
protected override void PerformExecutions(IEventSource session)
{
if (log.IsDebugEnabled)
{
log.Debug("executing flush");
}
try
{
session.ConnectionManager.FlushBeginning();
session.PersistenceContext.Flushing = true;
session.ActionQueue.PrepareActions();
session.ActionQueue.ExecuteActions();
}
catch (HibernateException exception)
{
if (log.IsErrorEnabled)
{
log.Error("Could not synchronize database state with session", exception);
}
throw;
}
finally
{
session.PersistenceContext.Flushing = false;
session.ConnectionManager.FlushEnding();
}
}
}
Register it during NHibernate configuraiton:
cfg.EventListeners.FlushEventListeners = new IFlushEventListener[] { new FixedDefaultFlushEventListener() };
You can read more about this bug in Hibernate JIRA:
https://hibernate.onjira.com/browse/HHH-2763
The next release of NHibernate should include that fix either.
This is not easy at all. I wrote something like this, but it is very specific to our needs and not trivial.
Some additional hints:
You can test if references are loaded using
NHibernateUtil.IsInitialized(entity)
or
NHibernateUtil.IsPropertyInitialized(entity, propertyName)
You can cast collections to the IPersistentCollection. I implemented an IInterceptor where I get the NHibernate Type of each property, I don't know where you can get this when using events:
if (nhtype.IsCollectionType)
{
var collection = previousValue as NHibernate.Collection.IPersistentCollection;
if (collection != null)
{
// just skip uninitialized collections
if (!collection.WasInitialized)
{
// skip
}
else
{
// read collections previous values
previousValue = collection.StoredSnapshot;
}
}
}
When you get the update event from NHibernate, the instance is initialized. You can safely access properties of primitive types. When you want to use ToString, make sure that your ToString implementation doesn't access any referenced entities nor any collections.
You may use NHibernate meta-data to find out if a type is mapped as an entity or not. This could be useful to navigate in your object model. When you reference another entity, you will get additional update events on this when it changed.
I was able to determine that this error is thrown when application code loads a Lazy Propery where the Entity has a collection.
My first attempt involed watching for new CollectionEntries (which I've never want to process as there shouldn't actually be any changes). Then mark them as IsProcessed = true so they wouldn't cause problems.
var collections = args.Session.PersistenceContext.CollectionEntries;
var collectionKeys = args.Session.PersistenceContext.CollectionEntries.Keys;
var roundCollectionKeys = collectionKeys.Cast<object>().ToList();
var collectionValuesClount = collectionKeys.Count;
// Application code that that loads a Lazy propery where the Entity has a collection
var postCollectionKeys = collectionKeys.Cast<object>().ToList();
var newLength = postCollectionKeys.Count;
if (newLength != collectionValuesClount) {
foreach (var newKey in postCollectionKeys.Except(roundCollectionKeys)) {
var collectionEntry = (CollectionEntry)collections[newKey];
collectionEntry.IsProcessed = true;
}
}
However this didn't entirly solve the issue. In some cases I'd still get the exception.
When OnPostUpdate is called the values in the CollectionEntries dictionary should all already be set to IsProcessed = true. So I decided to do an extra check to see if the collections not processed matched what I expected.
var valuesNotProcessed = collections.Values.Cast<CollectionEntry>().Where(x => !x.IsProcessed).ToList();
if (valuesNotProcessed.Any()) {
// Assert: valuesNotProcessed.Count() == (newLength - collectionValuesClount)
}
In the cases that my first attempt fixed these numbers would match exactly. However in the cases where it didn't work there were extra items alreay in the dictionary. In my I could be sure these extra items also wouldn't result in updates so I could just set IsProcessed = true for all the valuesNotProcessed.