Is there a way to disable event firing when doing a list item update from the Managed Client Object Model?
In the server model, I do the below. However, I cannot find a way of doing the same on the Managed Client ObjectModel:
class DisabledEventsScope : SPItemEventReceiver, IDisposable
{ // Boolean to hold the original value of the EventFiringEnabled property
bool _originalValue;
public DisabledEventsScope()
{
// Save off the original value of EventFiringEnabled
_originalValue = base.EventFiringEnabled;
// Set EventFiringEnabled to false to disable it
base.EventFiringEnabled = false;
}
public void Dispose()
{
// Set EventFiringEnabled back to its original value
base.EventFiringEnabled = _originalValue;
}
}
using (DisabledEventsScope scope = new DisabledEventsScope())
{
// State-changing operation occurs here.
spItem.Update();
}
Thanks in advance.
You can not do it in client object model, see the MSDN documentation on the SP.List object: http://msdn.microsoft.com/en-us/library/ee554951. But you can develop custom web service which will be called from a client side and disable events' firing.
Related
I'm new to stackoverflow however I use it everyday. Today I need you because I dont get this info anywhere.
My question is:
I want to make a service with callback to clients but I dont want to callback in the function they call in the service. (something like subscriber/publisher)
I want to save the callback instance.
Then I want a service calling a function in my service that will trigger the callbacks(like this: callbacks.PrintMessage("Message"));)
Saving the callback instance in a static list in a static class.
When calling the callback.function() Im getting this error: "you are using Disposed object"
because Im getting the instance with this: OperationContext.Current.GetCallbackChannel<"callback interface">
What can I do to save that callback instances?
Thanks a lot.
Pedro
CODE:
//FUNCTION IN MY SERVICE
public void Subscribe()
{
var callback = OperationContext.Current.GetCallbackChannel<IMonitoringWebServiceCallback>();
callbacks.Add(callback);
callback = OperationContext.Current.GetCallbackChannel<IMonitoringWebServiceCallback>();
AlarmCallbackSingleton.Instance.AddCallback(callback);
//callback.PrintString("String"); //HERE IT WORKS! BUT I DONT WANT CALL HERE!
alarmInfoHandler = new AlarmInfoEventHandler(AlarmInfoHandler);
NewAlarmInfo += alarmInfoHandler;
}
//FUNCTION IN THE SAME SERVICE CALLED BY OTHER CLIENT
public void PublishAlarm(string alarm)
{
AlarmInfoEventArgs e = new AlarmInfoEventArgs();
e.Alarm = alarm;
NewAlarmInfo(this, e);
}
public void AlarmInfoHandler(object sender, AlarmInfoEventArgs e)
{
List<IMonitoringWebServiceCallback> callbacks = AlarmCallbackSingleton.Instance.GetCallbacks();
//EVERYONE THAT SUBSCRIBED SHOULD EXECUTE THIS (HERE I GET THE DISPOSED ERROR)
callbacks.ForEach(x => x.ShowString("String!"));
}
Ok. I got it! The answer to this question is as simple as this:
When you subscribe to the service you need to save somewhere(List etc..) the OperationContext and not the callback object.
Then when the PublishAlarm is called by another client the event is triggered and you need to get OperationContext of all clients that subscribe.
I saved that objetcs in a static List(singleton class) just for the example.
Then:
public void AlarmInfoHandler(object sender, AlarmInfoEventArgs e)
{
var operation = AlarmCallbackSingleton.Instance.operationContext
var callback = operation.GetCallbackChannel<IMonitoringWebServiceCallback>();
callback.ShowAlarm(); //function you want to call
}
Hope this can help!
I was trying to build a bean that always retrieves the same document ( a counter document), gets the current value, increment it and save the document with the new value. Finally it should return the value to the calling method and that would get me a new sequential number in my Xpage.
Since the Domino objects cannot be serialized or singleton'ed what's the benefit creating a bean doing this, over creating a SSJS function doing the exact same thing?
My bean must have calls to session, database, view and document, which then will be called every time.
The same within the SSJS-function except for session and database.
Bean:
public double getTransNo() {
try {
Session session = ExtLibUtil.getCurrentSession();
Database db = session.getCurrentDatabase();
View view = db.getView("vCount");
view.refresh();
doc = view.getFirstDocument();
transNo = doc.getItemValueDouble("count");
doc.replaceItemValue("count", ++transNo);
doc.save();
doc.recycle();
view.recycle();
} catch (NotesException e) {
e.printStackTrace();
}
return transNo;
}
SSJS:
function getTransNo() {
var view:NotesView = database.getView("vCount");
var doc:NotesDocument = view.getFirstDocument();
var transNo = doc.getItemValueDouble("count");
doc.replaceItemValue("count", ++transNo);
doc.save();
doc.recycle();
view.recycle();
return transNo;
}
Thank you
Both pieces of code are not good (sorry to be blunt).
If you have one document in your view, you don't need a view refresh which might be queued behind a refresh on another view and be very slow. Presumably you are talking about a single sever solution (since replication of the counter document would for sure lead to conflicts).
What you do in XPages is to create a Java class and declare it as application bean:
public class SequenceGenerator {
// Error handling is missing in this class
private double sequence = 0;
private String docID;
public SequenceGenerator() {
// Here you load from the document
Session session = ExtLibUtil.getCurrentSession();
Database db = session.getCurrentDatabase();
View view = db.getView("vCount");
doc = view.getFirstDocument();
this.sequence = doc.getItemValueDouble("count");
this.docID = doc.getUniversalId();
Utils.shred(doc, view); //Shred currenDatabase isn't a good idea
}
public synchronized double getNextSequence() {
return this.updateSequence();
}
private double updateSequence() {
this.sequence++;
// If speed if of essence I would spin out a new thread here
Session session = ExtLibUtil.getCurrentSession();
Database db = session.getCurrentDatabase();
doc = db.getDocumentByUnid(this.docID);
doc.ReplaceItemValue("count", this.sequence);
doc.save(true,true);
Utils.shred(doc);
// End of the candidate for a thread
return this.sequence;
}
}
The problem for the SSJS code: what happens if 2 users hit that together? At least you need to use synchronized there too. Using a bean makes it accessible in EL too (you need to watch out not to call it too often). Also in Java you can defer the writing back to a different thread - or not write it back at all and in your class initialization code read the view with the actual documents and pick the value from there.
Update: Utils is a class with static methods:
/**
* Get rid of all Notes objects
*
* #param morituri = the one designated to die, read your Caesar!
*/
public static void shred(Base... morituri) {
for (Base obsoleteObject : morituri) {
if (obsoleteObject != null) {
try {
obsoleteObject.recycle();
} catch (NotesException e) {
// We don't care we want go get
// rid of it anyway
} finally {
obsoleteObject = null;
}
}
}
}
I have implemented the RIA WCF side to authenticate with Forms Authentication and everything works from the client as expected.
This application should only allow registered users to use it (users are created by admin - no registeration page).
My question is then, what (or where) should be the efficient way to make the authentication; it has to show up at application start up (unless remember me was on and cookie is still active) and if the user logs out, it should automatically get out the interface and return to login form again.
Update (code trimmed for brevity):
Public Class MainViewModel
....
Public Property Content As Object 'DP property
Private Sub ValidateUser()
If Not IsUserValid Login()
End Sub
Private Sub Login()
'I want, that when the login returns a success it should continue
'navigating to the original content i.e.
Dim _content = Me.Content
Me.Content = Navigate(Of LoginPage)
If IsUserValid Then Me.Content = _content
End Sub
End Class
I saw you other question so I assume you're using mvvm. I accomplish this by creating a RootPage with a grid control and a navigation frame. I set the RootVisual to the RootPage. I bind the navigation frames source to a variable in the RootPageVM, then in the consructor of RootPageVM you can set the frame source to either MainPage or LoginPage based on user auth. RootPageVM can also receive messages to control further navigation like logging out.
Using MVVM-Light.
So, in the RootPageView (set as the RootVisual), something like:
public RootPageViewModel()
{
Messenger.Default.Register<NotificationMessage>
(this, "NavigationRequest", Navigate);
if (IsInDesignMode)
{
}
else
{
FrameSource =
WebContext.Current.User.IsAuthenticated ?
"Home" :
"Login";
}
}
And a method for navigation:
private void Navigate(NotificationMessage obj)
{
FrameSource = obj.Notification;
}
In the LoginViewModel:
if (loginOperation.LoginSuccess)
{
Messenger.Default.Send
(new NotificationMessage(this, "Home"), "NavigationRequest");
}
I've got a Silverlight 4 RIA Services (SP1) app using Entity Frameworks 4 CTP5. I can databind a grid or listbox to the IEnumerable loaded by the domain context and it shows data from the server. Great.
Now I want to create a new instance of MyEntity and add it to the client-side data so that the user can see the newly added entity. MyEntity is a true entity descendant, not a POCO.
The only Add method I can find is domainContext.EntityContainer.GetEntitySet<MyEntity>().Add(newobj)
This does add the new entity to the domain context, and the domainContext.HasChanges does become true, but the new entity doesn't show up in the databound controls.
How do I get the new entity to show up in the databound controls prior to SubmitChanges?
(Probably related to this SO question from years ago that never got an answer)
Here's the server side declarations of the domain service, per requests:
[EnableClientAccess()]
public class MyDomainService : LinqToEntitiesDomainService<MyObjectContext>
{
protected override MyObjectContext CreateObjectContext()
{
return new MyObjectContext();
}
public IQueryable<MyEntity> GetMyEntities()
{
return this.ObjectContext.MyEntities;
}
public void InsertMyEntity(MyEntity MyEntity)
{
// ...
}
public void UpdateMyEntity(MyEntity currentMyEntity)
{
// ...
}
public void DeleteMyEntity(MyEntity MyEntity)
{
// ...
}
}
I've figured this out with a combination of my own trial and error and hints provided by some of the other responses to this question.
The key point I was missing was that it's not enough for the ViewModel to keep track of the DomainContext and hand out query results to the View for databinding. The ViewModel also has to capture and retain the query results if you want entity adds and deletes performed by the ViewModel to appear in the UI before DomainContext.SubmitChanges(). The ViewModel has to apply those adds to the collection view of the query results.
The ViewModel collection property for View databinding. In this case I'm using the Telerik QueryableDomainServiceCollectionView, but other collection views can be used:
public IEnumerable<MyEntity> MyEntities
{
get
{
if (this.view == null)
{
DomainContextNeeded();
}
return this.view;
}
}
private void DomainContextNeeded()
{
this.context = new MyDomainContext();
var q = context.GetMyEntitiesQuery();
this.view = new Telerik.Windows.Data.QueryableDomainServiceCollectionView<MyEntity>(context, q);
this.view.Load();
}
The ViewModel function that adds a new entity for the UI to display:
public void AddNewMyEntity(object selectedNode)
{
var ent = new MyEntity() { DisplayName = "New Entity" };
if (selectedNode == null)
{
this.view.AddNew(ent);
}
else if (selectedNode is MyEntity)
{
((MyEntity)selectedNode).Children.Add(ent);
}
}
Other responses mentioned ObservableCollection. The query results and the collection view may not return instances of ObservableCollection. They could be just IEnumerables. What is critical is that they implement INotifyCollectionChanged and IEditableCollectionView.
Thanks to those who contributed responses. I've +1'd each response that was helpful, but since none directly solved my problem I couldn't justify marking any as the definitive answer.
Your domainContext will have a property domainContext.MyEntities. Does it not show up in there when you add it?
Bind to that collection or watch that collection for changes.
domainContext.MyEntities.PropertyChanged += MyEventHandler;
I assume you bind your control to the IEnumerable which is provided by LoadOperation<TEntity>.Entities. In that case your binding source is not the DomainContext.GetEntitySet<MyEntity>().
DomainContext.GetEntitySet<MyEntity>() holds all your currently tracked instances of MyEntity, including the one you add with .Add().
LoadOperation<TEntity>.Entities only contains the instances of MyEntity that were actually loaded by your last LoadOperation/Query.
You have two options: Either add the new entity to the ItemsSource-collection for your control (I recommend that) or rebuild the collection with the contents of DomainContext.GetEntitySet<MyEntity>(). That may contain other elements that you have not cleared out before, though.
Ayende has an article about how to implement a simple audit trail for NHibernate (here) using event handlers.
Unfortunately, as can be seen in the comments, his implementation causes the following exception to be thrown: collection xxx was not processed by flush()
The problem appears to be the implicit call to ToString on the dirty properties, which can cause trouble if the dirty property is also a mapped entity.
I have tried my hardest to build a working implementation but with no luck.
Does anyone know of a working solution?
I was able to solve the same problem using following workaround: set the processed flag to true on all collections in the current persistence context within the listener
public void OnPostUpdate(PostUpdateEvent postEvent)
{
if (IsAuditable(postEvent.Entity))
{
//skip application specific code
foreach (var collection in postEvent.Session.PersistenceContext.CollectionEntries.Values)
{
var collectionEntry = collection as CollectionEntry;
collectionEntry.IsProcessed = true;
}
//var session = postEvent.Session.GetSession(EntityMode.Poco);
//session.Save(auditTrailEntry);
//session.Flush();
}
}
Hope this helps.
The fix should be the following. Create a new event listener class and derive it from NHibernate.Event.Default.DefaultFlushEventListener:
[Serializable]
public class FixedDefaultFlushEventListener: DefaultFlushEventListener
{
private static readonly log4net.ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);
protected override void PerformExecutions(IEventSource session)
{
if (log.IsDebugEnabled)
{
log.Debug("executing flush");
}
try
{
session.ConnectionManager.FlushBeginning();
session.PersistenceContext.Flushing = true;
session.ActionQueue.PrepareActions();
session.ActionQueue.ExecuteActions();
}
catch (HibernateException exception)
{
if (log.IsErrorEnabled)
{
log.Error("Could not synchronize database state with session", exception);
}
throw;
}
finally
{
session.PersistenceContext.Flushing = false;
session.ConnectionManager.FlushEnding();
}
}
}
Register it during NHibernate configuraiton:
cfg.EventListeners.FlushEventListeners = new IFlushEventListener[] { new FixedDefaultFlushEventListener() };
You can read more about this bug in Hibernate JIRA:
https://hibernate.onjira.com/browse/HHH-2763
The next release of NHibernate should include that fix either.
This is not easy at all. I wrote something like this, but it is very specific to our needs and not trivial.
Some additional hints:
You can test if references are loaded using
NHibernateUtil.IsInitialized(entity)
or
NHibernateUtil.IsPropertyInitialized(entity, propertyName)
You can cast collections to the IPersistentCollection. I implemented an IInterceptor where I get the NHibernate Type of each property, I don't know where you can get this when using events:
if (nhtype.IsCollectionType)
{
var collection = previousValue as NHibernate.Collection.IPersistentCollection;
if (collection != null)
{
// just skip uninitialized collections
if (!collection.WasInitialized)
{
// skip
}
else
{
// read collections previous values
previousValue = collection.StoredSnapshot;
}
}
}
When you get the update event from NHibernate, the instance is initialized. You can safely access properties of primitive types. When you want to use ToString, make sure that your ToString implementation doesn't access any referenced entities nor any collections.
You may use NHibernate meta-data to find out if a type is mapped as an entity or not. This could be useful to navigate in your object model. When you reference another entity, you will get additional update events on this when it changed.
I was able to determine that this error is thrown when application code loads a Lazy Propery where the Entity has a collection.
My first attempt involed watching for new CollectionEntries (which I've never want to process as there shouldn't actually be any changes). Then mark them as IsProcessed = true so they wouldn't cause problems.
var collections = args.Session.PersistenceContext.CollectionEntries;
var collectionKeys = args.Session.PersistenceContext.CollectionEntries.Keys;
var roundCollectionKeys = collectionKeys.Cast<object>().ToList();
var collectionValuesClount = collectionKeys.Count;
// Application code that that loads a Lazy propery where the Entity has a collection
var postCollectionKeys = collectionKeys.Cast<object>().ToList();
var newLength = postCollectionKeys.Count;
if (newLength != collectionValuesClount) {
foreach (var newKey in postCollectionKeys.Except(roundCollectionKeys)) {
var collectionEntry = (CollectionEntry)collections[newKey];
collectionEntry.IsProcessed = true;
}
}
However this didn't entirly solve the issue. In some cases I'd still get the exception.
When OnPostUpdate is called the values in the CollectionEntries dictionary should all already be set to IsProcessed = true. So I decided to do an extra check to see if the collections not processed matched what I expected.
var valuesNotProcessed = collections.Values.Cast<CollectionEntry>().Where(x => !x.IsProcessed).ToList();
if (valuesNotProcessed.Any()) {
// Assert: valuesNotProcessed.Count() == (newLength - collectionValuesClount)
}
In the cases that my first attempt fixed these numbers would match exactly. However in the cases where it didn't work there were extra items alreay in the dictionary. In my I could be sure these extra items also wouldn't result in updates so I could just set IsProcessed = true for all the valuesNotProcessed.