How to register ORMObjectListener in Intershop7 - orm

We have implemented several custom ORM objects in our webshop implementation that have references (dependencies) to Intershop Product system object.
When a user tries to delete a certain product in back-office, it causes problems because references to that product may still exist in our custom objects. Naturally, deleting a product that is referenced from one of our custom objects generates an exception like this:
java.sql.SQLTransactionRollbackException: ORA-02091: transaction rolled back ORA-02292: integrity constraint (INTERSHOP.A1POSTPAIDPRICE_CO_002) violated - child record found
We have figured that we could solve that by implementing an ORMObjectListener and overriding objectDeleting method to delete all the references before the product actually gets deleted.
Intershop cookbook for ORM layer states:
"Instances must implement the interface ORMObjectListener for a given ORM object type and register at the factory. The listener is called when instances of the given type are created, changed or removed."
(https://support.intershop.com/kb/index.php/Display/2G3270#Cookbook-ORMLayer-Recipe:NotificationofPersistentObjectChanges)
However, we cannot find a cookbook for registering the listener at the factory. What do we need to do to register the listener?
Also, if there is some better way for handling dependencies to system objects on our custom objects during delete event, I'm open to suggestions.
UPDATE:
This is the listener class I have tried with so far:
public class ProductDeleteListener implements ORMObjectListener<ProductPO> {
#Inject
ProductPOFactory productPOFactory;
/** The Constant LOGGER. */
private static final Logger LOGGER = LoggerFactory.getLogger(ProductDeleteListener.class);
public ProductDeleteListener() {
productPOFactory.addObjectListener(this, new AttributeDescription[0]);
}
#Override
public boolean isOldStateNeeded() {
// TODO Auto-generated method stub
return false;
}
#Override
public void objectChanged(ProductPO object, Map<AttributeDescription, Object> previousValues) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("PRODUCT LISTENER TEST - CHANGE");
}
}
#Override
public void objectChanging(ProductPO object, Map<AttributeDescription, Object> previousValues) {
// TODO Auto-generated method stub
}
#Override
public void objectCreated(ProductPO object) {
// TODO Auto-generated method stub
}
#Override
public void objectCreating(ProductPO object) {
// TODO Auto-generated method stub
}
#Override
public void objectDeleted(ORMObjectKey objectKey) {
// TODO Auto-generated method stub
}
#Override
public void objectDeleting(ProductPO object) {
if (LOGGER.isDebugEnabled()) {
LOGGER.debug("PRODUCT LISTENER TEST - PRE DELETE");
}
}
}
But it is not working. Nothing gets logged when object changes or gets deleted.

In addition to what Willem Evertse wrote you need to place your registration code in a class that gets instantiated via Intershop Component Framework.
implementation.component:
<components xmlns="http://www.intershop.de/component/2010" scope="global">
<implementation name="ProductDeleteListenerRegistrar"
class="your.fullqualifed.ProductDeleteRegistrar" start="start" stop="stop"></implementation>
instances.component:
<components xmlns="http://www.intershop.de/component/2010"> <instance name="ORMValidator" with="ORMValidator" scope="global"/></components>
You need to write a class, e.g. ProductDeleteRegistrar and provide start method in which you can add registration calls like Willem described. As for stop method you need to safely unregister your object listener. Make sure both methods are declared to be synchronized.

I think registering a listen would be the right approach. Maybe just look out for performance problems.
You are right that there are no examples of this, but here is an example.
Get the factory that you want to receive messages from. In your case, it is ProductPOFactory
ProductPOFactory productFactory = (ProductPOFactory) NamingMgr.getInstance().lookupFactory(ProductPO.class);
productFactory.addObjectListener(new MyProductChangeListener());
MyProductChangeListener needs to extend AbstractORMObjectListener<ProductPO>
and implement the method public void objectDeleting(T object)
Every time a product gets deleted your listener should be called and then you can clean up your custom orm objects. You can have a look at ImageSetDefinitionPOListener as an example

Related

How to create a BaseClass that adds logging messages

I am using serenity BDD for my automation testing and Page Object Model for my framework. I have created a BasePage class which will be inherited by all the other Pages. I want to minimize the logging messages from the Pages by adding all the log.info messages to a central Base page. Example, when calling the click() method, I will log before click and after click methods as shown below in the basePage class:
public class BasePage extends PageObject{
private static final Logger log = LogManager.getLogger(BasePage.class.getClass());
private final WebElementFacade element;
public static void clickBtn(WebElementFacade btnName) {
log.info("About to click " + btnName + " button");
btnName.click();
log.info("Successfully clicked on " + btnName + " button.");
}
Later I figured that instead of individually trying to figure out in advance what actions the user will perform on the webElements, and write new methods for each action (like the one shown above), just implement WebDriverFacade interface, so that I get all the unimplemented method list in BasePage from WebDriverFacade and then write the log messages inside each of them, like so:
public class BasePage extends PageObject implements WebElementFacade{
private static final Logger log = LogManager.getLogger(BasePage.class.getClass());
private final WebElementFacade element;
#Override
public void submit() {
// TODO Auto-generated method stub
}
#Override
public void sendKeys(CharSequence... keysToSend) {
// TODO Auto-generated method stub
}
#Override
public String getTagName() {
// TODO Auto-generated method stub
return null;
}
#Override
public boolean isSelected() {
// TODO Auto-generated method stub
return false;
}
.
.
.
.
.
}
This will serve two purposes:
I will not have to create new methods for every action in BasePage class, example the 'clickBtn()' function in the first code
As I mentioned before, I will not have to figure out what method any other person who adds methods to my code might use and having to change the BasePage class to create the new actions. So basically less maintenance in the long run.
The problem I am facing is an error that I receive in the second use case:
The return types are incompatible for the inherited methods WebElementFacade.withTimeoutOf(int, TimeUnit), PageObject.withTimeoutOf(int, TimeUnit)
Now my question is:
How can solve this problem?
Is this the right way to do things or should I be going with the first method and have maintenance overhead.
Just figured that another scenario where this might be useful. To make sure that subclass methods do not use methods from pageObject and are forced to use the methods from BaseClass only. This can be done by wrapping the WebElementFacade and adding the log messages as an added functionality. Any thought on this would be appreciated.
Thank you!
Honestly it is a neat trick and if you get it working you should be proud.
I think I figured something similar out in a dynamic language.
But you are better off just adding the logging entries and learning the following.
How to name functions so you don't feel like they need renaming.
How to log clearly for debugging use.
This is because loggings power is in its flexibility.
When you learn how to dump something complex like a matrix so you can eye it and go uh-oh you are increasing your overall skills.
I apologize for not giving you code but I felt some "chase the other rabbit" advice was better.

Is there anyway to log after save with Spring Data Repository

I'm trying to logging some information after save entity with Spring Data Repository, is there anyway?
Thanks!
You can use Annotated repo event handlers.
Just add the #RepositoryEventHandler annotation to your handler class, than in this class implement the handler method with #HandleAfterCreate and #HandleAfterSave annotations.
#RepositoryEventHandler
public class EventHandler {
#HandleAfterCreate
#HandleAfterSave
public void handleAfterCreateOrSave(Entity e) {
// LOG...
}
}

How to correctly dispose objects registered with Autofac

I've implemented Unit of Work/Repository pattern, as described here, but I'm also using autofac and constructor injection, so I registered UnitOfWork and DbContext (PsyProfContext) class like this:
builder.Register(context => new PsyProfContext()).InstancePerHttpRequest();
builder.RegisterType<UnitOfWork>().As<IUnitOfWork>().InstancePerHttpRequest();
And everything works great!
Except for one thing: I'm also using enterprise library logging block, and I have implemented CustomTraceListener which is using Entity Framework to write log entry into the database.
My controller looks like this (it is empty because at the moment I just tried to verify if all the things (IoC, logging, entity framework) are working):
public class HomeController : Controller
{
private readonly UnitOfWork unitOfWork;
public HomeController(IUnitOfWork unitOfWork)
{
this.unitOfWork = (UnitOfWork) unitOfWork;
}
//
// GET: /Home/
public ActionResult Index()
{
throw new HttpException();
return View();
}
protected override void Dispose(bool disposing)
{
unitOfWork.Dispose();
base.Dispose(disposing);
}
}
And in the Write method of the CustomTraceListener class, I've tried to Resolve UnitOfWork:
DependencyResolver.Current.GetService<IUnitOfWork>() as UnitOfWork;
But I get an instance which is already disposed! so I've put some breakpoints and found out that Dispose method of the controller is called before the Write method of the CustomTraceListener class, so in the end I didn't found other solution than using DbContext (PsyProfContext) directly:
public override void Write(object o)
{
using (var conext = new PsyProfContext())
{
var customLogEntry = o as CustomLogEntry;
if (customLogEntry != null)
{
var logEntry = new LogEntry
{
//a bunch of properties
};
conext.Exceptions.Add(logEntry);
conext.SaveChanges();
}
}
}
But I don't like this solution! What's the point to use UnitOfWork and Repository pattern if you access DbContext object directly. Or what's the point in using DI in project if you create a registered object manually in some cases.
So I wanted to hear your opinion, about how to deal with this kind of situations? Is my current implementation fine, or it is definitely wrong and I should think about another one.
Any help will be greatly appreciated and any ideas are welcome!
It looks like you may have a couple of problems.
First, if you're manually disposing the unit of work object in your controller, your controller should take an Owned<IUnitOfWork> in the constructor. When the request lifetime is disposed it will automatically dispose of any IDisposable components - including the controller and any resolved dependencies - unless you specify somehow that you're going to take over ownership of the lifetime. You can do that by using Owned<T>.
public class HomeController : Controller
{
Owned<IUnitOfWork> _uow;
public HomeController(Owned<IUnitOfWork> uow)
{
this._uow = uow;
}
protected override void Dispose(bool disposing)
{
if(disposing)
{
this._uow.Dispose();
}
base.Dispose(disposing);
}
}
(Note a minor logic fix in the Dispose override there - you need to check the value of disposing so you don't double-dispose your unit of work.)
Alternatively, you could register your units of work as ExternallyOwned, like
builder
.RegisterType<UnitOfWork>()
.As<IUnitOfWork>()
.ExternallyOwned()
.InstancePerHttpRequest();
ExternallyOwned also tells Autofac that you'll take control of disposal. In that case, your controller will look like it does already. (Generally I like to just let Autofac do the work, though, and not take ownership if I can avoid it.)
In fact, looking at the way things are set up, you might be able to avoid the disposal problem altogether if you let Autofac do the disposal for you - the call to DependencyResolver would return the unit of work that isn't disposed yet and it'd be OK.
If that doesn't fix it... you may want to add some detail to your question. I see where your controller is using the unit of work class, but I don't see where it logs anything, nor do I see anything in the listener implementation that's using the unit of work.
(Also, as noted in the first comment on your question, in the constructor of your controller you shouldn't be casting your service from IUnitOfWork to UnitOfWork - that's breaking the abstraction that the interface was offering in the first place.)

Looking for a Ninject scope that behaves like InRequestScope

On my service layer I have injected an UnitOfWork and 2 repositories in the constructor. The Unit of Work and repository have an instance of a DbContext I want to share between the two of them. How can I do that with Ninject ? Which scope should be considered ?
I am not in a web application so I can't use InRequestScope.
I try to do something similar... and I am using DI however, I need my UoW to be Disposed and created like this.
using (IUnitOfWork uow = new UnitOfWorkFactory.Create())
{
_testARepository.Insert(a);
_testBRepository.Insert(b);
uow.SaveChanges();
}
EDIT: I just want to be sure i understand… after look at https://github.com/ninject/ninject.extensions.namedscope/wiki/InNamedScope i though about my current console application architecture which actually use Ninject.
Lets say :
Class A is a Service layer class
Class B is an unit of work which take into parameter an interface (IContextFactory)
Class C is a repository which take into parameter an interface (IContextFactory)
The idea here is to be able to do context operations on 2 or more repository and using the unit of work to apply the changes.
Class D is a context factory (Entity Framework) which provide an instance (keep in a container) of the context which is shared between Class B et C (.. and would be for other repositories aswell).
The context factory keep the instance in his container so i don’t want to reuse this instance all the name since the context need to be disposed at the end of the service operaiton.. it is the main purpose of the InNamedScope actually ?
The solution would be but i am not sure at all i am doing it right, the services instance gonna be transcient which mean they actually never disposed ? :
Bind<IScsContextFactory>()
.To<ScsContextFactory>()
.InNamedScope("ServiceScope")
.WithConstructorArgument(
"connectionString",
ConfigurationUtility.GetConnectionString());
Bind<IUnitOfWork>().To<ScsUnitOfWork>();
Bind<IAccountRepository>().To<AccountRepository>();
Bind<IBlockedIpRepository>().To<BlockedIpRepository>();
Bind<IAccountService>().To<AccountService>().DefinesNamedScope("ServiceScope");
Bind<IBlockedIpService>().To<BlockedIpService>().DefinesNamedScope("ServiceScope");
UPDATE: This approach works against NuGet current, but relies in an anomaly in the InCallscope implementation which has been fixed in the current Unstable NuGet packages. I'll be tweaking this answer in a few days to reflect the best approach after some mulling over. NB the high level way of structuring stuff will stay pretty much identical, just the exact details of the Bind<DbContext>() scoping will work. (Hint: CreateNamedScope in unstable would work or one could set up the Command Handler as DefinesNamedScope. Reason I dont just do that is that I want to have something that composes/plays well with InRequestScope)
I highly recommend reading the Ninject.Extensions.NamedScope integration tests (seriously, find them and read and re-read them)
The DbContext is a Unit Of Work so no further wrapping is necessary.
As you want to be able to have multiple 'requests' in flight and want to have a single Unit of Work shared between them, you need to:
Bind<DbContext>()
.ToMethod( ctx =>
new DbContext(
connectionStringName: ConfigurationUtility.GetConnectionString() ))
.InCallScope();
The InCallScope() means that:
for a given object graph composed for a single kernel.Get() Call (hence In Call Scope), everyone that requires an DbContext will get the same instance.
the IDisposable.Dispose() will be called when a Kernel.Release() happens for the root object (or a Kernel.Components.Get<ICache>().Clear() happens for the root if it is not .InCallScope())
There should be no reason to use InNamedScope() and DefinesNamedScope(); You don't have long-lived objects you're trying to exclude from the default pooling / parenting / grouping.
If you do the above, you should be able to:
var command = kernel.Get<ICommand>();
try {
command.Execute();
} finally {
kernel.Components.Get<ICache>().Clear( command ); // Dispose of DbContext happens here
}
The Command implementation looks like:
class Command : ICommand {
readonly IAccountRepository _ar;
readonly IBlockedIpRepository _br;
readonly DbContext _ctx;
public Command(IAccountRepository ar, IBlockedIpRepository br, DbContext ctx){
_ar = ar;
_br = br;
_ctx = ctx;
}
void ICommand.Execute(){
_ar.Insert(a);
_br.Insert(b);
_ctx.saveChanges();
}
}
Note that in general, I avoid having an implicit Unit of Work in this way, and instead surface it's creation and Disposal. This makes a Command look like this:
class Command : ICommand {
readonly IAccountService _as;
readonly IBlockedIpService _bs;
readonly Func<DbContext> _createContext;
public Command(IAccountService #as, IBlockedIpServices bs, Func<DbContext> createContext){
_as = #as;
_bs = bs;
_createContext = createContext;
}
void ICommand.Execute(){
using(var ctx = _createContext()) {
_ar.InsertA(ctx);
_br.InsertB(ctx);
ctx.saveChanges();
}
}
This involves no usage of .InCallScope() on the Bind<DbContext>() (but does require the presence of Ninject.Extensions.Factory's FactoryModule to synthesize the Func<DbContext> from a straightforward Bind<DbContext>().
As discussed in the other answer, InCallScope is not a good approach to solving this problem.
For now I'm dumping some code that works against the latest NuGet Unstable / Include PreRelease / Instal-Package -Pre editions of Ninject.Web.Common without a clear explanation. I will translate this to an article in the Ninject.Extensions.NamedScope wiki at some stagehave started to write a walkthrough of this technique in the Ninject.Extensions.NamedScope wiki's CreateNamedScope/GetScope article.
Possibly some bits will become Pull Request(s) at some stage too (Hat tip to #Remo Gloor who supplied me the outline code). The associated tests and learning tests are in this gist for now), pending packaging in a proper released format TBD.
The exec summary is you Load the Module below into your Kernel and use .InRequestScope() on everything you want created / Disposed per handler invocation and then feed requests through via IHandlerComposer.ComposeCallDispose.
If you use the following Module:
public class Module : NinjectModule
{
public override void Load()
{
Bind<IHandlerComposer>().To<NinjectRequestScopedHandlerComposer>();
// Wire it up so InRequestScope will work for Handler scopes
Bind<INinjectRequestHandlerScopeFactory>().To<NinjectRequestHandlerScopeFactory>();
NinjectRequestHandlerScopeFactory.NinjectHttpApplicationPlugin.RegisterIn( Kernel );
}
}
Which wires in a Factory[1] and NinjectHttpApplicationPlugin that exposes:
public interface INinjectRequestHandlerScopeFactory
{
NamedScope CreateRequestHandlerScope();
}
Then you can use this Composer to Run a Request InRequestScope():
public interface IHandlerComposer
{
void ComposeCallDispose( Type type, Action<object> callback );
}
Implemented as:
class NinjectRequestScopedHandlerComposer : IHandlerComposer
{
readonly INinjectRequestHandlerScopeFactory _requestHandlerScopeFactory;
public NinjectRequestScopedHandlerComposer( INinjectRequestHandlerScopeFactory requestHandlerScopeFactory )
{
_requestHandlerScopeFactory = requestHandlerScopeFactory;
}
void IHandlerComposer.ComposeCallDispose( Type handlerType, Action<object> callback )
{
using ( var resolutionRoot = _requestHandlerScopeFactory.CreateRequestHandlerScope() )
foreach ( object handler in resolutionRoot.GetAll( handlerType ) )
callback( handler );
}
}
The Ninject Infrastructure stuff:
class NinjectRequestHandlerScopeFactory : INinjectRequestHandlerScopeFactory
{
internal const string ScopeName = "Handler";
readonly IKernel _kernel;
public NinjectRequestHandlerScopeFactory( IKernel kernel )
{
_kernel = kernel;
}
NamedScope INinjectRequestHandlerScopeFactory.CreateRequestHandlerScope()
{
return _kernel.CreateNamedScope( ScopeName );
}
/// <summary>
/// When plugged in as a Ninject Kernel Component via <c>RegisterIn(IKernel)</c>, makes the Named Scope generated during IHandlerFactory.RunAndDispose available for use via the Ninject.Web.Common's <c>.InRequestScope()</c> Binding extension.
/// </summary>
public class NinjectHttpApplicationPlugin : NinjectComponent, INinjectHttpApplicationPlugin
{
readonly IKernel kernel;
public static void RegisterIn( IKernel kernel )
{
kernel.Components.Add<INinjectHttpApplicationPlugin, NinjectHttpApplicationPlugin>();
}
public NinjectHttpApplicationPlugin( IKernel kernel )
{
this.kernel = kernel;
}
object INinjectHttpApplicationPlugin.GetRequestScope( IContext context )
{
// TODO PR for TrgGetScope
try
{
return NamedScopeExtensionMethods.GetScope( context, ScopeName );
}
catch ( UnknownScopeException )
{
return null;
}
}
void INinjectHttpApplicationPlugin.Start()
{
}
void INinjectHttpApplicationPlugin.Stop()
{
}
}
}

Decoupling Silverlight client from service reference generated class

I am researching Prism v2 by going thru the quickstarts. And I have created a WCF service with the following signature:
namespace HelloWorld.Silverlight.Web
{
[ServiceContract(Namespace = "http://helloworld.org/messaging")]
[AspNetCompatibilityRequirements(RequirementsMode =
AspNetCompatibilityRequirementsMode.Allowed)]
public class HelloWorldMessageService
{
private string message = "Hello from WCF";
[OperationContract]
public void UpdateMessage(string message)
{
this.message = message;
}
[OperationContract]
public string GetMessage()
{
return message;
}
}
}
When I add a service reference to this service in my silverlight project it generates an interface and a class:
[System.ServiceModel.ServiceContractAttribute
(Namespace="http://helloworld.org/messaging",
ConfigurationName="Web.Services.HelloWorldMessageService")]
public interface HelloWorldMessageService {
[System.ServiceModel.OperationContractAttribute
(AsyncPattern=true,
Action="http://helloworld.org/messaging/HelloWorldMessageService/UpdateMessage",
ReplyAction="http://helloworld.org/messaging/HelloWorldMessageService/UpdateMessageResponse")]
System.IAsyncResult BeginUpdateMessage(string message, System.AsyncCallback callback, object asyncState);
void EndUpdateMessage(System.IAsyncResult result);
[System.ServiceModel.OperationContractAttribute(AsyncPattern=true, Action="http://helloworld.org/messaging/HelloWorldMessageService/GetMessage", ReplyAction="http://helloworld.org/messaging/HelloWorldMessageService/GetMessageResponse")]
System.IAsyncResult BeginGetMessage(System.AsyncCallback callback, object asyncState);
string EndGetMessage(System.IAsyncResult result);
}
public partial class HelloWorldMessageServiceClient : System.ServiceModel.ClientBase<HelloWorld.Core.Web.Services.HelloWorldMessageService>, HelloWorld.Core.Web.Services.HelloWorldMessageService {
{
// implementation
}
I'm trying to decouple my application by passing around the interface instead of the concrete class. But I'm having difficulty finding examples of how to do this. When I try and call EndGetMessage and then update my UI I get an exception about updating the UI on the wrong thread. How can I update the UI from a background thread?
I tried but I get UnauthorizedAccessException : Invalid cross-thread access.
string messageresult = _service.EndGetMessage(result);
Application.Current.RootVisual.Dispatcher.BeginInvoke(() => this.Message = messageresult );
The exception is thrown by Application.Current.RootVisual.
Here is something I like doing... The service proxy is generated with an interface
HelloWorldClient : IHelloWorld
But the problem is that IHelloWorld does not include the Async versions of the method. So, I create an async interface:
public interface IHelloWorldAsync : IHelloWorld
{
void HelloWorldAsync(...);
event System.EventHandler<HelloWorldEventRgs> HelloWorldCompleted;
}
Then, you can tell the service proxy to implement the interface via partial:
public partial class HelloWorldClient : IHelloWorldAsync {}
Because the HelloWorldClient does, indeed, implement those async methods, this works.
Then, I can just use IHelloWorldAsync everywhere and tell the UnityContainer to use HelloWorldClient for IHelloWorldAsync interfaces.
Ok, I have been messing with this all day and the solution is really much more simple than that. I originally wanted to call the methods on the interface instead of the concreate class. The interface generated by proxy class generator only includes the BeginXXX and EndXXX methods and I was getting an exception when I called EndXXX.
Well, I just finished reading up on System.Threading.Dispatcher and I finally understand how to use it. Dispatcher is a member of any class that inherits from DispatcherObject, which the UI elements do. The Dispatcher operates on the UI thread, which for most WPF applications there is only 1 UI thread. There are exceptions, but I believe you have to do this explicitly so you'll know if you're doing it. Otherwise, you've only got a single UI thread. So it is safe to store a reference to a Dispatcher for use in non-UI classes.
In my case I'm using Prism and my Presenter needs to update the UI (not directly, but it is firing IPropertyChanged.PropertyChanged events). So what I have done is in my Bootstrapper when I set the shell to Application.Current.RootVisual I also store a reference to the Dispatcher like this:
public class Bootstrapper : UnityBootstrapper
{
protected override IModuleCatalog GetModuleCatalog()
{
// setup module catalog
}
protected override DependencyObject CreateShell()
{
// calling Resolve instead of directly initing allows use of dependency injection
Shell shell = Container.Resolve<Shell>();
Application.Current.RootVisual = shell;
Container.RegisterInstance<Dispatcher>(shell.Dispatcher);
return shell;
}
}
Then my presenter has a ctor which accepts IUnityContainer as an argument (using DI) then I can do the following:
_service.BeginGetMessage(new AsyncCallback(GetMessageAsyncComplete), null);
private void GetMessageAsyncComplete(IAsyncResult result)
{
string output = _service.EndGetMessage(result);
Dispatcher dispatcher = _container.Resolve<Dispatcher>();
dispatcher.BeginInvoke(() => this.Message = output);
}
This is sooooo much simpler. I just didn't understand it before.
Ok, so my real problem was how to decouple my dependency upon the proxy class created by my service reference. I was trying to do that by using the interface generated along with the proxy class. Which could have worked fine, but then I would have also had to reference the project which owned the service reference and so it wouldn't be truly decoupled. So here's what I ended up doing. It's a bit of a hack, but it seems to be working, so far.
First here's my interface definition and an adapter class for the custom event handler args generated with my proxy:
using System.ComponentModel;
namespace HelloWorld.Interfaces.Services
{
public class GetMessageCompletedEventArgsAdapter : System.ComponentModel.AsyncCompletedEventArgs
{
private object[] results;
public GetMessageCompletedEventArgsAdapter(object[] results, System.Exception exception, bool cancelled, object userState) :
base(exception, cancelled, userState)
{
this.results = results;
}
public string Result
{
get
{
base.RaiseExceptionIfNecessary();
return ((string)(this.results[0]));
}
}
}
/// <summary>
/// Create a partial class file for the service reference (reference.cs) that assigns
/// this interface to the class - then you can use this reference instead of the
/// one that isn't working
/// </summary>
public interface IMessageServiceClient
{
event System.EventHandler<GetMessageCompletedEventArgsAdapter> GetMessageCompleted;
event System.EventHandler<AsyncCompletedEventArgs> UpdateMessageCompleted;
void GetMessageAsync();
void GetMessageAsync(object userState);
void UpdateMessageAsync(string message);
void UpdateMessageAsync(string message, object userState);
}
}
Then I just needed to create a partial class which extends the proxy class generated by the service reference:
using System;
using HelloWorld.Interfaces.Services;
using System.Collections.Generic;
namespace HelloWorld.Core.Web.Services
{
public partial class HelloWorldMessageServiceClient : IMessageServiceClient
{
#region IMessageServiceClient Members
private event EventHandler<GetMessageCompletedEventArgsAdapter> handler;
private Dictionary<EventHandler<GetMessageCompletedEventArgsAdapter>, EventHandler<GetMessageCompletedEventArgs>> handlerDictionary
= new Dictionary<EventHandler<GetMessageCompletedEventArgsAdapter>, EventHandler<GetMessageCompletedEventArgs>>();
/// <remarks>
/// This is an adapter event which allows us to apply the IMessageServiceClient
/// interface to our MessageServiceClient. This way we can decouple our modules
/// from the implementation
/// </remarks>
event EventHandler<GetMessageCompletedEventArgsAdapter> IMessageServiceClient.GetMessageCompleted
{
add
{
handler += value;
EventHandler<GetMessageCompletedEventArgs> linkedhandler = new EventHandler<GetMessageCompletedEventArgs>(HelloWorldMessageServiceClient_GetMessageCompleted);
this.GetMessageCompleted += linkedhandler;
handlerDictionary.Add(value, linkedhandler);
}
remove
{
handler -= value;
EventHandler<GetMessageCompletedEventArgs> linkedhandler = handlerDictionary[value];
this.GetMessageCompleted -= linkedhandler;
handlerDictionary.Remove(value);
}
}
void HelloWorldMessageServiceClient_GetMessageCompleted(object sender, GetMessageCompletedEventArgs e)
{
if (this.handler == null)
return;
this.handler(sender, new GetMessageCompletedEventArgsAdapter(new object[] { e.Result }, e.Error, e.Cancelled, e.UserState));
}
#endregion
}
}
This is an explicit implementation of the event handler so I can chain together the events. When user registers for my adapter event, I register for the actual event fired. When the event fires I fire my adapter event. So far this "Works On My Machine".
Passing around the interface (once you have instantiated the client) should be as simply as using HelloWorldMessageService instead of the HelloWorldMessageServiceClient class.
In order to update the UI you need to use the Dispatcher object. This lets you provide a delegate that is invoked in the context of the UI thread. See this blog post for some details.
You can make this much simpler still.
The reason the proxy works and your copy of the contract does not is because WCF generates the proxy with code that "Posts" the callback back on the calling thread rather than making the callback on the thread that is executing when the service call returns.
A much simplified, untested, partial implementation to give you the idea of how WCF proxies work looks something like:
{
var state = new
{
CallingThread = SynchronizationContext.Current,
Callback = yourCallback
EndYourMethod = // assign delegate
};
yourService.BeginYourMethod(yourParams, WcfCallback, state);
}
private void WcfCallback(IAsyncResult asyncResult)
{
// Read the result object data to get state
// Call EndYourMethod and block until the finished
state.Context.Post(state.YourCallback, endYourMethodResultValue);
}
The key is the storing of the syncronizationContext and calling the Post method. This will get the callback to occur on the same thread as Begin was called on. It will always work without involving the Dispatcher object provided you call Begin from your UI thread. If you don't then you are back to square one with using the Dispatcher, but the same problem will occur with a WCF proxy.
This link does a good job of explaining how to do this manually:
http://msdn.microsoft.com/en-us/library/dd744834(VS.95).aspx
Just revisiting old posts left unanswered where I finally found an answer. Here's a post I recently wrote that goes into detail about how I finally handled all this:
http://www.developmentalmadness.com/archive/2009/11/04/mvvm-with-prism-101-ndash-part-6-commands.aspx