Set a property of an object in a Expect.Call - rhino-mocks

It's kind of hard to explain what I'm searching for but my example should clarify it.
I have next code:
var schedule = ScheduleUtil.CreateScheduleDto(user, user);
Expect.Call(() => _scheduleRepository.Save(schedule));
Now, what I want to do is when this Save call is made, the schedule.id property should be set to another value (1 for instance).
I do not want to mock schedule. Can this be done? The Save method doesn't return a value, so that's not a possiblity, but I do want the object schedule to be modified.
UPDATE: Maybe a small example will clarify what I exactly want.
Say there's a class with a method Save:
public void Create(Entity entity)
{
//entity is saved to database
//entity.id is updated with the created id in database
}
So, before the create, entity.id is -1, after the create it is > 0.
Now, there's a service that uses this Create. Code contracts on this service method say that before it is called, the entity must have an id equal to -1, after it is called it must have an id > 0 (preconditions and postconditions).
So, what I need is something like this:
var entity = new Entity(); //id == -1
Expect.Call(() => _instance.Create(entity);
//Now the entity.id should be a random number > 0. This is what I need, to have Rhino Mocks update the id of entity to a given integer. Is this possible?

No. If you're not mocking the _scheduleRepository, Rhino Mocks doesn't know about it. Why don't you want to mock the _scheduleRepository?
EDIT: Ok, now I see what you want to do. Use the "WhenCalled" extension method to define code to be executed when Rhino.Mocks intercepts the call. Something like this should work:
_scheduleRepository.Expect(s => s.Save(schedule)).WhenCalled(a => ((Schedule) a.Arguments[0]).Id = 1);

Related

CakePHP3: Mock methods in integration tests?

I'm new to unit / integration testing and I want to do an integration test of my controller which looks simplified like this:
// ItemsController.php
public function edit() {
// some edited item
$itemEntity
// some keywords
$keywordEntities = [keyword1, keyword2, ...]
// save item entity
if (!$this->Items->save($itemEntity)) {
// do some error handling
}
// add/replace item's keywords
if (!$this->Items->Keywords->replaceLinks($itemEntity, $keywordEntities)) {
// do some error handling
}
}
I have the models Items and Keywords where Items belongsToMany Keywords. I want to test the error handling parts of the controller. So I have to mock the save() and replaceLinks() methods that they will return false.
My integration test looks like this:
// ItemsControllerTest.php
public function testEdit() {
// mock save method
$model = $this->getMockForModel('Items', ['save']);
$model->expects($this->any())->method('save')->will($this->returnValue(false));
// call the edit method of the controller and do some assertions...
}
This is working fine for the save() method. But it is not working for the replaceLinks() method. Obviously because it is not part of the model.
I've also tried something like this:
$method = $this->getMockBuilder(BelongsToMany::class)
->setConstructorArgs([
'Keywords', [
'foreignKey' => 'item_id',
'targetForeignKey' => 'keyword_id',
'joinTable' => 'items_keywords'
]
])
->setMethods(['replaceLinks'])
->getMock();
$method->expects($this->any())->method('replaceLinks')->will($this->returnValue(false));
But this is also not working. Any hints for mocking the replaceLinks() method?
When doing controller tests, I usually try to mock as less as possible, personally if I want to test error handling in controllers, I try to trigger actual errors, for example by providing data that fails application/validation rules. If that is a viable option, then you might want to give it a try.
That being said, mocking the association's method should work the way as shown in your example, but you'd also need to replace the actual association object with your mock, because unlike models, associations do not have a global registry in which the mocks could be placed (that's what getMockForModel() will do for you) so that your application code would use them without further intervention.
Something like this should do it:
$KeywordsAssociationMock = $this
->getMockBuilder(BelongsToMany::class) /* ... */;
$associations = $this
->getTableLocator()
->get('Items')
->associations();
$associations->add('Keywords', $KeywordsAssociationMock);
This would modify the Items table object in the table registry, and replace (the association collection's add() acts more like a setter, ie it overwrites) its actual Keywords association with the mocked one. If you'd use that together with mocking Items, then you must ensure that the Items mock is created in beforehand, as otherwise the table retrieved in the above example would not be the mocked one!

belongsTo only being set on first and last member of hasMany

My adapter uses findHasMany to load child records for a hasMany relationship.
My findHasMany adapter method is directly based on the test case for findHasMany. It retrieves the contents of the hasMany on demand, and eventually does the following two operations:
store.loadMany(type, hashes);
// ...
store.loadHasMany(record, relationship.key, ids);
(The full code for the findHasMany is below, in case the issue is there, but I don't think so.)
The really strange behavior is: it seems that somewhere within loadHasMany (or in some subsequent async process) only the first and last child records get their inverse belongsTo property set, even though all the child records are added to the hasMany side. I.e., if posts/1 has 10 comments, this is what I get, after everything has loaded:
var post = App.Posts.find('1');
post.get('comments').objectAt(0).get('post'); // <App.Post:ember123:1>
post.get('comments').objectAt(1).get('post'); // null
post.get('comments').objectAt(2).get('post'); // null
// ...
post.get('comments').objectAt(8).get('post'); // null
post.get('comments').objectAt(9).get('post'); // <App.Post:ember123:1>
My adapter is a subclass of DS.RESTAdapter, and I don't think I'm overloading anything in my adapter or serializer that would cause this behavior.
Has anybody seen something like this before? It's weird enough I though someone might know why it's happening.
Extra
Using findHasMany lets me load the contents of the hasMany only when the property is accessed (valuable in my case because calculating the array of IDs would be expensive). So say I have the classic posts/comments example models, the server returns for posts/1:
{
post: {
id: 1,
text: "Linkbait!"
comments: "/posts/1/comments"
}
}
Then my adapter can retrieve /posts/1/comments on demand, which looks like this:
{
comments: [
{
id: 201,
text: "Nuh uh"
},
{
id: 202,
text: "Yeah huh"
},
{
id: 203,
text: "Nazi Germany"
}
]
}
Here is the code for the findHasMany method in my adapter:
findHasMany: function(store, record, relationship, details) {
var type = relationship.type;
var root = this.rootForType(type);
var url = (typeof(details) == 'string' || details instanceof String) ? details : this.buildURL(root);
var query = relationship.options.query ? relationship.options.query(record) : {};
this.ajax(url, "GET", {
data: query,
success: function(json) {
var serializer = this.get('serializer');
var pluralRoot = serializer.pluralize(root);
var hashes = json[pluralRoot]; //FIXME: Should call some serializer method to get this?
store.loadMany(type, hashes);
// add ids to record...
var ids = [];
var len = hashes.length;
for(var i = 0; i < len; i++){
ids.push(serializer.extractId(type, hashes[i]));
}
store.loadHasMany(record, relationship.key, ids);
}
});
}
Solution
Override the DS.RelationshipChange.getByReference method by inserting the following code into your app:
DS.RelationshipChange.prototype.getByReference = function(reference) {
var store = this.store;
// return null or undefined if the original reference was null or undefined
if (!reference) { return reference; }
if (reference.record) {
return reference.record;
}
return store.materializeRecord(reference);
};
Yes, this is overriding a private, internal method in Ember Data. Yes, it may break at any time with any update. I'm pretty sure this is a bug in Ember Data, but I'm not 100% certain this is the right solution. But it does solve this problem, and possibly other relationship-related problems.
This fix is designed to be applied to Ember Data master as of 29 Apr 2013.
Reason
DS.Store.loadHasMany calls DS.Model.hasManyDidChange, which retrieves references for all the child records and then sets the hasMany's content to the array of references. This kicks off a chain of observers., eventually calling DS.ManyArray.arrayContentDidChange, in which the first line is this._super.apply(this, arguments);, calling the superclass method Ember.Array.arrayContentDidChange. That Ember.Array method includes an optimization that caches the first and last object in the array and calls objectAt on only those two array members. So there's the part that singles out the first and last record.
Next, since DS.RecordArray implements an objectAtContent method (from Ember.ArrayProxy), the objectAtContent implementation calls DS.Store.recordForReference, which in turn calls DS.Store.materializeRecord. This last function adds a record property to the reference that is passed in as a side effect.
Now we get to what I think is a bug. In DS.ManyArray.arrayContentDidChange, after calling the superclass method, it loops through all the new references and creates a DS.RelationshipChangeAdd instance that encapsulates the owner and child record references. But the first line inside the loop is:
var reference = get(this, 'content').objectAt(i);
Unlike what happens above to the first and last record, this calls objectAt directly on the Ember.NativeArray and bypasses the ArrayProxy methods including the objectAtContent hook, which means that DS.Store.materializeRecord--which adds the record property on the reference object--may have never been called on some references.
Next, the relationship changes created in the loop are immediately afterward (in the same run loop) applied with this call tree: DS.RelationshipChangeAdd.sync -> DS.RelationshipChange.getFirstRecord -> DS.RelationshipChange.getByReference. This last method expects the reference object to have a record property. However, the record property is only set on the first and last reference objects, for reasons explained above. Therefore, for all but the first and last records, the relationship fails to be established because it doesn't have access to the child record object!
The above fix calls DS.Store.materializeRecord whenever the record property doesn't exist on the reference. The last line in the function is the only thing added. On the one hand, it looks like this was the original intention: that var store = this.store; line in the original declares a variable that isn't otherwise used in the function, so what's it there for? Also, without the added line, the function doesn't always return a value, which is a little unusual for a function which is expected to do so. On the other hand, this could lead to mass materialization in some cases where that would be undesirable (but, the relationships just won't work without it in some cases, it seems).
Possibly related
The "chain of observers" I mentioned takes a bit of an odd path. The initiating event was setting the content property on a DS.ManyArray, which extends Ember.ArrayProxy--therefore the content property has a dependent property arrangedContent. Importantly, the observers on arrangedContent are executed before observers on content are executed (see Ember.propertyDidChange). However, the default implementation of Ember.ArrayProxy.arrangedContentArrayDidChange simply calls Ember.Array.arrayContentDidChange, which DS.ManyArray implements! The point being, this looks like a recipe for some code to execute in an unintended order. That is, I think Ember.ManyArray.arrayContentDidChange may getting executed earlier than expected. If this is the case, the above mentioned code that expects the record property to already exist on all references may have been expecting this reasonably, as one of the observers directly on the content property may call DS.Store.materializeRecord on each reference. But I haven't dug deep enough to find out if this is true.

given a list of objects using C# push them to ravendb without knowing which ones already exist

Given 1000 documents with a complex data structure. for e.g. a Car class that has three properties, Make and Model and one Id property.
What is the most efficient way in C# to push these documents to raven db (preferably in a batch) without having to query the raven collection individually to find which to update and which to insert. At the moment I have to going like so. Which is totally inefficient.
note : _session is a wrapper on the IDocumentSession where Commit calls SaveChanges and Add calls Store.
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var page = 0;
const int total = 30;
do
{
var paged = sales.Skip(page*total).Take(total);
if (!paged.Any()) return;
foreach (var sale in paged)
{
var current = sale;
var existing = _session.Query<Sale>().FirstOrDefault(s => s.Id == current.Id);
if (existing != null)
existing = current;
else
_session.Add(current);
}
_session.Commit();
page++;
} while (true);
}
Your session code doesn't seem to track with the RavenDB api (we don't have Add or Commit).
Here is how you do this in RavenDB
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
sales.ForEach(session.Store);
session.SaveChanges();
}
Your code sample doesn't work at all. The main problem is that you cannot just switch out the references and expect RavenDB to recognize that:
if (existing != null)
existing = current;
Instead you have to update each property one-by-one:
existing.Model = current.Model;
existing.Make = current.Model;
This is the way you can facilitate change-tracking in RavenDB and many other frameworks (e.g. NHibernate). If you want to avoid writing this uinteresting piece of code I recommend to use AutoMapper:
existing = Mapper.Map<Sale>(current, existing);
Another problem with your code is that you use Session.Query where you should use Session.Load. Remember: If you query for a document by its id, you will always want to use Load!
The main difference is that one uses the local cache and the other not (the same applies to the equivalent NHibernate methods).
Ok, so now I can answer your question:
If I understand you correctly you want to save a bunch of Sale-instances to your database while they should either be added if they didn't exist or updated if they existed. Right?
One way is to correct your sample code with the hints above and let it work. However that will issue one unnecessary request (Session.Load(existingId)) for each iteration. You can easily avoid that if you setup an index that selects all the Ids of all documents inside your Sales-collection. Before you then loop through your items you can load all the existing Ids.
However, I would like to know what you actually want to do. What is your domain/use-case?
This is what works for me right now. Note: The InjectFrom method comes from Omu.ValueInjecter (nuget package)
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var ids = sales.Select(i => i.Id);
var existingSales = _ravenSession.Load<Sale>(ids);
existingSales.ForEach(s => s.InjectFrom(sales.Single(i => i.Id == s.Id)));
var existingIds = existingSales.Select(i => i.Id);
var nonExistingSales = sales.Where(i => !existingIds.Any(x => x == i.Id));
nonExistingSales.ForEach(i => _ravenSession.Store(i));
_ravenSession.SaveChanges();
}

How to work around NHibernate caching?

I'm new to NHibernate and was assigned to a task where I have to change a value of an entity property and then compare if this new value (cached) is different from the actual value stored on the DB. However, every attempt to retrieve this value from the DB resulted in the cached value. As I said, I'm new to NHibernate, maybe this is something easy to do and obviously could be done with plain ADO.NET, but the client demands that we use NHibernate for every access to the DB. In order to make things clearer, those were my "successful" attempts (ie, no errors):
1
DetachedCriteria criteria = DetachedCriteria.For<User>()
.SetProjection(Projections.Distinct(Projections.Property(UserField.JobLoad)))
.Add(Expression.Eq(UserField.Id, userid));
return GetByDetachedCriteria(criteria)[0].Id; //this is the value I want
2
var JobLoadId = DetachedCriteria.For<User>()
.SetProjection(Projections.Distinct(Projections.Property(UserField.JobLoad)))
.Add(Expression.Eq(UserField.Id, userid));
ICriteria criteria = JobLoadId.GetExecutableCriteria(NHibernateSession);
var ids = criteria.List();
return ((JobLoad)ids[0]).Id;
Hope I made myself clear, sometimes is hard to explain a problem when even you don't quite understand the underlying framework.
Edit: Of course, this is a method body.
Edit 2: I found out that it doesn't work properly for the method call is inside a transaction context. If I remove the transaction, it works fine, but I need it to be in this context.
I do that opening a new stateless session for geting the actual object in the database:
User databaseuser;
using (IStatelessSession session = SessionFactory.OpenStatelessSession())
{
databaseuser = db.get<User>("id");
}
//do your checks
Within a session, NHibernate will return the same object from its Level-1 Cache (aka Identity Map). If you need to see the current value in the database, you can open a new session and load the object in that session.
I would do it like this:
public class MyObject : Entity
{
private readonly string myField;
public string MyProperty
{
get { return myField; }
set
{
if (value != myField)
{
myField = value;
DoWhateverYouNeedToDoWhenItIsChanged();
}
}
}
}
googles nhforge
http://nhibernate.info/doc/howto/various/finding-dirty-properties-in-nhibernate.html
This may be able to help you.

LINQ SQL Attach, Update Check set to Never, but still Concurrency conflicts

In the dbml designer I've set Update Check to Never on all properties. But i still get an exception when doing Attach: "An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported." This approach seems to have worked for others on here, but there must be something I've missed.
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
test.Name = "test2";
using(TheDataContext dc = new TheDataContext())
{
dc.Members.Attach(test, true);
dc.SubmitChanges();
}
The error message says exactly what is going wrong: You are trying to attach an object that has been loaded from another DataContext, in your case from another instance of the DataContext. Dont dispose your DataContext (at the end of the using statement it gets disposed) before you change values and submit the changes. This should work (all in one using statement). I just saw you want to attach the object again to the members collection, but it is already in there. No need to do that, this should work just as well:
using(TheDataContext dc = new TheDataContext())
{
var test = dc.Members.FirstOrDefault(m => m.fltId == 1);
test.Name = "test2";
dc.SubmitChanges();
}
Just change the value and submit the changes.
Latest Update:
(Removed all previous 3 updates)
My previous solution (removed it again from this post), found here is dangerous. I just read this on a MSDN article:
"Only call the Attach methods on new
or deserialized entities. The only way
for an entity to be detached from its
original data context is for it to be
serialized. If you try to attach an
undetached entity to a new data
context, and that entity still has
deferred loaders from its previous
data context, LINQ to SQL will thrown
an exception. An entity with deferred
loaders from two different data
contexts could cause unwanted results
when you perform insert, update, and
delete operations on that entity. For
more information about deferred
loaders, see Deferred versus Immediate
Loading (LINQ to SQL)."
Use this instead:
// Get the object the first time by some id
using(TheDataContext dc = new TheDataContext())
{
test = dc.Members.FirstOrDefault(m => m.fltId == 1);
}
// Somewhere else in the program
test.Name = "test2";
// Again somewhere else
using(TheDataContext dc = new TheDataContext())
{
// Get the db row with the id of the 'test' object
Member modifiedMember = new Member()
{
Id = test.Id,
Name = test.Name,
Field2 = test.Field2,
Field3 = test.Field3,
Field4 = test.Field4
};
dc.Members.Attach(modifiedMember, true);
dc.SubmitChanges();
}
After having copied the object, all references are detached, and all event handlers (deferred loading from db) are not connected to the new object. Just the value fields are copied to the new object, that can now be savely attached to the members table. Additionally you do not have to query the db for a second time with this solution.
It is possible to attach entities from another datacontext.
The only thing that needs to be added to code in the first post is this:
dc.DeferredLoadingEnabled = false
But this is a drawback since deferred loading is very useful. I read somewhere on this page that another solution would be to set the Update Check on all properties to Never. This text says the same: http://complexitykills.blogspot.com/2008/03/disconnected-linq-to-sql-tips-part-1.html
But I can't get it to work even after setting the Update Check to Never.
This is a function in my Repository class which I use to update entities
protected void Attach(TEntity entity)
{
try
{
_dataContext.GetTable<TEntity>().Attach(entity);
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
catch (DuplicateKeyException ex) //Data context knows about this entity so just update values
{
_dataContext.Refresh(RefreshMode.KeepCurrentValues, entity);
}
}
Where TEntity is your DB Class and depending on you setup you might just want to do
_dataContext.Attach(entity);