Synchronizing NHibernate Session with database - the reverse way - nhibernate

I am using NHibernate for a project, and I am absolutely beginner. I am fetching some objects from a table and showing it to a form, where they can be edited. If an user inserts a new object into the table from some other window, I want to show this newly inserted object into the edit window. My application uses tabbed window interface, so user can simultaneously open insert window and edit window at the same time.
So basically what I need is a way to determine if a newly created object exists in the database which is not fetched before by ISession, and if not, then fetch that new object from the database. In other words, I need to synchronize my session with the database, just like flush method, but in the reverse way.
Can anyone help me?

Publish/Subscription method works well for this. Check out the Publishing Events part of Ayende's sample desktop application. Basically after you've added a new item, you publish that information and other parts of your application that subscribed can update their lists accordingly.

You are taking the path to NHibernate Hell.
Be sure to work your infrastructure (ie defining interfaces, defining session management patterns and notification pattern) and isolate these non-business utilies from the rest of your code before using NHibernate to implement them.
Good luck.

Related

How to explicitly call TIBDataSet.RefreshSQL

I have list of records in TIBDataSet (Embarcadero Delphi) and I need to locate and modify one record in this list. There is chance that underlying database record has been changed by other queries and operations since TIBDataSet had been opened. Therefor I would like to call RefreshSQL for this one record (to get the latest data) before making any changes and before making post. Is it possible to do so and how?
I am not concerned about state of other records and I am sure that the record under consideration will always be updated and those updates will be commited before I need to changes this record from TIBDataSet.
As far as I understand then RefreshSQL is used for automatic retrieve of changes after TIBDataSet has posted upates to database. But I need manual (explicit) retrieval of the latest state before doing updates.
Try adding a TButton to your form and add the following code to its OnClick handler:
procedure TForm1.btnRefreshClick(Sender: TObject);
begin
IBQuery1.Refresh; // or whatever your IBX dataset is called
end;
and set a breakpoint on it.
Then run your app and another one (e.g. 2nd instance of it) and change a row in the second app, and commit it back to the db.
Navigate to the changed row in your app and click btnRefresh and use the debugger to trace execution.
You'll find that TDataSet.Refresh calls its InternalRefresh which in turn calls TIBCustomDataSet.InternalRefresh. That calls inherited InternalRefresh, which does nothing, followed by TIBCustomDataSet.InternalRefreshRow. If you trace into that, you'll find that it contructs a temporary IB query to retrieve the current row from the server, which should give you what you want before making changes yourself.
So that should do what you want. The problem is, it can be thoroughly confusing trying to monitor the data in two applications because they may be in different transaction states. So you are rather dependent on other users' apps "playing the transactional game" with you, so everyone sees a consistent view of the data.

Concurrency violation while updating and deleting newly added rows

I've been developing a CRUD application using Datasets in C# and Sql Server 2012. The project is basically an agenda wich holds information about Pokémon (name, habilities, types, image, etc).
By the few months I've been facing a problem related to concurrency violation. In other words, when I try to delete or update rows that I've just added during the same execution of the program, the concurrency exception is generated and it isn't possible to perform any other changes in the database. So I need to restart the program in order to be able to perform the changes again (Important Note: this exception only happens for the new rows added through C#).
I've been looking for a solution for this violation (without using Entity Framework or LINQ to SQL), but I couldn't find anything that I could add in the C#'s source code. Does anyone knows how to handle this? What should I implement in my source code? Is there anything to do in SQL Server that could help on it?
Here is a link to my project, a backup from the database, images of the main form and the database diagram:
http://www.mediafire.com/download.php?izkat44a0e4q8em (C# source code)
http://www.mediafire.com/download.php?rj2e118pliarae2 (Sql backup)
imageshack .us /a /img 708/38 23/pokmonform .png (Main Form)
imageshack .us /a /img 18/95 46/kantopokdexdiagram .png (Database Diagram)
I have looked on your code and it seems, that you use AcceptChanges on a datatable daKanto inconsistently. In fact you use AcceptChangesDuringUpdate, which is also fine, although I prefer to call method datatable.AcceptChanges() explictly after the update, but your way also is fine.
Anyway I have noticed, that you use AcceptChangesDuringUpdate in the method Delete_Click and Update__Click, but do not use it in the Save_Click, and also I think you should use AcceptChangesDuringFill in MainForm_Load, where you fill your datasets.
I cannot guarantee you, that it will help, but I know that uniformity of data access throughout the application reduces the risk of the unexpected data consistency errors.

Storing default instances of an NSManagedObject in every new file

I have a core data document based application. Part of my model works by having a DeviceType table, and a Devices table with a relation between them. I would like my application to be able to store the list of DeviceTypes separately from each file, and possibly be able to sync that to a server later.
What would be the best way to accomplish this?
Thanks,
Gabe
You're using a lot of database terminology with Core Data. You should break that habit as soon as possible (the reasons why are given in the introductory paragraphs to the Core Data Programming Guide).
I assume your "usually-static" device list is something you want to be able to update as new devices come out? I would actually recommend just storing the list as a PLIST resource in your app bundle and pushing an update to the app when new devices come out (for simplicity). Using a dictionary-based PLIST, your keys can be device IDs and that key can be a simple string attribute of your managed objects. It's perfectly reasonable to look things up outside your Core Data model based on some ID.
If you must update, I'd still include the "default" list with the app (see above) but if a ".devicelist" (or whatever) file is present in the documents folder, use that instead. That way you can periodically check for an updated list and download it to the docs folder if it differs.
If I've misunderstood you, I encourage you to clarify either by editing your question or posting comments.

What is the best option, transaction locking for distributed systems?

I am using NHibernate and really new to that. My dilemna is when
I open a web browser, it shows the table data. Meantime another person opens another web browser and hence read the existing data from the database.
Meantime, I make changes in the my pages and save. And the user save his changes afterwards. When I reload the page, I no more find my data, but that of the user, i.e his was the latest and mine were replaced.
How can I avoid this issue?
You need to implement optimistic concurrency control: http://nhibernate.info/doc/nhibernate-reference/transactions.html#transactions-optimistic
The most performant way is adding a <version> to your entities (see http://nhibernate.info/doc/nhibernate-reference/mapping.html#mapping-declaration-version)

Writing to one db while reading from another using DevExpress XPO

Does anyone have any experience with working with DevExpress' XPO in an environment where the DB is replicated? From my previous question here and one on serverfault, I think it's been decided that replication is the way to go.
The MySQL docs say that all writes need to happen on the master, and all reads have to come from the slave. This makes sense, but now it's a matter of setting up XPO to write to the master (far away), but read from the slave (local).
I received a good response on the DevExpress forums about how it could be done, which I intend to attempt, but I'm wondering if anyone HAS done it, and any insights/gotchas/references they would have.
EDIT: since you don't like the first approach.
here are some master-master replication links in case you haven't seen them.
http://forums.mysql.com/read.php?144,235807,235807
http://code.google.com/p/mysql-master-master/
http://www.mysqlperformanceblog.com/2007/04/05/mysql-master-master-replication-manager-released/
http://www.howtoforge.com/mysql_master_master_replication
Some potential wikipedia entries.
http://en.wikipedia.org/wiki/Replication_%28computer_science%29#Database_replication
http://en.wikipedia.org/wiki/Multi-master_replication
Mysql Replication Solutions (Cached from google, the original link is now dead for some reason)
Have you tried the method suggested on the DevExpress forum yet? That's how I would do it.
From Alian Bismark Here
Create SessionA
Call SessionA.Disconect() - Set ConnectionString to SessionA and call SessionA.Connect()
Create SessionB
Call SessionB.Disconect() - Set ConnectionString to SessionB and call SessionA.Connect()
Load obects from SessionA, using XPCollection auxL = new XPCollection(SessionA)
Create objects of SessionB, using B b = new B(SessionB)
Assign fields from object A to object b
8 Save object b
this approach work well with basic objects, if you have relationships etc, you need to resolve the references of objects in session B, using the info of objects of session A.