Writing to one db while reading from another using DevExpress XPO - replication

Does anyone have any experience with working with DevExpress' XPO in an environment where the DB is replicated? From my previous question here and one on serverfault, I think it's been decided that replication is the way to go.
The MySQL docs say that all writes need to happen on the master, and all reads have to come from the slave. This makes sense, but now it's a matter of setting up XPO to write to the master (far away), but read from the slave (local).
I received a good response on the DevExpress forums about how it could be done, which I intend to attempt, but I'm wondering if anyone HAS done it, and any insights/gotchas/references they would have.

EDIT: since you don't like the first approach.
here are some master-master replication links in case you haven't seen them.
http://forums.mysql.com/read.php?144,235807,235807
http://code.google.com/p/mysql-master-master/
http://www.mysqlperformanceblog.com/2007/04/05/mysql-master-master-replication-manager-released/
http://www.howtoforge.com/mysql_master_master_replication
Some potential wikipedia entries.
http://en.wikipedia.org/wiki/Replication_%28computer_science%29#Database_replication
http://en.wikipedia.org/wiki/Multi-master_replication
Mysql Replication Solutions (Cached from google, the original link is now dead for some reason)
Have you tried the method suggested on the DevExpress forum yet? That's how I would do it.
From Alian Bismark Here
Create SessionA
Call SessionA.Disconect() - Set ConnectionString to SessionA and call SessionA.Connect()
Create SessionB
Call SessionB.Disconect() - Set ConnectionString to SessionB and call SessionA.Connect()
Load obects from SessionA, using XPCollection auxL = new XPCollection(SessionA)
Create objects of SessionB, using B b = new B(SessionB)
Assign fields from object A to object b
8 Save object b
this approach work well with basic objects, if you have relationships etc, you need to resolve the references of objects in session B, using the info of objects of session A.

Related

How to manually add a user in ibm cloudant?

I have a cloudant database with a lot of deleted docs. Since they can't be destroyed, I would like to make a filtered copy with the non deleted items to a temporary base, destroy the original one, and copy the temporary base to a fresh database with the same name as before.
The problem is when I destroy the base, the API keys generated are also destroyed...
So the front app calling the new base can't acces it !
I would like to manually create a user/password, so I can recreate the same user each time I destroy the database.
I don't know how to do it ?
Or is there another way to achieve my goal ??
To answer your actual question, you can't add "users" to a Cloudant account, only databases. You can, however, make API-keys that span multiple databases, which sounds like it could be what you want:
https://dx13.co.uk/articles/2016/04/11/using-a-cloudant-api-key-with-multiple-cloudant-databases-and-accounts/
But as was noted by bessbd above, if your data model relies on document deletion, you're working against the grain of Cloudant, and sooner or later you'll end up with problems.
And finally -- the doc links appear to work just fine.
Maybe some useful stuff here: https://blog.cloudant.com/2019/11/21/Best-and-Worst-Practices.html
[disclaimer, I wrote that]
Can you please expand a little further on your use case? Why do you want to get rid of the deleted docs? Is there a way to avoid deleting the docs? Also, have you already read https://cloud.ibm.com/docs/services/Cloudant?topic=cloudant-documents#tombstone-documents ?

update old processes with the new process definition -Activiti

I have some processes that ran with old process definitions. But due to requirement change the user task data has been updated with new attributes and this process definition has been deployed. I'm aware that "SetProcessDefinitionVersionCmd" can be set to "yes" to point the processes to the new definition/version.
I would like to know how to migrate the old process data to have the newly added attributes of the user task updated in them?
There is no easy way to migrate process instance data, however, when you set the version to the new process definition the instance data will go with the migrated instance.
What you have to be careful of is to make sure you include null checks for any of the data that may not be present in the migrated process instances.
Hope this helps,
Greg
Indeed there is no easy way for migration, however depending on the differences between the two definitions and to what extend you may not prefer to use SetProcessDefinitionVersionCmd, you may find DynamicBpmnService useful when combined with detecting definitions' versions inside your logic.
And yes another way would be to use SetProcessDefinitionVersionCmd but be extra cautions for tasks that were actually active prior to migration, as Activiti's database model have some redundant data (some for performance reasons), you are better studying the DB tables first for these tasks and then inspecting the before and after migration state. For example, keeping up with a simple changed attribute is much easier than an added boundary event on an active User Task, which affects the "execution tree".
I would also advice to compare SetProcessDefinitionVersionCmd's implementations between Activiti and Camunda, it is sad to have such enhancements efforts separated, but that is another story.

Concurrency violation while updating and deleting newly added rows

I've been developing a CRUD application using Datasets in C# and Sql Server 2012. The project is basically an agenda wich holds information about Pokémon (name, habilities, types, image, etc).
By the few months I've been facing a problem related to concurrency violation. In other words, when I try to delete or update rows that I've just added during the same execution of the program, the concurrency exception is generated and it isn't possible to perform any other changes in the database. So I need to restart the program in order to be able to perform the changes again (Important Note: this exception only happens for the new rows added through C#).
I've been looking for a solution for this violation (without using Entity Framework or LINQ to SQL), but I couldn't find anything that I could add in the C#'s source code. Does anyone knows how to handle this? What should I implement in my source code? Is there anything to do in SQL Server that could help on it?
Here is a link to my project, a backup from the database, images of the main form and the database diagram:
http://www.mediafire.com/download.php?izkat44a0e4q8em (C# source code)
http://www.mediafire.com/download.php?rj2e118pliarae2 (Sql backup)
imageshack .us /a /img 708/38 23/pokmonform .png (Main Form)
imageshack .us /a /img 18/95 46/kantopokdexdiagram .png (Database Diagram)
I have looked on your code and it seems, that you use AcceptChanges on a datatable daKanto inconsistently. In fact you use AcceptChangesDuringUpdate, which is also fine, although I prefer to call method datatable.AcceptChanges() explictly after the update, but your way also is fine.
Anyway I have noticed, that you use AcceptChangesDuringUpdate in the method Delete_Click and Update__Click, but do not use it in the Save_Click, and also I think you should use AcceptChangesDuringFill in MainForm_Load, where you fill your datasets.
I cannot guarantee you, that it will help, but I know that uniformity of data access throughout the application reduces the risk of the unexpected data consistency errors.

NHibernate fetch old values instead of new values from Db

I use Nhibernate in a windows application. at run time I close the form and then i change the data in Db manually so after opening the form, it shows the old values instead of new entered values. it shows new values just when i close all program (killing process) and start application again.
this question was already asked by Kristoffer but there was no accepted answer for that.
please lead me to solve this problem.
thanks
You should read some texts that explain how the NHibernate session (first level cache) works, otherwise you will get big problems using it.
To me it sounds like you are keeping a session around for longer than needed. I would recommend the following article, it is essential reading for anyone creating a winforms application using nhibernate.
MSDN Magazine - Building a Desktop To-Do Application with NHibernate
On a windows application you should be using session per presenter.
This SO Question has some good answers that might provide you with a solution.
Also a good Google phrase is "session per presenter"
The NHibernate cookbook also has a good example (although you will need to pay for this)

Synchronizing NHibernate Session with database - the reverse way

I am using NHibernate for a project, and I am absolutely beginner. I am fetching some objects from a table and showing it to a form, where they can be edited. If an user inserts a new object into the table from some other window, I want to show this newly inserted object into the edit window. My application uses tabbed window interface, so user can simultaneously open insert window and edit window at the same time.
So basically what I need is a way to determine if a newly created object exists in the database which is not fetched before by ISession, and if not, then fetch that new object from the database. In other words, I need to synchronize my session with the database, just like flush method, but in the reverse way.
Can anyone help me?
Publish/Subscription method works well for this. Check out the Publishing Events part of Ayende's sample desktop application. Basically after you've added a new item, you publish that information and other parts of your application that subscribed can update their lists accordingly.
You are taking the path to NHibernate Hell.
Be sure to work your infrastructure (ie defining interfaces, defining session management patterns and notification pattern) and isolate these non-business utilies from the rest of your code before using NHibernate to implement them.
Good luck.