RavenDB - migrate multiple databases from 3.5 to 4.2 - migration

We have an appllication that uses more than 300 databases in one ravenDB instance. (One for every customer.)
Now we want to migrate to RavenDB 4.2 (from 3.5) and need to migrate all the data too.
And we not want to migrate every database for its own.
I was searching and I found a lot of descriptions to migrate one DB (and that's working fine) but not for more than one.
I don't care if it is done by code or any tool.
Any good ideas out there?

Easiest would be to set up one-way replication from 3.5 to 4.2 (you can do this by code for each database by modifying the replication destinations for each database)

Related

migrating user profiles when upgrading from pentaho server version x.x to 8.1

In pentaho server x.x i have user profiles like useraccount names and passwords and their own repository. How Do I migrate them when updating pentaho server to 8.1
I am sorry if its lame question, but I am not very skilled in web apps.
The x.x requirement is tricky. It’ll probably depend a lot on which version you’re currently on, as changes were made throughout multiple versions.
But normally your user profiles are in the hibernate database, which is either mysql, hypersonic or postgres (depending on version and wherher CE or EE), or is something customised.
There you’ll find the relevant tables with usernames, encrypted passwords, roles, etc.
Depending on version you may be able to import the whole hibernate db (and hope for the best), but you may need to start anew and just import manually the bits you actually need

Azure cloud. One database per (asp.net registered) client

Good morning,
I am using an asp.net framework with an azure client database.
I am now creating another server on Azure to host databases. On this server, for each customer registering on the website (for which 1 entry is created in my first database), I need to create a database with 8 tables - identical for each customer.
What would be the best thing to map the ASP.NET ID to a new database? Which framework would you recommend?
Thanks
Rather than running a VM where you're going to have to manage a SQL Server installation and write a bunch of code to handle a database per tenant scenario, I highly, highly, highly recommend taking a look at Azure SQL's multi-tenant sharding support. All of this code is already written for you. And it's not that you're paying for one DB per client - check out elastic pooling.
You can read the docs here.
Also note, this option will scale very well.
I have done this three different ways: a database per client where I wrote my own code to manage sharding, a single database with a separate schema per client (a huge pain in the rear), and using Azure SQL sharding support. It's not just the issue of correctly separating client data. You also need to think about querying for reporting across all client databases, and managing schema changes. Under the first two options, if you change a schema, you get to modify N client databases. Azure SQL's sharding tools will manage this for you.

Best Practices of continuous Integration with SQL Server project or local mdf file in project

Today I maintain project that has really messy DB that need a lot of refactor and publish on clients machines.
I know that I could add a SQL Server Database project that contains just scripts of the database and creates a .dacpac file that allows me to change clients databases automatically.
Also I know that I could just add an .mdf file to the App_Data or even to Solution_Data folder and have my database there. I suppose that localDb that already exists allows me to startup my solution without SQL Server
And atlast i know that Entity Framework exist with it's own migrations. But i don't want to use it, besouse i can't add and change indexes with it's migrations and i don't have anought flexibility when i need to describe difficult migrations scenarios.
My goals:
Generate migration scripts to clients DB's automaticaly.
Make my solution self-contained, that any new Programmer that came to project don't even need to install SQL Server on his machine.
Be able to update local (development) base in 1-2 clicks.
Be able to move back in history of db changes (I have TFS server)
Be able to have clean (only with dictionaries or lookup tables) db in solution with up to date DB scheme.
Additionally i want to be able to update my DB model (EF or .dbml) automatically or very easy way.
So what I what to ask:
What's a strengths and weaknesses of using this 2 approaches if I want to achive my goals?
Can be that I should use sort of combination of this tools?
Or don't I know about other existing tool from MS?
Is there a way to update my DAL model from this DB?
What's a strengths and weaknesses of using this 2 approaches if I want to achive my goals?
Using a database project allows you to version control all of the database objects. You can publish to various database instances and roll out changes incrementally, rather than having to drop and recreate the database, thus preserving data. These changes can be in the form of a dacpac, a SQL script, or done right through the VS interface. You gain a lot of control over deployments using pre- and post-deployment scripts and publishing profiles. Developers will be required to install SQL Server (the developer/express edition is usually good enough).
LocalDB is a little easier to work with -- you can make your changes directly in the database without having to publish. LocalDB doesn't have a built-in publish process for pushing changes to other instances. No SQL Server installation required.
Use a database project if you need version control for your database objects, if you have multiple users concurrently making changes, or if you have multiple applications that use the same database. Use LocalDB if none of those conditions apply or for small apps that require their own standalone database.
Can be that I should use sort of combination of this tools?
Yes. According to Kevin's comment below, "If the Database Project is set as your startup project, hitting F5 will automatically deploy it to LocalDB. You don't even need a publish profile in this case."
Or don't I know about other existing tool from MS?
Entity Framework's Code First approach comes close.
Is there a way to update my DAL model from this DB?
Entity Framework's POCO generator works well unless you make changes to your DAL classes, then those changes get lost the next time you run the generator.
There is a new tool called SqlSharpener which can generate classes from the SQL files in a database project. I have not used it so I cannot vouch for it but it looks promising.
One way for generating client script for DB changes is to use database modeling tool like ERWin Which have a free community edition. The best way to meet your database version control requirement and easy script generation is Redgate SQL Source Control. Using Redgate tool you will meet the first five goals mentioned. Moreover, you can now update EF Model by single click after changing DB schema (i.e. Database first approach) as required in goal 6.
I do not recommend using LocalDB at all. It always make issues with source control like "DB File is in use and can't commit...” In addition, the developer in the project will not have common set of updated data to work on unless a developer add test data to the database and ask others to get latest version and overwrite their own database Or generate update script by the previous mentioned tool and ask every developer to run it on his localDB.
The best way in your situation is to use SQL Server on network. A master version that all the developers use. Since you have version control on the database using previously mentioned tool, you can rollback any buggy change in the database server.
If you think that RedGate tool is expensive for the budget of your project. A second approach is to generate single SQL file from your database that has all database object and the other developers update the SQL file in source control per their changes. This can be done easily by using schema compare tool in visual studio and appending the generated script to SQL file in the source control. With EF DB First approach, you will not have to add many migration classes as in EF Code first.

What database replication model do I need?

I am on Rails 3 with a local Postgres database. What we want to do is replicate the entire database onto a second server in real time. We are thinking of using Octopus.
I'm confused about what model I'm looking for and how the master-slave model applies.
Postgres 9.1 and later comes with streaming replication built in (for master-slave configurations). Check out http://www.postgresql.org/docs/9.2/static/warm-standby.html#STREAMING-REPLICATION for more information on configuration and setup.
There are other third-party solutions for configuration, but I'd start there and see if that meets your needs.

NHibernate 2nd Level Cache - Membase MemCache - Multiple Session Factories

I am using multiple databases in a multi-tenant NHibernate application, I was previously using SysCache which worked fine, however I needed to move to Memcache as we are now using a Web Farm. I am actually using Membase server which was very easy to install on windows and supports Memcache.
It appears as if my cache is being shared between session factories...if an entity gets cahced from database A with ID of 1 and application B requests same entity, it should get it from its own database but instead its returning the entity from database A in the cache.
Is there any additional configuration I need to perform to get this to work?
I am using a MembaseCacheProvider from here (and confugured the same too)
http://blog.ovesens.net/2011/02/nhibernate-membase-caching-provider/
I have left a comment on the above blog, however I am posting here too in case anyone can help in the meantime.
Paul
You'd better to use different buckets for different applications. See Couchbase Server Data Buckets for more explanations.