Update SQL Server database schema - sql

There is an old SQL Server database which needed to be upgraded to much improved version of database schema. Mostly new columns have been added to existing tables. It is necessary to keep original data in the old database. It there any easy way to upgrade the schema than compacting and updating table by table manually?

I've used Adept SQL in the past with a lot of success. It will compare the databases for you, and even generate a script to bring one database up the other. It is not a free product, but you can use it for a trial (with most features, I believe.) If this is a one-time operation, it'll be just what you need.
In the interest of full disclosure, we liked the product so much that we did end up purchasing it.

Related

SQL Server migrating database performance what is better

Some background:
A customer has asked an Certified SQL Server Consultant for his opinion on migrating from sql server 2005 to sql server 2008.
One of his most important recommendations was not to use backup/restore but instead use the migration wizard to copy all the data into a new database.
He said that this would ensure that the inner structure of the database would be in an SQL 2008 format, and would ultimately result in better performance.
The Customer is skeptical about this because they cant find any writing, in white papers or otherwise to corroborate the consultants statement.
So they posed me this question:
Given an SQL Database, which originally started out on SQL Server 2000, and has migrated to newer versions of SQL Server using backup/restore. (and finally being on SQL Server 2005)
Would migrating to SQL Server 2008 using the Migration Wizard, and in effect copying all the raw data into a new database, result in better performance characteristics. Then if they would be using the Backup/Restore method again?
I'll repeat what I posted on Twitter, "your consultant is an idiot".
Doing a backup and restore will be much easier, and require a much shorter downtime. Also it will ensure that the data is consistent and that no objects are missed.
So long as you are doing index maintenance (rebuilding or reorging/defragging indexes) then any page splits which have happened are fixed and there will be no performance problems.
When the database is moved from one version to another the physical database file is updated to the new version. You'll notice when you restore the database that the compatibility level is set to the old version's number. This has nothing to do with the physical structure of the database file. You can change the compatibility level at any time to a lower or higher version. You can see this if you restore the database using T-SQL as after the database is restored you'll see the specific upgrade steps which are performed.
In response to qwerty13579's comment, when the indexes are rebuild the index is written to new physical database pages so exporting and importing the data in a SQL Server database isn't needed.
For the record, the migration wizard is about the worst possible option for moving data from database to database.
I agree with Denny.
Backup/restore is the easiest way to upgrade.
For no downtime upgrade you can use database mirorring to new server and fail over to new version
One important task that improves performance is refreshing all statistics when you upgrade to a new version

Create SQL Script to Change Old Database to Current One

I'm creating the front-end for a project and I made a copy of the back-end database from the company's server and put it on my computer. I needed to make some changes (a few new tables and two new columns in an existing table) for security roles and other things so I duplicated the copied database and made my changes on the new one.
I want to deploy my project to the company's server now but we need to modify the original back-end database. I need to generate a SQL script that finds the changes between the old-database and my newer database, which can be run on the old database to create the new tables and columns. The script should retain the data from the old database and NOT add any junk/testing data I made in my new database.
By the way, I'm using SQL Server 2008 R2 and the old database on the server is on 2005. I've been looking around for utilities to use and found tablediff. However, it looks like it will copy the data and I can't see an argument on the information page to toggle this.
I'm sure it's simple but I'm not really sure how to do this. Any help would be appreciated. Thanks.
By far the solution I trust most to handle schema comparisons is Red Gate's SQL Compare:
http://www.red-gate.com/products/sql-development/sql-compare/
It has a companion called Data Compare which is designed specifically for data. You can grab the free trial to see if it does what you need in this case.
There are other options as well, for example SQL Server Data Tools has this functionality, though I haven't tested it to any degree that I could compare feature sets, performance, etc.
I've also blogged about why you want to use a tool and just pay for this functionality, rather than solve it programmatically yourself. The post also mentions a variety of alternatives if budget is a primary blocker:
http://bertrandaaron.wordpress.com/2012/04/20/re-blog-the-cost-of-reinventing-the-wheel/

Synchronize none-data related changes between different DB instances

Here's the scenario.
Two identical databases:
One Live database, one Archive database, they're suppose have the exact same schema (table, view, indexes, SPs, functions), the only difference is the data in the databases. The data in Live DB will be archived with some business rules and apparently the data in Archive DB will be different from in Live DB.
The challenge is that we keep on patching changes (SP change, function change, data change, or even table schema change) to the Live DB in each release. Unfortunately, the changes required on Archive DB are forgotten for a long time and the issues have just not been addressed yet. It will happen one day that the out-of-sync DBs come back and bite us.
Here's what I want to do: I want to synchronize non-data related changes from Live DB to Archive DB. Either automated or manually.
Any idea is welcome. Here are some ideas that have come to my mind:
replication? I find replication does not fit this scenario quite well.
scripting the SP/function/view changes? I can manually pull out the scripts and combine them together. What about the table schema changes? It's difficult for me to track back to find out what's happened on table schema changes.
I know there's Redgate and other product can do the job but I'd like to explore the full potential.
If anybody can point out some feasible way that'd be great.
If you are using SQL Server, Visual Studio Team System Database Edition has a schema comparison and patching tool. Have a look at this article

Methods of maintaining sample data in a database

Firstly, let me apologize for the title, as it probably isn't as clear as I think it is.
What I'm looking for is a way to keep sample data in a database (SQL, 2005 2008 and Express) that get modified every so often. At present I have a handful of scripts to populate the database with a specific set of data, but every time the database is changed all the scripts have to be more or less rewritten and I was looking for some alternatives.
I've seen a number of tools and other software for creating sample data in a database, some free and some not. Are there any other methods I haven’t considered?
Thanks in advance for any input.
Edit: Also, if anyone has any advice at all in dealing with keeping data in sync with a changing application or database, that would be of some help as well.
If you are looking for tools for SQL server, go visit Red Gate Software, they have the best tools. They have a data compare tool that you can use to keep lookup type tables up-to-date and a SQL compare tool that you can use to keep the tables synched up between two datbases. So using SQL data compare, create a datbase with all the sample data you want. Then periodically refresh your testing db (or your prod db if these are strictly lookup type tables) using the compare tool.
I also like the alternative of having a script (you can use Red Gate's tool to create scripts) because that means you can store this info in your source control and use it as part of a deployment package to other servers.
You could save them in another database or the same db in different tables distinguished by the name, like employee_test
Joseph,
Do you need to keep just the data in sync, or the schema as well?
One solution to the data question would be SQL Server snapshots. You create a snapshot of your initial configuration, so any changes to the "real" database don't show up in the snapshot. Then, when you need to reset the table, select from the snapshot into a new table. I'm not sure how it will work if the schema changes, but it might be worth a try.
For generation of sample data, the Database project in Visual Studio has functionality that will create fake/random data.
Let me know if this make sense.
Erick

SQL Server 2005 multiple database deployment/upgrading software suggestions

We've got a product which utilizes multiple SQL Server 2005 databases with triggers. We're looking for a sustainable solution for deploying and upgrading the database schemas on customer servers.
Currently, we're using Red Gate's SQL Packager, which appears to be the wrong tool for this particular job. Not only does SQL Packager appear to be geared toward individual databases, but the particular (old) version we own has some issues with SQL Server 2005. (Our version of SQL Packager worked fine with SQL Server 2000, even though we had to do a lot of workarounds to make it handle multiple databases with triggers.)
Can someone suggest a product which can create an EXE or a .NET project to do the following things?
* Create a main database with some default data.
* Create an audit trail database.
* Put triggers on the main database so audit data will automatically be inserted into the audit trail database.
* Create a secondary database that has nothing to do with the main database and audit trail database.
And then, when a customer needs to update their database schema, the product can look at the changes between the original set of databases and the updated set of databases on our server. Then the product can create an EXE or .NET project which can, on the customer's server...
* Temporarily drop triggers on the main database so alterations can be made.
* Alter database schemas, triggers, stored procedures, etc. on any of the original databases, while leaving the customer's data alone.
* Put the triggers back on the main database.
Basically, we're looking for a product similar to SQL Packager, but one which will handle multiple databases easily. If no such product exists, we'll have to make our own.
Thanks in advance for your suggestions!
I was looking for this product myself, knowing that RedGate solution worked fine for "one" DB; unfortunately I have been unable to find such tool :(
In the end, I had to roll my own solution to do something "similar". It was a pain in the… but it worked.
My scenario was way simpler than yours, as we didn't have triggers and T-SQL.
Later, I decided to take a different approach:
Every DB change had a SCRIPT. Numbered. 001_Create_Table_xXX.SQL, 002_AlterTable_whatever.SQL, etc.
No matter how small the change is, there's got to be a script. The new version of the updater does this:
Makes a BKP of the customerDB (just in case)
Starts executing scripts in Alphabetical order. (001, 002...)
If a script fails, it drops the BD. Logs the Script error, Script Number, etc. and restores the customer's DB.
If it finishes, it makes another backup of the customer's DB (after the "migration") and updates a table where we store the DB version; this table is checked by the app to make sure that the DB and the app are in sync.
Shows a nice success msg.
This turned out to be a little bit more "manual" but it has been really working with little effort for three years now.
The secret lies in keeping a few testing DBs to test the "upgrade" before deploying. But apart from a few isolated Dbs where some scripts failed because of data inconsistency, this worked fine.
Since your scenario is a bit more complex, I don't know if this kind of approach can be ok with you.
As of this writing (June 2009) there's still no product on the market that'll do all this for multiple databases. I work for Quest Software, makers of Change Director for SQL Server, another database change automation system. Ours doesn't handle multiple databases like you're after, and I've seen the others out there. No dice.
I wouldn't hold out hope for it either, given the directions I've seen in SQL Server management. Things are going more toward packaged applications being contained in a single database, and most of the code is focusing on that.