sql Merge databases - sql

I have a Databse "Product" in in sql 2008.I have another Databse "ORDER" in sql 2008.
Both exist in different servers.
Now the requirement is to Merge both databases, and test pointing the applications to this new DB.
Can anyone suggest the best way to accomplish this without losing the information?
I have 2 options.
1) Script the DB objects.(script both the DB and run this scripts inthe new DB)
2) Export DB
Which one in this is best or should i use any other methods to avoid errors.
I am new to SQL so please guide me with correct options.
Thanks
SNA

In my opinion the best way to achieve what you want is by just Exporting the database.
I think this is the best option because it's alot more safe then scripting the db's into a new one (a way to just get alot of frustration and errors).
Just try the exporting of your database first before trying to do anything with scripting (which obviously also takes alot more time). So try your fast solution first, and see if it will work.
(I see you are using sql-server 2008) Are you also using the management studio? If so, you can go into the tables in edit-mode and try to copy / paste rows into the new tables. I don't know how big your tables / DB's are, but this could also be an option.
Greetings,
Younes

As you say, two options are scripting or using the SQL server export/import wizard.
I've used both (for the same database as it happens)
A third option is to use Visual StudioTeam System 2008 Database Edition GDR.
In terms of a one time export and import then I'd recommend going with the wizard. This is very safe and also very straightforward. Particulary as you are new to SQL server, you want to take the approach that minimizes the risk.
The only downside to doing it this way is that it is perhaps a little less transparent than the other methods.
On the project where I merged databases I ended up using the scripting method but that was mainly because I had a project that was already using GDR to merge incremental database updates, so adding in a data merge script to that was a simple task - all changes needed to go through DBAs who unfortunately weren't very SQL literate (I know!) so keeping all the processes similar was a must.
I also took some of my learnings from scripting the data and applied them to setting up my reference data scripts, so the effort of scripting was not a one time cost.
Either way, the most important tip I can give is to back up the databases before doing any work on them.

Related

Best way to save SQL versions while working on Tableau?

I am working using Tableau and have to write down multiple different SQL each time, while making new data sources.
I have to save all changes on SQL for every data source.
Currently I would paste the SQL on notepad and save them on separate folder in my computer, along with description of the changes.
Is there any better way to do this?
Assuming you have permission to create objects in the database, begin by creating database views, As #Nick.McDermaid commented.
Then, instead of using Custom SQL data source in Tableau, just connect to the View as if it were a table.
If you need to track the changes to these SQL views of your data, you will need to learn how to use source control for the .sql files that can be scripted from within SQL Server Management Studio:
Your company or school may have a preferred source control system already in use, in which case you should use that. If they don't, or if you are learning at home, then Git and Subversion are popular open source choices.
There are many courses available on learning platforms like Coursera that will teach you how to learn how to use those systems.
I had similar problem as you.
We ended up writing the queries in SQL Editor SQL Work bench (https://www.sql-workbench.eu/), then managed the code history and performed code peer-review (logic, error check, etc) in team shared space (like confluence).
The reasons we did that is
1) SQL queries are much easy to write on Work Bench
2) Code review is a must! You will find through implementing a review process more mistakes than you could ever think about
3) The shared space is just really convenient as it is accessible by everyone, and all errors are documented. After sometimes you get a lot of visible knowledge accumulated.
I also totally agree with Nick as this is one step to a reporting solution. But developing a whole reporting server is heavy, costly and takes time. Unless management are really convinced of the importance of developing a reporting solution, you may have to get a workaround with queries and Tableau (at least that was the case for us)
A little late to the party, but I would suggest you simply version the tableau workbook. The contents of the workbook are XML, so perfect for versioning using file based tools (Dropbox, One Drive, etc.) or source control (git, etc.). The workbooks themselves are usually quite small, so just make sure to keep the extract data separate if you use it.

what method is the most used and efficient to copy "database schema" to other servers not the db!

servers all sql server 2008, and win xp
i have the following task
create a huge database, DONE
distribute it to the 20 waiting servers!!
if there were two or three i would have taken the trouble of creating the db's on all of using sql server managemnt stdio
but i am guessing that there is an efficient way
please note,
only the copy of the database structure, the schema is needed not the values within the cells!
thank you

			
				
Or of course you should have been creating the scripts as you went along and putting them in Source control. Then you would have exactly which scripts you needed for this version of the software and be used to doing the same thing for later modifications. You would also script the data inserts for any lookup tables you need to build.
Not having that, you can script the entire database. or use a SQL compare tool. But I strongly urge you to start treating database code like all other code and scripting, storing it in source control and versioning it. Life is so much better when you do that.
What Gabriel McAdams has shown, or, Redgate SQL Compare does this very nicely also.
If you can spare the moolah, using a tool like Red Gate's SQL Packager is an option i have used in the past and it works well!
The tool can do a lot more as well and may not be worth the spend though if you do not need the other features!
In that case, Gabriels'option above is definitely the easiest one to go with!

Database schemas WAY out of sync - need to get up to date without losing data

The problem: we have one application that has a portion which is used by a very small subset of the total users, and that part of the application is running off of a separate database as well. In a perfect world, the schemas of the two databases would be synced up, but such is not the case. Some migrations have been run on the smaller database, most haven't; and furthermore, there is nothing such as revision number to be able to easily identify which have and which haven't. We would like to solve this quandary for future projects. During a discussion we've come up with the following possible plan of action, and I am wondering if anyone knows of any project which has already solved this problem:
What we would like to do is create an empty database from the schema of the large fully-migrated database, and then move all of the data from the smaller non-migrated database into that empty one. If it makes things easier, it can probably be assumed for the sake of this problem specifically that no migrations have ever removed anything, only added.
Else, if there are other known solutions, I'd like to hear them as well.
You could use a schema comparison tool like Red-Gate's SQL Compare. You can synchronize the changes and not lose any data. I wrote about this and many alternative tools ranging widely in price here:
http://bertrandaaron.wordpress.com/2012/04/20/re-blog-the-cost-of-reinventing-the-wheel/
The nice thing is that most tools have trial versions. So, you can try them our for 14 days (fully functional) and only buy it if it meets your expectations. I can't speak for the other tools, but I've been using RG for years and it is a very capable and reliable tool.
(Updated 2012-06-23 to help prevent link-rot.)
Red-Gate's SQL Compare as Aaron Bertrand mentions in his answer is a very good option. However, if you are not permitted to purchase something, an option is to try something like:
1) For each database, script out all the tables, constraints, indexes, views, procedures, etc.
2) run a DIFF, and go through all the differences and make sure that the small DB can accept them. If not implement any changes (including data) necessary onto the small DB so it can accept the changes.
3) create a new empty database from the schema of the large DB
4) import the data from the small DB into the nee DB.
You could also reverse engineer your database into Visual Studio as a database project. Visual Studio Team Suite Database Edition GDR R2 (I know long name) has the capability to do a schema comparison and data comparison, but the beauty of this approach is that you get all of your database into a nice database project where you can manage change and integrate with source control. This would allow you to build from a common source and deploy consistent changes.

How is Database Migration done?

i remember in my previous job, i needed to do data migration. in that case, i needed to migrate to a new system, i was to develop, so it has a different table schema. i think 1st, i should know:
in general, how is data migrated (with the same schema) to a different DB engine. eg. MySQL -> MSSQL. in my case, my destination DB was MySQL and i used MySQL Migration Toolkit
i am thinking, in an enterprise app, there may be stored procedures, triggers that also need to be imported.
if table schema is different, how will i then go abt doing this? in my prev job, what i did was import data (in my case, from Access) into my destination (MySQL) leaving table structures. then use SQL to select data and manipulate as required into final destination tables.
in my case, where i dont have documentation for the old db, and the columns was not named correctly, eg. it uses say 'field1', 'field2' etc. i needed to trace from the application code what the columns mean. any better way? or sometimes, columns contain multiple values in delimited data, is reading code the only way?
I really depends, but from your question I assume you want to hear what other people do.
So here is what I do in my current project.
I have to migrate from Oracle to Oracle but to a completely different schema.
The old system was 2-tier (old client, old database) the new system is 3-tier (new client, business logic, new database). We have more than 600 tables in the new schema.
After much pondering we scraped the idea of doing a migration from old database to new database in SQL. We decided that in our case i would be much easier to go:
old database -> old client -> business logic -> new database
In the old database much of the data is stored in strange ways and the old client
mangles it in complex ways. We have access to the source code of the old client but it is a very large system.
We wrote a migration tool that sits above the old client and the business logic.
We have some SQL before and some SQL after that but the bulk of data is migrated via
old client and business logic.
The downside is that it is slow, a complete migration taking more than 190 hours in our case but otherwise it works well.
UPDATE
As far as stored procedures and triggers are concerned:
Even as we use the same DBMS in old and new system (both Oracle) the procedures and
triggers are written from scratch for the new system.
When I've performed database migrations, I've used the application instead a general tool to migrate the database. The application connects to two databases and copies objects from one to the other. You don't have to worry about schema or permissions or whatnot since all that is handled in the application, just like what happens when you set up the application in the first place.
Of course, this may not help you if your application doesn't support this. But if you're writing an application, I strongly recommend doing it this way.
I recommend the wikipedia article for a good overview and links to the main commercial tools (and some non-commercial ones). Stored procedures (and kin, e.g. user-defined function), if abundant, are going to be the "hot spots" in the migration, requiring rare abd costly human skills -- as soon as you get away from the "declarative" mood of mainstream SQL, and into procedural code, you cannot expect automated tools to do a decent job (Turing's Theorem says that they actually can't, in a sufficiently general case;-). So, you need engineers with a good understanding of the procedural trappings of BOTH engines -- the one you're migrating from, the one you're migrating to. You can buy that -- it's one of the niches where consultants make REALLY good money!-)
If you are using MS SQL Server, you can use SSMS to script out the schema and all data in one go: SQL Server 2008: Script Data as Inserts.
If you are not using any/many non-standard SQL constructs, then you might be able to manually edit this scipt without too much effort.

Can I add sub folders to a SQL Server Management Studio Project?

What is the best way to keep large projects organized with SSMS?
I want to do something like:
ProjectRoot
SchemaObjects
Tables
Constraints
Indexes
Keys
Functions
Views
Stored Procedures
Scripts
DataGeneration
And so on, but I cannot find a nice way to do this... Any suggestions?
I don't think there's anyway way to do this in SMSS, at least not in 2005.
Maybe a third party tool will give you this; there's a list of replacements for 2000 Enterprise Manager here; not the most current, but probably a good starting point.
Sadly there isn't. Maintain your object scripts out side of SSMS and simply edit them in SSMS.
IMHO this is one thing that is lacking big time in SSMS.