Modifying SQL Database on Shared Hosting - sql-server-2005

I have a live database on a shared hosting server. I am making some major changes to my site's code and I would like to fix some stupid mistakes I made in initially designing the database. These changes involve altering the size of a large number of fields, and enforcing referential integrity between tables properly. I would like to make the changes on both my local test server and the remote server if possible.
I should note that while I'm fairly comfortable with writing complex queries to handle data, I have very little experience modifying database structure without a graphical interface.
I can access the remote database in the visual studio database explorer but I can not use that for anything other than data manipulation. I installed Sql Management Studio express last night and after 40+ crashes I gave up - I couldn't even patch the damn thing.
The remote server is SQL 2005 / The MyLittleAdmin web interface is available.
So my question is what is the best way to accomplish these changes. Is there a graphical interface I can use on the remote server? If not is there an easy way to copy the database to my local machine, fix it, and re upload? Finally if none of the above are viable does anyone have links to a decent info on fixing referential integrity via query?
Sorry for the somewhat general question - I feel like I am making this far harder than it should be but after searching / trying all night i haven't gotten anywhere. Thanks in advance for the help. I really appreciate it.
...Also does anyone have a time machine I can borrow- I need to go kick my past self's ass for this.

Usually hosting providers allow you to backup and restore your database, so the easiest way to accomplish the move is to backup your live DB, download the backup file, restore it locally, do all the changes, do a backup of the local db, upload it, then restore it in the live service. Your site should be placed on an administrative shutdown during this time so it does not continue to update the data while you're doing this operations. You have to make sure your local SQL instance is exactly at the same build version (##version) like the hosting provider, otherwise your local SQL may upgrade the database structure and you'll be unable to restore it back on the hosting provider (or you'll be unable to restore in on your local server if your version is earlier than the host's). The MSDN BOL has a detailed guid on how to Copy Databases using Backup/Restore.
An alternative to backup/restore is to detach/attach the database, but I do not recommend this because you need to move both the MDF and the LDF in sync, and they're also larger in size than a backup.
This assumes you can do all the schema changes on your local copy in a wizardly manner, ie. fast and correct. Of course, that is not easy. The recommended way is to prepare in time a script that applies all the transformations needed to reach the new schema. There are tools like SQL Diff, SQL Compare, SQL Delta and other that can generate such a script. Also Visual Studio Database Edition can do this.
How I would do this would be like this:
Ensure I have exactly the same schema on my dev machine as on the live host. If not sure, I can take a backup of the live server and restore it localy. This would be my reference, v1. schema.
Keep the backup of v1. for reference
Start developing a script that changes the schema to my target. Sometimes I need to refresh my memory on script syntax myself, and what I do is I go to the SQL Server Management Studio wizards for the operation I want to do, select all the options in the UI and then select the 'show script options', that will show me exactly the script SSMS is running to accomplish my desired change.
For each change I add to the script, I can test it by restoring the v1. reference backup I have from step 1 and running the script.
Keep iterating on the script, adding one change at a time, until all the needed schema changes are done. After each change, I can test it again like in step 4.
Yourscript should do not only DDL changes to the schema, but also any DML changes needed (modifying reference data, changing values, moving columns between tabels etc).
When the script is ready, I can download a newer backup, apply the script, and upload the updated backup and restore it on the live host. Alternatively you can simply run the script on the live host (after of course you backed it up in case something goes horribly wrong).
In my projects I always rely on scripts to deploy and upgrade the database. In fact I use the database extended properties to store a 'version' of my application deployed schema and in my code I simply roll forward all the scripts that bring the schema to my last version. I have an article on my blog describing this technique: Version Control and your Database.

Related

What's the best version neutral method for deploying a SQL Server database?

On my development box, I always run the latest version of SQL Server. I often deploy databases from my dev box to a live/staging area for review or testing. I've done this many times and it has always been a painful process, but I am realizing that I need to find an easier, more reliable and consistent way of performing this basic operation.
I normally use WebMatrix purely for deployment and it's worked fine, but I've been having problems getting it to work on my server for some reason. Consequently, I am seeking an alternative solution.
Creating a SQL dump file would probably work, but it's not an acceptable solution a database contains images and easily exceeds 2 gigs of data which would take forever.
The Import/Export utility fails due to issues with incomplete schema copies, identity inserts and checks. The solutions offered for these issues has failed to work in my particular case.
The Backup and Restore method also fails due to some strange incompatibilities between SQL Server 2008 and 2012. SQL Server 2008 Management Studio throws exceptions during the restore process of a 2012 database. It's odd that this happens, even though I set the compatibility of the database to version 2008.
I haven't tried the Detaching, copying and reattaching files method, but I haven't bothered trying since it would probably fail for the same reasons the backup and restore method did.
Are there other alternatives out there? Also, why is this so unbelievable hard for a task that is so common and important, especially in this day in age of 2013? Get real Microsoft!
We changed our method of deploying and moving databases between servers, instances and versions by adopting the tools from RedGate. They are expensive, but worth it IMHO.
My team creates scripts for ~everything.
Database Creation, Alter, Inserts, etc, etc.
And we write all scripts that check for the existence of things before trying to create them.
Aka, we can run the scripts over and over and get the same results.
And we deploy to different environments by using SqlCmd.exe.
EDIT
See:
http://odetocode.com/blogs/scott/archive/2008/02/02/versioning-databases-views-stored-procedures-and-the-like.aspx
and
http://odetocode.com/blogs/scott/archive/2008/01/30/three-rules-for-database-work.aspx
=============
If that is "too much" then I agree with the other poster, RedGate is your friend.
Points below aside, have you considered the Database Projects within VS2012? they allow you to script off the tables, sp's, triggers, users etc you want, generate sql cmd scripts, make changes and schema compare and version control your database code, I'd certainly recommend it
"Creating a SQL dump file would probably work, but it's not an
acceptable solution a database contains images and easily exceeds 2
gigs of data which would take forever."
Why is this a problem? where are you transferring the file from and to and over what connection?
"The Backup and Restore method also fails due to some strange
incompatibilities between SQL 2008 and 2012. SQL 2008 Management
Studio throws exceptions during the restore process of a 2012
database. It's odd that this happens, even though I set the
compatibility of the database to version 2008"
This shouldn't be an issue if file is created in 2008 prior to restoring. If you create a new DB in your 2008 instance, then take a backup from that and restore it to a 2012 instance with 2008 compatiblity, then you should be able to use it there, back it up from the 2012 instance and restore to 2008 again afterwards.

how to backup SQL database (tables, mappings, indices, etc.)

I am installing a service pack on our shopping cart. They recommend backing up the SQL database before installing. I know we have backups to tape drives done by our hosting company, but I want one I know the exact time stamp for and can access quickly if I need to reload it because of a goof during the upgrade. (I don't want to have the store down for any longer than needed.)
How do you recommend backing up a SQL database for easy reloading for someone who is used to just writing queries and stored procedures? (I'd like to get everything - mappings & indices, etc - because I wouldn't know what all of them are or how to recreate them.)
I access the database via Remote Desktop and can link my hard drive and DVD drives, if that helps. It's MSSQL 2008.
Thank you so much.
Best wishes,
Andrea
BACKUP DATABASE databasename TO DISK='C:\somefile.bak' WITH COPY_ONLY, INIT, FORMAT, CHECKSUM
Obviosly replace databasename and the target C:\somefile.bak as appropriate. Remember, the file and path is on the server; connecting remotely won't change where backup file is stored--in other words it won't be on your local machine.
You can omit the WITH options if you want. Drop INIT if you don't want the target .bak file overwritten. COPY_ONLY isn't a big deal either way in your case. CHECKSUM is just for validating the data before it gets backed up, and may not matter if you don't have CHECKSUMs turned on for the database--though by default starting in MS SQL 2005 new databases were.
The MS documentation for BACKUP and RESTORE isn't too difficult to understand in its basic forms. You can also use the Management Studio Tasks->Back Up or Tasks->Restore GUI if you have access to it.

Creating a CHANGE script in Management Studio?

I was wondering if there is a way to automatically append to a script file all the changes I am making to my columns, tables, relationships etc...
The thing is I am doing a lot of different changes on a TEST db and the idea will be to apply this change script when I move the test db to production... hence keeping production data but applying all schema and object changes.
Is there an easy way to do this? Can it also migrate database diagram changes?
I have seen how you can create a change script each time I do a change but this means I have to copy and paste into a master file. Actually pretty easy!
I was just wondering if I was missing something?
Do not make changes to the test server using the UI. Write scripts and keep them under source control. You can test your scripts starting from backups of the live data and you can tune yoru scripts untill they achieve the desired result. Then you can check in the scripts for reference and later apply them on the live server. See this article Version Control and Your Database.
BTW, check out the SSMS toolpack, I think it may do what you want (I'm not sure). My advice stand none the less: version your schema, use explicitly created/saved scripts, use source control.
There's no way to directly generate a "delta" script in SSMS.
However, if every time you publish changes, you script out the entire database, including data, to SQL using the SQL Server Database Publishing Wizard you should be able to extract diffs between the versions and get your deltas that way.
If money is no object, you can purchase Visual Studio Team System Database Architect edition and use its fantastic database comparison tools to generate and version control exactly the diffs you want.
Try using TableDiff , that came with SQL Server 2005.
SQL Server 2005 TableDiff Utility
tablediff Utility
We have the process where when a developer gets done with a change, they then script it out and check it into Subversion. In Subversion we have a folder for Tables, Stored Procs, Data, etc. They script it out so it is repeatable (i.e. don’t insert the new data if it is already there.) This is important to do anyway so you keep the history of changes for a given object in the database.
In the past, we would just enter each of the files that we wanted scripted out into a text file (i.e. FileListV102.txt). When we were ready to make a release we would do “get latest” on all of the files (from VSS back then.) We then had a simple utility that would read the “file list” file and open each of those files in turn concatenating them into an output file. That is pretty easy to code.
We outgrew that and now we have a release management tools (which can be found here and will be on sale mid September), that takes all of the files and creates a big SQL script file out of it. It does it in the order that you would expect based on the folder names – so files found in the "Tables" folder are done before those in the "Data" folder, etc.
Either way, once you are done you have a big SQL script file that you can then apply to a fresh copy of production and that is what you test against.
I know I'm way late to the party, but I just wanted to add that there are tens of third party products out there. Some are very good, some are very cheap or free, and some are a mixture. I listed 22 here:
http://bertrandaaron.wordpress.com/2012/04/20/re-blog-the-cost-of-reinventing-the-wheel/
We have been using a relatively new software called Kal Admin.
It has Change Management feature and let distributing selected changes to other databases very easily. We used to do it by comparing two databases but it not satisfy our need for change tracking.
BTW Kal Admin has Metadata and data compare capabilities as well.

What is the best way to transfer a table or tables from one SQL server to another?

I have been developing in VB.NET and SQL Server 2008 for a while now, but haven't got into live installs yet. In the database system I used be on it had the ability to archive multiple tables into a .dga file, as it was called. I could then restore the .dga file into another database or on another server.
I'm looking for the easiest way to accomplish something similar in SQL Server.
If you want to transfer specific tables, then using Data Transformation Services (right click on the database in SQL Server Management studio and select "Import Data" and it will bring the dialog up for it). Of course, this assumes that you have both databases available to you.
If you are comfortable with replacing the database as a whole, you can easily backup the database and then restore it into a new one through SQL Server Management studio (or through calling the appropriate SP).
I would go for one of the following :
From MS SQL Management Studio, right click on the database / Tasks / Generate scripts
From Visual Studio, in the Server Explorer tab, "publish to provider"
Both will launch a wizard allowing you to export the tables you want the way you want (including data or not, creation scripts or not, etc etc.)
If you want to move tabless without data, the simpliest thing is to script the tables you want and run the script.
We script all our db changes and commit them to subversion and then run them as part of the deplyment process.
If you want to put the whole database on prod including data (scrub out test records first!), then do a backup and restore onthe other server.
For future changes, wescript all our db changes and commit them to subversion and then run them as part of the deployment process. There also are tools that look at the structural differnces bewteen the two servers and creates scripts. REd-Gate's SQL Compare is really good for this.
In addition to HLGEM's suggestions, you can look into SSIS if this is an ongoing process.

Is there a version control system for database structure changes?

I often run into the following problem.
I work on some changes to a project that require new tables or columns in the database. I make the database modifications and continue my work. Usually, I remember to write down the changes so that they can be replicated on the live system. However, I don't always remember what I've changed and I don't always remember to write it down.
So, I make a push to the live system and get a big, obvious error that there is no NewColumnX, ugh.
Regardless of the fact that this may not be the best practice for this situation, is there a version control system for databases? I don't care about the specific database technology. I just want to know if one exists. If it happens to work with MS SQL Server, then great.
In Ruby on Rails, there's a concept of a migration -- a quick script to change the database.
You generate a migration file, which has rules to increase the db version (such as adding a column) and rules to downgrade the version (such as removing a column). Each migration is numbered, and a table keeps track of your current db version.
To migrate up, you run a command called "db:migrate" which looks at your version and applies the needed scripts. You can migrate down in a similar way.
The migration scripts themselves are kept in a version control system -- whenever you change the database you check in a new script, and any developer can apply it to bring their local db to the latest version.
I'm a bit old-school, in that I use source files for creating the database. There are actually 2 files - project-database.sql and project-updates.sql - the first for the schema and persistant data, and the second for modifications. Of course, both are under source control.
When the database changes, I first update the main schema in project-database.sql, then copy the relevant info to the project-updates.sql, for instance ALTER TABLE statements.
I can then apply the updates to the development database, test, iterate until done well.
Then, check in files, test again, and apply to production.
Also, I usually have a table in the db - Config - such as:
SQL
CREATE TABLE Config
(
cfg_tag VARCHAR(50),
cfg_value VARCHAR(100)
);
INSERT INTO Config(cfg_tag, cfg_value) VALUES
( 'db_version', '$Revision: $'),
( 'db_revision', '$Revision: $');
Then, I add the following to the update section:
UPDATE Config SET cfg_value='$Revision: $' WHERE cfg_tag='db_revision';
The db_version only gets changed when the database is recreated, and the db_revision gives me an indication how far the db is off the baseline.
I could keep the updates in their own separate files, but I chose to mash them all together and use cut&paste to extract relevant sections. A bit more housekeeping is in order, i.e., remove ':' from $Revision 1.1 $ to freeze them.
MyBatis (formerly iBatis) has a schema migration, tool for use on the command line. It is written in java though can be used with any project.
To achieve a good database change management practice, we need to identify a few key goals.
Thus, the MyBatis Schema Migration System (or MyBatis Migrations for short) seeks to:
Work with any database, new or existing
Leverage the source control system (e.g. Subversion)
Enable concurrent developers or teams to work independently
Allow conflicts very visible and easily manageable
Allow for forward and backward migration (evolve, devolve respectively)
Make the current status of the database easily accessible and comprehensible
Enable migrations despite access privileges or bureaucracy
Work with any methodology
Encourages good, consistent practices
Redgate has a product called SQL Source Control. It integrates with TFS, SVN, SourceGear Vault, Vault Pro, Mercurial, Perforce, and Git.
I highly recommend SQL delta. I just use it to generate the diff scripts when i'm done coding my feature and check those scripts into my source control tool (Mercurial :))
They have both an SQL server & Oracle version.
I wonder that no one mentioned the open source tool liquibase which is Java based and should work for nearly every database which supports jdbc. Compared to rails it uses xml instead ruby to perform the schema changes. Although I dislike xml for domain specific languages the very cool advantage of xml is that liquibase knows how to roll back certain operations like
<createTable tableName="USER">
<column name="firstname" type="varchar(255)"/>
</createTable>
So you don't need to handle this of your own
Pure sql statements or data imports are also supported.
Most database engines should support dumping your database into a file. I know MySQL does, anyway. This will just be a text file, so you could submit that to Subversion, or whatever you use. It'd be easy to run a diff on the files too.
If you're using SQL Server it would be hard to beat Data Dude (aka the Database Edition of Visual Studio). Once you get the hang of it, doing a schema compare between your source controlled version of the database and the version in production is a breeze. And with a click you can generate your diff DDL.
There's an instructional video on MSDN that's very helpful.
I know about DBMS_METADATA and Toad, but if someone could come up with a Data Dude for Oracle then life would be really sweet.
Have your initial create table statements in version controller, then add alter table statements, but never edit files, just more alter files ideally named sequentially, or even as a "change set", so you can find all the changes for a particular deployment.
The hardiest part that I can see, is tracking dependencies, eg, for a particular deployment table B might need to be updated before table A.
For Oracle, I use Toad, which can dump a schema to a number of discrete files (e.g., one file per table). I have some scripts that manage this collection in Perforce, but I think it should be easily doable in just about any revision control system.
Take a look at the oracle package DBMS_METADATA.
In particular, the following methods are particularly useful:
DBMS_METADATA.GET_DDL
DBMS_METADATA.SET_TRANSFORM_PARAM
DBMS_METADATA.GET_GRANTED_DDL
Once you are familiar with how they work (pretty self explanatory) you can write a simple script to dump the results of those methods into text files that can be put under source control. Good luck!
Not sure if there is something this simple for MSSQL.
I write my db release scripts in parallel with coding, and keep the release scripts in a project specific section in SS. If I make a change to the code that requires a db change, then I update the release script at the same time.
Prior to release, I run the release script on a clean dev db (copied structure wise from production) and do my final testing on it.
I've done this off and on for years -- managing (or trying to manage) schema versions. The best approaches depend on the tools you have. If you can get the Quest Software tool "Schema Manager" you'll be in good shape. Oracle has its own, inferior tool that is also called "Schema Manager" (confusing much?) that I don't recommend.
Without an automated tool (see other comments here about Data Dude) then you'll be using scripts and DDL files directly. Pick an approach, document it, and follow it rigorously. I like having the ability to re-create the database at any given moment, so I prefer to have a full DDL export of the entire database (if I'm the DBA), or of the developer schema (if I'm in product-development mode).
PLSQL Developer, a tool from All Arround Automations, has a plugin for repositories that works OK ( but not great) with Visual Source Safe.
From the web:
The Version Control Plug-In provides a tight integration between the PL/SQL Developer IDE >>and any Version Control System that supports the Microsoft SCC Interface Specification. >>This includes most popular Version Control Systems such as Microsoft Visual SourceSafe, >>Merant PVCS and MKS Source Integrity.
http://www.allroundautomations.com/plsvcs.html
ER Studio allows you to reverse your database schema into the tool and you can then compare it to live databases.
Example: Reverse your development schema into ER Studio -- compare it to production and it will list all of the differences. It can script the changes or just push them through automatically.
Once you have a schema in ER Studio, you can either save the creation script or save it as a proprietary binary and save it in version control. If you ever want to go back to a past version of the scheme, just check it out and push it to your db platform.
There's a PHP5 "database migration framework" called Ruckusing. I haven't used it, but the examples show the idea, if you use the language to create the database as and when needed, you only have to track source files.
We've used MS Team System Database Edition with pretty good success. It integrates with TFS version control and Visual Studio more-or-less seamlessly and allows us to manages stored procs, views, etc., easily. Conflict resolution can be a pain, but version history is complete once it's done. Thereafter, migrations to QA and production are extremely simple.
It's fair to say that it's a version 1.0 product, though, and is not without a few issues.
You can use Microsoft SQL Server Data Tools in visual studio to generate scripts for database objects as part of a SQL Server Project. You can then add the scripts to source control using the source control integration that is built into visual studio. Also, SQL Server Projects allow you verify the database objects using a compiler and generate deployment scripts to update an existing database or create a new one.
In the absence of a VCS for table changes I've been logging them in a wiki. At least then I can see when and why it was changed. It's far from perfect as not everyone is doing it and we have multiple product versions in use, but better than nothing.
I'd recommend one of two approaches. First, invest in PowerDesigner from Sybase. Enterprise Edition. It allows you to design Physical datamodels, and a whole lot more. But it comes with a repository that allows you to check in your models. Each new check in can be a new version, it can compare any version to any other version and even to what is in your database at that time. It will then present a list of every difference and ask which should be migrated… and then it builds the script to do it. It’s not cheap but it’s a bargain at twice the price and it’s ROI is about 6 months.
The other idea is to turn on DDL auditing (works in Oracle). This will create a table with every change you make. If you query the changes from the timestamp you last moved your database changes to prod to right now, you’ll have an ordered list of everything you’ve done. A few where clauses to eliminate zero-sum changes like create table foo; followed by drop table foo; and you can EASILY build a mod script. Why keep the changes in a wiki, that’s double the work. Let the database track them for you.
Schema Compare for Oracle is a tool specifically designed to migrate changes from our Oracle database to another. Please visit the URL below for the download link, where you will be able to use the software for a fully functional trial.
http://www.red-gate.com/Products/schema_compare_for_oracle/index.htm
Two book recommendations: "Refactoring Databases" by Ambler and Sadalage and "Agile Database Techniques" by Ambler.
Someone mentioned Rails Migrations. I think they work great, even outside of Rails applications. I used them on an ASP application with SQL Server which we were in the process of moving to Rails. You check the migration scripts themselves into the VCS.
Here's a post by Pragmatic Dave Thomas on the subject.