How to enable direct modifications to the system catalogues in sql? - sql

How to change the direct modification of system catalogues in sql ?

In old versions of SQL Server (pre SQL 2005), it was possible to modify system tables directly by turning on the allow updates configuration option and applying using RECONFIGURE WITH OVERRIDE. This option is obsolete in SQL Server 2005 onward. Although it still exists, it is ignored.
You can check the configured value with the command below even though the option has no meaning in modern versions:
EXEC sp_configure 'allow updates';

You "directly" modify the system catalogs by using DDL (data modification language). These are commands that start with commands such as ALTER, CREATE and DROP.
These commands are well documented.
You should not even think about directly changing system tables/views otherwise. They are owned by the database and managed by the database.

Related

SQL Server stored procedures automatically recompiled?

If I use a web application Web Data Administrator and I edit the stored procedures SQL query, does it recompile on it's own? (new to SQL Server and this side of the database development)
MSSQL Server does maintain a cache of query plans, but this is not the same as compiled code.
The SQL Server manages this cache and can be the source of some pain if it caches a plan that is non-optimal. Though this has happened to me less than 5 times in 15 years (and that seemed to be a problem with a particular server), its best to let SQL server handle this and not touch it.
You can force SQLServer to recompile by supplying the WITH RECOMPILE option. Same caveat applies, unless you have a substantial reason to, DONT.
SQL is a scripting language, which means the code you write is not compiled. Rather, it is stored on the server to be used later.
When you edit a stored procedure, you can execute an ALTER script, or a DROP then CREATE script. This sends the text in your Web Data Admin (or SSMS) window to the server, issuing a command that tells the server to store this new query as a procedure for later use.
So, in short, yes, if you execute an ALTER script.

Sql Server Script Generator

Is there a tool that will let me generate a single script containing all tables and views? Sql Publishing Wizard drops everything (so all data is lost) and recreates it. It does have an option to not drop, but in that case, it doesn't update tables that exist (if any columns have changed).
In SQL Server Management Studio 2008 you can right-click on a database in the object explorer, go to Tasks > Generate Scripts..., and that will give you the option to choose not only what object types you want to script, but whether or not you want to script the drop as well.
When you are making changes to existing tables, you should be writing alter table scripts to make the change and then putting them in source control like any other code. Then when you deploy a set of changes, you run the scripts you created for that deployment.
Otherwise, yes use SQL compare.
Should you look at the Red Gate products specifically SQL Compare. They'll handle any situation you could need concerning script generation and database synchronization. (You can get a trial license too, to try it out and see if it is what you need.)
Have a look at this tool can be used which has the capability to generate the create and drop scripts for the SQL server objects, provided in a configuration file.
This tool uses the same mechanism as SSMS tool uses to generate the script.
SQL Server Script Generator Tool (via C#)
Have a look at these:
WinSQL (Lite edition is free, other versions are reasonably priced + free trial)
OpenDbDiff (free)
You can also check out MyDbUtils which can create scripts for:
Stored Procedures
Functions
Views
Triggers

SQL SERVER Project

My Application Database Without Project and without Source safe, i planned to make my DB to be as project and add it to TFS, but I have no idea how to script the stored procedures, Triggers, Views, Functions, and what is the best practice to Make Update Script for All My stored procedures, Triggers, Views, and Functions to My customers DB.
The best procedure (IMHO) is to manually maintain a strict version of your schemas. Then when you need to make changes you write a delta script to move from one version to the next. I suggest you write the DDL scripts by hand -- you can make them concise and comment them.
You can use a tool like Visual Studio Team System for database architects, take a look at Running static code analysis on SQL Server database with Visual Studio Team System for database architects it will show you how to import the data, disregard the static code analysis that comes later it does not apply to your question
I've found a good way to get SQL scripts into SCM from an existing database is to use SMSS's "export all to script" option or whatever it's called, can't remember now.
Then every other change you add the change script into your SCM with a different version number in the file name.
Every release (or set cycle depending on your development/release methodology) you apply all change scripts, then re-script the entire database, tag it, and start again.
The best way to do it - save the database in TFS as set of database creation script, i.e. MyTable table should be added to TFS as MyTable.sql file (CREATE TABLE...) etc. We are using SQL Examiner to do this - see the following article: How to keep your database under version control
We are working with SVN and I never tested SQL Examiner with TFS, but I know that the tool supports TFS.

Is it possible to restore Sql Server 2008 backup in sql server 2005

Is it possible to restore a backup of a SQL Server 2008 database onto an instance of SQL Server 2005?
I need to work on an sample application for which database backup is in sql server 2008.
But I'll not be able to install 2008. So is it possible to restore that back up in 2005?
No. It is not possible to restore a database from a backup of a newer version.
If you are dead set on it, I think your best option is to selet the database in the Object Explorer in SQL 2008,
right-click, select Tasks->Generate Scripts.In the options dialog emable about everything, including Script Data.
And make sure you select "Script for SQL 2005".
Source
When importing the objects into your target server, if the objects are large you may find that you can't open the SQL file via Management Studio (with a completely useless "The operation could not be completed" error, no less). That's okay, just load the file via sqlcmd.
One important thing is missing in all answers and that is the fact that Generate Scripts in SSMS doesn’t order the scripts correctly.
Scripts have to be ordered in the correct dependency order so that child tables are created after parent tables and such.
This is not an issue for small databases where its easy to reorder the scripts manually but it can be a huge issue when dealing with databases that have 100+ objects.
My experience is that its most convenient to use third party tools that can read backup and generate scripts in the correct order. I’m using ApexSQL Diff and Data Diff from ApexSQL but you can’t go wrong with any popular vendor.
No, not directly. SQL Server 2008 database backups are not backward compatible with SQL Server 2005. However, with SQL Server 2008 Management Studio, you can script data and schemas in SQL Server 2005 mode. This article describes the process in detail.
Yes it is possible
Using the export in the SQL Server 2008. Go to All Programs --> Microsoft SQL Server 2008 --> Import and Export Data
Then SQL Server Import AND Export Wizard window will be opened. Press Next
Choose a Data source (in your case from SQL Server 2008). Choose a Destination( in your case from SQL Server 2005).
Select Copy data from one or more tables or view
Select the source's tables and destination's tables
Click Next & Finish
to complete.
I have had this problem for a long while.
You cannot restore SQL2008 backups onto an SQL2005 instance.
And for me, workarounds like import/export wizzard or to script the database from SQL2008 using the generate scripts with the for SQL2005 option won't work.
Because simply, my databases cross-reference each other inside their views and stored procedures or udfs. They do not befall to my responsibility completely and so I cannot consolidate them into 1 database.
They are a set of 6 dbs that refer to each other directly inside their views and stored procedures.
When I transfer them from one SQL2005 instance onto another, I usually do full-backup/restore.
If I were to script them, even with the with dependencies option I would get errors at re-creation time as db1 will not find views inside db3 because it so happened that I executed the create db1 script first. If I tried db3 first I get similar exceptions.
The only way to script them so that I won't have such dependency exceptions, is to figure out a sequence that works and script them partially in that manner: say: db1_tables followed by db2_tables followed by db2_views followed by db1_views, sp, udfs etc.
I have such a sequence. And when I need to create a new set of such 6 dbs, I executed the smaller partial scripts in sequence.
This explains why the generate scripts, with dependencies and with data and set to SQL2005 version scripts, will just not work for me.
The import/export wizzard is better in my case because it will copy tables and then you still have to script all views, sp, udfs etc.
What I really need is a conversion tool for SQL2008 backup files, to convert them to SQL2005 format. Then my problem will go away.
Or some kind of a tool that would allow restore from SQL2008 full-backup files, without asking me too many questions.
If anyone knows such tools and have used them, let me know.
You can use DBSave, it's a great freeware tool to backup and restore ms sql server on different machines.
It's verry simple to setup and to use.
No you can't, but tools like red gate's SQL Compare/Data Compare can read backup files directly & transfer the info across to a live database, dealing with any syntax or settings that aren't compatible on SS2005
Having had no luck with the Import/Export stuff (flat file exports failed on import claiming charset mapping issues [even though same charset used throughout] and/or truncation issues [even though source and destination had exact same structure]), and having had no luck with using the "generate scripts" option suggested by Garry Shutler (it generated a script with syntax errors), I was finally able to copy the big table I wanted to copy from 2008 to 2005 using the SQL Server bcp utility. So that's another option for this situation, although for an entire database it would be table-by-table and probably doesn't help with views and such.
The steps I used:
On the source server, use "Script Table As...CREATE" to get the structure, run that on the target server.
On the target server, create a bcp format file using your newly-created table:
bcp database.owner.table format nul -f table.fmt -n
(If you're not using Windows auth, you may need the -U and -P options to specify username and password.)
Copy that format file to the source server (if necessary).
Export the data to file on the source server:
bcp database.owner.table out table.dat -f table.fmt
(Again, possibly with -U and -P.)
Copy the data file to the target server (if necessary).
Import the data on the target server:
bcp database.owner.table in table.dat -f table.fmt
(Again, possibly with -U and -P.)
In hopes that proves useful to someone else.

Is there a version control system for database structure changes?

I often run into the following problem.
I work on some changes to a project that require new tables or columns in the database. I make the database modifications and continue my work. Usually, I remember to write down the changes so that they can be replicated on the live system. However, I don't always remember what I've changed and I don't always remember to write it down.
So, I make a push to the live system and get a big, obvious error that there is no NewColumnX, ugh.
Regardless of the fact that this may not be the best practice for this situation, is there a version control system for databases? I don't care about the specific database technology. I just want to know if one exists. If it happens to work with MS SQL Server, then great.
In Ruby on Rails, there's a concept of a migration -- a quick script to change the database.
You generate a migration file, which has rules to increase the db version (such as adding a column) and rules to downgrade the version (such as removing a column). Each migration is numbered, and a table keeps track of your current db version.
To migrate up, you run a command called "db:migrate" which looks at your version and applies the needed scripts. You can migrate down in a similar way.
The migration scripts themselves are kept in a version control system -- whenever you change the database you check in a new script, and any developer can apply it to bring their local db to the latest version.
I'm a bit old-school, in that I use source files for creating the database. There are actually 2 files - project-database.sql and project-updates.sql - the first for the schema and persistant data, and the second for modifications. Of course, both are under source control.
When the database changes, I first update the main schema in project-database.sql, then copy the relevant info to the project-updates.sql, for instance ALTER TABLE statements.
I can then apply the updates to the development database, test, iterate until done well.
Then, check in files, test again, and apply to production.
Also, I usually have a table in the db - Config - such as:
SQL
CREATE TABLE Config
(
cfg_tag VARCHAR(50),
cfg_value VARCHAR(100)
);
INSERT INTO Config(cfg_tag, cfg_value) VALUES
( 'db_version', '$Revision: $'),
( 'db_revision', '$Revision: $');
Then, I add the following to the update section:
UPDATE Config SET cfg_value='$Revision: $' WHERE cfg_tag='db_revision';
The db_version only gets changed when the database is recreated, and the db_revision gives me an indication how far the db is off the baseline.
I could keep the updates in their own separate files, but I chose to mash them all together and use cut&paste to extract relevant sections. A bit more housekeeping is in order, i.e., remove ':' from $Revision 1.1 $ to freeze them.
MyBatis (formerly iBatis) has a schema migration, tool for use on the command line. It is written in java though can be used with any project.
To achieve a good database change management practice, we need to identify a few key goals.
Thus, the MyBatis Schema Migration System (or MyBatis Migrations for short) seeks to:
Work with any database, new or existing
Leverage the source control system (e.g. Subversion)
Enable concurrent developers or teams to work independently
Allow conflicts very visible and easily manageable
Allow for forward and backward migration (evolve, devolve respectively)
Make the current status of the database easily accessible and comprehensible
Enable migrations despite access privileges or bureaucracy
Work with any methodology
Encourages good, consistent practices
Redgate has a product called SQL Source Control. It integrates with TFS, SVN, SourceGear Vault, Vault Pro, Mercurial, Perforce, and Git.
I highly recommend SQL delta. I just use it to generate the diff scripts when i'm done coding my feature and check those scripts into my source control tool (Mercurial :))
They have both an SQL server & Oracle version.
I wonder that no one mentioned the open source tool liquibase which is Java based and should work for nearly every database which supports jdbc. Compared to rails it uses xml instead ruby to perform the schema changes. Although I dislike xml for domain specific languages the very cool advantage of xml is that liquibase knows how to roll back certain operations like
<createTable tableName="USER">
<column name="firstname" type="varchar(255)"/>
</createTable>
So you don't need to handle this of your own
Pure sql statements or data imports are also supported.
Most database engines should support dumping your database into a file. I know MySQL does, anyway. This will just be a text file, so you could submit that to Subversion, or whatever you use. It'd be easy to run a diff on the files too.
If you're using SQL Server it would be hard to beat Data Dude (aka the Database Edition of Visual Studio). Once you get the hang of it, doing a schema compare between your source controlled version of the database and the version in production is a breeze. And with a click you can generate your diff DDL.
There's an instructional video on MSDN that's very helpful.
I know about DBMS_METADATA and Toad, but if someone could come up with a Data Dude for Oracle then life would be really sweet.
Have your initial create table statements in version controller, then add alter table statements, but never edit files, just more alter files ideally named sequentially, or even as a "change set", so you can find all the changes for a particular deployment.
The hardiest part that I can see, is tracking dependencies, eg, for a particular deployment table B might need to be updated before table A.
For Oracle, I use Toad, which can dump a schema to a number of discrete files (e.g., one file per table). I have some scripts that manage this collection in Perforce, but I think it should be easily doable in just about any revision control system.
Take a look at the oracle package DBMS_METADATA.
In particular, the following methods are particularly useful:
DBMS_METADATA.GET_DDL
DBMS_METADATA.SET_TRANSFORM_PARAM
DBMS_METADATA.GET_GRANTED_DDL
Once you are familiar with how they work (pretty self explanatory) you can write a simple script to dump the results of those methods into text files that can be put under source control. Good luck!
Not sure if there is something this simple for MSSQL.
I write my db release scripts in parallel with coding, and keep the release scripts in a project specific section in SS. If I make a change to the code that requires a db change, then I update the release script at the same time.
Prior to release, I run the release script on a clean dev db (copied structure wise from production) and do my final testing on it.
I've done this off and on for years -- managing (or trying to manage) schema versions. The best approaches depend on the tools you have. If you can get the Quest Software tool "Schema Manager" you'll be in good shape. Oracle has its own, inferior tool that is also called "Schema Manager" (confusing much?) that I don't recommend.
Without an automated tool (see other comments here about Data Dude) then you'll be using scripts and DDL files directly. Pick an approach, document it, and follow it rigorously. I like having the ability to re-create the database at any given moment, so I prefer to have a full DDL export of the entire database (if I'm the DBA), or of the developer schema (if I'm in product-development mode).
PLSQL Developer, a tool from All Arround Automations, has a plugin for repositories that works OK ( but not great) with Visual Source Safe.
From the web:
The Version Control Plug-In provides a tight integration between the PL/SQL Developer IDE >>and any Version Control System that supports the Microsoft SCC Interface Specification. >>This includes most popular Version Control Systems such as Microsoft Visual SourceSafe, >>Merant PVCS and MKS Source Integrity.
http://www.allroundautomations.com/plsvcs.html
ER Studio allows you to reverse your database schema into the tool and you can then compare it to live databases.
Example: Reverse your development schema into ER Studio -- compare it to production and it will list all of the differences. It can script the changes or just push them through automatically.
Once you have a schema in ER Studio, you can either save the creation script or save it as a proprietary binary and save it in version control. If you ever want to go back to a past version of the scheme, just check it out and push it to your db platform.
There's a PHP5 "database migration framework" called Ruckusing. I haven't used it, but the examples show the idea, if you use the language to create the database as and when needed, you only have to track source files.
We've used MS Team System Database Edition with pretty good success. It integrates with TFS version control and Visual Studio more-or-less seamlessly and allows us to manages stored procs, views, etc., easily. Conflict resolution can be a pain, but version history is complete once it's done. Thereafter, migrations to QA and production are extremely simple.
It's fair to say that it's a version 1.0 product, though, and is not without a few issues.
You can use Microsoft SQL Server Data Tools in visual studio to generate scripts for database objects as part of a SQL Server Project. You can then add the scripts to source control using the source control integration that is built into visual studio. Also, SQL Server Projects allow you verify the database objects using a compiler and generate deployment scripts to update an existing database or create a new one.
In the absence of a VCS for table changes I've been logging them in a wiki. At least then I can see when and why it was changed. It's far from perfect as not everyone is doing it and we have multiple product versions in use, but better than nothing.
I'd recommend one of two approaches. First, invest in PowerDesigner from Sybase. Enterprise Edition. It allows you to design Physical datamodels, and a whole lot more. But it comes with a repository that allows you to check in your models. Each new check in can be a new version, it can compare any version to any other version and even to what is in your database at that time. It will then present a list of every difference and ask which should be migrated… and then it builds the script to do it. It’s not cheap but it’s a bargain at twice the price and it’s ROI is about 6 months.
The other idea is to turn on DDL auditing (works in Oracle). This will create a table with every change you make. If you query the changes from the timestamp you last moved your database changes to prod to right now, you’ll have an ordered list of everything you’ve done. A few where clauses to eliminate zero-sum changes like create table foo; followed by drop table foo; and you can EASILY build a mod script. Why keep the changes in a wiki, that’s double the work. Let the database track them for you.
Schema Compare for Oracle is a tool specifically designed to migrate changes from our Oracle database to another. Please visit the URL below for the download link, where you will be able to use the software for a fully functional trial.
http://www.red-gate.com/Products/schema_compare_for_oracle/index.htm
Two book recommendations: "Refactoring Databases" by Ambler and Sadalage and "Agile Database Techniques" by Ambler.
Someone mentioned Rails Migrations. I think they work great, even outside of Rails applications. I used them on an ASP application with SQL Server which we were in the process of moving to Rails. You check the migration scripts themselves into the VCS.
Here's a post by Pragmatic Dave Thomas on the subject.