How to script out stored procedures to files? - sql

Is there a way that I can find where stored procedures are saved so that I can just copy the files to my desktop?

Stored procedures aren't stored as files, they're stored as metadata and exposed to us peons (thanks Michael for the reminder about sysschobjs) in the catalog views sys.objects, sys.procedures, sys.sql_modules, etc. For an individual stored procedure, you can query the definition directly using these views (most importantly sys.sql_modules.definition) or using the OBJECT_DEFINITION() function as Nicholas pointed out (though his description of syscomments is not entirely accurate).
To extract all stored procedures to a single file, one option would be to open Object Explorer, expand your server > databases > your database > programmability and highlight the stored procedures node. Then hit F7 (View > Object Explorer Details). On the right-hand side, select all of the procedures you want, then right-click, script stored procedure as > create to > file. This will produce a single file with all of the procedures you've selected. If you want a single file for each procedure, you could use this method by only selecting one procedure at a time, but that could be tedious. You could also use this method to script all accounting-related procedures to one file, all finance-related procedures to another file, etc.
An easier way to generate exactly one file per stored procedure would be to use the Generate Scripts wizard - again, starting from Object Explorer - right-click your database and choose Tasks > Generate scripts. Choose Select specific database objects and check the top-level Stored Procedures box. Click Next. For output choose Save scripts to a specific location, Save to file, and Single file per object.
These steps may be slightly different depending on your version of SSMS.

Stored procedures are not "stored" as a separate file that you're free to browse and read without the database. It's stored in the database it belongs to in a set of system tables. The table that contains the definition is called [sysschobjs] which isn't even accessible (directly) to any of us end users.
To retrieve the definition of these stored procedures from the database, I like to use this query:
select definition from sys.sql_modules
where object_id = object_id('sp_myprocedure')
But I like Aaron's answer. He gives some other nice options.

It depends on which version of SQL Server you're running. For recent versions, source code for stored procedures is available via the system view sys.sql_modules, but a simpler way to get the source for a stored procedure or user-defined function (UDF) is by using system function object_definition() (which the view definition of sys.ssql_modules uses):
select object_definition( object_id('dbo.my_stored_procedure_or_user_defined_function') )
In older versions, stored procedure and UDF was available via the now-deprecated view system view sys.syscomments.
And in older yet versions of SQL Server, it was available via the system table `dbo.syscomments'
It should be notdd that depending on your access and how the database is configured, the source may not be available to you or it may be encrypted, which makes it not terribly useful.
You can also get the source programmatically using SMO (Sql Server Management Objects).
http://technet.microsoft.com/en-us/library/hh248032.aspx

I recently came across an issue with programmatically extracting Stored Procedure scripts to file. I started off using the routine_definition approach, but quickly realised that I hit the 4000 character limit... No matter what I tried, I couldn't find a way to get over that hump. (Still interested to know if there's a way around this!)
Instead, I stumbled across a powerful built-in helper; sp_helptext
In short, for the purposes of extracting Stored Procedure Scripts, specifically, sp_helptext extracts each line to a row in the output. ie, 2000 lines of code = 2000 rows in a returned dataset. As long as your individual lines don't exceed the 4000 character limit, nothing will be clipped.
Of course, you can then write the entire table contents to file pretty easily either in SQL, or in my case SSIS.

In Case someone comes across this problem, I guess the fastest way to extract all the items (Stored Procedures, Views, User Defied Tables, Functions) is to create a Database project in any solution, then Import everything with Schema Compare and wholaaa you have all the items nicely created in corresponding folders.

Related

create alias for database name within same server

We have three database on same server (dev, test and uat). I am using a fourth database to perform some operations. I have views and stored proc created which utilizes the dev db. When I want to promote the code, I need to change the db name in all views and stored proc. Is there a better way of doing this? We are constrained with single server for all three environment.
Thanks
shankara Narayanan
Always script everything. Then you have a nice .SQL file that you can manipulate in whatever way is necessary. I prefer to set the all up with DROP/CREATE pairs for every view, SP and function. If any of them need to change, i update the script and rerun the whole thing.
I usually use a separate script file for the tables.

How to store/organize DDL script?

First of all, I'm using MySQL on the cloud ( Amazon RDS ). My database definition script has statements to create views, triggers, stored procedures, users, grant permissions to users plus insert some data (e.g look-up tables ) etc. This script has 2000 lines of SQL code. I keep this script in just one file and I execute it using : mysql --user=myusername --password=mypassword << my.script.sql. This file is protected by SVN.
The issue with having all the SQL code in one file is that it's difficult to see the SVN history for just one item ( say I want to see the SVN history for the table Task and the view TaskView ).... So my question is : how do people store such scripts ? Do professional people store each item ( table,view,stored procedure ) in its own file in a directory ? If so , one has to make a script that deploys all the mini SQL scripts in a folder ? Do people just make a script that looks for every .SQL file and dumps it on the DB ? Do people use various folders to organize such a script ? E.g one folder for views, one folder for tables, one folder for stored procedures ?
Cheers !
We have following folder structure
+ddl
....group1_ddl.sql
....group2_ddl.sql
+procedures
---level1
......single_sp.sql
......another_sp.sql
---level2
......another_uses_level1_sp.sql
---leveln
......remaining_sp.sql
+views
--level1
......group_of_views.sql
As you can see we have 3 top level folders, each for ddl, sps and views
DDL
90% of time we have one ddl script for all the tables
Some times we mainitain ddl scripts separately which can be separated logically
ex: staging_ddl.sql, aggrigate_ddl.sql
ddl script includes PK and FK constriants and also additional indeces
Stored Procedures
Note the multiple folders (level1, level2), since our our entire ETL
& business is implemented in stored procedures so we have lot of sps
(dozens) with hundreds of lines of code. Since we are wrote modular
coding we have some sps depending on other sps. So the sps which
depend on other sps go to higher level
ex: In our scenario main_sp.sql is one sp which runs the entire workflow, this sp intern calls rest of the sps in the sequential order and they intern may or may not call other sps
so main_sp.sql goes to the level3, child_sp.sql goes to level2,
grand_child_sp.sql goes to level1
file name is same as sp name
Views:
If your views are less complex and you think you can maintain easily
you can manage them in a single script.
But in our case they are some views with nearly over 2000 lines so
we maintained them in one script per view.
Mostly we try to avoid using a view in another view, in case we did
it then we maintain the multiple level hierachy as explained above
otherwise we maintain single script per view
file name is same as view name
This is how I have been managing the scripts successfully since over 7 years.
Hope this helps

Batch file to get sql backup scripts

Is there any way where I can use Batch files to get backup of the selected scripts from the SQL database...?
Say - I have one stored procedure, one function and one view in a folder.
sp1.sql
vie1.sql
fn1.sql
Before run the batch file I want to take the backup of these files.
Kindly note: I do not want to take entire database backup. Just the provided scripts alone.
Help me to achieve this one pls...
The specific answer depends entirely on the flavor of your database engine. But the general answer is you need to SELECT the definition from your database's data catalog (meta data). The function and procedure definition will probably come out intact. But the view definition may come out as just the SELECT statement - you might have to prefix it with the CREATE VIEW XXXXXXX AS part.

Bteq Scripts to copy data between two Teradata servers

How do I copy data from multiple tables within one database to another database residing on a different server?
Is this possible through a BTEQ Script in Teradata?
If so, provide a sample.
If not, are there other options to do this other than using a flat-file?
This is not possible using BTEQ since you have mentioned both the databases are residing in different servers.
There are two solutions for this.
Arcmain - You need to use Arcmain Backup first, which creates files containing data from your tables. Then you need to use Arcmain restore which restores the data from the files
TPT - Teradata Parallel Transporter. This is a very advanced tool. This does not create any files like Arcmain. It directly moves the data between two teradata servers.(Wikipedia)
If I am understanding your question, you want to move a set of tables from one DB to another.
You can use the following syntax in a BTEQ Script to copy the tables and data:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH DATA AND STATS;
Or just the table structures:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH NO DATA AND NO STATS;
If you get real savvy you can create a BTEQ script that dynamically builds the above statement in a SELECT statement, exports the results, then in turn runs the newly exported file all within a single BTEQ script.
There are a bunch of other options that you can do with CREATE TABLE <...> AS <...>;. You would be best served reviewing the Teradata Manuals for more details.
There are a few more options which will allow you to copy from one table to another.
Possibly the simplest way would be to write a smallish program which uses one of their communication layers (ODBC, .NET Data Provider, JDBC, cli, etc.) and use that to take a select statement and an insert statement. This would require some work, but it would have less overhead than trying to learn how to write TPT scripts. You would not need any 'DBA' permissions to write your own.
Teradata also sells other applications which hides the complexity of some of the tools. Teradata Data Mover handles provides an abstraction layer between tools like arcmain and tpt. Access to this tool is most likely restricted to DBA types.
If you want to move data from one server to another server then
We can do this with the flat file.
First we have fetch data from source table to flat file through any utility such as bteq or fastexport.
then we can load this data into target table with the help of mload,fastload or bteq scripts.

How to backup Sql Server to sql file?

In "Back UP" I only get a bak file, but I would like to create .sql file
Use SQL Server's Generate Scripts commend
right click on the database; Tasks -> Generate Scripts
select your tables, click Next
click the Advanced button
find Types of data to script - choose Schema and Data.
you can then choose to save to file, or put in new query window.
results in CREATE and INSERT statements for all table data selected in bullet 2.
This is a possible duplicate of: SQL script to get table content as "SELECT * FROM tblname"
To do a full database backup to File/Query you can use the 'Generate Scripts...' option on the Database.
Open SQL Server Management studio, right click on the database and choose 'Tasks->Generate Scripts...'
Then use the wizard to backup the database. You can script the whole database or parts of it. Two important options: In the 'Advanced' section, you will probably want to ensure 'Type of backup = 'Schema and Data' and the 'Script Statistics' is on.
This will produce a *.sql file that you can use as a backup that includes the schema and table data.
Ok, I read through most of these, but I had no "advanced button". But, there is still a way to do it, it's just a little hard to find, so here you go:
You can generate a script from a database, see http://msdn.microsoft.com/en-us/library/ms178078.aspx
If you want to create a script of your database you right-click on the databases and Generate Scripts (it's in different sub-menus depending on what version of SQL and Enterprise Manager / SQL Server Management studio you're using).
That will, however, only get you the database objects. It will not generate scripts for data. Backing up a database will give you all of the database objects as well as the data, depending on what recovery model your database is set to.
This fellow may have achieved what you are trying to do by creating the backup, and then restoring it and giving it a new name.
This approach copies the data along with all of the database objects.
If you want a file with insert statements for your data have a look here:
This procedure generates INSERT statements using existing data from the given tables and views. Later, you can use these INSERT statements to generate the data. It's very useful when you have to ship or package a database application. This procedure also comes in handy when you have to send sample data to your vendor or technical support provider for troubleshooting purposes.
http://vyaskn.tripod.com/code.htm#inserts