I want to generate Scripts i.e. Stored procedures and user defined functions from SQL server 2005. I have done it through SQL management studio but i want to to do it through Command line. Is there is any way to do it through command line arguments like sqlcmd.exe or any other scripting.
use sp_helptext with a query from SQLCMD
http://msdn.microsoft.com/en-us/library/ms176112.aspx
If you prefer something with a little more flexibility, check out the sys.sql_modules system view, as it gives you a bit extra information about the object and has the nice ability of being able to join to other system tables.
The rest of MauMen's answer is the right direction though: Use SQLCMD to generate the output to a file with the "-o OutputFile.sql".
For example, create a proc:
create proc dbo.pr_procDefTest
as
print 'This is a proc definition'
go
Show the definition:
select m.definition
from sys.procedures p
inner join sys.sql_modules m on p.object_id = m.object_id
where p.name = 'pr_procDefTest'
Output with SQLCMD:
sqlcmd -E -S .\sqlexpress -d test -o OutputFile.sql -Q "set nocount on; select m.definition from sys.procedures p inner join sys.sql_modules m on p.object_id = m.object_id where p.name = 'pr_procDefTest'" -h -1 -w 65535
There are a variety of paramaters passed in, which you can look up on the sqlcmd Utility MSDN page. The important thing to note is the use of "set nocount on;" in the beginning of the script to prevent the "(1 rows affected)" footer.
Microsoft released a new tool a few weeks ago called mssql-scripter that's the command line version of the "Generate Scripts" wizard in SSMS. It's a Python-based, open source command line tool and you can find the official announcement here. Essentially, the scripter allows you to generate a T-SQL script for your database/database object as a .sql file. You can generate the file and then execute it. This might be a nice solution for you to generate the schema and/or of your db objects such as stored procs. Here's a quick usage example to get you started:
$ pip install mssql-scripter
# script the database schema and data piped to a file.
$ mssql-scripter -S localhost -d AdventureWorks -U sa --include-objects StoredProcs --schema-and-data > ./myFile.sql
More usage examples are on our GitHub page here: https://github.com/Microsoft/sql-xplat-cli/blob/dev/doc/usage_guide.md
Related
I want to export the result set of an SQL stored procedure on one server into a table on another server. Is there a way I can do this?
Please help.
Sometimes using a Linked Server is also an option (I try to avoid them for several reasons), especially if you have enough permissions and if you need to do this as a one off. Then your code would simply look like:
INSERT INTO somelinkedservername.somedatabasename.dbo.sometable
EXEC dbo.thesproc
But pending on the size of the resultset I prefer to use BCP and a fileshare to keep it simple and put the code in a SQL Agent job for overview:
bcp.exe "EXEC [AdventureWorks].[dbo].[uspGetEmployees] #managerId = 666" queryout "\\SomeShare\Temp\emps.txt" -ServerA -T -c
And then
bcp.exe "\\SomeShare\Temp\emps.txt" in -ServerB -T -c
Server Objects > Linked Server > Right Click > New Linked Server
Data transfer= SELECT * FROM [Server Name].[Database Name].[Schema Name].[Table Name]
You can do something like this:
execute ('EXECUTE DatabaseName.Schema.ProcedureName #Parameter1=?,#Parameter2=?',#Parameter1Value,#Parameter2Value) at [ServerName]
I am accessing my PostgreSQL database (9.3) via R using the RPostgreSQL package.
I have a few very long and big sql queries (several MB big. generated from raster2pgsql).
How can I send / execute sql query files as statement within R?
The normal way
\i query.sql
does not seem to work via dbSendQuery.
I tried to read in the whole sql file as character vector via readLines, however this also fails, because dbSendQuery only supports a single command apparently?
dbSendQuery or dbGetQuery is just for the "SQL" part, not the psql commands such as \i.
In your case the simplest is indeed to use readLines but then wrap dbGetQuery in a sapply call.
con <- dbConnect(...) #Fill this as usual
queries <- readLines("query.sql")
sapply(queries, function(x) dbGetQuery(con,x))
dbDisconnect(con)
Since I use this very often, I have a shortcut for this in my .Rprofile file:
dbGetQueries<-function(con,queries)sapply(queries,function(x)dbGetQuery(con,x))
Of course, you can also go the system way:
system("psql -U username -d database -h 127.0.0.1 -p 5432 -f query.sql") #Remember to use your actual username, database, host and port
Looked everywhere... to no avail.
I am trying to do a basic select using SQLCMD from the command line:
sqlcmd -S myServer -d myDB -E
So far so good.
select * from myTable
Nothing, just goes to the next line. Shouldn't it display a table with values ? Or at least "n row(s) returned" ?
I also tried the -o param: it creates an empty file.
When you use the SQLCMD tool in interactive mode statements that you enter are sent to the server when you use the keyword GO.
GO signals both the end of a batch and the execution of any cached
Transact-SQL statements. When specifying a value for count, the cached
statements will be executed count times, as a single batch.
See Use the sqlcmd Utility specifically the section titled Running Transact-SQL Statements Interactively by Using sqlcmd
So in your case:
select * from myTable enter
GOenter
I need to export a SQL query in SQL Server to an XML file.
So far, I have made a query that is:
select *
from
products
for xml path ('product'), root ('Products');
With this query, the result is correct, but I have not found a way to export it to a file.
My idea is to make the export from SQL Management Studio, if possible.
If this option is not possible, I would like to give me a hand to find out what other options I can use.
The other options I've seen are SQLCMD and .NET with Visual Basic.
In Management Studio:
Run query as above.
Click on XML link in results - this will open the XML in a new window.
Navigate to new windown and File -> Save As - this should save as XML by default.
A bit of a manual process but maybe useful for an ad-hoc scenario?
For the sake of others who will be looking for this answer.
This can be achieved in two ways.
use EXEC xp_cmdshell and add queryout "Report.xml" in the syntax for it to save it as a xml file.
EXEC master.dbo.xp_cmdshell 'bcp "SELECT *FROM DataTable" queryout Report.xml -S[ServerName]
use sqlcmd tool. Like this,
just add save your query as input.sql:
sqlcmd -S <your-server> -i input.sql -o Report.xml
What is the age old method of testing execution time of stored procedures on informix 11.5. I am thinking of doing something like this from a unix prompt:
$ time (echo 'execute procedure foo(1)' | dbaccess ...)
Any other ideas?
Sure, you can do something more elaborate, but if that's all you need, why bother? Obviously if there are more steps, move the sql into a separate file and run
time dbaccess <dbname> file.sql
btw, there's a quote missing from your code fragment.
I use my SQLCMD program for this sort of job. It has a benchmark mode (-B option), and also makes it easier to write the SQL:
sqlcmd -d stores -B -e 'execute procedure foo(1)'
It is open source and available from the IIUG Software Archive.