Multiple outputs from single SP - SQL Server 2008 - sql

I've been testing multiple theories but having issues. I've created an SP and a BAT command to export a file via BCP and send it to a third party. Normally I would BCP it within the SP, but due to server:folder connectivity, I'm trying to perform the %OUTFILE% within the BAT (If I'm over complicating it, let me know.)
I can't post entire code, so I'll psuedo replace it.
CREATE PROCEDURE
{{{populates a temp table}}}
SELECT {requirements} FROM #table;
SELECT {requirements2} FROM #table;
SELECT {requirements3} FROM #table;
END
Now this works in live form, just fine.
The BAT file I sent the client is
SET hourVAR
SET OUTFILE="{FileDirectory}"
bcp "exec SPICreated" queryout %OUTFILE% params
Normally I would do this either within a multiple step job (I can't do a job for them, though) or I would make the BAT file include the entire BCP "SELECT FROM" but the select is ~30 columns long, and due to the 3rd party vendor I'm trying to put all the 'bulk' in the SP.
Can anyone provide insight on how I may better do this? If I assign a variable to the "SELECT" portion, can I call it from the BAT file? SQL Server is not my forte.
(Trying not to create 3 duplicated SPs, and trying to avoid a ~100 line BAT file.)
--- For those wondering, running the SP with all 3 "SELECTs" caused it to become broken in compilation and somehow becomes.
Additional Info: This is all from the same table, but I need 3 different data sets in 3 different documents.
Data Resembles:
1|2|3|4|5
A|B|C|D|E
Z|X|Y|V|C
AA|BB|3|D|5
I need Document One to be
1|2|3
A|B|C
Z|X|Y
AA|BB|D
I need document Two to be
1|5
A|E
Z|C
AA|3
I need document Three to be
1|3
A|D
Z|V
AA|D
EDIT: Added data examples to assist query. Queries already work to get the data within a view, but not for BCP

Related

Do while loop with GPDB using talend

I have a very large data set in GPDB from which I need to extract close to 3.5 million records. I use this for a flatfile which is then used to load to different tables. I use Talend, and do a select * from table using the tgreenpluminput component and feed that to a tfileoutputdelimited. However due to the very large volume of the file, I run out of memory while executing it on the Talend server.
I lack the permissions of a super user and unable to do a \copy to output it to a csv file. I think something like a do while or a tloop with more limited number of rows might work for me. But my table doesnt have any row_id or uid to distinguish the rows.
Please help me with suggestions how to solve this. Appreciate any ideas. Thanks!
If your requirement is to load data into different tables from one table, then you do not need to go for load into file and then from file to table.
There is a component named tGreenplumRow which allows you to write direct sql queries (DDL and DML queries) in it.
Below is a sample job,
If you notice, there are three insert statements inside this component. It will be executed one by one separated by semicolon.

Execute SQL task (SSIS) and then insert the result set into a table on a different server

This is more of a generic question :
I have file1.sql, file2.sql , file3.sql in a folder. I can run a foreach container to loop through the files and execute it but I need the result set to go to respective tables sitting on a different server
file1 result set --> Server2.TableA
file2 result set --> Server2.TableB .. etc
How can this be achieved through SSIS techniques ?
You can do this with a script task in the foreach loop, that analyses the result set and inserts it to the appropriate destination table.
You could also put all the records into a staging table on one server with additional columns for that server they will go to and a isprocesssed bit field.
At this point you could do any clean up required of the data.
Then create a separate dataflow for each server to grab the unprocessed records for that server. After they are sent, then mark the records as processed.
This will work if you only have a few servers. If there are many possibilities or you expect the number will continue to change, I would go with #TabAlleman's suggestion.
thestralFeather,
If you are new to SSIS, refer to msdn's tutorial on looping utilizing SSIS here. If you look at this page within the tutorial, you will see in the dataflow the output destination. #Tab Allerman and #HLGEM have provided good advice. When you look at the pages I've referred you to, just thing in terms of 2 separate loops dropping data to a single location that you can manage in a target dataflow.

SQL Server job to execute query from the output CSV file of first step

This is my first job creation task as a SQL DBA. First step of the job runs a query and sends the output to a .CSV. As a last step, I need the job to execute the query from the .CSV file (output of first step).
I have Googled all possible combinations but no luck.
your question got lost somehow ...
You last two comments make ist a little clearer.
If I understand it correctly you create a SQL script which restores all the logins, roles and users, their rights etc. into a newly created db.
If this created script is executable within a query window you can easily execute it with EXECUTE (https://msdn.microsoft.com/de-de/library/ms188332(v=sql.120).aspx)
Another approach could be SQLCMD (http://blog.sqlauthority.com/2013/04/10/sql-server-enable-sqlcmd-mode-in-ssms-sql-in-sixty-seconds-048/)
If you need further help, please come back with more details: What does your "CSV" look like? What have you tried so far?

Verify multiple tables and copy data in ssis/BIML?

I have a package that have about 6 to 7 dataflow tasks.Within those dataflow tasks, I have up from 5 to 70 tasks thaht copy data from a source(ORACLE database) to a destination(sql database). I need to make to make a count of the source table and then if the source is not empty I will copy the data .I have presently a execute sql task taht trucate all the tables.I would like to truncate if my parameters is > 0 .But wuth my use number of tables(177), I can't afford to use a variable for each one to hold the result of the count and then test the rest.Can I make something work with BIML.Can I use a stored procedure and loop throug it. I need some advice.
EDIT: ////
I think i did not explain myself correctly. I have multiple dataflow task with a lot of source to destination copy.In my control flow , I have an execute sql task that truncate all my 177 tables. I need to do a count on all the sources tables and store the results so i can send it to my execute sqltask.After thaht i want to check if my variable is > 0 then I would not do the task.Is there any easier way to do this than to create 177 variables.
Thanks.
I hope i'm not too late for you. You can use bimlonline.com to reverse engineer your package.
Bimlonline.com is free

How to script out stored procedures to files?

Is there a way that I can find where stored procedures are saved so that I can just copy the files to my desktop?
Stored procedures aren't stored as files, they're stored as metadata and exposed to us peons (thanks Michael for the reminder about sysschobjs) in the catalog views sys.objects, sys.procedures, sys.sql_modules, etc. For an individual stored procedure, you can query the definition directly using these views (most importantly sys.sql_modules.definition) or using the OBJECT_DEFINITION() function as Nicholas pointed out (though his description of syscomments is not entirely accurate).
To extract all stored procedures to a single file, one option would be to open Object Explorer, expand your server > databases > your database > programmability and highlight the stored procedures node. Then hit F7 (View > Object Explorer Details). On the right-hand side, select all of the procedures you want, then right-click, script stored procedure as > create to > file. This will produce a single file with all of the procedures you've selected. If you want a single file for each procedure, you could use this method by only selecting one procedure at a time, but that could be tedious. You could also use this method to script all accounting-related procedures to one file, all finance-related procedures to another file, etc.
An easier way to generate exactly one file per stored procedure would be to use the Generate Scripts wizard - again, starting from Object Explorer - right-click your database and choose Tasks > Generate scripts. Choose Select specific database objects and check the top-level Stored Procedures box. Click Next. For output choose Save scripts to a specific location, Save to file, and Single file per object.
These steps may be slightly different depending on your version of SSMS.
Stored procedures are not "stored" as a separate file that you're free to browse and read without the database. It's stored in the database it belongs to in a set of system tables. The table that contains the definition is called [sysschobjs] which isn't even accessible (directly) to any of us end users.
To retrieve the definition of these stored procedures from the database, I like to use this query:
select definition from sys.sql_modules
where object_id = object_id('sp_myprocedure')
But I like Aaron's answer. He gives some other nice options.
It depends on which version of SQL Server you're running. For recent versions, source code for stored procedures is available via the system view sys.sql_modules, but a simpler way to get the source for a stored procedure or user-defined function (UDF) is by using system function object_definition() (which the view definition of sys.ssql_modules uses):
select object_definition( object_id('dbo.my_stored_procedure_or_user_defined_function') )
In older versions, stored procedure and UDF was available via the now-deprecated view system view sys.syscomments.
And in older yet versions of SQL Server, it was available via the system table `dbo.syscomments'
It should be notdd that depending on your access and how the database is configured, the source may not be available to you or it may be encrypted, which makes it not terribly useful.
You can also get the source programmatically using SMO (Sql Server Management Objects).
http://technet.microsoft.com/en-us/library/hh248032.aspx
I recently came across an issue with programmatically extracting Stored Procedure scripts to file. I started off using the routine_definition approach, but quickly realised that I hit the 4000 character limit... No matter what I tried, I couldn't find a way to get over that hump. (Still interested to know if there's a way around this!)
Instead, I stumbled across a powerful built-in helper; sp_helptext
In short, for the purposes of extracting Stored Procedure Scripts, specifically, sp_helptext extracts each line to a row in the output. ie, 2000 lines of code = 2000 rows in a returned dataset. As long as your individual lines don't exceed the 4000 character limit, nothing will be clipped.
Of course, you can then write the entire table contents to file pretty easily either in SQL, or in my case SSIS.
In Case someone comes across this problem, I guess the fastest way to extract all the items (Stored Procedures, Views, User Defied Tables, Functions) is to create a Database project in any solution, then Import everything with Schema Compare and wholaaa you have all the items nicely created in corresponding folders.