SQL Server : how to dump/extract each row to separate file - sql

I have a business requirement to take a table with columns PrimaryKey, A, B, C, D, E and dump each to a file such as:
filename: Primarykey.txt
(row 1) A
(row 2) B
(row 3) C
etc.
Is there a good way to do this with SQL Server 2008 or should I write a C# program using a datatable? The table I am using has about 200k rows in it.
Thanks

The links below contain some previous posts and another link to a possible solution using SSIS.
You might have to play around until you get what you want.
Good luck.
Some clues here
or
SQL Server Forums - Create multiple files from SSIS

Using xp_cmdshell
If xp_cmdshell is permitted, you might be able to use something like this:
declare #path varchar(200)
declare #schema varchar(100)
declare #pk varchar(100)
declare #sql varchar(2000)
declare #cmd varchar(2500)
set #path = 'C:\your\file\path\'
declare rowz cursor for (select PrimaryKey from TableA)
open rowz
fetch next from rowz into #pk
while ##FETCH_STATUS = 0
begin
set #sql = 'select A, B, C, D, E from TableA where PrimaryKey = ''' + #pk + ''''
set #cmd = 'bcp "' + #sql + '" queryout "' + #path + #pk + '.txt" -T -c -t\n'
exec xp_cmdshell #cmd
fetch next from rowz into #item
end
close rowz
deallocate rowz
Using sqlcmd
Alternately, you can create a series of statements to run through sqlcmd to create the separate files.
First, create a temporary table and add some boilerplate entries:
create table #temp (tkey int, things varchar(max))
insert into #temp (tkey, things) values (0, 'SET NOCOUNT ON;
GO'),(2, ':out stdout')
Then, fill the temporary table with the queries that we need:
declare #dir varchar(250)
declare #sql varchar(5000)
declare #cmd varchar(5500)
declare #schema varchar(100)
declare #pk varchar(100)
set #schema = 'YourSchema'
declare rowz cursor for (select PrimaryKey from "YourSchema"..TableA)
open rowz
fetch next from rowz into #pk
while ##FETCH_STATUS = 0
begin
set #dir = '"C:\your\file\path\'+#pk+'.txt"'
set #sql = '
SELECT A +''
''+ B +''
''+ C +''
''+ D +''
''+ E
FROM "'+#schema+'"..TableA where PrimaryKey = '+#pk
set #cmd = ':out '+#dir+#sql+'
GO'
insert into #temp (tkey,things) values (1, #cmd)
fetch next from rowz into #pk
end
close rowz
deallocate rowz
Next, query the temporary table to get the generated queries:
select things from #temp order by tkey
Finally, copy the results to a file and send this file as input to sqlcmd.exe at the command prompt with the parameter -h -1:
C:\your\file\path>sqlcmd -h -1 -S YourServer -i "script.sql"

Related

Stored procedure to drop the column in SQL Server

I created many tables and I have noticed that I have created one useless column in all the tables. I want to create a stored procedure which will drop one specific column and can be useful in all the column.
I created this stored procedure but I'm getting an error. Help me please
You cannot parametrize table and column names with parameters - those are only valid for values - not for object names.
If this is a one-time operation, the simplest option would be to generate the ALTER TABLE ... DROP COLUMN ... statements in SSMS using this code:
SELECT
'ALTER TABLE ' + SCHEMA_NAME(t.schema_id) + '.' + t.Name +
' DROP COLUMN Phone;'
FROM
sys.tables t
and then execute this code in SSMS; the output from it is a list of statement which you can then copy & paste to a new SSMS window and execute.
If you really want to do this as a stored procedure, you can apply the same basic idea - and then just use code (a cursor) to iterate over the commands being generated, and executing them - something like this:
CREATE PROCEDURE dbo.DropColumnFromAllTables (#ColumnName NVARCHAR(100))
AS
BEGIN
DECLARE #SchemaName sysname, #TableName sysname
-- define cursor over all tables which contain this column in question
DECLARE DropCursor CURSOR LOCAL FAST_FORWARD
FOR
SELECT
SchemaName = s.Name,
TableName = t.Name
FROM
sys.tables t
INNER JOIN
sys.schemas s ON t.schema_id = s.schema_id
WHERE
EXISTS (SELECT * FROM sys.columns c
WHERE c.object_id = t.object_id
AND c.Name = #ColumnName);
-- open cursor and start iterating over the tables found
OPEN DropCursor
FETCH NEXT FROM DropCursor INTO #SchemaName, #TableName
WHILE ##FETCH_STATUS = 0
BEGIN
DECLARE #Stmt NVARCHAR(1000)
-- generate the SQL statement
SET #Stmt = N'ALTER TABLE [' + #SchemaName + '].[' + #TableName + '] DROP COLUMN [' + #ColumnName + ']';
-- execute that SQL statement
EXEC sp_executeSql #Stmt
FETCH NEXT FROM DropCursor INTO #SchemaName, #TableName
END
CLOSE DropCursor
DEALLOCATE DropCursor
END
This procedure should work.
It loops through all cols and then deletes the column where sum(col) is zero.
Take a Backup of the Table
alter procedure deletecolumnsifzero #tablename varchar(1000)
as
set nocount on
declare #n int
declare #sql nvarchar(1000)
declare #sum_cols nvarchar(1000)
declare #c_id nvarchar(100)
set #n = 0
declare c1 cursor for
select column_name from information_schema.columns
where
table_name like #tablename
--Cursor Starts
open c1
fetch next from c1
into #c_id
while ##fetch_status = 0
begin
set #sql=''
set #sql='select #sum_cols = sum('+#c_id+') from ['+#tablename+']'
exec sp_Executesql #sql,N'#sum_cols int out,#tablename nvarchar(100)',#sum_cols out,#tablename
if(#sum_cols = 0)
begin
set #n=#n+1
set #sql=''
set #sql= #sql+'alter table ['+#tablename+'] drop column ['+#c_id+']'
exec sp_executesql #sql
end
fetch next from c1
into #c_id
end
close c1
deallocate c1

bcp xml not exporting all rows to individual files

I have a sql query that exports each XML row of a table to individual XML files. The process works perfectly, except it stops and does not process all rows at once. I have to change the row number to start where it left off. It will process all the rows with no errors using this procedure, however I would prefer that it process all rows at once.
This is my query:
DECLARE
#FILENAME VARCHAR(500),
#bcpcmd VARCHAR(2000),
#RN VARCHAR(10),
#i int;
DECLARE #Table table (RN int, IsDone char(1))
INSERT #Table SELECT RN, 0 FROM [x_rpt].[dbo].[x_abc] WHERE RN>=1 --this is where I update RN to process records where it leaves off--
SET #i=0
WHILE #i<= (SELECT COUNT(*) FROM #Table WHERE IsDone = '0')
BEGIN
SELECT TOP 1 #RN=RN FROM #Table WHERE IsDone = 0
SET #FILENAME = '"C:\temp\data\abc\Jan_2016_'+#RN+'.xml"'
SET #bcpcmd = 'BCP "SELECT [XML] from [x_rpt].[dbo].[x_abc] WHERE RN='+#RN+'" queryout "'
SET #bcpcmd = #bcpcmd + #FILENAME + '" -w -T -S "SERVER"'
EXEC master..xp_cmdshell #bcpcmd
UPDATE #Table set IsDone='1' where RN=convert(int,#RN)
SET #i=#i+1
END
Try to use a CURSOR based approach:
CREATE TABLE SomeDB.dbo.MockUp(RN INT,[XML] XML);
INSERT INTO SomeDB.dbo.MockUp VALUES
(1,N'<test RN="1"/>')
,(17,N'<test RN="17"/>')
,(-300,N'<test RN="-300"/>');
DECLARE
#FILENAME VARCHAR(500),
#bcpcmd VARCHAR(2000),
#rn INT;
DECLARE c CURSOR FOR SELECT RN FROM SomeDB.dbo.MockUp
OPEN c;
FETCH NEXT FROM c INTO #rn;
WHILE ##FETCH_STATUS = 0
BEGIN
SET #FILENAME = '"C:\SomePath\Jan_2016_'+CAST(#rn AS VARCHAR(MAX))+'.xml"'
SET #bcpcmd = 'BCP "SELECT [XML] from SomeDB.[dbo].MockUp WHERE RN='+CAST(#rn AS VARCHAR(MAX))+'" queryout "'
SET #bcpcmd = #bcpcmd + #FILENAME + '" -w -T -S ' + ##SERVERNAME
EXEC master..xp_cmdshell #bcpcmd
FETCH NEXT FROM c INTO #rn;
END
CLOSE c;
DEALLOCATE c;
GO
DROP TABLE SomeDB.dbo.MockUp
Adjust path and database / schema / table / column names...

Asterisk in sql script [duplicate]

I have a folder called "Dump." This folder consists of various .CSV Files.
The folder Location is 'C:\Dump'
I want to Import the contents of these files into SQL Server.
I want the rough code along with proper comments so that I understand it.
I have tried a few codes that I found on the Net. But they haven't quite worked out for me for some strange reason.
The steps I would like to have are
Step 1: Copy all the File Names in the folder to a Table
Step 2: Iterate through the table and copy the data from the files using Bulk Insert.
Someone do please help me out on this one. Thanks a lot in advance :)
--BULK INSERT MULTIPLE FILES From a Folder
--a table to loop thru filenames drop table ALLFILENAMES
CREATE TABLE ALLFILENAMES(WHICHPATH VARCHAR(255),WHICHFILE varchar(255))
--some variables
declare #filename varchar(255),
#path varchar(255),
#sql varchar(8000),
#cmd varchar(1000)
--get the list of files to process:
SET #path = 'C:\Dump\'
SET #cmd = 'dir ' + #path + '*.csv /b'
INSERT INTO ALLFILENAMES(WHICHFILE)
EXEC Master..xp_cmdShell #cmd
UPDATE ALLFILENAMES SET WHICHPATH = #path where WHICHPATH is null
--cursor loop
declare c1 cursor for SELECT WHICHPATH,WHICHFILE FROM ALLFILENAMES where WHICHFILE like '%.csv%'
open c1
fetch next from c1 into #path,#filename
While ##fetch_status <> -1
begin
--bulk insert won't take a variable name, so make a sql and execute it instead:
set #sql = 'BULK INSERT Temp FROM ''' + #path + #filename + ''' '
+ ' WITH (
FIELDTERMINATOR = '','',
ROWTERMINATOR = ''\n'',
FIRSTROW = 2
) '
print #sql
exec (#sql)
fetch next from c1 into #path,#filename
end
close c1
deallocate c1
--Extras
--delete from ALLFILENAMES where WHICHFILE is NULL
--select * from ALLFILENAMES
--drop table ALLFILENAMES
This will give you separate tables for each file.
--BULK INSERT MULTIPLE FILES From a Folder
drop table allfilenames
--a table to loop thru filenames drop table ALLFILENAMES
CREATE TABLE ALLFILENAMES(WHICHPATH VARCHAR(255),WHICHFILE varchar(255))
--some variables
declare #filename varchar(255),
#path varchar(255),
#sql varchar(8000),
#cmd varchar(1000)
--get the list of files to process:
SET #path = 'D:\Benihana\backup_csv_benihana_20191128032207_part_1\'
SET #cmd = 'dir ' + #path + '*.csv /b'
INSERT INTO ALLFILENAMES(WHICHFILE)
EXEC Master..xp_cmdShell #cmd
UPDATE ALLFILENAMES SET WHICHPATH = #path where WHICHPATH is null
delete from ALLFILENAMES where WHICHFILE is null
--SELECT replace(whichfile,'.csv',''),* FROM dbo.ALLFILENAMES
--cursor loop
declare c1 cursor for SELECT WHICHPATH,WHICHFILE FROM ALLFILENAMES where WHICHFILE like '%.csv%' order by WHICHFILE desc
open c1
fetch next from c1 into #path,#filename
While ##fetch_status <> -1
begin
--bulk insert won't take a variable name, so make a sql and execute it instead:
set #sql =
'select * into '+ Replace(#filename, '.csv','')+'
from openrowset(''MSDASQL''
,''Driver={Microsoft Access Text Driver (*.txt, *.csv)}''
,''select * from '+#Path+#filename+''')'
print #sql
exec (#sql)
fetch next from c1 into #path,#filename
end
close c1
deallocate c1
For Step 1 Maybe you can look at:
http://www.sql-server-performance.com/forum/threads/copying-filenames-to-sql-table.11546/
or
How to list files inside a folder with SQL Server
and then Step 2
How to cast variables in T-SQL for bulk insert?
HTH
You might need to enable the xp_cmdshell first:
sp_configure 'show advanced options', '1'
RECONFIGURE
go
sp_configure 'xp_cmdshell', '1'
RECONFIGURE
go
And, to enable ad_hoc,
sp_configure 'show advanced options', 1;
RECONFIGURE;
GO
sp_configure 'Ad Hoc Distributed Queries', 1;
RECONFIGURE;
GO
To solve step 1, xp_dirtree can also be used to list all files and folders.
Keep in mind that it is an undocumented function. Security precautions must be considered. Intentionally crafted filenames could be an intrusion vector.
In python you can use d6tstack which makes this simple
import d6tstack
import glob
c = d6tstack.combine_csv.CombinerCSV(glob.glob('*.csv'))
c.to_mssql_combine('mssql+pymssql://usr:pwd#localhost/db', 'tablename')
See SQL examples. It also deals with data schema changes, creates table and allows you to preprocess data. It leverages BULK INSERT so should be just as fast.
to expand upon the answer by SarangArd you can replace temp with the following if your file name matches your table name.
' + Left(#filename, Len(#filename)-4) + '
This code will create a new table per CSV file that is imported.
Best to populate empty database from CSV files.
CREATE TABLE ALLFILENAMES
(
WHICHPATH VARCHAR(255)
,WHICHFILE VARCHAR(255)
)
DECLARE #filename VARCHAR(255),
#path VARCHAR(255),
#sql VARCHAR(8000),
#cmd VARCHAR(1000)
SET #path = 'L:\DATA\SOURCE\CSV\' --PATH TO YOUR CSV FILES (CHANGE TO YOUR PATH)
SET #cmd = 'dir ' + #path + '*.csv /b'
INSERT INTO ALLFILENAMES(WHICHFILE)
EXEC Master..xp_cmdShell #cmd
UPDATE ALLFILENAMES
SET WHICHPATH = #path
WHERE WHICHPATH IS NULL
DECLARE c1 CURSOR
FOR SELECT WHICHPATH
,WHICHFILE
FROM ALLFILENAMES
WHERE WHICHFILE LIKE '%.csv%'
OPEN c1
FETCH NEXT FROM c1 INTO #path,
#filename
WHILE ##fetch_status <> -1
BEGIN
CREATE TABLE #Header
(
HeadString NVARCHAR(MAX)
)
DECLARE #Columns NVARCHAR(MAX) = ''
DECLARE #Query NVARCHAR(MAX) = ''
DECLARE #QUERY2 NVARCHAR(MAX) = ''
DECLARE #HeaderQuery NVARCHAR(MAX) = ''
SELECT #HeaderQuery = #HeaderQuery + 'bulk insert #Header from ''' + #path + #filename + '''
with(firstrow=1,lastrow=1)'
EXEC (#HeaderQuery)
SELECT #Columns = (SELECT QUOTENAME(value) + ' nvarchar(max)' + ','
FROM #Header
CROSS APPLY STRING_SPLIT(HeadString,',') FOR xml PATH(''))
IF ISNULL(#Columns,'') <> ''
BEGIN
SET #Columns = LEFT(#Columns,LEN(#Columns) - 1)
SELECT #Query = #Query + 'CREATE TABLE ' + Replace(#filename,'.csv','') + ' (' + replace(#Columns,'"','') + ')'
PRINT #Query
EXEC (#QUERY)
END
SELECT #QUERY2 = #QUERY2 + 'bulk insert ' + replace(Replace(#filename,'.csv',''),'.TPS','') + ' from ''' + #path + #filename + '''
with(firstrow=2,FORMAT=''csv'',FIELDTERMINATOR='','',ROWTERMINATOR=''\n'')'
EXEC (#QUERY2)
DROP TABLE #Header
FETCH NEXT FROM c1 INTO #path,
#filename
END
CLOSE c1
DEALLOCATE c1

Error "Declare Scalar" when Dynamic Exec Update inside two Cursor loops

Basically I'm looking to loop through a temp table which lists certain table names which need updated, I take each table name use it to populate another temporary table of all the ID's which are to be updated..
I can select the data in each table needing updated using this structure, but cannot seem to get the inner cursor to run as it isn't picking up the temp table..
Any help would be greatly appreciated as this has been doing my nut in for the past few hours..
Cheers,
DECLARE #table INT
DECLARE #prefix nvarchar(3)
DECLARE #TableName nvarchar(50)
DECLARE #TableIdName nvarchar(50)
DECLARE #getTable CURSOR
SET #getTable = CURSOR FOR
SELECT DISTINCT(id)
FROM #t
OPEN #getTable
FETCH NEXT
FROM #getTable INTO #table
WHILE ##FETCH_STATUS = 0
BEGIN
SELECT #TableName = name FROM #t WHERE id = #table
SET #TableIdName = #TableName + 'Id'
SELECT #prefix = prefix FROM #t WHERE name = #TableName
--PRINT #table
PRINT #TableName
--PRINT #TableIdName
--PRINT #prefix
DECLARE #temptable table(rid int, rTableName nvarchar(50), rprefix nvarchar(3), rpk nvarchar(50))
EXEC ('INSERT INTO ' + #temptable + ' SELECT ' + #TableIdName + ', ' + #TableName + ', ' + #prefix + #TableIdName + ' FROM ' + #TableName)
DECLARE #rTableName nvarchar(50)
DECLARE #rpk nvarchar(50)
DECLARE #rprefix nvarchar(3)
DECLARE #row INT
DECLARE #getRow CURSOR
SET #getRow = CURSOR FOR
SELECT DISTINCT(rid)
FROM #temptable
OPEN #getRow
FETCH NEXT
FROM #getRow INTO #row
WHILE ##FETCH_STATUS = 0
BEGIN
PRINT #row
SELECT #rTableName = rTableName FROM #temptable WHERE rid = #row
SELECT #rpk = rpk FROM #temptable WHERE rid = #row
SELECT #rprefix = rprefix FROM #temptable WHERE rid = #row
EXEC ('UPDATE ' + #rTableName + ' SET CoiRef = ' + #rprefix + '_' + #row + ' WHERE ' + #rpk + ' = ' + #row)
FETCH NEXT
FROM #getRow INTO #row
END
CLOSE #getRow
DEALLOCATE #getRow
FETCH NEXT
FROM #getTable INTO #table
END
CLOSE #getTable
DEALLOCATE #getTable
I have also tried another solution using sp_executesql but get similar errors there, this was using the below code instead on the first EXEC..
DECLARE #sqlCommand nvarchar(500)
SET #sqlCommand = 'INSERT INTO #temptable SELECT TableIdName, TableName, prefix, TableIdName FROM' + #TableName
EXECUTE sp_executesql #sqlCommand, N'#temptable nvarchar(50) output', #temptable OUTPUT
Again, Any help would be greatly appreciated..
Thanks..
Gerry
I think the error is coming from the fact that you're trying to use a table variable inside of dynamic sql. This is not something that is supported because the table variable is out of scope for the dynamic sql. You should make your #temptable into a temporary table using the create table command.
In the unsolicited advice category, I'd suggest attempting to recreate this without using cursors if at all possible, as cursors go against the concept of set based processing which is the foundation of sql server.

Exporting binary file data (images) from SQL via a stored procedure

I am trying to export a fairly large number of image files, stored internally in an SQL database as binary data.
Being fairly new to writing stored procedures in SQL, I have come across a couple of very useful guides on how this can be archived, but I seem to be missing something.
I am running SQL Server 2008 R2 locally, and I am trying to write the files to a folder on my C:\ drive.
Here is the buisness part of what I have so far:
BEGIN
DECLARE #cmd VARCHAR(8000)
DECLARE #result int
DECLARE curExportBinaryDocs CURSOR FAST_FORWARD FOR
SELECT 'BCP "SELECT Photograph_Data FROM [ALBSCH Trial].[dbo].[Photograph] WHERE Photograph_ID = '
+ CAST(Photograph_ID AS VARCHAR(500)) + '" queryout "' + #OutputFilePath
+ CAST(Photograph_ID AS VARCHAR(500)) + '.jpg"' + ' -n -T'
FROM dbo.Photograph
OPEN curExportBinaryDocs
FETCH NEXT FROM curExportBinaryDocs INTO #cmd
WHILE ##FETCH_STATUS = 0
BEGIN
--PRINT #cmd
EXEC #result = xp_cmdshell #cmd
FETCH NEXT FROM curExportBinaryDocs INTO #cmd
END
CLOSE curExportBinaryDocs
DEALLOCATE curExportBinaryDocs
END
'#result' is always being set to '1' (failed) after the xp_cmdshell call. All the table names/fields are correct, so I suspect there is something wrong with my BCP call, but I am not sure what to try next.
Any help or advice would be very welcome.
Well, first of all.. (and sorry about that ;) ) DON"T USE CURSORS..
and sorry for the caps...
One of the most baddest things about cursors are that they can lock your table. What i always do for these purposes (and which is quite faster), i use a for loop.. like this
declare #totrow int
, #currow int
, #result int
, #nsql nvarchar(max)
declare #sqlStatements table (
Id int identity(1, 1)
, SqlStatement varchar(max)
)
insert
into #sqlStatements
select 'QUERY PART'
from table
set #totrow = ##rowcount
set #currow = 1
while #totrow > 0 and #currow <= #totrow
begin
select #nsql = SqlStatement
from #SqlStatements
where Id = #currow
exec #result = xp_cmdshell #nsql
set #currow = #currow + 1
end
For the next part, does the SQL Server process has enough permission to write to the c: drive? Also, look into your message pane when you execute your code, maybe you can find something there?
What you also can do, try to execute it manually. Just get one BCP statement and execute it with the xp_cmdshell. Does it gives any errors?
Here is my final working procedure and format file. I was not able to find the finer details of
BCP commands, permision settings and format file layouts in one place, so maybe this will be of use to someone.
CREATE PROCEDURE [dbo].[ImgExport]
#OutputFilePath VARCHAR(500) = 'C:\SQLTest\ '
AS
BEGIN
DECLARE #totrow int
DECLARE #currow int
DECLARE #result int
DECLARE #nsql nvarchar(4000)
DECLARE #sqlStatements table (ID int IDENTITY(1, 1), SqlStatement varchar(max))
INSERT
INTO #sqlStatements
SELECT 'BCP "SELECT Photograph_Data FROM [ALBSCH_Trial].[dbo].[Photograph] WHERE Photograph_ID = '''
+ CAST(Photograph_ID AS VARCHAR(500)) + '''" queryout ' + #OutputFilePath
+ CAST(Photograph_ID AS VARCHAR(500)) + '.jpg -S localhost\SQLEXPRESS2008 -T -f C:\SQLTest\Images.fmt'
FROM dbo.Photograph
SET #totrow = ##ROWCOUNT
SET #currow = 1
WHILE #totrow > 0 and #currow <= #totrow
BEGIN
SELECT #nsql = SqlStatement
FROM #sqlStatements
WHERE ID = #currow
EXEC #result = xp_cmdshell #nsql
SET #currow = #currow + 1
END
END
Format file:
9.0
1
1 SQLBINARY 0 0 "\t" 1 Photograph_Data ""
I hope that helps somebody.