Export #tempTable to .txt file in SQL server - sql

Check my last question Export values from SQL Server to txt file. I can able to export values to .txt file with my last question. I want to do export #tempTable to .txt file. How can I do this?
Edit: My Requirement: I want to export only updated data from a table to .txt file, say user insert 10 new rows of data in existing table, I want that 10 rows to be there in .txt file. I used Inserted table in trigger to get updated rows, when I try to export to .txt file from bcp, I can't since I don't know the full context of inserted table([database].[schema].[tableName]). So I decided to have inserted table data in #tempTable to export .txt file.
This is I suppose to do export inserted table data to .txt file in SQL server.

you could simply load the memory table into a regular table an then use your code for export that you already have
select * into queryout from #tempTable
-- your code from step 1
drop table queryout
here is an updated version of your trigger which should do the work
create trigger monitorTest1 on test for insert as
declare #sql varchar(8000);
select * into tempTable from inserted
SELECT #sql = 'bcp "select * from test2.dbo.tempTable" queryout C:\temp\mytest.txt -c -t -T -S localhost'
exec xp_cmdshell #sql
drop table tempTable

Related

SQL Server 2012 table data export to excel file

I would like to export data from a SQL Server stored procedure to an Excel file. How can I achieve that?
I test like that:
insert into OPENROWSET(
'Microsoft.ACE.OLEDB.12.0',
'Excel 8.0;Database=D:\test.xlsx;;HDR=YES',
'SELECT EmpID FROM [Sheet1$]')
select * from tb1
but it returns an error:
Column name or number of supplied values does not match table definition.
I think it's because Excel doesn't have columns right? But there's no way to pre-write columns before exporting to Excel, and SQL Server can't create Excel by itself ...
I also tried with bcp:
bcp "SELECT * FROM mydb.tb1" queryout 'D:\test.xlsx' -t, -c -S . -d DESKTOP-D3PCUR7 -T
It returns error:
Incorrect syntax near 'queryout'.
How can I export table to Excel easily in SQL Server?

Bash script to export a table into another table

I created a script to export a table from my db into a .csv file
#!/usr/bin/bash
FILE="example.csv"
sqlplus -s abcd/abcd#XE <<EOF
SET PAGESIZE 50000
SET COLSEP ","
SET LINESIZE 200
SET FEEDBACK OFF
SPOOL $FILE
SELECT * FROM myTable;
SPOOL OFF
EXIT
and now I'd like to modify this script and export my table into another. How can I change my code?
By "exporting your table into another", do you mean copying data from one table to another? If you don't need indexes or keys or other features on your new table initially, i.e. if it's not for production use, it's quite simple:
#!/usr/bin/bash
TABLE="myOtherTable"
sqlplus -s abcd/abcd#XE <<EOF
CREATE TABLE $TABLE as SELECT * FROM myTable;
EXIT
You could also do a create table statement first, specifying columns, keys and storage options as any other table, and then have a separate line that does INSERT INTO $TABLE (SELECT * FROM myTable) to fill it with data copied from myTable.

How to get column name in excel using bcp format in sql

I already got the csv file with data but i need data with columns
I have many table which i need to execute at one go and save data in csv format at one location with header i don't want to add columns name as their 100 table.
I am using below query
Declare #cmd varchar(8000)
Set #cmd='Bcp"select * from tablename where condition queryout "path" -s -T -f -t, -c -E
EXECUTE master..xp_cmdshell
Their 100 tables like that.. any suggestion

SQL Server Bulk import data from .csv into new table

I can't seem to find the answer to this quite trivial question.
I would like to bulk import data from a .csv file (with an unknown number of columns, comma separated) file to a new SQL Server table, within an existing database. The BULK INSERT statement works fine if the table is predefined, but since I don't know the number of columns of the .csv file upfront, this won't work.
I was trying to use bulk in combination with openrowset, but can't get it working.
By the way: SSIS won't be an option in this case, since I would like to incorporate the query within R (sqlquery) or Python.
Help would be highly appreciated!
I have found a workaround, using R, to solve the problem above. The same logic can be applied in other languages. I advise everyone using this function to keep in mind the useful comments above.
I wrote a small function to capture the steps in R:
SQLSave <- function(dbhandle, data, tablename) {
# Export data to temp path, for example within your SQL Server directory.
write.csv2(data,file = "\\\\pathToSQL\\temp.csv",row.names=FALSE,na="")
# Write first 100 rows to SQL Server, to incorporate the data structure.
sqlSave(dbhandle, head(data,100), tablename = tablename, rownames = FALSE, safer = FALSE)
# SQL Query to remove data in the table, structure remains:
sqlQuery(dbhandle,paste("DELETE FROM [",tablename,"]",sep=""));
# SQL Query to bulk insert all data from temp .csv to SQL Server
sqlQuery(dbhandle,paste("BULK INSERT [",tablename,"]
FROM '\\\\pathToSQL\\temp.csv'
WITH
(
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\\n',
FIRSTROW = 2,
KEEPNULLS
)",sep=""));
# Delete temp file from file directory
file.remove("\\\\pathToSQL\\temp.csv")
}
I am currently struggling with the same problem. I have first read the first row (headers) using bulk insert and created the table. Then again using bulk insert from row 2 imported data in the table. Although you have to change datatype after checking the data imported.
CREATE TABLE #Header(HeadString nvarchar(max))
declare #TableName nvarchar(100)='byom.DenormReportingData_100_100'
DECLARE #Columns nvarchar(max)=''
declare #Query nvarchar(max)=''
DECLARE #QUERY2 NVARCHAR(MAX)=''
bulk insert #Header
from 'F:/Data/BDL_BI_Test.csv'
with(firstrow=1,lastrow=1)
select #Columns=(select quotename(value)+' VARCHAR(500)'+',' from #Header cross apply string_split(HeadString,',') for xml path(''))
if isnull(#Columns,'')<>''
begin
set #Columns = left(#Columns,len(#Columns)-1)
select #Query=#Query+'CREATE TABLE '+#TableName+' ('+#Columns+')'
exec(#QUERY)
end
select #QUERY2 =#QUERY2+'bulk insert '+#TableName+' from ''F:/Data/BDL_BI_Test.csv''
with(firstrow=2,FORMAT=''csv'',FIELDTERMINATOR='','',ROWTERMINATOR=''\n'')'
exec(#QUERY2)

Test stored procedure with user defined table type from data in csv file

I have a stored procedure that accept a UDTT(User Define Table Type). I'd like to test the performance using data input from csv files. Since the store procedure handles foreign key relationship, I will not use SQL build-in bulk inserts. How to do this in SQL management studio?
Here is the steps that I found works:
BCP the data from csv file to a temp table.
bcp TempDb.dbo.CsvTest in "C:\test.csv" -T -c -t ,
Use the temp table to populate the UDTT
INSERT INTO #args
SELECT col1, col2 FROM TempDb.dbo.CsvTest
EXEC #return_value = [dbo].[myProcedure] #inputs = #args
Not sure if there is a way to skip the temp table.