Moving results of T-SQL query to a file without using BCP? - sql

What I want to do is output some query results to a file. Basically, when I query the table I'm interested in, my results look like this:
HTML_ID HTML_CONTENT
1 <html>...
2 <html>...
3 <html>...
4 <html>...
5 <html>...
6 <html>...
7 <html>...
The field HTML_CONTENT is of type ntext and each record's value is around 500+ characters (that contains HTML content).
I can create a cursor to move each record's content to a temp table or whatever.
But my question is this: instead of temp table, how would I move this without using BCP?
BCP isn't an option as our sysadmin has blocked access to sys.xp_cmdshell.
Note: I want to store each record's HTML content to individual files
My version of sql is: Microsoft SQL Server 2008 (SP1) - 10.0.2531.0

You can make use of SSIS to read the table data and output the content of the table rows as files. Export column transformation available within Data Flow Task of SSIS packages might help you do that.
Here is an example, The Export Column Transformation
MSDN documentation about Export Column transformation.

This answer would have worked until you added the requirement for Individual Files.
You can run the SQL from command line and dump the output into a file. The following utilities can be used for this.
SQLCMD
OSQL
Here is an example with SQLCMD with an inline query
sqlcmd -S ServerName -E -Q "Select GetDate() CurrentDateAndTime" > output.txt
You can save the query to a file (QueryString.sql) and use -i instead
sqlcmd -S ServerName -E -i QueryString.sql> output.txt
Edit
Use SSIS
Create a package
Create a variable called RecordsToOutput of type Object at the package level
Use an EXECUTE SQL task and get the dataset back into RecordsToOutput
Use a For-Each loop to go through the RecordsToOutput dataset
In the loop, create a variable for each column in the dataset (give it the same name)
Add a Data Flow task
Use a OleDB source and use a SQL statement to create one row (with data you already have)
use a flat-file destination to write out the row.
Use expressions on the flat file connection to change the name of the destination file for each row in the loop.

Related

Export a Large SQL Database via Query

I am trying to export a database within Microsoft SQL Server Management Studio 2000. My database is extremely large with over 1,300 unique tables.
I am aware of the export wizard and copy/paste options but am looking for something that would allow me to export all 1,300 tables (or at least a few hundred at once) into a single csv or xls file. Copy and pasting or selecting "save as" for each table would take far too long and the export wizard only allows a few dozen tables at a time.
Well , I would not recommended you to do this, but if you desperately need a solution in the way you have described , here it is :
First , run this query on the database :
SELECT 'sqlcmd -S . -d '+DB_NAME()+' -E -s, -W -Q "
SET NOCOUNT ON;
SELECT * FROM '+table_schema+'.'+TABLE_name+'" > "C:\Temp\'+Table_Name+'.csv"'
FROM [INFORMATION_SCHEMA].[TABLES]
You might want to change the folder name as per your convenience
Second, Copy all the rows returned into a Export.bat or Export.cmd file
Third, run the Export.bat file to get 1300 odd tables you need in separate CSV files
Fourth, open a cmd window , navigate to the folder you have your files in and use the following command :
copy *.csv Export.csv
You will have a single Export.csv file containing all your tables, along with headers for each table
Perhaps this will help you to resolve your problem.
SQL Server Management Studio 2012 - Export all tables of database as csv

Moving data from Excel to SQL Server table

I have a very Simple excel sheet:
I am wanting to put this data into a table in SQL Server. I also wanted to add a field that contains a date.
what is the best way to do this?
Create a table in SQL server with the same number of fields that you have in the spreadsheet.
In SQL Server Management Studio Object Explorer you can right click the table you created and select "Edit top 200 rows". Highlight the data you want to copy in Excel and right click -> copy. Right click the space to the left of the first row in the SSMS edit tabe widow and paste.
After the import check the SQL table to make sure it has the same amount of rows as the spreadsheet.
You can add the date column to the SQL table after you do the data import. Or add it in the spreadsheet before you do the import.
You can first create the table in SQL Server with the added field and then use the import wizard in the Mangement Studio for importing the excel file. Or you create the table during the import task, and you add the new field later.
Option 1:
Read the data in an IDataReader, and then call a stored procedure to insert the data.
http://granadacoder.wordpress.com/2009/01/27/bulk-insert-example-using-an-idatareader-to-strong-dataset-to-sql-server-xml/
I use the above when I have ~~alot~~ of rows to import and I want to validate them outside of the database.
Option 2:
http://support.microsoft.com/kb/321686
or search for:
Select FROM OPENDATASOURCE Excel
Option N:
There are other options out there.
It depends what you like, how much time you want to put into it, is it a "one off" or you gotta do it for 333 files every day.
My solution was to convert .xlsx to .csv and then use this site to convert .csv to .sql. I then ran the sql file in sql server and generated my tables.
It can be also be done by creating a batch file.
For this you already need to have a table created on server with the same data structure as you have in excel.
Now using BCP you can import that data from the excel file and insert it into sql table.
BCP utility function
sqlcmd -S IP -d databasename -U username -P passwd -Q "SQL query mostly it should be truncate query to truncate the created table"
bcp databasename.dbo.tablename in "Excel file path from where you need to input the data" -c -t, -S Server_IP -U Username -P passwd -b provide batch size
You can refer to the link below for more options on the bcp utility:
https://msdn.microsoft.com/en-IN/library/ms162802.aspx
Open your SQL server interface software, add the date field to the table.
Go to excel, add the date column, copy the excel data.
Go to your SQL server interface software and use the functionality to import the data from the clipboard. (SQL server interface software that has this is for example Database4.net, but if you have another package with the functionality then use that.)
Alternatively use VBA with DOA or ADO to interact with the SQL server database and use SQL statements to add the field and copy the data into your table

Purging an SQL table

I have an SQL table which is used for logging purpose(There are lakhs of records in the table). I need to purge the table (Take a back up of the data and need to clear the table data).
Is there a standard way of doing it where I can automate it.?
You can do this within SQL Server Management Studio, by:
right clicking Database > Tasks > Generate Script
You can then select the table you wish to script out and also choose to include any associated objects, such as constraints and indexes.
Attaching an image which will give you the step by step procedure,
image_bkp_procedure
PFB the stackoverflow link which will give you more insight on this,
Table-level backup
And your automation requirement,
You can download bcp utility which copies data between an instance of Microsoft SQL Server and a data file in a user-specified format.
Sample syntax to export,
bcp "select * from [MyDatabase].dbo.Customer " queryout "Customer.bcp" -N -S localhost -T -E
You can automate this query by using any scheduling mechanism (UNIX etc)
Simply we can create a job that runs once in a month
--> That backups data in another table like archive table
--> Then deletes data in the main table
Its primitive partitioning I guess, this way it will be more flexible when you need to select data from the past deleted one i.e. now on archive table where you have backed up

Export records from SQL Server 2005 express edition

I have a little problem. My friend has a database with over 10 tables and each table has over 90-100 records.
I can't find a workaround to export the records (to put in a SQL file something like this: INSERT INTO .... VALUES ... for each existing records) from his tables to import in my database.
How to do that ?
I tried: right click on a table -> Script Table as -> INSERT TO -> File ...
but it only generate the INSERT statement.
There are a solution ? or this feature is only for commercial version ?
UPDATE
You can use BCP command with command prompt like this
For export: bcp ADatabase.dbo.OneTable out d:\test\OneTable.bcp -c -Usa -Ppassword
For import: bcp ADatabase.dbo.OneTable in d:\test\OneTable.bcp -c -Usa -Ppassword
these commands will create a BCP file which contains records for specified table. You can import using existing BCP file into another database
If you use remote database then:
bcp ADatabaseRemote.dbo.OneTableRemote out d:\test\OneTableRemote.bcp -Slocalhost/SQLExpress -Usa -Ppassword
Instead of localhost/SQLExpress, you can use localhost or other server name...
Probably the simplest way to do this would be to run a SELECT statement that outputs to a file. Then you can import that data into your database.
For simple moves, I have also done a copy/paste manually. Sometimes it is better to use Excel as a staging platform before pasting it into the new database. You may need to create a temporary table in your new database that matches up exactly with the data you are pasting over. For example, I usually don't put a PK on the temp table at first and make the PK field just an INT. That way the copy will go smoother.
In the corporate world, you would use SSIS to move this data around.
a couple of ways you could do this. One,select everything from each table and save the results as a csv or delimited file (you can do this from sql management studio). You can also script the tables as create and copy the scripts over to the new database, assuming it is a sql server also. Then for import use the load infile statement. You may have to google the syntax for sql server but I know this works in mysql and oracle. haven't tried it in sql server yet.
LOAD DATA INFILE 'myfile'
INTO TABLE stuff
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
SET id = NULL;
Or if you are going to another sql server use the sql export import wizard.
http://msdn.microsoft.com/en-us/library/ms141209.aspx

Which one is good way to Import Excel to Database?

Hi i am using SQL Server 2008.
How can I import an Excel file into the database, which is the easiest way and simple to do?
OpenRowSet
BulkCopy
Linked Servers
SSIS
I have the above options to Import Excel to Database.
In my opinion SSIS wizard is best way to import excel data where you get row and column wise whole view of table data which will be inserted and also specify column names and contraints and parse data using query.
UPDATE :
If the data in your excel file does not require any processing to match your database table then I recommend you save your excel file as a csv and use a combination of BULK INSERT and the BCP.exe program.
To use BULK INSERT you will need a format file which defines how your datafile matches up to your database table. You can write this by hand to match the existing database table or you can use the following command to generate the format file you need:
bcp [ServerName].[SchemaName].[TableName] format nul -c -f [FormatFileOutputName].fmt -S[ServerHostName] -U[DbUserName] -P[DbUserPassword]
Now you will have 2 files:
DatafileName.csv
FormatFileName.fmt.
Use BULK INSERT within Sql Server to insert your data.
Note: If the columns in your datafile are in a different order than your database table then you can simply edit the generated format file to have them map correctly.