I am trying to figure out how to generate multiple csv exports from a table in SQL based off the date field in this table. The date is different for many records. I would like to export records containing the same date to a csv. Every time a different date is found, a csv will be exported containing the data for the selected columns and date. How can I go about creating a script to perform this type of action. Is there a way to have the script go through the date column automatically and export out the data with the 4 fields selected and generated a csv for those docs that share the same date, etc.
Example:
Select Box, Highprice, Lowprice, Date
where date="2016-01-31"
As others suggest this is not a good method. Also there are few issues related to this.
By default, sql server is not granting permission to execute xp_cmdshell
The files will be generated in the Server and should have proper accessing privileges for that.
I gave you the way to do it for a static date and you may need to write a cursor to loop all days that are needed.
select #sql = 'bcp YourDB.dbo.YourTbl where date=''2016-01-31'' out
c:\temp\CSV2016-01-31.csv -c -t, -T -S'+ ##servername
exec master..xp_cmdshell #sql
Related
I am trying to export a database within Microsoft SQL Server Management Studio 2000. My database is extremely large with over 1,300 unique tables.
I am aware of the export wizard and copy/paste options but am looking for something that would allow me to export all 1,300 tables (or at least a few hundred at once) into a single csv or xls file. Copy and pasting or selecting "save as" for each table would take far too long and the export wizard only allows a few dozen tables at a time.
Well , I would not recommended you to do this, but if you desperately need a solution in the way you have described , here it is :
First , run this query on the database :
SELECT 'sqlcmd -S . -d '+DB_NAME()+' -E -s, -W -Q "
SET NOCOUNT ON;
SELECT * FROM '+table_schema+'.'+TABLE_name+'" > "C:\Temp\'+Table_Name+'.csv"'
FROM [INFORMATION_SCHEMA].[TABLES]
You might want to change the folder name as per your convenience
Second, Copy all the rows returned into a Export.bat or Export.cmd file
Third, run the Export.bat file to get 1300 odd tables you need in separate CSV files
Fourth, open a cmd window , navigate to the folder you have your files in and use the following command :
copy *.csv Export.csv
You will have a single Export.csv file containing all your tables, along with headers for each table
Perhaps this will help you to resolve your problem.
SQL Server Management Studio 2012 - Export all tables of database as csv
I have a SQL query which consist of the database_name parameter in several places of this query/than I have a list of databases in a .txt file. How to run this query step by step one after another on each database from the .txt file (the database_name in the query file has to be changed with every interaction). Thank you in advance!
Read the .txt file into a temporary table
Loop through table
-- Create a variable and populate with SQL and database name
-- Use sp_executesql to execute
Loop
I'm having a problem with exporting the results of a stored procedure to a csv file and keeping the results as a 9 character string. The results of the stored procedure is a simple one column output which looks fine when executed in SSMS but the returned values in the csv that have leading zeros are being returned without the zeros. The table column is type varchar(13) and I have done a convert to try and keep the leading zeros from being dropped but no luck.
Here is the stored procedure:
SELECT DISTINCT
convert(char(8),n.NIIN)
FROM IMMS_ELEC.dbo.NIINList n
Here is the simple BCP script I'm using:
DECLARE #string AS NVARCHAR(4000)
SELECT #string = 'BCP "exec CPLINK_Dev.dbo.spSelectLOG_NiinDistinct"
QUERYOUT:\data.csv -c -T -t'
exec master.dbo.xp_cmdshell #string
Excel loves to think it knows how to format your data better than you do... Here's a trick you can use to outsmart it. Open a new spreadsheet, select all cells and change the type to text (from general), then copy your data from notepad (or SSMS), paste it into Excel, and use text to columns if you have to... Excel should stop messing with your formats then.
You can also probably do the same thing with import data from Excel, but I find the first much more effective and you can use it to copy and paste directly from SSMS grid results as well.
I have an SQL table which is used for logging purpose(There are lakhs of records in the table). I need to purge the table (Take a back up of the data and need to clear the table data).
Is there a standard way of doing it where I can automate it.?
You can do this within SQL Server Management Studio, by:
right clicking Database > Tasks > Generate Script
You can then select the table you wish to script out and also choose to include any associated objects, such as constraints and indexes.
Attaching an image which will give you the step by step procedure,
image_bkp_procedure
PFB the stackoverflow link which will give you more insight on this,
Table-level backup
And your automation requirement,
You can download bcp utility which copies data between an instance of Microsoft SQL Server and a data file in a user-specified format.
Sample syntax to export,
bcp "select * from [MyDatabase].dbo.Customer " queryout "Customer.bcp" -N -S localhost -T -E
You can automate this query by using any scheduling mechanism (UNIX etc)
Simply we can create a job that runs once in a month
--> That backups data in another table like archive table
--> Then deletes data in the main table
Its primitive partitioning I guess, this way it will be more flexible when you need to select data from the past deleted one i.e. now on archive table where you have backed up
What I want to do is output some query results to a file. Basically, when I query the table I'm interested in, my results look like this:
HTML_ID HTML_CONTENT
1 <html>...
2 <html>...
3 <html>...
4 <html>...
5 <html>...
6 <html>...
7 <html>...
The field HTML_CONTENT is of type ntext and each record's value is around 500+ characters (that contains HTML content).
I can create a cursor to move each record's content to a temp table or whatever.
But my question is this: instead of temp table, how would I move this without using BCP?
BCP isn't an option as our sysadmin has blocked access to sys.xp_cmdshell.
Note: I want to store each record's HTML content to individual files
My version of sql is: Microsoft SQL Server 2008 (SP1) - 10.0.2531.0
You can make use of SSIS to read the table data and output the content of the table rows as files. Export column transformation available within Data Flow Task of SSIS packages might help you do that.
Here is an example, The Export Column Transformation
MSDN documentation about Export Column transformation.
This answer would have worked until you added the requirement for Individual Files.
You can run the SQL from command line and dump the output into a file. The following utilities can be used for this.
SQLCMD
OSQL
Here is an example with SQLCMD with an inline query
sqlcmd -S ServerName -E -Q "Select GetDate() CurrentDateAndTime" > output.txt
You can save the query to a file (QueryString.sql) and use -i instead
sqlcmd -S ServerName -E -i QueryString.sql> output.txt
Edit
Use SSIS
Create a package
Create a variable called RecordsToOutput of type Object at the package level
Use an EXECUTE SQL task and get the dataset back into RecordsToOutput
Use a For-Each loop to go through the RecordsToOutput dataset
In the loop, create a variable for each column in the dataset (give it the same name)
Add a Data Flow task
Use a OleDB source and use a SQL statement to create one row (with data you already have)
use a flat-file destination to write out the row.
Use expressions on the flat file connection to change the name of the destination file for each row in the loop.