how to convert csv file from a table in stored procedure? - sql

I need query for import csv file from a particular table and that quest must be used inside a stored procedures.
I tried this query
EXEC master..xp_cmdshell
'osql.exe -S ramcobl412 -U connect -P connect
-Q "select * from ramcodb..rct_unplanned_hdr" -o "c:\out.csv" -h-1 -s","'
but that csv file not in format when I open in xsl sheet
Comma separated files are working fine but width is problem

Save the .csv file you want to import to SQL Server on your desktop or some other path you can easily access.
Using SQL Server Management Studio, you should right click the database you want the csv file imported to as a table and go to Tasks >
Import Data and use the Import Wizard to import the csv file to a table.
The Import Wizard will automatically account for the different lengths you have for some rows. For example if you have column X and it has 5 characters on 1 row and 10 characters on two other rows the Import Wizard will automatically set the max length for the X column as 10.

Related

Export a Large SQL Database via Query

I am trying to export a database within Microsoft SQL Server Management Studio 2000. My database is extremely large with over 1,300 unique tables.
I am aware of the export wizard and copy/paste options but am looking for something that would allow me to export all 1,300 tables (or at least a few hundred at once) into a single csv or xls file. Copy and pasting or selecting "save as" for each table would take far too long and the export wizard only allows a few dozen tables at a time.
Well , I would not recommended you to do this, but if you desperately need a solution in the way you have described , here it is :
First , run this query on the database :
SELECT 'sqlcmd -S . -d '+DB_NAME()+' -E -s, -W -Q "
SET NOCOUNT ON;
SELECT * FROM '+table_schema+'.'+TABLE_name+'" > "C:\Temp\'+Table_Name+'.csv"'
FROM [INFORMATION_SCHEMA].[TABLES]
You might want to change the folder name as per your convenience
Second, Copy all the rows returned into a Export.bat or Export.cmd file
Third, run the Export.bat file to get 1300 odd tables you need in separate CSV files
Fourth, open a cmd window , navigate to the folder you have your files in and use the following command :
copy *.csv Export.csv
You will have a single Export.csv file containing all your tables, along with headers for each table
Perhaps this will help you to resolve your problem.
SQL Server Management Studio 2012 - Export all tables of database as csv

Moving data from Excel to SQL Server table

I have a very Simple excel sheet:
I am wanting to put this data into a table in SQL Server. I also wanted to add a field that contains a date.
what is the best way to do this?
Create a table in SQL server with the same number of fields that you have in the spreadsheet.
In SQL Server Management Studio Object Explorer you can right click the table you created and select "Edit top 200 rows". Highlight the data you want to copy in Excel and right click -> copy. Right click the space to the left of the first row in the SSMS edit tabe widow and paste.
After the import check the SQL table to make sure it has the same amount of rows as the spreadsheet.
You can add the date column to the SQL table after you do the data import. Or add it in the spreadsheet before you do the import.
You can first create the table in SQL Server with the added field and then use the import wizard in the Mangement Studio for importing the excel file. Or you create the table during the import task, and you add the new field later.
Option 1:
Read the data in an IDataReader, and then call a stored procedure to insert the data.
http://granadacoder.wordpress.com/2009/01/27/bulk-insert-example-using-an-idatareader-to-strong-dataset-to-sql-server-xml/
I use the above when I have ~~alot~~ of rows to import and I want to validate them outside of the database.
Option 2:
http://support.microsoft.com/kb/321686
or search for:
Select FROM OPENDATASOURCE Excel
Option N:
There are other options out there.
It depends what you like, how much time you want to put into it, is it a "one off" or you gotta do it for 333 files every day.
My solution was to convert .xlsx to .csv and then use this site to convert .csv to .sql. I then ran the sql file in sql server and generated my tables.
It can be also be done by creating a batch file.
For this you already need to have a table created on server with the same data structure as you have in excel.
Now using BCP you can import that data from the excel file and insert it into sql table.
BCP utility function
sqlcmd -S IP -d databasename -U username -P passwd -Q "SQL query mostly it should be truncate query to truncate the created table"
bcp databasename.dbo.tablename in "Excel file path from where you need to input the data" -c -t, -S Server_IP -U Username -P passwd -b provide batch size
You can refer to the link below for more options on the bcp utility:
https://msdn.microsoft.com/en-IN/library/ms162802.aspx
Open your SQL server interface software, add the date field to the table.
Go to excel, add the date column, copy the excel data.
Go to your SQL server interface software and use the functionality to import the data from the clipboard. (SQL server interface software that has this is for example Database4.net, but if you have another package with the functionality then use that.)
Alternatively use VBA with DOA or ADO to interact with the SQL server database and use SQL statements to add the field and copy the data into your table

SQL Server Management Studio 2012 - Export/Import data from/to table

I have table with more than 3 000 000 rows. I have try to export the data from it manually and with SQL Server Management Studio Export data functionality to Excel but I have met several problems:
when create .txt file manually copying and pasting the data (this is several times, because if you copy all rows from the SQL Server Management Studio it throws out of memory error) I am not able to open it with any text editor and to copy the rows;
the Export data to Excel do not work, because Excel do not support so many rows
Finally, with the Export data functionality I have created a .sql file, but it is 1.5 GB, and I am not able to open it in SQL Server Management Studio again.
Is there a way to import it with the Import data functionality, or other more clever way to make a backup of the information of my table and then to import it again if I need it?
Thanks in advance.
I am not quite sure if I understand your requirements (I don't know if you need to export your data to excel or you want to make some kind of backup).
In order to export data from single tables, you could use Bulk Copy Tool which allows you to export data from single tables and exporting/Importing it to files. You can also use a custom Query to export the data.
It is important that this does not generate a Excel file, but another format. You could use this to move data from one database to another (must be MS SQL in both cases).
Examples:
Create a format file:
Bcp [TABLE_TO_EXPORT] format "[EXPORT_FILE]" -n -f "[ FORMAT_FILE]" -S [SERVER] -E -T -a 65535
Export all Data from a table:
bcp [TABLE_TO_EXPORT] out "[EXPORT_FILE]" -f "[FORMAT_FILE]" -S [SERVER] -E -T -a 65535
Import the previously exported data:
bcp [TABLE_TO_EXPORT] in [EXPORT_FILE]" -f "[FORMAT_FILE] " -S [SERVER] -E -T -a 65535
I redirect the output from hte export/import operations to a logfile (by appending "> mylogfile.log" ad the end of the commands) - this helps if you are exporting a lot of data.
Here a way of doing it without bcp:
EXPORT THE SCHEMA AND DATA IN A FILE
Use the ssms wizard
Database >> Tasks >> generate Scripts… >> Choose the table >> choose db model and schema
Save the SQL file (can be huge)
Transfer the SQL file on the other server
SPLIT THE DATA IN SEVERAL FILES
Use a program like textfilesplitter to split the file in smaller files and split in files of 10 000 lines (so each file is not too big)
Put all the files in the same folder, with nothing else
IMPORT THE DATA IN THE SECOND SERVER
Create a .bat file in the same folder, name execFiles.bat
You may need to check the table schema to disable the identity in the first file, you can add that after the import in finished.
This will execute all the files in the folder against the server and the database with, the –f define the Unicode text encoding should be used to handle the accents:
for %%G in (*.sql) do sqlcmd /S ServerName /d DatabaseName -E -i"%%G" -f 65001
pause

SQL Server 2008 R2: Import table from text file when column names have a / in them

How do you import a table into a database from a text file when that table has /'s in the column names?
I have a SQL Server 2008 R2 table with columns that have /'s in their name. (SAP Database)
For example: /BIC/FIC_SD001 has a column named /BIC/O_CST_00
I have done an export (using the Import/Export Wizard) of this table to a text file and the slashes are there.
When I import the table into a different database (using the wizard) all the /'s from the column are removed and replaced with spaces.
The above column now looks like this: BIC 0_CST_00
Thank you in advance for your help!
In the step of the wizard called Select Source Tables and Views, you have to click Edit Mappings... Over there you can manually change the Destination column names (setting back the missing slashes).
You can use BCP Utility to import or export data from sql server to text file or vice verse.
BCP Database.TableName out "Location of the text file " -c -S ServerName -T
The above command will load the data from sql server table to the flat file
To load the data from flat file to SQL server the command is as follows :-
BCP Database.TableName in "Location of the text file " -c -S ServerName -T
The above 2 commands will work for Windows Authentication

Moving results of T-SQL query to a file without using BCP?

What I want to do is output some query results to a file. Basically, when I query the table I'm interested in, my results look like this:
HTML_ID HTML_CONTENT
1 <html>...
2 <html>...
3 <html>...
4 <html>...
5 <html>...
6 <html>...
7 <html>...
The field HTML_CONTENT is of type ntext and each record's value is around 500+ characters (that contains HTML content).
I can create a cursor to move each record's content to a temp table or whatever.
But my question is this: instead of temp table, how would I move this without using BCP?
BCP isn't an option as our sysadmin has blocked access to sys.xp_cmdshell.
Note: I want to store each record's HTML content to individual files
My version of sql is: Microsoft SQL Server 2008 (SP1) - 10.0.2531.0
You can make use of SSIS to read the table data and output the content of the table rows as files. Export column transformation available within Data Flow Task of SSIS packages might help you do that.
Here is an example, The Export Column Transformation
MSDN documentation about Export Column transformation.
This answer would have worked until you added the requirement for Individual Files.
You can run the SQL from command line and dump the output into a file. The following utilities can be used for this.
SQLCMD
OSQL
Here is an example with SQLCMD with an inline query
sqlcmd -S ServerName -E -Q "Select GetDate() CurrentDateAndTime" > output.txt
You can save the query to a file (QueryString.sql) and use -i instead
sqlcmd -S ServerName -E -i QueryString.sql> output.txt
Edit
Use SSIS
Create a package
Create a variable called RecordsToOutput of type Object at the package level
Use an EXECUTE SQL task and get the dataset back into RecordsToOutput
Use a For-Each loop to go through the RecordsToOutput dataset
In the loop, create a variable for each column in the dataset (give it the same name)
Add a Data Flow task
Use a OleDB source and use a SQL statement to create one row (with data you already have)
use a flat-file destination to write out the row.
Use expressions on the flat file connection to change the name of the destination file for each row in the loop.