What's the easiest way to export data to excel from SQL Server 2000.
I want to do this from commands I can type into query analyzer.
I want the column names to appear in row 1.
In Query Analyzer, go to the Tools -> Options menu. On the Results tab, choose to send your output to a CSV file and select the "Print column headers" option. The CSV will open in Excel and you can then save it as a .XLS/.XLSX
Manual copy and paste is the only way to do exactly what you're asking. Query Analyzer can include the column names when you copy the results, but I think you may have to enable that somewhere in the options first (it's been a while since I used it).
Other alternatives are:
Write your own script or program to convert a result set into a .CSV or .XLS file
Use a DTS package to export to Excel
Use bcp.exe (but it doesn't include column names, so you have to kludge it)
Use a linked server to a blank Excel sheet and INSERT the data
Generally speaking, you cannot export data from MSSQL to a flat file using pure TSQL, because TSQL cannot manipulate anything outside the database (using a linked server is sort of cheating). So you usually need to use some sort of client application anyway, whether it's bcp.exe, dtswiz.exe or your own program.
And as a final comment, MSSQL 2000 is no longer supported (unless your company has an extended maintenance agreement) so you may want to look into upgrading at some point.
Related
We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.
I want to transfer SQL query results to a new csv file. This is because I have placed my SQL query inside a loop which will generate export query results to csv file each time. I'm using MS SQL Server 2012. I don't want to take GUI option.
Sql Server is not really designed to import and export files. You can use bulk copy program but I dont think it works in tsql code (looping). You can use openrowset but you need to set a special flag that opens up your surface area of attack which some do not want to do.
The answer is SSIS (or a tool like Talend). It comes with Sql and is designed by MS as the go to tool for import and export from Sql. If you were to right click on the data base, choose tasks and then export the wizard eventually creates and executes an SSIS package.
I recommend you reconsider a GUI option.
ps - Another answer was to use save results as. I have heard of problems using this method including problems with delimiters or text qualified fields.
There are multiple ways to attain this. Either you can export the resultset using BCP or using IMPORT/ EXPORT or using CTRL+SHIFT+S (this will change the resultset to SAVE AS. Hope this may help.
First, a grumble: MS builds SQL Server Studio AND Excel, but can't make one save in the standard format of the other?
OK, I'm a data analyst, but not allowed to change/mod either the data or structures directly. So full READ, but no WRITE.
I'm trying to do a dump so I can do some of this analysis offline, as I have no remote access either.
So one VARCHAR2 column in this table is for comments on the purchase of the asset being described/tracked. Of course, there are commas. The only export types built into SQL Server Studio are .csv and .txt, and .csv just turns into a mess when 'comma' is included as a delimiter.
So after an hour or so of screwing around with this, (including reading a thread on methods for excluding the one column from a SELECT while still exporting the other 221 columns in the table, without having to write them all out manually (fun reading, impressive, but means I'd have to figure out which of them actually works, and then still export the one column separately and insert it in the Excel separately)) I am throwing this problem on the pile at StackOverflow.
Someone else must have worked around this frustration of the .csv format as export VS the commas embedded in 'comment' text.
Any help would be appreciated.
Why don't you simply select all data in ssms result window, then copy and then paste in a blank excel file?
It should copy paste all data in correct format including comma valued fields in single column.
Try that.
So If you replace the ' to some special character you can export it.
Select
Replace(columnName,'''','`')
from Table
Other solution if you use the manager studio
https://learn.microsoft.com/en-us/sql/integration-services/import-export-data/start-the-sql-server-import-and-export-wizard
Trying (and largely succeeding) to export the results of a query from SQL Server to Excel, like so:
insert into OPENROWSET('Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=c:\exported excel files\exported_data.xls;',
'SELECT * FROM [Query$]') SELECT dbo.blabbityblah FROM dbo.the_table
It works! Sort of. It does export the data to the excel file, but it puts it all in there as text, even though some of the columns are datetime and most of them are numbers. None of them are being convert()-ed in the query itself. I've tried preformatting the cells in the actual Excel file before running the query, but it ignores the existing formating and spits it all out as text again.
There's got to be a way to do this, right?
excel dont have data type, its text based and preformat not work becus it replace existing file. if u want datatype try MS Access.
Look into using a schema.ini file to define the datatypes in a csv or txt. when you open either in excel you may achieve what you want
[sample_out.csv]
Format=CSVDelimited
DecimalSymbol=.
Col1=DATE datetime
Col2=FName Text
Another approach you may want to look at depending on your needs is to use the import and export wizard. You can customize a query for the data and specify the data type in the wizard. If you are using a SKU other than Express you can the run it right away or save the SSIS package is generated for further manipulation.
I have an Excel spreadsheet with a few thousand entries in it. I want to import the table into a MySQL 4 database (that's what I'm given). I am using SQuirrel for GUI access to the database, which is being hosted remotely.
Is there a way to load the columns from the spreadsheet (which I can name according to the column names in the database table) to the database without copying the contents of a generated CSV file from that table? That is, can I run the LOAD command on a local file instructing it to load the contents into a remote database, and what are the possible performance implications of doing so?
Note, there is a auto-generated field in the table for assigning ids to new values, and I want to make sure that I don't override that id, since it is the primary key on the table (as well as other compound keys).
If you only have a few thousand entries in the spreadsheet then you shouldn't have performance problems (unless each row is very large of course).
You may have problems with some of the Excel data, e.g. currencies, best to try it and see what happens.
Re-reading your question, you will have to export the Excel into a text file which is stored locally. But there shouldn't be any problems loading a local file into a remote MySQL database. Not sure whether you can do this with Squirrel, you would need access to the MySQL command line to run the LOAD command.
The best way to do this would be to use Navicat if you have the budget to make a purchase?
I made this tool where you can paste in the contents of an Excel file and it generates the create table, and insert statements which you can then just run. (I'm assuming squirrel lets you run a SQL script?)
If you try it, let me know if it works for you.