I'm using a basic query to gather data from a few joined tables, and I need to be able to export the data to a CSV (or text) file in order to be imported into Excel. The query format is:
SELECT
Item1 as 'blah'
FROM
table1 JOIN table2
WHERE Condition
GROUP BY ...
HAVING ....
I have the proper output setup correctly through the query, so I'm only looking for a way to output it to a file. If it would be easier to use a stored procedure, then it would be no problem to throw that around the query. I'm just looking for something that can write the output to a file, WITHOUT using a third-party tool, as this needs to be moderately portable.
If you need more detail from the query, I can supply that (but it really is basic).
Use a BCP utility [MSDN]
That's what most commonly used.
If you have access to SSIS (through the Business Intelligence Development Studio), create a data flow task with an OLE DB Source going to a Flat File Destination. Or, you can go straight to Excel, if you don't want to worry about converting the delimited file.
Related
I am using select statement in excel source to select just specific columns data from excel for import.
But I am wondering, is it possible to select data such way when I select for example column with name: Column_1, but if this column is not exists in excel then it will try to select column with name Column_2? Currently if Column_1 is missing, then data flow task fails.
Use a Script task and write .net code to read the excel file and then perform the check for the Column_1 availability in the file. If the column does not present then use Column_2 as input. Script Task in SSIS can act as a source.
SSIS is metadata based and will not support dynamic metadata, however you can use Script Component as #nitin-raj suggested to handle all known source columns. There is a good post below on how it can be done.
Dynamic File Connections
If you have many such files that can have varying columns then it is better to create a custom component.However, you cannot have dynamic metadata even with custom component, the set of columns should be known upfront to SSIS.
If the list of columns keep changing and you cannot know in advance what are expected columns then you are better off handling the entire thing in C#/VB.Net using Script Task of control flow
As a best practice, because SSIS meta data is static, any data quality and formatting issues in source files should be corrected before ssis data flow task runs.
I have seen this situation before and there is a very simple fix. In the beginning of your ssis package, using a file task to create copy of the source excel file and then run a c# script or execute a powershell to rename the columns so that if column 1 does not exist, it is either added at the appropriate spot in excel file or in case the column name is wrong is it corrected.
As a result of this, you will not need to refresh your ssis meta data every time it fails. This is a standard data standardization practice.
The easiest way is to add two data flow tasks, one data flow for each Excel source select statement and use precedence constraints to execute the second data flow when the first one fails.
The disadvantage of this approach is that if the first data flow task fails for another reason, it will also try to execute the second one. You will need some advanced error handling to check if the error is thrown due to missing columns or not.
But if have a similar situation, I will use a Script Task to check if the column exists and build the SQL command dynamically. Note that this SQL command must always return the same metadata (you must use aliases).
Helpful links
Overview of SSIS Precedence Constraints
Working with Precedence Constraints in SQL Server Integration Services
Precedence Constraints
We want some of our customers to be able to export some data into a file and then we have a job that imports that into a blank copy of a database at our location. Note: a DBA would not be involved. This would be a function within our application.
We can ignore table schema differences - they will match. We have different tables to deal with.
So on the customer side the function would ran somethiug like:
insert into myspecialstoragetable select * from source_table
insert into myspecialstoragetable select * from source_table_2
insert into myspecialstoragetable select * from source_table_3
I then run a select * from myspecialstoragetable and get a .sql file they can then ship to me which we can then use some job/sql script to import into our copy of the db.
I'm thinking we can use XML somehow, but I'm a little lost.
Thanks
Have you looked at the bulk copy utility bcp? You can wrap it with your own program to make it easier for less sophisticated users.
Since it is a function within your application, in what language is the application front-end written ? If it is .NET, you can use Data Transformation Services in SQL Server to do a sample export. In the last step, you could save the steps into a VB/.NET module. If necessary, modify this file to change table names etc. Integrate this DTS module into your application. While doing the sample export, export it to a suitable format such as .CSV, .Excel etc, whichever format from which you will be able to import into a blank database.
Every time the user wants do an export, he will have to click on a button that would invoke the DTS module integrated into your application, that will dump the data to the desired format. He can mail such file to you.
If your application is not written in .NET, in whichever language it is written, it will have options to read data from SQL Server and dump them to a .CSV or text file with delimiters. If it is a primitive language, you may have to do it by concatenating the fields of every record, by looping through the records and writing to a file.
XML would be too far-fetched for this, though it's not impossible. At your end, you should have the ability to parse the XML file and import it into your location. Also, XML is not really suited if the no. of records are too large.
You probably think of a .sql file, as in MySql. In SQL Server, .sql files, that are generated by the 'Generate Scripts' function of SQL Server's interface, are used for table structures/DDL rather than the generation of the insert statements for each of the record's hard values.
I have a bunch of different xml souces (over 100), and I am trying to pipe them all into an OLE DB via SSIS. I started out trying to use the Union All function, but since not all of the data that goes into each row of the database is of the same type, some of the values are entered in as "ignore" by SSIS, and come out as null. Each xml source has multiple outputs as well, so I am trying to avoid manually pointing each node of each xml source at a different db source, and then combining them. I know that you can change the input type of the data in the xml source under the advanced options, but there doesn't appear to be a way to default everything to one data type (string would work for me). Is there a way to use the union all feature to work for me, or a better way to do this? Thanks in advance.
Does your server have access to folder that contains xml files?
If it does, using SSIS for that is not nesessary. Just use OPENROWSET
Also you can make stored procedure that selects data from all xml sources and union all results. This SP you can use in your SSIS package.
How do I copy data from multiple tables within one database to another database residing on a different server?
Is this possible through a BTEQ Script in Teradata?
If so, provide a sample.
If not, are there other options to do this other than using a flat-file?
This is not possible using BTEQ since you have mentioned both the databases are residing in different servers.
There are two solutions for this.
Arcmain - You need to use Arcmain Backup first, which creates files containing data from your tables. Then you need to use Arcmain restore which restores the data from the files
TPT - Teradata Parallel Transporter. This is a very advanced tool. This does not create any files like Arcmain. It directly moves the data between two teradata servers.(Wikipedia)
If I am understanding your question, you want to move a set of tables from one DB to another.
You can use the following syntax in a BTEQ Script to copy the tables and data:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH DATA AND STATS;
Or just the table structures:
CREATE TABLE <NewDB>.<NewTable> AS <OldDB>.<OldTable> WITH NO DATA AND NO STATS;
If you get real savvy you can create a BTEQ script that dynamically builds the above statement in a SELECT statement, exports the results, then in turn runs the newly exported file all within a single BTEQ script.
There are a bunch of other options that you can do with CREATE TABLE <...> AS <...>;. You would be best served reviewing the Teradata Manuals for more details.
There are a few more options which will allow you to copy from one table to another.
Possibly the simplest way would be to write a smallish program which uses one of their communication layers (ODBC, .NET Data Provider, JDBC, cli, etc.) and use that to take a select statement and an insert statement. This would require some work, but it would have less overhead than trying to learn how to write TPT scripts. You would not need any 'DBA' permissions to write your own.
Teradata also sells other applications which hides the complexity of some of the tools. Teradata Data Mover handles provides an abstraction layer between tools like arcmain and tpt. Access to this tool is most likely restricted to DBA types.
If you want to move data from one server to another server then
We can do this with the flat file.
First we have fetch data from source table to flat file through any utility such as bteq or fastexport.
then we can load this data into target table with the help of mload,fastload or bteq scripts.
What's the easiest way to export data to excel from SQL Server 2000.
I want to do this from commands I can type into query analyzer.
I want the column names to appear in row 1.
In Query Analyzer, go to the Tools -> Options menu. On the Results tab, choose to send your output to a CSV file and select the "Print column headers" option. The CSV will open in Excel and you can then save it as a .XLS/.XLSX
Manual copy and paste is the only way to do exactly what you're asking. Query Analyzer can include the column names when you copy the results, but I think you may have to enable that somewhere in the options first (it's been a while since I used it).
Other alternatives are:
Write your own script or program to convert a result set into a .CSV or .XLS file
Use a DTS package to export to Excel
Use bcp.exe (but it doesn't include column names, so you have to kludge it)
Use a linked server to a blank Excel sheet and INSERT the data
Generally speaking, you cannot export data from MSSQL to a flat file using pure TSQL, because TSQL cannot manipulate anything outside the database (using a linked server is sort of cheating). So you usually need to use some sort of client application anyway, whether it's bcp.exe, dtswiz.exe or your own program.
And as a final comment, MSSQL 2000 is no longer supported (unless your company has an extended maintenance agreement) so you may want to look into upgrading at some point.