Export data to Excel file using SSIS - sql

I have SSIS Package which Exports Data from table to Excel file
Control Flow :-
Data Flow :-
This is My Step :-
Drop Excel Table
Create Excel Table with format as of my Select Query which i used to retrieve data from database
Insert Data from Database to Excel file
I Used Query Like Select * From Table Where --Some Condition
I retrieve 3000 rows out of 10000 rows and put that 3000 rows in my excel sheet.
But when open my excel sheet i saw scrollbar which goes till 10000th row and ends hence my excel sheet size also increses . how can i reduce my excel sheet size ? my excel sheet contains only 3000 rows then why blank cells which goes till 10000th row ?
SQL Server 2008 &
Visual Studio 2008 with BIDS

I believe your issue is around the method in which you are using to create the file. You have two alternatives and both should fix your issue:
Solution #1:
You can create an Excel file with those predefined columns, essentially your empty output file - this would act as your 'Template File'. Your flow would then be this:
File System Task - Copy template file to output or working directory (rename if necessary)
OLEDB Source Task - Query your source for the data (3000)
Data Conversion Task
Excel Destination Task - Put data into new Excel file
Note: You already have steps 2 thru 3 complete, you just need to make sure you are connecting to the new Excel file. Also, to clarify, step 1 is outside the Control Flow Task.
This way is helpful because you always have a blank and consistently formatted Excel file to copy and work with.
Solution #2:
The other option is to use a Script Task and create the Excel file - you could also load the data into the file in this task. This requires some basic understanding of VB.NET or C#. Basically you would need to get a XLS library (like NPOI). This is more complicated, but gives you the best functionality.
I recommend you try solution #1 and see how that works for you.

Drop table SheetName doesn't delete the sheet instead it just deletes the row . If for the 1st time you have loaded 10K rows and then again executed the package by restricting the number of rows to 3K ,the excel file will still contain those 10K empty rows as it retains the sheet along with the empty spaces .
You can use script task to delete the sheet using COM obects .But for that you need to place the Excel PIA(Primary Interop Assemply) to make it visible for VSA or else create a new excel file every time the package runs
Else as suggested by Nicarus use File System Task to delete the existing file and create a new Excel file on every execution .
Diagram :
File System Task :
Use the same components and the query for Create Table using Execute SQL task and your DFT

Related

Exporting data with NTEXT columns to Excel in a regularly basis (as Job)

Initial situation:
I'm stuck in a simple task (in my opinion it should be simple...)
I have a collection of data which should be exported weekly to Excel.
This export contains 104 columns, from which 57 are nvarchar(max) and contains item descriptions and other information in different languages for our sales guys.
The report will have something around 2000 to 8000 rows.
I use a SQL Server 2017 CU 16
My intention:
I intended to do an SSI Job with an Excel template where the columns are predefined (width, data type and so on).
This job would have something like those steps:
Delete existing Excel file
copy Excel template as a new Excel file
DataFlowTask using SQL Server as the source and Excel destination as the target
What I already tried:
If I use the excel template with only headers, I get the following error for each of the nvarchar(max) columns:
[Excel Destination [2]] Error: An error occurred while setting up a
binding for the "ColumnName" column. The binding status was
"DT_NTEXT".
When I prepare the template having it prefilled with one row. This row has a long text (more than 255 characters) for the columns where nvarchar(max) is in the source, everything runs fine but, this dummy line is still existing.
Another try I did was dropping the sheet using an "Execute SQL Task" to the Excel File Connection and recreating the sheet using a create table statement in another "Execute SQL Task" to the same Excel file connection, I get the same error as above. Although I'm using NTEXT as the datatype for the relevant columns.
Question:
How can I export data seamlessly into a preformatted excel file which contains NTEXT?
Thank you very much in advance for any assistance.

SSIS: i need to populate 2 Excel spreadsheets from 2 different queries inside For Each Loop

I am creating multiple xl files from ssis package. I assign dynamic file name to each workbook in the For Each Loop in the FileSystemTask where I copy a blank template from template directory to the output directory where all few hundreds Excel files need to be output. Part of the file name comes from the values in the loop.
Each file has 2 spreadsheets. Different SQL Query is a source for each. When I output files with 1 spreadsheet filled it is no problem. However, when I add the second source-destination or another data flow i get an error that connection string is in invalid format or other... Should I use second set of OLEDB Source/ Excel Destination within same data flow or use another data flow for second spreadsheet?

Executing an Excel package worksheets with different table structure from SSIS to different tables in SSMS dynamically?

Can I execute an Excel package worksheets with different table structure from SSIS to different tables in SSMS dynamically ?
I have an excel file with 3 worksheets. I want to process those worksheets into one database in SSMS but each to their own table. what will be the best practice processing this file? i'm new to SSIS . Thank you in advance
You could, suppose you have target set already in database. there is one properties called openrowset in the advanced properties for excel source. you could specify which sheet to be loaded as well as the columns.
For example, Sheet1$A1:Z, which will load data from sheet1 A column to Z, start from row #1 if you did not check header row as row #1

UNION ALL with Excel file as data source

I have got the following Problem.
I have several Excel files containing each the data of a country in one folder.
However I want to pull that all into one Excel report.
As the content of the source files change dayly, I guess the best way to do that is to do a import via an SQL Statement using Union All.
However the problem is that MSQuery only allows me to Access one file at a time. Is there a Workaround for that problem?
Maybe create a data model and use DAX?
This sounds like a job for Power Query, a free add-in from Microsoft for Excel 2010 and Excel 2013, and built into Excel 2016 as "Get and Transform" in the Data ribbon.
You can create individual queries to the different Excel files in the different folder, then create a query that appends all previous queries into one table, which can be loaded to the Excel data model or a worksheet table for further processing.
The queries can be refreshed with a click when the data has changed.

Best Way to ETL Multiple Different Excel Sheets Into SQL Server 2008 Using SSIS

I've seen plenty of examples of how to enumerate through a collection of Excel workbooks or sheets using the Foreach Loop Container, with the assumption that the data structure of all of the source files are identical and the data is going to a single destination table.
What would be the best way to handle the following scenario:
- A single Excel workbook with 10 - 20 sheets OR 10 - 20 Excel workbooks with 1 Sheet.
- Each workbook/ sheet has a different schema
- There is a 1:1 matching destination tables for each source sheet.
- Standard cleanup: workbooks would be created and placed in a "loading" folder, SSIS package runs on a job that reads the files in the loading folder and moves them to an archive folder upon successful completion
I know that I can create a seperate SSIS package for each workbook, but that seems really painful to maintain. Any help is greatly appreciated!
We faced the same issue long back. I will just summarize you what we have done.
We have written an SSIS-package pro-grammatically using C#. A MetaTable is maintained which holds the information (table name, columns, positions of these columns in the flat file.) of the flat files. We extract the flat file name and then query it to the meta-table about the table this flat file belongs to and the columns it is having and the column positions in the flat file.
We execute the package in the SQLSERVER by passing each flat file as a command line argument to the PackageExe. So it reads and processes each flat file.
Example Suppose say we have a flat file FF, we first extract name of the flat file and then get the table name by querying to the DB, lets say it is TT which contains columns COL-1, COL-2 with positions 1 to 10 and 11 to 20 respectively. By reading this information from the MetaTable now I have created a derived column transformation (package.)
Our Application has a set of flat files in a folder and by using "For Loop Container SSIS" we get one flat file at a time and do the above process.