ssis using sql task to overwrite excel data - sql

I have an ssis package that writes data to an excel destination, my issue is every time the package is run it adds the data to the end of the excel file and i want it to clear down the data and then insert. so i have added 2 sql tasks before my data flow task. the first drops the excel table, the second creates the table then the data flow task contains the excel connection.
it is failing on the create sql task with the following msg, i'm a little confused about how to go about it.
[Execute SQL Task] Error: Executing the query "CREATE TABLE `Excel Destination 1` (
`Name data..." failed with the following error: "Table 'Excel Destination 1' already exists.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.
The issue is because i delete the excel destination in the control flow it falls over because file not there when it gets to the excel destination task as this then creates file?the create sql task uses same sql taken from excel destination editor next to name of excel sheet
CREATE TABLE `Excel Destination 1` (
`Name` VARCHAR(225),
`Postcode` VARCHAR(15),
`Date1` DATETIME,
`Date2` DATETIME,
`Date3` DATETIME,
`Date4` DATETIME,
`Date5` DATETIME,
`Date6` DATETIME,
`Date7` DATETIME,
`Date8` DATETIME,
`Date9` DATETIME,
`Date10` DATETIME,
)
I have also tried to use a File System Task to copy the file to a new location but this doesn't work it still just adds data to end of worksheet on each run rather than overwriting

Use this way to overwrite on destination

Have you tried using using a "template" excel file for each execution and renaming this? You can create a template based off the current excel document and place this in another folder. A generic name such as template or otherwise can be used, as long as the name is different from the file that's written to each time. The steps below outline this process further.
Use a File System Task to delete the excel file from the last execution.
Copy the template file to the location that you plan to write to your Excel Destination using another File System Task.
Rename the now copied template file to the file name used in the Connection Manager for your Excel Destination.
Execute your Data Flow Task to load the data into the new Excel file.
One thing to note, make sure that you delete the rows with data in the Excel file when you first create the template. If you only use the Clear Contents option, new data will be appended below.

Related

Exporting data with NTEXT columns to Excel in a regularly basis (as Job)

Initial situation:
I'm stuck in a simple task (in my opinion it should be simple...)
I have a collection of data which should be exported weekly to Excel.
This export contains 104 columns, from which 57 are nvarchar(max) and contains item descriptions and other information in different languages for our sales guys.
The report will have something around 2000 to 8000 rows.
I use a SQL Server 2017 CU 16
My intention:
I intended to do an SSI Job with an Excel template where the columns are predefined (width, data type and so on).
This job would have something like those steps:
Delete existing Excel file
copy Excel template as a new Excel file
DataFlowTask using SQL Server as the source and Excel destination as the target
What I already tried:
If I use the excel template with only headers, I get the following error for each of the nvarchar(max) columns:
[Excel Destination [2]] Error: An error occurred while setting up a
binding for the "ColumnName" column. The binding status was
"DT_NTEXT".
When I prepare the template having it prefilled with one row. This row has a long text (more than 255 characters) for the columns where nvarchar(max) is in the source, everything runs fine but, this dummy line is still existing.
Another try I did was dropping the sheet using an "Execute SQL Task" to the Excel File Connection and recreating the sheet using a create table statement in another "Execute SQL Task" to the same Excel file connection, I get the same error as above. Although I'm using NTEXT as the datatype for the relevant columns.
Question:
How can I export data seamlessly into a preformatted excel file which contains NTEXT?
Thank you very much in advance for any assistance.

Extract Data from SQL to Excel without overwriting the previous Data

I was able to use SQL Server Agent to create a job that extract Data from SQL Server 2008 to Excel Format Daily. However, is there anyway to create a job that keep all the extracts separately without overwrite the previous files? I would really much appreciated your help.
Thank you.
When using OLE DB/Jet data provider for Excel, there is a way to specify the target worksheet name. A worksheet in Excel is a rough equivalent of a database table. One option is to use a different worksheet name each time (say, based on a current date). Another option would be to append the data to the existing worksheet, if that's what you're after. If SQL Server Agent job does not allow you to do something like that, then you may want to create a small app instead.
While Copying from OLEDB to Excel, You can dynamically pass the file name through the expressions based on the Datetime.
Ex: Filepath+Date1_mmddyy_hhmmss
Every time the file generates, It will create a file with new file name and Excel file should be passed as an expression.

Export data to Excel file using SSIS

I have SSIS Package which Exports Data from table to Excel file
Control Flow :-
Data Flow :-
This is My Step :-
Drop Excel Table
Create Excel Table with format as of my Select Query which i used to retrieve data from database
Insert Data from Database to Excel file
I Used Query Like Select * From Table Where --Some Condition
I retrieve 3000 rows out of 10000 rows and put that 3000 rows in my excel sheet.
But when open my excel sheet i saw scrollbar which goes till 10000th row and ends hence my excel sheet size also increses . how can i reduce my excel sheet size ? my excel sheet contains only 3000 rows then why blank cells which goes till 10000th row ?
SQL Server 2008 &
Visual Studio 2008 with BIDS
I believe your issue is around the method in which you are using to create the file. You have two alternatives and both should fix your issue:
Solution #1:
You can create an Excel file with those predefined columns, essentially your empty output file - this would act as your 'Template File'. Your flow would then be this:
File System Task - Copy template file to output or working directory (rename if necessary)
OLEDB Source Task - Query your source for the data (3000)
Data Conversion Task
Excel Destination Task - Put data into new Excel file
Note: You already have steps 2 thru 3 complete, you just need to make sure you are connecting to the new Excel file. Also, to clarify, step 1 is outside the Control Flow Task.
This way is helpful because you always have a blank and consistently formatted Excel file to copy and work with.
Solution #2:
The other option is to use a Script Task and create the Excel file - you could also load the data into the file in this task. This requires some basic understanding of VB.NET or C#. Basically you would need to get a XLS library (like NPOI). This is more complicated, but gives you the best functionality.
I recommend you try solution #1 and see how that works for you.
Drop table SheetName doesn't delete the sheet instead it just deletes the row . If for the 1st time you have loaded 10K rows and then again executed the package by restricting the number of rows to 3K ,the excel file will still contain those 10K empty rows as it retains the sheet along with the empty spaces .
You can use script task to delete the sheet using COM obects .But for that you need to place the Excel PIA(Primary Interop Assemply) to make it visible for VSA or else create a new excel file every time the package runs
Else as suggested by Nicarus use File System Task to delete the existing file and create a new Excel file on every execution .
Diagram :
File System Task :
Use the same components and the query for Create Table using Execute SQL task and your DFT

SSIS - SQL Task, ADO Connection, and Excel xlsx file issues

So I have a relatively simple SSIS job that does the following:
Execute SQL Task - Drop Sheet from Excel File
drop table `Table`
Execute SQL Task - Create Sheet in Excel File
CREATE TABLE `Table` (
`Col1` VarChar (255) ,
`Col2` Long ,
`Col3` VarChar (84) ,
`Col4` VarChar (60) ,
`Col5` VarChar (255) ,
`Col6` VarChar (20) ,
`Col7` VarChar (255) ,
`Col8` VarChar (255) ,
`Col9` VarChar(255))
Data Flow Task - Export Data from SQL to Excel
This just runs an SQL query [OLE DB Source], coverts everything to unicode strings, and exports the data to an Excel Destination.
NOTE: This job executes perfectly with no errors in BIDS 2005. However, when I initially tried running it in BIDS 2008 (32 bit mode), I got the following error on both the Drop Sheet and Create Sheet Execute SQL Tasks mentioned above:
Warning: Multiple-step OLE DB operation generated errors. Check each
OLE DB status value, if available. No work was done.
I found I could fix this by changing the ConnectionType property of my Execute SQL Tasks to ADO, and using the following connection string:
Data Source=\\<filelocation>\<ExcelFileName>.xls;Provider=Microsoft.ACE.OLEDB.12.0;Extended Properties=Excel 8.0;
At this point, I'm back to the package executing in BIDS 2008 with no errors. Everything runs great!
Then I tried to update the job from exporting to an Excel 97-2003 .xls file to exporting to an Excel 2007 .xlsx file.
So, per Microsoft, I had to change my Execute SQL Tasks to use the following connection string:
Data Source=\\<filelocation>\<ExcelFileName>.xlsx;Provider=Microsoft.ACE.OLEDB.12.0;Extended Properties=Excel 12.0 XML;
I also had to update the connection manager for my Excel Destination step (which DOES support the Excel 2007 format in BIDS 2008 per Microsoft) to the following:
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=\\<file location>\<Excel filename.xlsx;Extended Properties="Excel 12.0 XML;HDR=YES";
So at this point, all the connections test good, and everything APPEARS like it should work. However, when I execute the package I have the following problem.
The Drop Sheet Execute SQL Task completes successfully according to BIDS. However, it doesn't actually drop the sheet in the Excel file anymore. It DOES delete all the data contained in the sheet though.
Because of #1, the Create Sheet Execute SQL Task fails because the sheet was never actually dropped.
Any ideas why this isn't working anymore? Honestly I've looked all over the internet and SO, and I have yet to see someone explain how to do this. Is there some new command to drop a sheet in Excel 2007?
For anyone else who may find this:
I fixed the issue with exporting data to XLSX files by replacing the two Execute SQL tasks with two File System Tasks. One that deletes the existing file, and another that copies a "template" Excel file (basically just an empty spreadsheet with column headers) to the reporting directory.
Then I export the data to the new template file using the data flow task.
Not my ideal solution, I much prefer the old method of having one file and just re-creating the table. However, apparently that's no longer an option in Excel 2007, and the File System task method using a template file will work.
The two most helpful resources I found when working on this job are:
SSIS - Excel 2007 Connection Guide
Copy/Rename Files Using SSIS File System Task

SQL SSIS Help. Import an excel sheet into a temp table

I have a farily simple task of taking an Excel sheet and importing it into a SQL 2005 database table. I need to create an SSIS task for this. The Excel sheet does not have all the columns I need to make the insert directly into the permanent sql table, but I know how I could link out to other tables and get the columns that are missing. So I was wondering how I could import the Excel sheet into a #tempTable (or #VariableTable) and then one in a temp table I could just write my SQL Insert code (using the temptable as well as the other tables that I will link on) in a basic Execute SQL Task. But I am having trouble figuring out how to do this with SSIS. When I drag my excel source and try to link it to a SQL Server Destination the drop down doesn't have an option for temptables.
The SSIS way of doing this would be to use a Merge or Lookup transform. I don't think that you can put things into a temp table like that, but you could have an ExecuteSQL task that creates an actual table that you can then drop at the end of the package. You can then have your package use that.
During the design time you might need to have the table in place to link things up, but it shouldn't need to be there when you actually run the package.
First, you'll need to create the staging table for the excel worksheet. Open SSMS, right click the database, choose tasks, import data. Set the import source as excel. Browse to the file. Set the destination as SQL Server. You can accept the table name or name it as you wish. I suggest naming it something useful. Depending on your understanding of data types and what is in the excel sheet, it may take you a while to get this right. Eventually, you will have a table that will accept the contents of the excel sheet.
Second, create your ssis package by using an excel source and sql serve or oledb destination.
take Execute SQL task in control flow to create the target staging table of excel sheet surce.
in data flowuse excel source and oled db target to staging table create in the first step.
3.In control flow use merge or join statement to your excel targeted staging table with other source table to final target table.
thanks
prav