How to auto fit columns (width) using Excel Destination in SSIS? - sql

I am working on an SSIS-2012 project, where I am receiving dataset after executing a stored procedure and saving the data in an excel file using an "Excel Destination" task in data flow. It works well, however I have a problem that for some fields, the width of columns are not large enough to show the value. Some numbers are shown like ##### for example, or part of a sentence instead of whole the end user sees etc.
Usually I handle this problem in Visual studio using a code like:
for (int i = 1; i <= RAWDATA_COLUMNS.Length; i++)
{
worksheet.Column(i).AutoFit();
}
However, here I could not find a way to handle it in SSIS. Writing a script to receive the data and save it into Excel file is not an option, using Excel Destination task in data flow is a must. Any help or advice would be appreciated.

Related

Automating scripts in q/kdb

I have some codes in kdb+(which uses q) which generates data in tabular form. The problem is I need to run each line separately and export the data to excel. Is there any way to automate this such that all the excel files can ben generated in one go?
Thanks
Whils't you can of course use csv as suggested above, .h.edsn is the function used for creating excel workbooks. It takes a dictionary of sheetNames->tables.
`:/var/tmp/excel.xls 0: .h.edsn `tab1`tab2!(([]10?10);([]20?20))
Then just open the xls file in excel.

ssis foreach loop error [SSIS.Pipeline] Error: "Excel Source" failed validation and returned validation status "VS_NEEDSNEWMETADATA"

I have created an SSIS project to load data from Excel file source. I am using foreach loop to take all Excel files from a selected folder. The loop seems to take the first file, but when it takes the next file an error occurs. It says that it needs a new metadata. The Excel source have the same file format and the same header (only one column data).
I am looking for an advice.
Sometimes excel will actually have more columns than it appears. Two ways to check this:
1. Save as csv and see if you have extra ,,, at the end of each line.
2. Create a quick test package and connect to the excel file and verify that the source has only one column.

Dynamically Created Excel From a Excel Template not writing Data Properly -SSIS

I need to create a SSIS Package which will run daily and export a table's data into a directory. The Exported excel have a predefined format. So I have used a template excel. (Excel File with Column Headers only)
Here are the steps I followed:
Created a variable Filename with holds the location and name of the excel to be generated
(based on current date value)
Added a File System Task in Control flow. Give Source as Template Excel and Destination as the Filename variable.
Added a Dataflow Task in control flow and connect it with File System Task.
In Dataflow Task, added a OLE-DB source and configure it with the source table (the table data needs to be copied into the excel )
Added a Excel Connection manager and changed Excel File path property to filename variable.
Added a Excel Destination and configure it with Excel Connection manager.
set Delayed validation true in Data flow task and Executed the package.
Control Flow:
Data Flow:
The Package is running successfully and the excel file also get generated in the desired directory. But the excel file skips around 19000 rows copying data after that . why it is happening?
Can any one help me to solve the issue.
Thanks for the help
It is possible that the file is already formatted , and that lines are down at the bottom...often excel jumps or add lines if you do not delete the lines already used even if empty...we must also consider strange events!

How to keep the first space in Cell on a .xlt

I worked on an export of data from an ERP to Excel but I encoutered a problem.
When I received my datas on my model Excel (.xlt, i don't have a choice for the extension...), all first spaces of fields in the ERP disappeared on my worksheet...
An exemple (Here, spaces before "Holder") :
And now, on excel, without spaces... :
And the last information, I think the problem is only on file type .xlt (97/03) (The only one I can use of course...) because when I try an export in .xls, there is no problem.
I already tried to change the type of cell in Text or Standard but it doesn't work.
Did you have a solution ?
Thanks !
Let me outline a typical solution:
You have a "data source" you cannot control - in this case it's an xlt file that somewhere on your hard drive - call it export1.xlt
You want to add the data from a data source (export1.xlt) to a "database" which could just be another aggregate spreadsheet or whatever. Let's call it database1.xlsx.
Typcially you would create a marcro inside database1.xlsx that knows how to import data into intself - in this case let's say you give a path e.g. C:\temp\export1.xlt and tell it to copy that data to Sheet1.
When you run that macro it will open export1.xlt, read the data into Sheet1 of database1.xlsx, and perform any necessary post-processing.
In this case the post processing could simply be looping over every cell to looking for a missing space.

In Word, how can I alter mail merge data programmatically with VBA?

I have a Word document that is used as the source document for a mail merge. I can edit the document, but not the data being used for the merge. I need to transform some of the data in the data source (specifically, I need to take numbers (e.g. 342) and add their value in words (e.g. "three hundred forty-two (342)")). I can write a VBA function to do the transformation, but I'm not sure how best to get the data to that function.
Is there some way I can associate a macro with specific points in the document and let the merging drive the transformation process? (I'm thinking of how you can use formulas in Word fields; I have a few things of the form { IF { MERGEFIELD foo } > 75 { MERGEFIELD foo } { = { MERGEFIELD foo } * 20 } } in the document already. If I could add something so I could go { FUNCTION WordNum { MERGEFIELD number } }, that would be ideal.)
Alternately, I think I can use VBA to rummage around in the mail merge's datasource (specifically document.MailMerge.DataSource) and rewrite fields. If I go that route, where should I execute the macro so that it will get to the data after it's been read from the data source, but before it's been merged with the document?
Could you do the data transformation in Microsoft Query? That is, where you currently have:
Data Source -> Mail Merge Template
create a Microsoft Query that sits between your data source and Word:
Data Source -> Microsoft Query -> Mail Merge Template
It's been a while since I used Mail Merge in Word but I don't remember being able to exert much control over it...
There does not appear to be a way to call arbitrary functions from embedded Word macros, so I went the VBA route.
I added an AutoOpen macro to the source document that called MailMerge.EditDataSource and then walked through the table to make its changes.