I have 4 different text files each file with different name and different column in it place in one folder. I want these 4 four files to be inserted or updated into 4 different existing tables. So How to read read these 4 files dynamically and insert them into their respective table dynamically in SSIS.
Well, you need to use Data Flow Task to move data from a Flat File Source to a Table Destination (OLEDB Destination perhaps). Are the columns in your file delimited in any way? For example, with any of these: (;),(|) or something like that? if it is, you can create a FlatFileConnectionManager and set that to split the columns. If not, you might need to use the FixedWidth option to separate your columns. To use the OLEDB Destination, you will need to create a OLEDB connectionManager to point to the table in your database. I could help you more if I had more information about the files you want to read the data from.
EDIT
Well you said at the start you were working with 4 files and 4 tables, so you can create 4 Flat Destination sourcers with 4 OLEDB destinations aswell (1 of each for each flat file). If I understood you correctly, these 4 files can or cannot exist yet. So if you know the names that the files will get, change the Package Property DelayValidation to true, and then create a connection with a sample text file. You do this so the File path gets saved. The tables, in my opinion DO need to exist. Now, when you said:
i want to load all the text files into each different existing table whenever there is files inside the folder.
The only way I know you can do something similar, is to schedule the execution of your package at a certain time with SQL Server Agent Job. Please let me know if this was what you were looking for.
Related
Hope you can help me.
I am looking to retrieve multiple file-names into a column from files I wish to upload onto SQL in a custom column on SSIS.
I am aware of using Advanced Editor on the Flat File Source in Component Properties > FileNameColumnName. However, I am not sure how to make sure it picks up all the file names or what to enter in this field?
I can upload the files and all the data it holds but not the filename into a column.
I’m making an SSIS package to create a CSV file using OLE DB Source and Flat File Destination.
When I get the file it doesn't contain the headers but they are clearly defined in the destination.
I've tried all the options related to this:
headers rows to skip -1,
column names in the first data row,
column delimiter,
data rows to skip
and even resetting the columns.
Please check the option in the Flat file connection manager as
Just in case this doesn't work then the setting are not being carried over to your config files. In such cases there are two workarounds:
Editing the .dtconfig to edit "firstcolumnhasnames" to 1 adds the column names without needing to delete the connection from the package.
delete the destination connection, and recreate from scratch.
I'm trying to use oracle external tables to load flat files into a database but I'm having a bit of an issue with the location clause. The files we receive are appended with several pieces of information including the date so I was hoping to use wildcards in the location clause but it doesn't look like I'm able to.
I think I'm right in assuming I'm unable to use wildcards, does anyone have a suggestion on how I can accomplish this without writing large amounts of code per external table?
Current thoughts:
The only way I can think of doing it at the moment is to have a shell watcher script and parameter table. User can specify: input directory, file mask, external table etc. Then when a file is found in the directory, the shell script generates a list of files found with the file mask. For each file found issue a alter table command to change the location on the given external table to that file and launch the rest of the pl/sql associated with that file. This can be repeated for each file found with the file mask. I guess the benefit to this is I could also add the date to the end of the log and bad files after each run.
I'll post the solution I went with in the end which appears to be the only way.
I have a file watcher than looks for files in a given input dir with a certain file mask. The lookup table also includes the name of the external table. I then simply issue an alter table on the external table with the list of new file names.
For me this wasn't much of an issue as I'm already using shell for most of the file watching and file manipulation. Hopefully this saves someone searching for ages for a solution.
I've seen plenty of examples of how to enumerate through a collection of Excel workbooks or sheets using the Foreach Loop Container, with the assumption that the data structure of all of the source files are identical and the data is going to a single destination table.
What would be the best way to handle the following scenario:
- A single Excel workbook with 10 - 20 sheets OR 10 - 20 Excel workbooks with 1 Sheet.
- Each workbook/ sheet has a different schema
- There is a 1:1 matching destination tables for each source sheet.
- Standard cleanup: workbooks would be created and placed in a "loading" folder, SSIS package runs on a job that reads the files in the loading folder and moves them to an archive folder upon successful completion
I know that I can create a seperate SSIS package for each workbook, but that seems really painful to maintain. Any help is greatly appreciated!
We faced the same issue long back. I will just summarize you what we have done.
We have written an SSIS-package pro-grammatically using C#. A MetaTable is maintained which holds the information (table name, columns, positions of these columns in the flat file.) of the flat files. We extract the flat file name and then query it to the meta-table about the table this flat file belongs to and the columns it is having and the column positions in the flat file.
We execute the package in the SQLSERVER by passing each flat file as a command line argument to the PackageExe. So it reads and processes each flat file.
Example Suppose say we have a flat file FF, we first extract name of the flat file and then get the table name by querying to the DB, lets say it is TT which contains columns COL-1, COL-2 with positions 1 to 10 and 11 to 20 respectively. By reading this information from the MetaTable now I have created a derived column transformation (package.)
Our Application has a set of flat files in a folder and by using "For Loop Container SSIS" we get one flat file at a time and do the above process.
I have a comma delimited text file containing discrepancies across two different databases, and need to update one of the databases with information from the aforementioned text file. The text file is in the following format:
ID valueFromDb1 valueFromDb2
1 1234 4321
2 2345 5432
... ... ...
I need to go update a table by checking for the ID value, and where valueFromDb1 exists replace it with valueFromDb2. There are around 11,000 rows that need to be updated. Is there a way I can access the information in this text file directly through an sql query? My other thought was to write a java program to do this for me, but I'm not convinced that is the easiest solution.
The article below demonstrates one way to read a text file in MS SQL Server by using xp_cmdshell. In order for it to work the file has to be on one of the drives of the server. Once you have the file loaded into a table variable (which is what the code in the article will do) you should be able to do the joins and updates pretty easily. Let us know if you need any other help.
http://www.kodyaz.com/articles/read-text-file-using-xp_cmdshell.aspx