I have a generic SSIS package which is used for multiple files upload and there is only one batch file for triggering this package which is expecting a call from scheduler for different files.
The problem is all the files will get into the system at the same time and ESP will trigger the same batch file multiple times for each file .
I have a Execute SQL task component as my first component in the package which will delete the data from the table since multiple times these package is getting called table is loosing data for single file .
I want to synchronize this behavior meaning I want to run the package again for the other file only once first file load is completed.
You could have a schedule table, then create foreach loop as the first component called. There would be an oledb source for your schedule table that the loop would iterate through to get the list of files. e.g. Once the first file is uploaded, execute sql task to update the schedule table with updatedatetime. The schedule table would contain a row for each file. You would select which files to upload by something like select * from schedule where updatedatetime < getdate() -.5 After the first one is done, you execute another sql task to truncate the table and call the next one in the schedule table.
Related
---The steps I used to upload .csv file were:
upload, browse, select .csv from my desktop
select auto detect schema and input parameters
create table
One possible cause of the initial malfunction was that my .csv was a zipfile; which did not automatically load in the usual manner, and I canceled the process. After several failed attempts to upload the zip file I tried to upload a different .csv and was not been able to create a table then or since.
I located the error message (HTTP409) which states the table already exists, so I've tried changing the write preference to "overwrite" as well as "append to the table" but the processing function takes several minutes and does not complete. I'm unable to create any tables.
I got a bunch of .csv files each containing a script that would create a certain table.
I want to create tables using these scripts in said files (each table to be created using one file).
I got a foreach loop container that specifies the path and which files to use.
I don't know how to configure the Execute SQL Task to execute the script in each one of these files in order to create a table.
You can use the Execute SQL Task with an input parameter of the table name (I would use the table name that the 'for each' container provides. I would first drop the table if it exists and then recreate it with a create table command (in the Execute SQL Task).
As other people have noted you may want to be careful with tasks that drop tables but I have created plenty of SSIS packages that involve truncating and/or creating tables.
I have a workflow which goes through a loop and load split files into TD table. I would like to direct the rows from the temporary tables created by TPT i.e, ET1 and ET2 to another table.
Here is detailed scenario:
I have a list of file names in a file (abc.txt). These file names are initialized in a loop and triggers workflow(indirect load) for each file.
I am using a shell script to trigger the workflow. However, I would like to keep track of the error records going to ET1 and ET2 tables to another table.
These records needs to be append for each file load with the corresponding file name.
Please suggest how to achieve this.
Thanks in advance
The proper way of handling such a scenario will be to have an additional flag in the text file with the file names. This flag can be updated as completed or processed after each successful run of the workflow.
To get the file name inside the mapping have the text file as a lookup inside the mapping.
To include file name, just check the Add Currently Processed Flat File Name Port property on the Source Transformation.
I am new to SQL.
What is the best way to create a TXT file, if a table has records > 0?
The code already exists to remove or add records to this table.
I am looking for ways to create a trigger file (with no content in the file) at a specific network folder.
Preferably, I would want this TXT file to be removed at the end of the day, so the process could repeat itself every morning
On an after delete Trigger do a select count(*) from table or query one of the system catalog views. If its zero, then call a stored proc that poops a file onto your share drive.
To move the file you could create a small package or call a powershell or bcp (after enabling xp_cmdshell though), or you could create a CLR function (after enabling CLR). I guess since the latter two you need to change a server setting, you could just create a package.
Annnd since there is no data you dont actually need to export, you just create a blank file!
I have a folder called "masterData"; this folder has approximately 500GB of data, 100k files, 500 folders.
I would like to load the list of all the file name and the location into SQL Server.
I can do this using SSIS. But I am wondering if there are other more efficient way out there, instead of having the job run for 10 hours/day checking if the filename exists in the table.
I wonder if there is a way to set a pointer/marker somehow, so every new filename will be added into the table.
This sounds like a job for a FileSystemWatcher with associated event handlers that update your tables appropriately.