SSIS - fill unmapped columns in table in OLE DB Destination - sql

As you can see in the image below, I have a table in SQL Server that I am filling via a flat file source. There are two columns in the destination table that I want to update based on the logic listed below:
SessionID - all rows from the first CSV import will have a value of 1; the second import will have a value of 2, and so on.
TimeCreated - datetime value of when the CSV imports happened.
I don't need help with how to write the TSQL code to get this done. Instead, I would like someone to suggest a method to implement this as a Data Flow task within SSIS.
Thank you in advance for your thoughts.
Edit 11/29/2012
Since all answers so far suggested taking care of this on the SQL Server side, I wanted to show you what I had initially tried doing (see image below), but it did not work. The trigger did not fire in SQL Server after SSIS inserted the data into the destination table.
If any of you can explain why the trigger did not fire, that would be great.

If you are able to modify the destination table, you could make the default values for SessionID and TimeCreated do all the work for you. SessionID would be an auto-incremental integer while the default value for TimeCreated would be getdate() or gettime() depending on the data type.
Now, if you truly need it the values to be created as part of your workflow, you can use variables for each.
SessionID would be a package variable which is set by an Execute SQL Task. Just reference the variable in your result set and have your SQL determine the next number to use. There are potential concurrency issues with this, though.
TimeCreated is easily done by creating a Derived Column in your data flow based on the system variable StartTime.

You can use a Derived Column to fill the TimeCreated column, if you want the time of the data flow to happen, you just use the date and time function to get the current datetime. If you want a common timestamp for the whole package (all files) you can use the system variable #[System::StartTime] (or whatitwascalled).
For the CSV looping (i guess), you use a foreach loop container, and map an iterative value to a user variable that you map in the derived column for SessionID as mentioned above.

First, I'd better do it on SQL Server side :)
But if you don't want or cannot to do it on server side you can use this approach:
It is obvious that you need to store SessionID somewhere you can create a txt file for that or better some settings table in SQL Server or there can be other approaches.
To add columns SessionID and TimeCreated to OLE Destination you can use Derived columns

Related

Update after a Copy Data Activity in Azure Data Factory

I've got this doubt in Azure Data Factory. My pipeline has a copy data activity, and after loading the information in the table I need to update a field in that destination based on a parameter. It is a simple update, but given that we do not have a SQL task (present in SSIS) I do not what to use. Create a SP for this does not seem to be the most appropriate solution, besides, modify the database is complicated. I thought the option "Use Query" in the Lookup activity could be a solution, but this does not allow me to create a SQL query with a parameter, just like in a Source.
What could be a possible workaround?
You are on the right track with the Lookup. That is definitely the way to go. The query field there will allow you to create dynamic SQL just like you did within the copy activity. You just need to reference the variable/parameter properly.
Also, with the Lookup, it will always expect something returned. You don't have to do anything with that returned value. Just ignore it, but the Lookup will not work without returning something. So, that query field would contain something like:
UPDATE dbo.MyTable SET IsComplete = 1 WHERE RunId = #{pipeline().parameters.runId};
SELECT 0 AS DummyValue; -- Necessary for Lookup to work

Use SQL Field in SSIS Variable

Is it possible to reference a SQL field in your SSIS variable?
For instance, I would like use the field from the "table" below
Select '999999' AS Physician_Profile_ID
as a dynamic variable (named "CMSPhysProID" in our example) here
I plan on concatenating multiple IDs into a In statement.
Possible by using execute sql taskIn left side pan of Execute SQL task, general tab 1.Select result set as single row2. Connection type ole db 3. Set connection and form SQL statement, As you mentioned Select '999999' AS Physician_Profile_ID 4.Go to result set in your left side pan 5. Add your variable where you want to store '999999' 6. Click ok
If you are looking to store the value within the variable to be used later, you can simply use an Execute SQL Task with a single row result set. More details in the following article:
SSIS Basics: Using the Execute SQL Task to Generate Result Sets
If you are looking to add a computed column while importing data, you must use a Derived Column Transformation within the data flow task to add a column based on another one, you can refer to the following article for more details about this component:
SSIS Derived Columns with Multiple Expressions vs Multiple Transformations
What are you trying to accomplish by concatenating the IDs into an "IN" statement? If the idea is to use the values of the IDs to limit the results, as a dynamic WHERE clause, you may have better luck just using a lookup against either a table you maintain with the desired IDs or even a static list generated in the package with a script task. (If you can use the lookup table method it will be much easier to maintain as you only have to update a table, not your source code.)
Alternatively, you may even be able to accomplish the goal with a join. Create a temp table from the profile IDs you want to keep and join to it, or, again, use it as a lookup component. Dynamically creating a where clause using IN will come in a lot slower and will be cumbersome to maintain.

How to use the date from a FileName and use this date in an expression - SSIS Package

I am quite a beginner with SSIS packages so bear with me. What I am trying to do:
I have daily files in the format of yyyy-mm-dd_filename_bla_bla.tsv
The date of the file need to be added in the table were I am trying to import it. Currently I am doing this manually with a derived coloumn with the expression: (DT_DATE)(DT_DBDATE)"yyyy-mm-dd"
Is there a possibility to automatically take the file name and only take the date part to import it into the table.
The things I find on the internet is getting the date into the file name, but this is exactly the opposite.
I hope I provided enough information, and anyone can help me out with this problem.
thanks in advance.
If you know the file name then keep that file name in a variable
example let file name be : 01/02/2015_kjh.bgd
then by using derived column use string functions as left(#variable,10)
10-> length of date
then map it to your oledb destination
To update/insert into a table you can use the SSIS SQL Exec task and pass in the variable value (which you know how to get) into the update/insert statement as a parameter. Its not hard just be careful on your settings of parameter names 0.
Here is a detailed description on how to do exactly what you are asking once you have a parameter set up:
How to pass variable as a parameter in Execute SQL Task SSIS?

Augmenting Rows With Value From T-SQL Statement

I'm using SSIS 2012 to import an Excel file into a database. One of the fields I need to populate into the database on import cannot be stored in the Excel file. The value that goes into the field can only be known at the time the record is being created in the database. The particular software I am using stores the last used value for this field in a separate database. When creating the records on import I need to increment this field and insert the new value in the new record. I have a T-SQL script that generates this value but I don't know enough about SSIS to know how to get that value for each row during Import.
Here's the script that I'm using to generate the value I need:
--Declare some variables
DECLARE #I_sCompanyID smallint,
#O_mNoteIndex numeric(19,5),
#O_iErrorState int
--Get the CompanyID
select #I_sCompanyID = CMPANYID
from DYNAMICS..SY01500
where INTERID = DB_Name()
--Get and increment the next note index
exec DYNAMICS.[dbo].[smGetNextNoteIndex] #I_sCompanyID, 1, #O_mNoteIndex output, #O_iErrorState output
--Print the Next Note Index
SELECT #O_mNoteIndex
The option that comes to mind is to use a Script Component to add a column named O_mNoteIndex into your data flow. You will basically need to use your above TSQL code and either work with OleDB, SqlClient or Odbc to query the Dynamics server and generate your id.
You will need to add the column into your output buffer and assign that the tsql value. I'm not finding any of my previous answers that explore how to do this but the msdn site ought to get you started
I suggest you run the above script and return the result into a SSIS variable. Then add that variable in a computed column transform. That is assuming you need to just run that script once before you load your dataset. If you need to run it per record, you need to go #billinkc's method.
Or if you want to avoid script you could load your source data into an OLEDB recordset and then use that in a For Each Loop and call that script for each iteration of the loop

Create delimited string from a row in stored procedure with unknown number of elements

Using SQL Server 2000 and Microsoft SQL Server MS is there a way to create a delimited string based upon an unknown number of columns per row?
I'm pulling one row at a time from different tables and am going to store them in a column in another table.
A simple SQL query can't do anything like that. You need to specify the fields you are concatenating.
The only method that I'm aware of is to dynamincally build a query for each table.
I don't recall the structure of MSSQL2000, so I won't try to give an exact example, maybe someone else can. But there -are- system tables that contain table defintions. By parsing the contents of those system tables you can dynamically build the necessary query for each source data table.
TSQLthat writes TSQL, however, can be a bit tricky to debug and maintain :) So be careful how you structure everything...
Dems.
EDIT:
Or just do it in your client application.