Changing variables values during run time in SSIS [duplicate] - variables

I have a SSIS package in which, two records are coming. I need to insert the records in the table with an extra column (let's say Sequence). If there are two records, Sequence column should have the value 1(for the first record) and 2(for the second record). Again, next time, I'm getting three records, then again sequence starts from 1,2 and 3.
Is there anyway to do this without using script or stored procedure?
Screenshot:

There are 2 methods to achieve this:
(1) why not using a script component?
I think that using script component is more efficient and more controlled, you can write your own logic
In the DataFlow Task Add a Script Component
Add an Output Column of type DT_I4 (ex: NewID)
In the Script Editor use the following Code (i used Visual Basic language)
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
<Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute> _
<CLSCompliant(False)> _
Public Class ScriptMain
Inherits UserComponent
Private CurrentID as Integer = 0
Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
CurrentID += 1
Row.NewID = CurrentID
End Sub
End Class
In the OLEDB Destination,Map the NewID column to the destination identity column
(2) Using Staging Table
As similar to what i mentioned in this answer:
Create a staging table with an identity column
Each time Truncate the Table (this will restart identity), and Insert data into staging table
Insert data from Staging table using a DataFlow Task or an Execute SQL Task

You can use a staging table with an IDENTITY(1,1) column, each time you execute the package you have to TRUNCATE the table to reset IDENTITY. So each time it will start from 1
Or you can write your own logic using a Script Component

You can achieve this in the Database itself without the need of adding a Logic in the SSIS package. Just add a Column to your Destination table with IDENTITY and it will be automatically incremented. There is No Need to add Some Additional Logic in SSIS
You Can Add the IDENTITY Column (If you don't have one already on the Table) by Just altering your Table
ALTER TABLE YourTable
ADD SeqNo INT IDENTITY(1,1)
IDENTITY(1,1) Mens the Value of SeqNo for the First Record will be 1 and then it will be Incremented by 1 for each record inserted

Related

SQL Server overwrite unique constraint violation

I have two files which I am importing via Node JS to SQL Server. The table has unique key for equity instrument identifier (ISIN)
data1.csv and data2.csv
I first import data1.csv each row is inserted to the database. After this I import data2.csv (the values are again inserted to database) which may contain the same ISIN, but it's related values are higher priority than the first file (there are not many of these ISINs 5 out of 1000 or so).
What can I do with SQL server to overwrite the values if the unique constraint is violated? I understand that there is an option to upload data2.csv first, however there are some external constrains that do not allow me to do that.
Please tell me if additional information is required
I would recommend staging process to do this:
1. create a staging table with similar schema as your target table
2. Before loading delete all rows from staging table (you can use truncate)
3. Upload the file to the staging table
4. Load your data into final table - here you can use some logic to only insert new rows and update existing rows. Merge command will be useful in scenario like this.
Repeat steps 2 to 4 for each source table.

Referencing a dynamically created table in SSIS "source assistance"

I have an SSIS project that does the following:
Executes a sql task (in the control flow) that creates (and populates with data) 4 tables. After that executes I have a Data Flow task that references those 4 tables, joins them, then uses the destination assistant to create a new table called "Step". Then, those 4 tables are deleted leaving only the "Step" table.
However, when I try to run it it throws an error because technically I am referencing 4 tables that do not yet exist. How do I work around this?
I would do this the same way you have to handle temp tables in a data flow source.
Use a stored procedure for your datasource, and start it with something like:
IF 1=0
SELECT
{dummy-valued column list that matches the column names and datatypes of your expected output}
ELSE
{your real stored procedure code}

Handling insert and delete with multiple users in SQL

I would like to maintain a system for uploading data through Excel to SQL Server with ADO method. The process consists of two steps:
the raw data is inserted to temporary table, say dbo.TableTemp
the raw data is processed with a stored procedure and inserted to a dbo.GoodTable
delete from dbo.TableTemp at the end of stored procedure
Is there any way to be sure that the activities of two users not overlap? For example the delete from dbo.TableTemp of user1 will not be executed after user2 inserts data and before the data are processed?
Update. Unluckily I have not been successful with #temp tables. They seem to be too much temporary and when I try to insert data into them #temps already do not exist. For uploading data I use the variation of code by Sergey Vaselenko downloaded from here: http://www.excel-sql-server.com/excel-sql-server-import-export-using-vba.htm#Excel Data Export to SQL Server using ADO
In the Sergey's solution it is possible to create table by stored procedure prior to inserting the data in step 1. But when I create #temp table with stored procedure, it vanishes at the end of procedure, so I cannot insert data to it. Any help please?
Use temporary tables #TableTemp. Those are specific for each session and thus would not overlap.
There are two types of temporary tables: local and global. They differ
from each other in their names, their visibility, and their
availability. Local temporary tables have a single number sign (#) as
the first character of their names; they are visible only to the
current connection for the user, and they are deleted when the user
disconnects from the instance of SQL Server. Global temporary tables
have two number signs (##) as the first characters of their names;
they are visible to any user after they are created, and they are
deleted when all users referencing the table disconnect from the
instance of SQL Server.
Update. Looks like this particular Excel-SQL Server Import-Export using VBA use separate functions to create table and upload the data each opening and closing own connection. From SQL Server perspective those functions operate in different sessions and thus temporary tables do not persist. I think this solution can be rewritten to use single connection to create temporary table, populate, process the data and output the results into permanent table.
You might also find useful this question: How do I make an ADODB.Connection Persistent in VBA in Excel? In particular - Kevin Pope's answer suggesting the use of global connection variable opened and closed with the workbook itself:
Global dbConnPublic As ADODB.Connection
In the "ThisWorkbook" object:
Private Sub Workbook_Open()
Set dbConnPublic = openDBConn() 'Or whatever your DB connection function is called
End Sub
Private Sub Workbook_BeforeClose(Cancel As Boolean)
dbConnPublic.Close
End Sub
Another approach - use TABLE variable. From MSDN
CREATE #AddedValues TABLE (ID INT, SomeValue VARCHAR(50))
Then use it normally as tables in the query.
INSERT INTO #AddedValues (ID, SomeValue) VALUES (1, 'Test');
SELECT ID FROM #AddedValues WHERE SomeValue = 'Test';
Table variable's scope limited to the batch. So you can be sure that other user or even same user will not access it from another batch.
From MSDN
A table variable behaves like a local variable. It has a well-defined
scope. This is the function, stored procedure, or batch that it is
declared in.
Instead of using a Temp-Table in the user database you can put it in the temp db. Prefix the tablename in a CREATE TABLE Statement with # to create it in tempdb.
For example
CREATE TABLE #TableTemp (....)
Only the session that creates the temp table has access to it and SQL Server deletes the table automatically.

AS400 DB2 Duplicate Key Error during Insert in Table with PK Identity Column

I got a Table with an Auto Increment Column which looks like:
ALTER TABLE SOMESCHEMA.SOMETABLE
ALTER COLUMN ID
SET DATA TYPE INTEGER GENERATED BY DEFAULT
SET INCREMENT BY 1
SET NO ORDER
SET NO CYCLE
SET MINVALUE 1
SET MAXVALUE 2147483647
SET NO CACHE;
As long as i let the DBMS generate the Ids everything works fine and I can get the generated Id via:
SELECT IDENTITY_VAL_LOCAL() FROM sysibm.sysdummy1
But sometimes i need to insert a row with an ID of my choice and there i get into trouble.
Lets say we got a single row in the table with ID 1.Now i insert a new row with a manually assigned id of 2. The next time i try to insert a new row without a preset ID i get an error SQL0803 "DUPLICATE KEY".
I assume the internal "NextId" field for that Auto-Increment Column doesnt update itself if the Id of a row is manually set.
So I tried reseting this field with:
ALTER TABLE SOMESCHEMA.SOMETABLE ALTER COLUMN ID RESTART WITH 3
But this causes a permanent Table lock, which i dont know how to unlock.
How can i get this "Mixed-Mode" ID-Column working? Is it possible to get it to work like MySQL where the DBMS automatically updates the "NextID" upon a manually-Id Insert? If not, how can I release that {insert swear-word here} lock that pops up if i try to reset the NextId?
SQL0913 isn't creating a lock - it is reporting that a lock exists. ALTER TABLE needs an exclusive lock on the table in order to reset the ID number. A table can be locked by another process having it open, or it can be locked by this process if there are uncommitted rows.
There is another reason the table is in use - soft close (or pseudo-close). For performance reasons, DB2 for i keeps cursors in memory so that they can be reused as efficiently as possible. So even if you say CLOSE CURSOR, DB2 keeps it in memory. These soft closed cursors can be closed by the command ALCOBJ OBJ((SOMSCHEMA/SOMETABLE *FILE *EXCL)) WAIT(1) CONFLICT(*RQSRLS) The CONFLICT(*RQSRLS) parameter tells DB2 to close all soft closed cursors.
So the root of the issue is that DB2 wants exclusive access to the table. Which is sort of a design question, because typically one doesn't manipulate the table's structure during the work day. It sounds as though this table is sometimes a parent and sometimes a child when it comes to ID numbers. If that is the case, may I suggest that you ALTER the table again?
I think the implementation might be better if you used a trigger rather than auto-increment. Fire the trigger on INSERT. If ID is supplied, do nothing. If ID is not supplied, SELECT MAX(ID)+1 and use that as the actual ID number you commit to the database.
ALTER TABLE table_name ALTER COLUMN column_name RESTART WITH 99999;
Fixed my issue. "99999" is the next ID to be used for example

How do I create a trigger to insert a value into an ID field that is Max([ID Field])+1 on insert

When I add a new record I want SQL Server to automatically add a fresh ID.
There are already some records which have been migrated over (from Access) and until I finish preparing the server for other required functionality I will be manually migrating further records over (if this affects any possible answers).
What are the simplest ways to implement this.
The simplest way would be to make the column an IDENTITY column. Here is an example of how to do this (it's not as simple as ALTER TABLE).
Make use of the Identity field type. This will automatically create a value for you using the next available number in the sequence.
Here is an example of how to create an Identity column (add a new column) on an existing table
ALTER TABLE MyTable ADD IdColumn INT IDENTITY(1,1)