SSIS Create table which is destination in flow - sql

I have a database that is dropped and recreated completely. I want to schedule a SSIS package that:
executes a SQL script that creates TableA
than proceeds to the dataflow task that transfers data from source table to TableA.
Problem is that I get an error that says that TableA does not exist as destination. So the validation sees that in my dataflow task the destination does nog exist.
How can I set the routine, such that step 1 gets done before step 2 in one dtsx?
Please note that solutions such as truncation are not an option for me.

In the SSIS package, under the properties for the Data Flow Task, set Delay Validation property equal to True.
This will delay the validation of the step until run-time, after the table has been created by the previous Execute SQL Task.
Careful that some errors may be suppressed when using this property.

Related

Determine whether to execute SSIS package inside its Control Flow based on whether an SQL Agent job is currently running

How to let an Execute SQL SSIS component decide whether to continue package execution, based on whether a specific SQL Agent job is running.
This is an easy way of letting the package decide to proceed based on whether a specific SQL Agent job is running at the time the package is invoked.
To do so:
Create an Execute SQL task component at the start of your package.
Give it a query that will ask [msdb] on the execution status of the SQL Agent job you are interested in.
Like so:
USE [msdb];
SELECT ISNULL(MAX([sj].[name]),'') AS [JobRunning]
FROM [msdb].[dbo].[sysjobactivity] AS [sja]
INNER JOIN [msdb].[dbo].[sysjobs] AS [sj]
ON [sja].[job_id] = [sj].[job_id]
WHERE [sja].[start_execution_date] IS NOT NULL
AND [sja].[stop_execution_date] IS NULL
AND CAST([sja].[start_execution_date] AS DATE) = CAST(GETDATE() AS DATE)
AND ISNULL([sj].[name],'') LIKE '%NameOfSQLAgentJobThatNeedsToNotBeRunnimg%';
And as shown in the screenshot:
Make sure to set the result set property to single row and to create a Package scoped variable to hold the returned value.
Like so:
And then you need to add the fist step of the Package logic whatever that might be and connect the two.
When you have connected the two, edit the Precedence Constraint and set it to Expression and Constraint and upon Success set the code to:
#[User::JobRunning]!="NameOfSQLAgentJobThatNeedsToNotBeRunnimg"
like so:
And that's it.
Save, build, deploy :)

How to invoke a job in tsql sql server like listeners do/behave in java

I want to implement below design.
Data is coming to source table - source_tab and I have 3 procedures to invoke for processing this available data. I am invoking each procedure manually now.
Is there any way to create a job which will be invoked as soon as new data is available in source_tab and start processing data by invoking those 3 procedures sequentially. Also, next job cycle should not trigger until current execution gets finished. This should behave in same as Java listeners do.
I don't want to use TRIGGERS.
I agree with the comment suggesting MSMQ. A dirty method, if you don't want to use triggers is to set up a Job on a low interval schedule.
It could check the table, if new data exists then go to first step to execute and flow from there, if not then do nothing.
How you determine what "new" data is that would depend on the data. Easy enough if you have dateadded column or something similar. If not then you may have to have an additional job step to write the table as it is now into a staging table, then compare this version to the next one on the next run through.
Like I said, not nice but an option.

Debugging SP on SSIS

I have a stored procedure that I execute through SSIS using an execute sql task. It appears to work on SISS, but when I look at the database the record is not created. The connection is for the correct database. The PROBLEM.
I have put a breakpoint ON and checked all the variables getting fed IN AND THEN ran it manually IN SQL SERVER management.
The SP work perfectly in SSMS with the same input parameters, but when executed through SSIS, it does not create the records required and does not give any error out.
In the SP I have a try catch to put any erorrs in the stored procedure when it erorr out to a table, but there is no entry for the SSIS run. According to the Error table for the SP and SSIS it looks like it executed successfully. When I go to see if the record it is not created. I cannot see the problem. Is there something I can put into the stored procedure to debug this problem or anything further I can do in SSIS to work this out ?
It has been 3 hours on this problem so looking for a fresh perspective to work out what is happening.
The SSIS package definitely points to the correct database and stored procedure.
From the watch window it appears to be giving all the parameters the correct values and does not error in SSIS.
Worked it out with sql profiler . In the Target database there is sequence that is incremented each time a new record needs to be created . When I deleted the record to rerun it it created it with a different ID number , I was expecting it to be created with the same ID number.
Thanks Billinkc !

Stored Procedure passing control back too quickily - VB6

I have a stored procedure that is updating a very large table (with over 100 million records). The stored procedure is updating records in this table.
The steps are as follows:
Store record IDs to be updated in a recordset (not all records will be updated - only about 20000)
Loop through the recordset and call the stored procedure for each record ID in the recordset
Each time the stored procedure has finished (for each record in the recordset mentioned in part 1), update a flag in a table to say that the update completed.
I am finding some strange behaviour. It appears that the stored procedure is passing control back to VB6 before it has completed its updates and is continuing processing the next record. The stored procedure is then timing out later on (on another record ID). Therefore there are flags that say updated (step 3), even though the stored procedure has not run (because it timed out). Is this normal behaviour i.e. for the stored procedure to pass control back to VB6 before it has finished the work?
I have Googled this and I have discovered that it could be because of the way the stored procedure is optimised by SQL Server. I would expect control only to be passed back to VB6 after the updates have completed. Is this not the case?
Please note that I realise there may be better ways of approaching this. My question specifically relates to SQL Server passing control back to VB6 before it has finished the work (update).
The following article proved to be the solution to this problem: http://weblogs.sqlteam.com/dang/archive/2007/10/20/Use-Caution-with-Explicit-Transactions-in-Stored-Procedures.aspx. It appears that the following behaviour was happening:
1) Record 1. Run stored procedure and create transaction. Timeout on SQL Command object occurrs.
2) Record 2. Run stored procedure successfully. Return constrol to VB6 to update flag in database.
3) Record 3. Run stored procedure successfully. Return constrol to VB6 to update flag in database.
4) Record 4. Run stored procedure successfully. Return constrol to VB6 to update flag in database.
5) Program ends. Stored procedure rolls back transaction (transaction now encompasses records 1-4). Therefore records 1-4 are not deleted.
Can you...
run the code in sql management studio and see what happens and report back? if so i will update this answer as that will help us understand if its the code / connection or sql.
other things to investigate, given we dont not what cases you have tested for...
use the same code path in ur vb application and change only the sql in the stored procedure to something very simple but has the same signature as far as what its doing (ie/ basica reading if there is reading, basic deleting if there is deleting, and same for updating and adding) to see what happens.
Also, some other thoughts...
if you are using MSSQL, it's as simple as someone leaving a query window open and it ties up the database. This is easily tested. I've had the same trouble before. I've run stored procedures before that had no timeout, that normally would run immediately but would sit overnight and not run. Only to realize another person left their query window open. Close their window and poof it finally runs. Check this out, it could be a table lock. Whether it be the application doing it, or it is being done by another user making queries to the DB. Check to make sure your application is closing connections to the DB each time their being used.

How to determine the name of a process that caused a trigger to fire

Short Version:
Does anyone know of a way --inside a SQL 2000 trigger-- of detecting which process modified the data, and exiting the trigger if a particular process is detected?
Long Version
I have a customized synchronization routine that moves data back and forth between dis-similar database schemas.
When this process grabs a modified record from Database A, it needs to transform it into a record that goes into Database B. The database are radically different, but share some of the same data such as user accounts and user activity (however even these tables are structurally different).
When data is modified in one of the pertinent tables, a trigger fires which writes the PK of that record to a "sync" table. This "sync" table is monitored by a process (a stored proc) which will grab the PK's in sequence, and copy over the related data from database A to database B, making transformations as necessary.
Both databases have triggers that fire and copy the PK to the sync table, however these triggers must ignore the sync process itself so as not to enter into "endless" loop (or less, depending on nesting limits).
In SQL 2005 and up, I use the following code in the Sync process to identify itself:
SET CONTEXT_INFO 0xHexValueOfProcName
Each trigger has the following code at the beginning, to see if the process that modified the data is the sync process itself:
IF (CONTEXT_INFO() = 0xHexValueOfProcName)
BEGIN
-- print '## Process Sync Queue detected. This trigger is exiting! ##'
return
END
This system works great, keep chugging along, keeps the data in sync. The problem now however is that a SQL2000 server wants to join the party.
Does anyone know of a way --inside a SQL 2000 trigger-- of detecting which process modified the data, and exiting the trigger if a particular process is detected?
Thanks guys!
(As per Andriy's request, I am answering my own question.)
I put this at the top of my trigger, works like a charm.
-- How to check context info in SQL 2000
IF ((select CONTEXT_INFO from master..sysprocesses where spid = ##SPID) = 0xHexValueOfProcName)
BEGIN
print 'Sync Process Detected -- Exiting!'
return
END