Variable is cleared on PostExecute event SSIS 2008 - variables

I am trying to pass Package variable: packno from Execute SQL Task to Data flow.
Variable is filled with needed value on OnProgress event. The value is:20717.
But on OnPostExecute event this value is cleared and assigned: -1.
So SQL command is executed with -1 value.
Any idea why this is happening?

There is a problem with the server. Everything worked fine on different one.

Related

How to make the SSIS package status to failure when propagate was set to false for a Sequence container

I have an SSIS package with for each loop > sequence container. The sequence container is trying to read file from For each loop and process its data. The requirement was to not fail the entire package when any exception happened in processing a file but to continue processing the next file until all the files were processed from the for each loop. For this, I have set the Propagate variable for the sequence container to False. I have also added email step on On Error event of Sequence container. The package is running as expected and able to process all files even when any exception happened with any file. But I would like the status of my SSIS package to be failed finally since one of the files got failed. How can I achieve that ?
Did you try this options?
(SSIS version in russian on the left side but it's sequence container)
View -> Properties window -> Then click on your sequence container and it will show you ther properties of sequence container.
If i were you first of all i would try property "FailPackageOnFailture" - it should cover your question if i get it right.
P.S. Also you can see the whole properties of your project when you click on a free place in your project
UPDATED (after comments and more clear understanding task):
The idea is - set this param Maximum ErrorCount for SQ as max as you want - in this case it wont stop the package because 1 of the files was failed in SQ and next file will process, but it should stop package after SQ will finish his work because you don't change MaximumErrorCount for package.
Important - a value of zero sets the error count threshold to infinity and package or task never get's Failure

Database update committed despite BAPI_TRANSACTION_ROLLBACK

I have a Modify statement to update a custom table after which I am calling BAPI_CONTRACT_CHANGE. When BAPI failed to change the document it's calling BAPI_TRANSACTION_ROLLBACK. However, it's not changing the data back in my custom table which was updated by Modify statement.
IF gt_return[] IS NOT INITIAL.
READ TABLE gt_return INTO gwa_return WITH KEY type = 'E'.
IF sy-subrc EQ 0.
CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
MESSAGE i021(zxx).
ELSE.
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTING
wait = abap_true.
ENDIF.
ENDIF.
Appreciate your response.
Modify statement is present somewhere before the bapi call.and also this program can be run from external portal . The behaviour is as expected when I run it from the portal i.e, BAPI_TRANSACTION_ROLLBACK works and data doesn't get updated in the custom table. It fails only when I run it from ECC.
You get an implicit commit on each PBO screen interaction.
So any user interaction after the Insert/modify/update DB will be committed without code needing to issuing a commit work.
if you want to rollback , it must be done before the next PBO cycle.

How to correctly add a labor transaction record in a Maximo automation script

Hi I'm trying to add a labor transaction from an action automation script with the object being ASSIGNMENT in Maximo. I am currently trying the code below.
labTransSet = MXServer.getMXServer().getMboSet("LABTRANS",ui);
labTrans = labTransSet.add();
labTrans.setValue("laborcode", userLabor);
labTrans.setValue("wonum", assignWonum);
sds1=SimpleDateFormat("hh.mm aa").format(firstDate);
sds2=SimpleDateFormat("hh.mm aa").format(Date());
labTrans.setValue("STARTTIME", sds1);
labTrans.setValue("FINISHTIME", sds2);
labTransSet.save();
labTransSet.close();
userLabor is the username of the current user
assignWonum is the assignment work order number
firstDate is the scheduled date field from the assignment
The labor record is being added correctly with the right data, but when I go to route my workflow after the script is called from a button, I am given the warning BMXAA8229W WOACTIVITY has been updated by another user and the work order does not route. I am under the impression that this is happening because the assignment object for the script is being queried at the same time I try to add and save a labor record. Does anyone know if my guess is correct or what else the problem is and how I can fix this? Thanks
That error occurs because Maximo already has one version of the record loaded into memory when the record in the database is modified independently. Maximo then tries to work with the in-memory object and sees it doesn't match what is in the database and throws that error. Timing doesn't really have anything to do with it (other than that an edit happened at some point after the record was loaded into memory).
What you need to do is make sure you are modifying the exact same task/assignment/labtrans record that has already been loaded into memory. That "MXServer.getMXServer().getMboSet" stuff is guaranteed to use a new object. That is how you start a new transaction in Maximo; how you make sure you are not using anything already loaded into memory. I suspect you want to get your set off of the implicit "mbo" object the script will give to you.

SQL Server Agent 2005 job runs but no output

Essentially I have a job which runs in BIDS and as as a stand lone package and while it runs under the SQL Server Agent it doesn't complete properly (no error messages though).
The job steps are:
1) Delete all rows from table;
2) Use For each loop to fill up table from Excel spreasheets;
3) Clean up table.
I've tried this MS page (steps 1 & 2), didn't see any need to start changing from Server side security.
Also SQLServerCentral.com for this page, no resolution.
How can I get error logging or a fix?
Note I've reposted this from Server Fault as it's one of those questions that's not pure admin or programming.
I have logged in as the proxy account I'm running this under, and the job runs stand alone but complains that the Excel tables are empty?
Here's how I managed tracking "returned state" from an SSIS package called via a SQL Agent job. If we're lucky, some of this may apply to your system.
Job calls a stored procedure
Procedure builds a DTEXEC call (with a dozen or more parameters)
Procedure calls xp_cmdshell, with the call as a parameter (#Command)
SSIS package runs
"local" SSIS variable is initialized to 1
If an error is raised, SSIS "flow" passes to a step that sets that local variable to 0
In a final step, use Expressions to set SSIS property "ForceExecutionResult" to that local variable (1 = Success, 0 = Failure)
Full form of the SSIS call stores the returned value like so:
EXECUTE #ReturnValue = master.dbo.xp_cmdshell #Command
...and then it gets messy, as you can get a host of values returned from SSIS . I logged actions and activity in a DB table while going through the SSIS steps and consult that to try to work things out (which is where #Description below comes from). Here's the relevant code and comments:
-- Evaluate the DTEXEC return code
SET #Message = case
when #ReturnValue = 1 and #Description <> 'SSIS Package' then 'SSIS Package execution was stopped or interrupted before it completed'
when #ReturnValue in (0,1) then '' -- Package success or failure is logged within the package
when #ReturnValue = 3 then 'DTEXEC exit code 3, package interrupted'
when #ReturnValue in (4,5,6) then 'DTEXEC exit code ' + cast(#Returnvalue as varchar(10)) + ', package could not be run'
else 'DTEXEC exit code ' + isnull(cast(#Returnvalue as varchar(10)), '<NULL>') + ' is an unknown and unanticipated value'
end
-- Oddball case: if cmd.exe process is killed, return value is 1, but process will continue anyway
-- and could finish 100% succesfully... and #ReturnValue will equal 1. If you can figure out how,
-- write a check for this in here.
That last references the "what if, while SSIS is running, some admin joker kills the CMD session (from, say, taskmanager) because the process is running too long" situation. We've never had it happen--that I know of--but they were uber-paranoid when I was writing this so I had to look into it...
Why not use logging built into SSIS? We send our logs toa database table and then parse them out to another table in amore user friendly format and can see every step of everypackage that was run. And every error.
I did fix this eventually, thanks for the suggestions.
Basically I logged into Windows with the proxy user account I was running and started to see errors like:
"The For each file enumerator is empty"
I copied the project files across and started testing, it turned out that I'd still left a file path (N:/) in the properties of the For Each loop box, although I'd changed the connection properties. Easier once you've got error conditions to work with. I also had to recreate the variable mapping.
No wonder people just recreate the whole package.
Now fixed and working!

OnTaskFailed event handler in SSIS

If I use OnError event handler in my SSIS package, there are variables System::ErrorCode and System::ErrorDescription from which I can get the error information if any things fails while execution.
But I cant the find the same for OnTaskFailed event handler, i.e. How to get the ErrorCode and ErrorDescription from the OnTaskFailed event handler when any things fails while execution in case we want to only implement OnTaskFailed event handler for our package?
This might be helpful, it's a list of all system variables and when they are available.
http://msdn.microsoft.com/en-us/library/ms141788.aspx
I've just run into the same issue, and I've worked around it by :
Creating a #[ErrorCache] variable
In my case the task was being retried multiple times, so I needed an Expression task to reset the #[ErrorCache] variable at the beginning of each retry
Create an OnError event handler, which contains an Expression task purely to append the #[ErrorMessage] to the #[ErrorCache]
Create an OnTaskFailed event handler which then utilises the #[ErrorCache]
Go to the event handler of the task you want to monitor for errors and click on the link to create a new handler.Then create a task like Send Mail and create 2 variables: mail_header and mail_body.
IMPORTANT: Move the variables from the current scope to the OnError scope otherwise the values won't be available when processing the package.
Define the mail_subject variable as string and set the expression as: "Error " + #[System::TaskName] + " when executing " + #[System::PackageName] + " package."
Define the mail_body variable as string and set the expression as: REPLACENULL( #[System::ErrorDescription],"" ) + "\nNotify your system administrator."
On the task editor, create an expression assigning Subject to the #mail_subject variable. Define the MessageSourceType as a variable and set MessageSource to the #mail_body.
In the task that you put on the error event handler you can select parameters that are only available in an error handler such as system:Errordescription or system:Sourcename (which provideds the task that it failed on). We use these as input variables to a stored proc which inserts into an error table (and to send an email for a failed process) that stores other information beyond just the logging table. We also use the logging table to log our steps and in clude on error in that so general error information goes there.