Counting how many script in SSIS have been completed - scripting

In SSIS package i have multiple scripts running within a job. At any given time i want to read how many scripts have been executed eg, 5/10 (50%) have been completed. Please tell how can i achieve that?

Currently there is no such functionality provided by SSIS to track progress of package execution.
It seems you need to write your own custom utility/application to implement same or use third party one.
There are few ways to do -
Using a switch called /Reporting or /Rep of DTEXEC at the command-line . For example:
DTEXEC /F ssisexample.dtsx /Rep P > progress.txt
2.Implement package logging or customize it.
3 . Implement Event handler on required executable. You can also use OnPipelineRowsSent log of Data Flow Task.
If you want to write your own application then below thread will provide nice starting point.
How do you code the Package Execution Progress window in C#

In my experience, we are using another software to monitor the jobs that are running. http://en.wikipedia.org/wiki/CA_Workload_Automation_AE
You can also try to create your own application that runs on the background that checks that status of your jobs, through checking the logs.

Related

How to schedule Pentaho Kettle transformations?

I've set up four transformations in Kettle. Now, I would like to schedule them so that they will run daily at a certain time and one after the another. For example,
tranformation1 -> transformation2 -> transformation3 -> transformation4
should run daily at 8.00 am. How can I do that?
There are basically two ways of scheduling jobs in PDI.
1. You can use the command line (as correctly written by Anders):
for transformation scheduling:
<pentaho-installation directory>/pan.sh -file:"your-transformation.ktr"
for job scheduling:
<pentaho-installation directory>/kitchen.sh -file:"your-transformation.kjb"
2. You can also use the inbuilt scheduler in Pentaho Spoon.
If you are using the EE version of PDI, you will have a inbuilt scheduler in the spoon itself. Its an UI interface which you can use it to easily schedule jobs. You can also read this section of doc for more.
You can execute transformation from the command line using the tool Pan:
Pan.bat /file:transform.ktr /param:name=value
The syntax might be different depending on your system - check out the link above for more information. When you have a batch file executing your transformation you can just schedule it to run using any scheduling tool on the whatever system you are running.
Also, you could put all the transformation in a job and execute that from the command line with Kitchen.
I'd like to add another answer that many first-time spoon users miss. Let's say you have a transformation exampleTrafo.ktr that you want to run in a certain interval. Then what you could do is create a job exampleJob.kjb which merely runs the transformation. If you do so, you will have to create something that looks like this:
The START node here is the important thing: right klick on it and choose Edit... and you'll be presented with a job scheduling window where you can specify your desired job schedule. Then save and run this job (either locally or eventually remote on a slave using PDI's carte server). Basically what you will end up with is a indefinitely running job called exampleJob that will execute your exampleTrafo in the desired intervals.

Test Automation Framework - Stuck

I am wondering about where to start in building a test framework here.
I create a vb.net application to display the list of projects available. Allow user to select the project, time and date when test needs to get executed.
Once the user decides the time and task, I want my system to schedule a task onto a remote machine where the test execution would happen at the specified time.
I am stuck at point two. any pointers or question is much appreciated.
I use testcomplete for automation.
I want my system to schedule a task onto a remote machine where the test execution would happen at the specified time
There's a Windows Task Scheduler and associated API that supports scheduling tasks at specific times. The API is aimed at C++ programmers.
You could use the Task Scheduler Managed Wrapper available on CodePlex for easy interop with VB.Net.
The task to execute could be copied to a network drive so that it is accessible from the remote machine.
For point 2 you'll have to call TestComplete from the command line as per these instructions:
http://support.smartbear.com/viewarticle/55587/
You can also call TestExecute from the command line, it's a cut down version of Test Complete that will run your tests. Your license may or may not include that.
Did you also consider taking a look at Jenkins for scheduling your test runs?

Execute Multiple PowerShell Files using a SSIS Package

I have multiple PowerShell script files that I need to execute in a sequential flow(one after other). Can someone please help me how to schedule multiple PowerShell files to be executed using a SSIS Package. And I need to build a fault tolerant model were I need to re-execute a powershell script in case of failure.
Running PowerShell
There isn't a built-in Execute PowerShell task (pity) so you'll need to use an Execute Process Task with the path to powershell.exe
Something that you will need to take into consideration is that the default execution policy for PowerShell is Restricted which cannot run a script. Further complicating matters is the account that runs the SSIS package will also need to have its execution policy modified to be able to fire off those scripts. It's a simple matter of Set-ExecutionPolicy RemoteSigned or whatever level you feel is appropriate but you'll need to do this from within the account.
Fault Tolerance
The simple approach is to ignore the return code in the Execute Process Task. Alternatively, if the desire is to keep running the PS1 until it doesn't fail, then you'd wrap a For Loop Container around the Execute Process Task and only set the terminal condition once the task returns a success value. Things might still go sideways depending on what the failure is.

ms-access: doing repetitive processes with vba/sql

i have an access database backend that contains three tables. i have distributed the front end to several users. this is a very simple database with minimal functionality. i need to import certain rows from a file every hour into one of the tables in the database. i would like to know what is the best way to automate this process so that i can have it running hourly. i need it to be running sort of as a service in the background. can you tell me how you would do this?
You could have for example:
a ms-access file with all necessary code to run the import proc
a BAT file containing the command line(s) that will run this ms-access file with all requested parameters. Check ms-access command line parameters to see the available options.
a task scheduler service software to launch the BAT file: depending on the task scheduler and the command line to be sent, you could even avoid the BAT file step
If all you want to do is run some queries, I would not do this by automating all of Access, but instead by writing a VBScript that uses DAO to execute the SQL directly. That's a much more efficient way to do it, and will run without a console logon (which may or may not be required for full Access to be run by the task scheduler).

How to get details about the DTS Step in a running job?

I have scheduled a DTS to run from a scheduled job. The DTS has several steps in it. Now whenever the job is running and I take a look at the jobs section in Enterprise manager, then it always displays the following in the status: Executing Job Step 1'.... although its running all steps properly. How do I know at what step the DTS is running at?
Can I get the status maybe from sql analyzer?
You can add something which can show you where the dts currently running at. I prefer best way is to put alert using a script. There is no other direct way using which you can trace DTS task !
The display you get is a snap shot. you need to keep refreshing it.
There is only one step in the job, the command to run the DTS package.
If you want to see progress of steps within the package, you need to add something to the DTS package to record each step as it finishes in a logging table.
Since the DTS mostly executes against database tables, on the SQLServer side you can find what all sessions are currently active, the statement it is executing etc if you have administrative privileges. You can find this under Management as Activity monitor.