Scheduling a pentaho job in SQL server agent - pentaho

I have built out a simple FTP job in Pentaho that places a file in a local directory. I need to be able to call this job in a SQL server agent job which I can then schedule and use, but when I set the agent job up it runs through the steps successfully but does not produce anything to show that it was in fact successful.
I am pretty confident the Pentaho job itself is fine because it can be run through the UI, command line, and .bat file. Everything works as expected except when I try to make this SQL Server Agent job and I have no idea why!
Here is the only step in the job When I use this i'm prompted with no errors but nothing actually happens. If I try to enclose it in quotes I get an error.
Any help would be appreciated

Figured it out!
Apparently, only the first line of the command was executing. So it was navigating to a different directory but not executing any commands. I remedied this by putting everything on one line and adding a && to it.
Command line used: cd c:\pentaho\data-integration && kitchen.bat /file:C:\pentaho\Jobs\BW\FTP_BW_TRN.kjb /level:Basic

Related

Where is the.cs files when a script task is run from the SSISDB using the agent?

I have an SSIS job that contains a script task.
There is also a connection to another server within the package that can only be executed accessed by a specific user and the agent cannot be given read rights to the other server.
So the solution to this is have the SQL job be run as the required user.
The problem with this is this user does not have full control over the folder/file where the script task will be temporarily located while the code runs so I get a failure on the script task but if I run it in visual studio or with the normal agent account the script task executes successfully.
I have tried giving the user access to all major drives on the server but this has not solved the problem
Is the script doing something on the folder?
Or does it just need permission to execute script?
I would anyway advise you to implement error handling in script:
https://learn.microsoft.com/en-us/sql/integration-services/extending-packages-scripting/task/logging-in-the-script-task?view=sql-server-2017
Make sure it works in visual studio, so that when you get that error you are sure that the error is caused by permissions/environment.
You could also take a look at this article:
article

Windows Task Scheduler: SQLCMD command does not return error when fails?

I need some help with this.
I've scheduled a task on Windows Task Scheduler, that calls the command SQLCMD using the parameter: -i "path\script.sql"
My problem is this:
My script starts with: USE [DatabaseX]
DatabaseX does not exist on the server, so the script fails.
But the Scheduled Task ends with a Successful result, even if the script fails.
I need to see that the last run failed in the scheduled task. Or other place, but somewhere...
Is this possible?
Thanks,

Bat file to run a sql query on a schedule through Task Scheduler

I am trying to run a .sql script on a schedule. I have created a batch file to run the script. The script runs fine in sql server management studio and also when I run the batch file content through cmd.
Contents of the batch file:
sqlcmd -S omfmesql -U OMESRV -P orat -i "\\pvsrv-
fsr14\data\Projects\Stat_Table_Creation_unique.sql"
The sql script is supposed to update a stat table. When I run it though cmd and refresh the stat table, the numbers are updated. But when I run this batch file through Task Scheduler, the only action that seems to be performed is running C:\Windows\SYSTEM32\cmd.exe
The task is stated to be completed successfully but the sql query is just not run.
I am not too experienced with Task Scheduler. Any help here would be very much appreciated. Thanks!
Note: I am not intending to use SQL Server Agent
If you have not done so, you need to set the location in Task Scheduler (TS). In at least some versions of TS, this can only be done when you create a basic task, not from the more general "Create Task..." option. Ensure that all the paths in the batch file are absolute or are based in this location.

SSIS Execute Process Task Can't Find executable

I am using the 7zip standalone .exe to unzip a file. I am using the Execute Process task for this. I have tested this over and over again on multiple machines and I know it works (at least in debug mode/visual studio). I have uploaded this package the server. I have created a job that calls said package from the Package Store. The package is not able to find the .exe no matter where I put it.
My first thought was to put the .exe on the C:\ drive, which failed. I have also failed in my attempts to place the .exe on a network location that the account the package is running under has full control over.
Basically, has anybody else had issues getting the Execute Process Task to find an executable when the package is uploaded to the server?
The error message is
Can't find 7za.exe in directory C:\7zip
I'll risk a downvote for being wrong, but I believe you have a permission issue.
You say it runs fine on other servers from BIDS, try it without BIDS. Call it from a command-line on a box that it works on.
dtexec.exe /file C:\HereComesTheUnzipper.dtsx
If that works, then repeat the step on the troublesome server. RDC into the box and try again
dtexec.exe /ser localhost /sq HereComesTheUnzipper
If that still works, then you are looking at an issue with the job. What account is the SQL Agent service running as? Is the SSIS job step running as a particular set of credentials? If so, is it a SQL Server login (which wouldn't map to anything on the physical box)? Regardless of what your answer is, the resolution will be to ensure the account has access to
7z.exe
whatever scratch area 7zip may use while unpacking files (I assume %temp%)
the output folder (C:\bin\7z.exe -e e:\data\MyThing.7z)

Powershell Transcript is empty when running script from SQL Agent Job in 2005 SQL Server

I have a complex Powershell script that gets run as part of a SQL 2005 Server Agent Job. The script works fine, but it uses the "Start-Transcript $strLogfile -Append" command to log all of it's actions to a transcript file. The problem is that the transcript is always empty. It adds the header and footer to indicate that the transcript is starting and stopping, but it doesn't actually log anything. Example:
**********************
Windows PowerShell Transcript Start
Start time: 20100304173001
Username : xxxxxxxxxxxx\SYSTEM
Machine : xxxxx-xxx (Microsoft Windows NT 5.2.3790 Service Pack 2)
**********************
**********************
Windows PowerShell Transcript End
End time: 20100304173118
**********************
When I execute the script from a command prompt or start -> run everything works just fine. Here is the command used to run the script (same command used in the Operating system CmdExec step of the SQL Agent Job)
powershell.exe -File "c:\temp\Backup\backup script.ps1"
I first thought it must have something to do with the script running under the System account (default SQL Agent account), but even when I tried changing the SQL Agent to run under my own personal account it still created a blank transcript.
Is there any way to get PowerShell Transcripts to work when executing them as part of a 2005 SQL Server Agent Job?
If your script uses native commands (console exes), Start-Transript does not log any of that output. This issue has been logged on Connect, you can vote on it. One way to capture all input is to use cmd.exe:
cmd /c powershell.exe -file "C:\temp\backup script.ps1" > backup.log
sqlps.exe does not implement certain methods including the method that supports write-host. This may explain why you are not seeing output using Start-Transcript when running sqlps.exe from a SQL Agent Powershell jobstep. See http://blogs.msdn.com/mwories/archive/2009/09/30/the-use-of-write-host-and-sql-server-agent-powershell-job-steps.aspx for more information.
I am still not sure why the Powershell Transcript is empty, but we found a workaround. Under the CmdExec step of the SQL Job there is an advance option to capture the output to a file, which combined with the "Append output to existing file" option and using a Logfile.rtf extension is about the same as the Powershell transcript. This way anything that gets printed to the host from the Powershell script (including native console executables piped to "| out-host") will be captured in the log file.