bcp not running from task scheduler - sql

I am trying to schedule a bcp job in server 2012 task scheduler. My batch file works fine when I double-click on it. It includes this line:
bcp "SELECT * FROM [TIME_KEEPER]" queryout D:\DATA\TIMESHEET_DBASE.csv -S 10.0.0.54 /c /t, -T
The file is created from the command line. Scheduler has:
Action: start a program
Script: D:\DATA\myBatch.bat
Start in: D:\Data
I am using the same account for other scheduled tasks and they are running fine.

Sounds like a security issue.
Do any of the other scheduled tasks use the bcp executable and connect to the same server pulling data from teh same table? If not then you have to track down the security being used.
When you double click your batch, it is run as the account you are logged in as. Is it possible that your scheduled tasks are running as a different account than what you are logged in as?
As a test, are you able to log in to the windows server using the same account the task scheduler is executing the tasks (assuming they are different)?
Should get similar error at that point.
Just a start.

Related

Using rsync preserving all the attributes and as foreground job

I have a server which is always online, I want during night hours, to run a job on the server that simply copies from /folderA -> /folderB.
I would not like to keep my screen on while my rsync is performing the job.
Is there any way of using this command:
rsync -avhW --no-compress --progress folderA/. folderB/
while I turn off my SSH station (laptop)?
Thanks
You are looking to setup a cron job. Cron jobs are tasks that run automatically on a schedule. To setup your rsync job:
Login to your server via SSH
Run crontab -e. This will open a file in the server's default text editor. Typically, this file will have some comments with details on how to add a job
To the last line of the file, add 30 11 * * * rsync -avhW --no-compress --progress folderA/. folderB/. This will run the job at 11:30PM every day. For changing the scheduled time, see the link above.
Save and exit
You can now logout from the server and the job will run every day at the scheduled time. Since the job is scheduled on the server, your workstation can be off and you do not need to logged into the server. The task will keep running on schedule until you remove the line from the crontab file.

Windows Task Scheduler: SQLCMD command does not return error when fails?

I need some help with this.
I've scheduled a task on Windows Task Scheduler, that calls the command SQLCMD using the parameter: -i "path\script.sql"
My problem is this:
My script starts with: USE [DatabaseX]
DatabaseX does not exist on the server, so the script fails.
But the Scheduled Task ends with a Successful result, even if the script fails.
I need to see that the last run failed in the scheduled task. Or other place, but somewhere...
Is this possible?
Thanks,

Bat file to run a sql query on a schedule through Task Scheduler

I am trying to run a .sql script on a schedule. I have created a batch file to run the script. The script runs fine in sql server management studio and also when I run the batch file content through cmd.
Contents of the batch file:
sqlcmd -S omfmesql -U OMESRV -P orat -i "\\pvsrv-
fsr14\data\Projects\Stat_Table_Creation_unique.sql"
The sql script is supposed to update a stat table. When I run it though cmd and refresh the stat table, the numbers are updated. But when I run this batch file through Task Scheduler, the only action that seems to be performed is running C:\Windows\SYSTEM32\cmd.exe
The task is stated to be completed successfully but the sql query is just not run.
I am not too experienced with Task Scheduler. Any help here would be very much appreciated. Thanks!
Note: I am not intending to use SQL Server Agent
If you have not done so, you need to set the location in Task Scheduler (TS). In at least some versions of TS, this can only be done when you create a basic task, not from the more general "Create Task..." option. Ensure that all the paths in the batch file are absolute or are based in this location.

Oracle EXPDP from remote client

I am running Oracle expdp process from a remote client using the command line. I can see the process starts and there are few progress messages shown in the command prompt window. But, after some time there are no further progress messages. When I checked the log created in the server directory, I could see the expdp process is completed and the .dmp file generated. What could be the reason for the command prompt on the client for not receiving the process updates after some time.
Below is expdp sample command used.
expdp Schemas=XXXX directory=exportdir dumpfile =xxxx.dmp logfile=xxxx.log Job_name=xxxx

Powershell Transcript is empty when running script from SQL Agent Job in 2005 SQL Server

I have a complex Powershell script that gets run as part of a SQL 2005 Server Agent Job. The script works fine, but it uses the "Start-Transcript $strLogfile -Append" command to log all of it's actions to a transcript file. The problem is that the transcript is always empty. It adds the header and footer to indicate that the transcript is starting and stopping, but it doesn't actually log anything. Example:
**********************
Windows PowerShell Transcript Start
Start time: 20100304173001
Username : xxxxxxxxxxxx\SYSTEM
Machine : xxxxx-xxx (Microsoft Windows NT 5.2.3790 Service Pack 2)
**********************
**********************
Windows PowerShell Transcript End
End time: 20100304173118
**********************
When I execute the script from a command prompt or start -> run everything works just fine. Here is the command used to run the script (same command used in the Operating system CmdExec step of the SQL Agent Job)
powershell.exe -File "c:\temp\Backup\backup script.ps1"
I first thought it must have something to do with the script running under the System account (default SQL Agent account), but even when I tried changing the SQL Agent to run under my own personal account it still created a blank transcript.
Is there any way to get PowerShell Transcripts to work when executing them as part of a 2005 SQL Server Agent Job?
If your script uses native commands (console exes), Start-Transript does not log any of that output. This issue has been logged on Connect, you can vote on it. One way to capture all input is to use cmd.exe:
cmd /c powershell.exe -file "C:\temp\backup script.ps1" > backup.log
sqlps.exe does not implement certain methods including the method that supports write-host. This may explain why you are not seeing output using Start-Transcript when running sqlps.exe from a SQL Agent Powershell jobstep. See http://blogs.msdn.com/mwories/archive/2009/09/30/the-use-of-write-host-and-sql-server-agent-powershell-job-steps.aspx for more information.
I am still not sure why the Powershell Transcript is empty, but we found a workaround. Under the CmdExec step of the SQL Job there is an advance option to capture the output to a file, which combined with the "Append output to existing file" option and using a Logfile.rtf extension is about the same as the Powershell transcript. This way anything that gets printed to the host from the Powershell script (including native console executables piped to "| out-host") will be captured in the log file.