Bat file to run a sql query on a schedule through Task Scheduler - sql

I am trying to run a .sql script on a schedule. I have created a batch file to run the script. The script runs fine in sql server management studio and also when I run the batch file content through cmd.
Contents of the batch file:
sqlcmd -S omfmesql -U OMESRV -P orat -i "\\pvsrv-
fsr14\data\Projects\Stat_Table_Creation_unique.sql"
The sql script is supposed to update a stat table. When I run it though cmd and refresh the stat table, the numbers are updated. But when I run this batch file through Task Scheduler, the only action that seems to be performed is running C:\Windows\SYSTEM32\cmd.exe
The task is stated to be completed successfully but the sql query is just not run.
I am not too experienced with Task Scheduler. Any help here would be very much appreciated. Thanks!
Note: I am not intending to use SQL Server Agent

If you have not done so, you need to set the location in Task Scheduler (TS). In at least some versions of TS, this can only be done when you create a basic task, not from the more general "Create Task..." option. Ensure that all the paths in the batch file are absolute or are based in this location.

Related

bcp not running from task scheduler

I am trying to schedule a bcp job in server 2012 task scheduler. My batch file works fine when I double-click on it. It includes this line:
bcp "SELECT * FROM [TIME_KEEPER]" queryout D:\DATA\TIMESHEET_DBASE.csv -S 10.0.0.54 /c /t, -T
The file is created from the command line. Scheduler has:
Action: start a program
Script: D:\DATA\myBatch.bat
Start in: D:\Data
I am using the same account for other scheduled tasks and they are running fine.
Sounds like a security issue.
Do any of the other scheduled tasks use the bcp executable and connect to the same server pulling data from teh same table? If not then you have to track down the security being used.
When you double click your batch, it is run as the account you are logged in as. Is it possible that your scheduled tasks are running as a different account than what you are logged in as?
As a test, are you able to log in to the windows server using the same account the task scheduler is executing the tasks (assuming they are different)?
Should get similar error at that point.
Just a start.

Unable to run .sql file in SQL Server

I have a .sql dump file 20 gb and I am trying to run it on Mysql workbench using run script and after successful execution, using SSMA I'll migrate the data from Mysql workbench to SQL Server. I have migrated the data this way many times successfully however for 20 gb file it seems very time-consuming. Please let me know if there is any alternate way to achieve this quickly. I have followed the following link:
Steps to migrate mysql tables to sql server using SSMA!
From your Title "unable to run .sql file in SSMS" and "I have a .sql dump file 20 gb" are you trying to open a 20GB .sql in SSMS? That's never going to work. SSMS is a 32bit application, so the maximum addressable memory is 2GB. If you want to run your .sql file, I suggest using sqlcmd.
Open up Powershell, and then run the command below replacing the appropriate parts:
sqlcmd -S {Server Name/ServerIP} -U {Your Login} -i {Your full path to your script}
You'll be prompted for your password and then you the file will be run. So, as an example, you might run:
sqlcmd -S svSQL2017 -U Larnu -i \\svFileServer\SQLShare\Scripts\BigBatchFile.sql
If you are using integrated security, then don't pass the -U parameter for the command.
Edit: This answer is no relevant to the OPs question, as they were using "SSMS" as a synonym for SQL Server, which it is not. I have left this here for the moment so the OP can review my comments, and I will likely remove this answer at a later point.

Windows Task Scheduler: SQLCMD command does not return error when fails?

I need some help with this.
I've scheduled a task on Windows Task Scheduler, that calls the command SQLCMD using the parameter: -i "path\script.sql"
My problem is this:
My script starts with: USE [DatabaseX]
DatabaseX does not exist on the server, so the script fails.
But the Scheduled Task ends with a Successful result, even if the script fails.
I need to see that the last run failed in the scheduled task. Or other place, but somewhere...
Is this possible?
Thanks,

Scheduling a pentaho job in SQL server agent

I have built out a simple FTP job in Pentaho that places a file in a local directory. I need to be able to call this job in a SQL server agent job which I can then schedule and use, but when I set the agent job up it runs through the steps successfully but does not produce anything to show that it was in fact successful.
I am pretty confident the Pentaho job itself is fine because it can be run through the UI, command line, and .bat file. Everything works as expected except when I try to make this SQL Server Agent job and I have no idea why!
Here is the only step in the job When I use this i'm prompted with no errors but nothing actually happens. If I try to enclose it in quotes I get an error.
Any help would be appreciated
Figured it out!
Apparently, only the first line of the command was executing. So it was navigating to a different directory but not executing any commands. I remedied this by putting everything on one line and adding a && to it.
Command line used: cd c:\pentaho\data-integration && kitchen.bat /file:C:\pentaho\Jobs\BW\FTP_BW_TRN.kjb /level:Basic

SQL Server 2008 Job based on changing Stored Procedures

I have looked through the SQL Server questions and answers and I didn't see an answer to this one, if it is out there and I've missed it, please let me know.
Here's the situation:
I write stored procedures and views that are then run as reports (using Crystal) - this is not the problem. Before I am able to release the reports into Production, I need to have the end users run the reports and check them for errors, etc. In a perfect world, I would have a frozen test environment, but I don't live in a perfect world. Every night everything I place into my test environment is wiped out and every morning anything that is in end user testing needs to be re-added. This means that when I come in the first thing I do is run all of the stored procedure, along with a script that unhides the reports in the program we use.
What I'd like to be able to do is to write a package that would find all of the stored procedures in a folder and execute them to add them to the database and, then, run the script that unhides the reports.
I know how to set up an SSIS package to run a stored procedure, but I don't know how to set one up that would run an ever changing list of stored procedures. Is this even possible? And, if it is, how do I go about starting this up?
I should note that while I have more then 10 years of query writing experience, I haven't used VB since VB 6.0 and I very new to the SSIS and SSRS world.
Thanks in advance!
Good old nt shell will do the trick. Run this statement in the folder containing the files.
for %A in (*.sql) DO sqlcmd -i %A -S <myServer> -d <myDb> -E
if you want to include it in a batch file it could look like
#echo off
for %%A in (*.sql) DO sqlcmd -i %%A -S <myServer> -d <myDb> -E
sqlcmd -i script_to_update_config.sql -S <myServer> -d <myDb> -E
This actually sounds like it may be more of a deployment issue than a SQL one. Take a look at Jenkins CI. I believe it's mostly used for code build and deployment, but it can also be used for any automated task.
If you had one SQL file that listed all the changed procs and their associated files you can use that single script to run all the others http://www.devx.com/tips/Tip/15132. For that matter you could just use a scheduled task to run it every morning.
Adding one more step, you could build the file based on the contents of a folder (using a little Powershell script or the like).
I'm not sure trying to do this all within an SSIS package is the right tool for the job.