How to check if an SQL Script executed successfully in MS SQL Server? - sql

I have created multiple SQL DB Maintenance scripts which I am required to run in a defined order. I have 2 scripts. I want to run the 2nd script, only on successful execution of 1st script. The scripts contain queries that creates tables, stored procedures, SQL jobs etc.
Please suggest an optimal way of achieving this. I am using MS SQL Server 2012.
I am trying to implement it without using an SQL job.

I'm sure I'm stating the obvious, and it's probably because I'm not fully understand what you meant by "executed successfully", but if you meant no SQL error while running:
The optimal way to achieve it is to create a job for your scripts, then create two steps - one for the first script and for the second. Once both steps are there, you go to the advanced options of step 1 and set it up to your needs.
Screenshot

Can you create a SQL Server Agent Job? You could just set the steps to be Step 1: Run first script, Step 2: run second script. In the agent job setup you can decide what to when step 1 fails. Just have it not run step 2. You can also set it up to email you, skip to another step, run some other code etc... If anything the first code did failed with any error message, your second script would not run. -- If you really needed to avoid a job, you could add some if exists statements to your second script, but that will get very messy very fast

If the two scripts are in different files
Add a statement which would log into a table the completion and date .Change second script to read this first and exit,if not success
if both are in same file
ensure they are in a transaction and read ##trancount at the start of second script and exit ,if less than 1

SQL Server 2005’s job scheduling subsystem, SQL Server Agent, maintains a set of log files with warning and error messages about the jobs it has run, written to the %ProgramFiles%\Microsoft SQL Server\MSSQL.1\MSSQL\LOG directory. SQL Server will maintain up to nine SQL Server Agent error log files. The current log file is named SQLAGENT .OUT, whereas archived files are numbered sequentially. You can view SQL Server Agent logs by using SQL Server Management Studio.

Related

Run multiple SQL scripts with oracle SQL developer

Right now I'm dropping, creating and then running scripts for several database several times each day. It's getting a little tedious.
I've simple scripts from dropping and creating databases and additional scripts that writes data to the DBs.
I run these scripts through oracle SQL developer. Is there a way I can run all of these scripts at the same time. Like in a batch file or another tool? I.e
Drop existing DBs
Create DBs
Run scripts for DBs
Haven't been able to figure it out
You can create a scheduled job in sql developer. See bellow steps to create a job:
1- From your connection in sql developer select scheduler. In right click select "New Job ...".
2- In opened window you should select "PL/SQL Block" for "Type of Job" then write a block just like I wrote in bellow picture. Then you should select "Repeating" and insert intervals and start and end dates:
3- Keep in the mind that "Enable" box should be checked.
4- click on "Apply". Your Job will be ran based on intervals and start date you inserted.

Run MS SQL server script on startup

I am trying to run an SQL script when I start (or restart) my windows 2012 R2 server instance (Google Cloud Server). I am doing so using an SQL script, a Batch-file and the task-scheduler.
For the sake of testing I have created a simple SQL-script that adds a datestamp to a table:
USE <Databasename>
GO
INSERT INTO testingTable(time_Stamp)
VALUES (GETDATE())
SELECT * FROM testingTable
(where Databasename obviously contains the name of the specific database)
The batch-file looks as follows:
sqlcmd -S <servername> -i "C:\Temp\testQuery.sql" > C:\Temp\output.txt
I am outputting everything to a text-file. When I run the Batch-file the output looks fine: it prints a list with all the times I have run this SQL-query and saves it in the text-file.
I have scheduled this task to run on startup (following the steps here: https://www.sevenforums.com/tutorials/67503-task-create-run-program-startup-log.html). I have tried a whole range of settings here but nothing seems to work, including the exact settings as highlighted in the forum.
When I now restart the server the output file shows the following error message:
Msg 904, Level 16, State 3, Server <servername>, Line 1
Database 7 cannot be autostarted during server shutdown or startup.
Msg 208, Level 16, State 1, Server <servername>, Line 2
Invalid object name 'testingTable'.
It seems like MS SQL does not allow scripts to be run before you log-in to one of the user accounts.
The problem really is that the actual SQL tasks that I want to run have to be run very early in the morning such that they are done when everyone arrives at the office. I have managed to automate the startup of the server using VMPower, but I can not automate logging in to one of the accounts.
I was hoping someone could give me some feedback on how to resolve this issue. Preferably I would want my current setup to work, but if anyone has an idea on how to automate logging in to an account on an existing google cloud server instance that would be really helpful as well.
Thank you,
Joost
SQL Server offers the system stored procedure sp_procoption which can be used to designate one or more stored procedures to automatically execute when the SQL Server service is started.
For instance, you may have an expensive query in your database which takes some time to run at first execution. Using sp_procoption, you could run this query at server startup to pre-compile the execution plan so one of your users does not become the unfortunate soul of being first to run this particular query. I've used this feature to set up the automatic execution of a Profiler server side trace which I've scripted. The scripted trace was made part of a stored procedure that was set to auto execute at server start up.
exec sp_procoption #ProcName = ['stored procedure name'],
#OptionName = 'STARTUP',
#OptionValue = [on|off]
Read more: Automatically Running Stored Procedures at SQL Server Startup.
Docker
For solution to MSSQL Docker image, see: SQL Server Docker container is stopping after setup.

How to execute a SQL script if a SQL Job step fails

We have got a SQL job that comprises of several steps. Currently, the On Failure property is set to Quit the job reporting failure which only notifies the relevant parties of the failed step. Is it possible to execute a SQL script if the step fails (for further processing etc.) as well as sending the notification? We are using SSMS 2014.
You can create another step that will be executed each time another step fails.
In this step, simply put a Script Transact-SQL (T-SQL) in it.
Then on this new step, weither it fails or not : Quit the job reporting failure
Make sure no step falls in this step from success.

Oracle DB function log file

I executed a script in unix that called a function in oracle db. I didn't gave the logfile information for the unix script. Usually, when I run a script to call a db function, I give logfile for script and monitor the unix log file and know that if the function is still running or is done. Also, the logfile has information whether the function executed successfully or not.
I have following concerns, based on above situation:
Can I monitor if the function is still running or not using oracle sql developer?
Can I know if the funtion executed successfully in Oracle DB or not? If oracle saves a log of function execution and I could access that then it would be great.
Thank You
Yes, you can monitor if the function is still running by checking the session's status in v$session. See this answer for information on how: How to list active / open connections in Oracle?
As for what the execution result was... probably not.
The PL/SQL you executed won't directly appear in dba_audit_trail, but any queries it ran as part of execution might. The audit trail will show if the queries were successful or not, but it won't show the query results or the final result of the function execution.

SQL Server PRINT and Messages export to a .txt file

When executing IF/THEN queries in SQL Server I am using a Print Statement to let the user/myself know what has happened.
Also when I run a query SQL Server mentions how many rows have been affected.
After searching I have only come across MySQL functions that allow PRINT statements to be exported via SELECT into OUTFILE. Does SQL Server have a way to send the PRINT statements and or messages to a .txt file for logging?
If you run your script as a SQL Agent job, you can specify Output file on the Advanced settings of a step.
You don't have to use SQL Agent as a scheduler, you can run the job manually, or use sp_start_job.