I need to run a batch file on one of my database tables. The best solution for me is to run it in a trigger. I'd like to know whether it's a good idea or not. And in case it isn´t what other solutions could I implement?
-I need to run it everytime there's an insert or update.
-My bat file takes 20-30 seconds to finish.
-The table has around 10 inserts a day.
Related
I have a stored procedure that I run on a new Microsoft SQL SERVER query and its duration is very short, it only takes a few seconds. But when I copy and paste and the query into a job the time grows for no reason.
I have tried to put in the stored procedure "WITH RECOMPILE" but still the same thing happens.
The stored procedure just copies the information from one table to another, it's very simple.
I need to introduce it in a job because I want this copy to be done every so often but with such a long time I don't see it feasible.
Thank you very much for your help in advance.
Check your query execution plan, as it seems like when executing it goes through some full table scans or something like that.
the other reason might be, check if you have indexes properly maintained for the columns you are targeting.
relatively new to this.
I have a stored procedure that should run when a job is triggered. It all worked fine with the files containing test data that I used for import and testing (excel sheets). I got a new file to test with my solution before deploying, but after having executed the job given the new file the stored procedure just keeps loading without having anything done.
I tried to delete the (excel)file and start again, but it says it's open in another program (it isn't). I then noticed that anytime I try to perform a simple select on one of the tables that are used in the stored procedure, it just keeps loading and never finishes. None of the simple commands work.
I've tried this:
SELECT * FROM Cleaned_Rebate
SELECT TOP 1 * FROM Cleaned_Rebate
TRUNCATE TABLE Cleaned_Rebate
DELETE FROM Cleaned_Rebate
SELECT COUNT(*) FROM Cleaned_Rebate
I also tried to create a new stored procedure identical to the original one, but it just never executes the create query. it keeps on loading. It only creates the new one if I remove 90% of the code.
I can't even execute the stored procedure for the sake of saving it (f5) with just an added comment...
I don't know what is wrong, why, or what I should do to fix this. Does anyone have an idea of what could be wrong? any advice is appreciated!
I want to add that these aren't any big tables - some of them should be empty even. the data sets aren't large either (about 100-300 rows?)
I've been experiencing a strange situation with a specific SELECT on SQL Server 2008 Standard.
I have a proc that executes a bunch of commands.
One of the last tasks is to insert the result of a select statement into a table. This proc is executed for each step of a job (with 20 steps) just changing one parameter, store code (the company has 20 stores).
This insert-select command normally executes in 2 minutes, but sometimes, it gets stuck with an SOS_SCHEDULER_YIELD lastwaittype and hangs forever... until I kill the process. After that, if I execute the same command, it executes in the normal 2 minutes.
I noticed that it doesn't matter if the CPU has 99% usage or 1%, or if there are other processes executing at the same time. I didn't find any pattern with these events. It just happens sometimes (almost every day with one or two steps).
Has anyone experienced this? I don't know what to do; I made a job to kill the process and execute it again if it hangs in SOS_SCHEDULER_YIELD for a long time, but I'd like to have a solution.
Thanks.
We have many SQL Server scripts. But there are a few critical scripts that should only be run at certain times under certain conditions. Is there a way to protect us from ourselves with some kind of popup warning?
i.e. When these critical scripts are run, is there a command to ask the user if they want to continue?
(We've already made some rollback scripts to handle these, but it's better if they not be accidentally run at all).
No, there is no such thing.
You can write an application (windows service?) that will only run the scripts as and when they should be.
The fact that you are even asking the question shows that this is something that should be automated, the sooner the better.
You can mitigate the problem in the meanwhile by using if to test for these conditions and only execute if they are met. If this is a series of scripts you should wrap them in transactions to boot.
One work-around you can use is the following, which would require you to update a value in another table:
CREATE PROC dbo.MyProc
AS
WHILE (SELECT GoBit FROM dbo.OKToRun) = 0
BEGIN
RAISERROR('Waiting for GoBit to be set!', 0,1)
WAITFOR DELAY '00:00:10'
END
UPDATE dbo.OKtoRun
SET GoBit = 0
... DO STUFF ...
This will require you to, in another spid or session, update that table manually before it'll proceed.
This gets a lot more complicated with multiple procedures, so it will only work as a very short-term workaround.
sql is a query language. does not have ability to accept user inputs.
only thing i can think of would be to have it #variable driven. first part should update #shouldRunSecond = 1. and the second part should be wrapped in a
if #shouldRunSecond = 1
begin
...
end
second portion will be skipped if not desired.
The question is - where are these scripts located ?
If you have them as .sql file that you open every time before you run, then you can simply add some "magic numbers" before beginning of the script, that you will have to calculate every time, before you run it. In example below each time before you run your script you have to put correct date and minute into IF fondition, other wise script will not run
IF DATEPART(dd,GETDATE())!=5 or DATEPART(mi,(GETDATE()))!=43
BEGIN
RAISERROR ('You have tried occasionally to run your dangerous script !!!',16,1);
RETURN
END
--Some dangerous actions
drop database MostImportantCustomer
update Personal set Bonus=0 where UserName=SUSER_SNAME()
If your scripts reside in stored procedure - you can add some kind of "I am sure, I know what I do" parameter, where you will always pass, for example Minute multiplied by Day.
Hote it helps
I have seen batch scripts containing SQLCMD ..., so instead of running the .sql script from code or management studio, you could add a prompt in the script.
I have (on limited occasion) created an #AreYouSure parameter that must be passed into a stored procedure, then put comments next to the declaration in the stored procedure explaining the danger of running said procedure.
At least that way, no RANDOs will wander into your environment and kick off stored procedures when they don't understand the consequences. The parameter could be worked into an IF statement that checks it's value, or it doesn't really have to be used at all, but if it must be passed, then they have to at least figure out what to pass.
If you use this too much, though, others may just start passing a 'Y' or a 1 into every stored procedure without reading the comments. You could switch up the datatypes, but at some point it becomes more work to maintain this scheme than it is worth. That is why I use it on limited occasion.
I have a SQL Script that inserts about 8000 rows into a TABLE variable.
After inserting into that variable, I use a WHILE loop to loop over that table and perform other operations. That loop is perhaps 60 lines of code.
If I run the TABLE variable insert part of the script, without the while loop, it takes about 5 seconds. That's great.
However, if I run the entire script, it takes about 15 minutes.
Here's what is interesting and what I can't figure out:
When I run the entire script, I don't see any print statements until many minutes into the script.
Then, once it figures out what to do (presumably), it runs the inserts into the table var, does the loop, and that all goes rather fast.
Then, toward the end of the loop, or even after it, it sits and hangs for many more minutes. Finally, it chugs through the last few lines or so of the script that come after the loop.
I can account for all the time taken during the insert, and then all the time taken in the loop. But I can't figure out why it appears to be hanging for so many minutes before and at the end of the script.
for kicks, I added a GO statement after the insert into the temp table, and everything up to that point ran as you'd expect; however, I can't do that because I need that variable, and the GO statement obviously kills that variable.
I believe I'm going to stop using the table variable and go with a real table so that I can issue the GO, but I would really like to know what's going on here.
Any thoughts on what SQL Server is up to during that time?
Thanks!
You can always check what a script is doing from the Activity Monitor or from the sys.dm_exec_requests view. The script will be blocked by something, and you'll be able to see what is that is blocking in the wait_type and wait_resource columns.
There are several likely culprits, like waiting on row locks or table locks, but from the description of the problem I suspect is a database or log growth event. Those tend to be very expensive once the database is a big enough and the default 10% increase means growth of GBs. If that's the case, try to pre-size the database at the required size and make sure Instant File Initialization is enabled for data files.
PRINTs are buffered, so you can't judge performance from them.
Use RAISERROR ('Message', 0, 1) WITH NOWAIT to see the output immediately.
To understand what the process is doing, I'd begin with calling sp_who2 a few times and looking at the values for the process of interest: isn't it being blocked, what are the wait types if any, and so on. Also, just looking at the server hardware load (CPU, disk activity) might help (unless there're other active processes).
And please post some code. Table var definition and the loop will be enough, I believe, no need for INSERT stuff.
If you are using the table variable, can you try substituting it with temp table and see if there is any change in performance?
And if possible, please post the code so that it can be analysed for possible area of interest.
From the wording of your question, it sounds like you're using a cursor to loop through the table. If this is the case, issuing a "SET NOCOUNT ON" command before starting the loop will help.
The table variable was mentioned in a previous answer, but as a general rule, you should really use a temp table if you have more than a few rows.