So I have a SQL Server job that runs every night that truncates a table and runs a query that pivots a count of data into columns from another table and inserts said results into that truncated table.
I have gone through quite a few different iterations of this job already and this is what I've settled on so far. I initially used a merge statement that is much more clean but unfortunately I discovered this function isn't available with SQL Server 2005. Next, I made a bastardized version of the code with Insert and Update with some choice if statements. This bastardized version appeared to have worked but I soon discovered that the update portion wasn't working. I then decided to cut out the update and just truncate and insert again each morning. This worked great for what I needed until I discovered that the job was inserting duplicate records into the table. To combat this, I created a primary key on the table. I ran the job manually and the table worked fine.
To automate this process I made this truncate/insert into a SQL Server job to run every morning before work hours and life was good. Turns out the next day this job fails to complete because I got a primary key violations. The code is still trying to input a duplicate. If I run this code manually through SQL Server job agent I receive no errors. If I let the job run as scheduled normally each morning it fails.
Is there anything in SQL Server job agent that is causing SQL Server to behave differently or process code differently? I don't understand how the system cannot run this code automatically using the same tool whereas I can I run it manually using the same tool and it works fine?
Try changing the time of day the job runs and see it that helps. It could be an issue with something else that just happens to be running at that same time.
Related
we have jobs in sql agent runnig schedually. We know at least about one job (scheduled every day at three different times - morning, afternoon, night) which rarely (apx one per several months) fires/runs twice in same time (like a parallel). But it is scheduled only one time in same moment. Job runs CLR stored procedure which launch console application which process some getting data and recalculating
This behaviour causes problems for us.
Why?
How could it happend?
Where is problem?
But where is reason?
Thank you for all ideas.
(Of course, we should handle it and prevent this behaviour.)
I found error in job history according to time of job execution - error 6535.
This error appears twice for this job iteration. Firstly without addtional message, secondly with additional message "The step suceeded". When I looking for information about his error 6535 - i found that this error occurs on sql server 2005, but due to select ##version, the customer use sql server 2008 R2
I have created multiple SQL DB Maintenance scripts which I am required to run in a defined order. I have 2 scripts. I want to run the 2nd script, only on successful execution of 1st script. The scripts contain queries that creates tables, stored procedures, SQL jobs etc.
Please suggest an optimal way of achieving this. I am using MS SQL Server 2012.
I am trying to implement it without using an SQL job.
I'm sure I'm stating the obvious, and it's probably because I'm not fully understand what you meant by "executed successfully", but if you meant no SQL error while running:
The optimal way to achieve it is to create a job for your scripts, then create two steps - one for the first script and for the second. Once both steps are there, you go to the advanced options of step 1 and set it up to your needs.
Screenshot
Can you create a SQL Server Agent Job? You could just set the steps to be Step 1: Run first script, Step 2: run second script. In the agent job setup you can decide what to when step 1 fails. Just have it not run step 2. You can also set it up to email you, skip to another step, run some other code etc... If anything the first code did failed with any error message, your second script would not run. -- If you really needed to avoid a job, you could add some if exists statements to your second script, but that will get very messy very fast
If the two scripts are in different files
Add a statement which would log into a table the completion and date .Change second script to read this first and exit,if not success
if both are in same file
ensure they are in a transaction and read ##trancount at the start of second script and exit ,if less than 1
SQL Server 2005’s job scheduling subsystem, SQL Server Agent, maintains a set of log files with warning and error messages about the jobs it has run, written to the %ProgramFiles%\Microsoft SQL Server\MSSQL.1\MSSQL\LOG directory. SQL Server will maintain up to nine SQL Server Agent error log files. The current log file is named SQLAGENT .OUT, whereas archived files are numbered sequentially. You can view SQL Server Agent logs by using SQL Server Management Studio.
I use an application connected with an sql database. I found using the profiler that the application runs an update query with a syntax error. I don't have access to the application's source code. The result is that the record is not updated. Is there a way to modify the query every time it is executed with something like trigger? I can't use INSTEAD OF because there ism't any record updated or inserted.
This answer
https://stackoverflow.com/a/3319031/1359088
suggests a way to log to a text file all the errors. You could write a little utility and schedule it to run every hour or whatever, which could read through this log, find the erroneous sql statements, fix them, then run them itself.
Given a entry in a sysdtslog90 table in SQL Server 2005 from an SSIS package, how do you figure out what database agent job the entry was run under?
It looks like, after many tests and examination of the data generated, you should be able to piece out this information for your environment(s) from a mix of the computer, operator, and source columns, and maybe the sourceid if that remains consistant between restarts of SQL Agent. Deriving it by correlating between job run times and the starttime and endtime column is also possible, and even more complex options could be worked through setting the value in the datacode column (however that may be done).
I no longer have anything to do with DTS packages, and so do not have (and cannot generate) any such sample data and provide a better answer. (I'd have made this a comment, but no one else has replied yet, so I'm making it a bit more formal.)
I've an SSIS package that runs a stored proc for exporting to an excel file. Everything worked like a champ until I needed to a do a bit of rewriting on the stored proc. The proc now takes about 1 minute to run and the exported columns are different, so my problems are the following;
1) SSIS complains when I hit the preview button "No column information returned by command"
2) It times out after about 30 seconds.
What I've done.
Tried to clean up/optimize the query. That helped a bit, but it still is doing some major calculations and it runs just fine in SSMS.
Changed the timeout values to 90 seconds. Didn't seem to help. Maybe someone here can?
Thanks,
Found this little tidbit which helped immensely.
No Column Names
Basically all you need to do is add the following to your SQL query text in SSIS.
SET FMTONLY OFF
SET NOCOUNT ON
Only problem now is it runs slow as molasses :-(
EDIT: It's running just too damn slow.
Changed from using #tempTable to tempTable. Adding in appropriate drop statements. argh...
Although it appears you may have answered part of your own question, you are probably getting the "No column information returned by command" error because the table doesn't exist at the time it tries to validate the metadata. Creating the tables as non-temporary tables resolves this issue.
If you insist on using temporary tables, you can create the temporary tables in the step preceeding the data flow. You would need to create it as a ## table and turn off connection sharing for the connection for this to work, but it is an alternative to creating permanent tables.
A shot in the dark based on something obscure I hit years ago: When you modified the procedure, did you add a call to a second procedure? This might mess up SSIS's ability to determine the returned data set.
As for (2), does the procedure take 30+ or 90+ seconds to run in SSMS? If not, do you know that the query is actually getting into SQL from SSIS? Might be worth firing up SQL Profiler to see what's actually being sent to SQL Server. [Which was the way I found out my obscure factoid.]