I have 12 queries to run, but can not run at the same time. I would like to start the first query, then as soon as it completes the next query begins, when it completes the third query starts and so on. Want this to be automatic. Each query writes a txt file. Thanks
Seems to me like you just have to create a script and call that script:
#query1.sql
#query2.sql
...
Or am I missing something?
Paste them all in 1 file with a GO statement between should do the trick.
Related
I am new to Pentaho. I currently has a job like this:
condition 1 -> condition 2 -> if successful then run this sql scripts, if failed then send email
I would like to have a loop that is more like:
(condition 1 -> condition -> 2) are run every 30 minutes
-> if successful then run the sql scripts and stop going back to conditions to check
-> if failed then loop back and run in the next interval
Is this possible to achieve?
Many thanks!
You can do it with a job. See screenshots.
You'll need to replace the simple evaluation step by your condition checking routine.
You could program a job to run every 30 minutes and add a step generating a file or inserting/updating a DB table that you could use to check if you need to run the conditions or not.
Currently, in Databricks if we run the query, it always returns 1000 rows in the first run. If we need all the rows, we need to execute the query again.
In the situations where we know that we need to download full data(1000+ rows), is there a turn around way to execute the query to get all the rows in the first run without re-executing the query?
There should be a down arrow next to the download button which allows you to Download full results. Are you seeing that in your interface? (Source)
I do run pretty complex code in Access VBA. It pulls data from Excel or from other query into table in an Access frontend database. Then I want to run DoCmd.RunSQL that updates backend database based on this temporary data, but this line of code acts weirdly.
When I run the code it updates 0 records.
If I debug.print the code into immediate window, and run it from there, it updates the records as expected.
When I take the SQL string into Access query builder, again it updates the records.
If I use SetWarnings = True, and it shows warning that I'm about to update 0 records, I press no, VBA throws error on the DoCmd.RunSQL. I press Run again, and on the same line, now it wants to update all the records as expected
I tried to run the query twice within the code. In both tests it almost seemed to be a solution, but now again it is updating 0 records.
I tried to use loop that runs for set time (something like Excel's Application.wait), suspecting that maybe temporary data in table have to refresh for database to see it, or something.
Anyone knows where is the problem? Do I have to refresh tables somehow or something? Thank you in advance.
UPDATE (SELECT [AUC],[Comment] FROM [AuC] IN 'C:\Users\test.accdb') AS Q1, [tempComments]
SET [Q1].[Comment] = [tempComments].[Comment]
WHERE [Q1].[AUC] = [tempComments].[AUC]
This should work
With Access.DoCmd
.RunSQL "[AuC] IN 'C:\Users\test.accdb' SET [AuC].[Comment] = [tempComments].[Comment] WHERE [AuC].[AUC] = [tempComments].[AUC]"
End With
I have a sql query that I need to loop through the system views sys.dm_exec_requests and sys.dm_exec_sessions every 60 seconds to pull specific information and dump it into a separate table. After a specified time I would like it to kill the loop. How would the loop be formatted?
This sounds like a SQL Agent job. If so, the short form of the answer is:
Create the job with one step that runs the query
Add a Schedule that runs it once a minute, starting whenever you want it to start
Set the schedule to stop running it when the cut-off time is reached
The long form, of course, is all the detail work behind creating a SQL Agent job. Best to read up on them in Books Online (here)
Don't do this in a loop. Do it with a job.
Write a sproc that does the query and save the results and then call it from a job.
I think you should use a job as well. But some work environments that is not practical. So you could have something like:
WHILE #StopTime < getdate()
BEGIN
exec LogCurrentData
WAITFOR DELAY '00:01:00'; -- wait 1 minute
END
I think the best way is creating a Job
There is a post that explain how to create a job step by step (with images) in SQL Server.
You can visit the post here
If you prefer a video tutorial, you can visit this link
I've always thought that looping over sql files in directory in SSIS is easy... But I've got a problem today:
Execute SQL Task isn't executing statements that are in the sql file.
In the sql file I've got delete statement and then insert statement.
SSIS Execute SQL Task component is done after about 2 sec, while executing the same script manually takes usually about 2 minutes, and of course in SSIS it doesn't insert anything.
I checked variable value that is coming from Foreach loop (with full filemane path) and it is ok.
I've got parametrized (by Expression) File Connection with this variable.
What am I doing wrong? Thanks for help.
Do you have GO after each call within your SQL file?
Example:
// Your insert TSQL code here
GO
// Your delete TSQL code here
GO
// Etc...
It will not continue if you don't have this.