SQL INSERT - how to execute a list of queries automatically - sql

I've never done this, so apologies if I'm being quite quite vague.
Scenario
I need to run a long series of INSERT SQL queries. This data is inserted in a table for being processed by a web service's client, i.e. the data is uploaded on a different server and the table gets cleared as the process progresses.
What I've tried
I have tried to add a delay to each Insert statement like so
WAITFOR DELAY '00:30:00'
INSERT INTO TargetTable (TableName1, Id, Type) SELECT 'tablename1', ID1 , 1 FROM tablename1
WAITFOR DELAY '00:30:00'
INSERT INTO TargetTable (TableName2, Id, Type) SELECT 'tablename2', ID2 , 1 FROM tablename2
But this has the disadvantage of assuming that a query will finish executing in 30 minutes, which may not be the case.
Question
I have run the queries manually in the past, but that's excruciatingly tedious. So I would like to write a program that does that for me.
The program should:
Run each query in the order given
Wait to run the next query until the previous one has been processed, i.e. until the target table is clear.
I'm thinking of a script that I can copy into the command prompt console, SQL itself or whatever and run.
How do I go about this? Windows service application? Powershell function?
I would appreciate any pointers to get me started.

You need to schedule job in SQL Server
http://www.c-sharpcorner.com/UploadFile/raj1979/create-and-schedule-a-job-in-sql-server-2008/

Related

Sequential execution of SQL Scripts that depend on the previous script

I'm probably asking a silly question, but is there a way to execute SQL scripts that depend on each other?
I have a set of 20 scripts, each one is dependent on the table that the previous script creates. Currently it's a case of waiting for each one to finish and without error before setting of the next one. This was fine for a while, but now the total run time is around 15 hours, so it would be really good if i could just set this off over a weekend and leave it without having to keep an eye on things.
you can create a stored proc like this.
Create proc SPWaitforTable
#tableName varchar (255)
as
while 1=1
begin
if exists (
select name from sys.tables where name =#tableName)
return
else
waitfor delay '00:00:01'
end
you can run all your script at once, but it will wait until the other table is created to proceed

Atomicity of a job execution in SQL Server

I would like to find the proper documentation to confirm my thought about a SQL Server job I recently wrote. My fear is that data could be inconsistent for few milliseconds (timing between the start of the job execution and its end).
Let's say the job is setup to run every 30 minutes. It will only have one step with the following SQL statement:
DELETE FROM myTable
INSERT INTO myTable
SELECT *
FROM myTableTemp
Could it happens that a SELECT query would be executed exactly in between the DELETE statement and the INSERT statement and thus returning empty results?
And what if I would have created 2 steps in my job, one for the DELETE query and another for the INSERT INTO? Is the atomicity is protected by SQL Server between several steps of one job?
Thanks for your help on this one
No there is no automatic atomic handling of jobs, whether they are multiple statements or steps.
Use this:
begin transaction
delete...
insert....
... anything else you need to be atomic
commit work

Stop execution of sql script

I have a huge SQL script with many batches (using GO).
On certain conditions, I want to stop the whole execution of this script.
I tried using NOEXEC, but I keep on getting invalid object errors, since my script includes creating and adding data to new tables and columns.
I do not want to use RAISERROR, is there any other way to do this?
There are no good solutions to this problem. One way you can share data between batches is tables(persistent or temporary). So you can have some logic that depends on some state of that table or value of particular columns in that table like this:
--CREATE TABLE debug(col int)
--INSERT INTO debug VALUES(1)
--DELETE FROM debug
SELECT 1
GO
SELECT 2
SELECT 3
GO
IF EXISTS(SELECT * FROM debug)
SELECT 6
GO
Just add IF EXISTS(SELECT * FROM debug) this line to every place in script that you want to skip and depending on the fact that the table has some rows or not those code blocks will execute or not.

i couldn't do a simple select top 1 * from table, and nothing with one table, which posible problems could be?

i got this error
timeout value expired. The timeout period elapsed prior to completion
of the operation or the server is not responding
i have a process, which do inserts and update at night, another process
which does query at nights too, (etl or dts) at sql server 2005 so, now we need to do a query to this table, and this doesn't work, i want to run my process again, and this never finish, and noneone can do a query to this table, (another tables could do) users commented me, yesterday they could do, but today, they coulden't is it posible, my process is execute at night has never finished, and it let a begin transaccion open?
how can i to be sure of this? and close it from ssms ?
this is not a problem of permissions we could do queries and inserts/updates yesterday.
it happens only with one table.
Try this:
SELECT TOP 1 * FROM Table (nolock)
Does that return results? If so, sounds like a locking issue..

Concurrent access problem in mysql database

Hi i'm coding a python script that will create a number child processes that fetch and execute tasks from the database.
The tasks are inserted on the database by a php website running on the same machine.
What's the good (need this to be fast) way to select and update those tasks as "in progress" to avoid to be selected by multiple times by the python scripts
edit: database is mysql
Thanks in advance
Use an InnoDB table Tasks, then:
select TaskId, ... from Tasks where State="New" limit 1;
update Tasks set State="In Progress" where TaskId=<from above> and State="New";
if the update succeeds, you can work on the task. Otherwise, try again.
You'll want an index on TaskId and State.
Without knowing more about your architecture, I suggest the following method.
1) Lock Process table
2) Select ... from Process table where State="New"
3) processlist = [list of process id''s from step 2]
4) Update Process table set State="In progress" where ProcessId in [processlist]
5) Unlock Process table.
A way to speed things up is to put the process into a stored procedure, and return the selected row from that procedure. That way, only one trip to the db server.