T-SQL query to find jobs and tables - sql

I want to find the sql server agent jobs that operate upon a particular table. For ex:
I have a table called TAB1 which is updated daily by a job called SAJ1. I need a query to extract this information.

You can start with this select if the job step in T-SQL then it will find the text what you will declare as table name.
use msdb
Declare #table_name varchar(50)
set #table_name='Test'
select j.name,js.command from dbo.sysjobs j
inner join
dbo.sysjobsteps js
on j.job_id=js.job_id
where command like '%'+#table_name+'%'

One of the options available is the table msdb.dbo.sysjobsteps that enlists all the jobs and their steps. You can go into the Command column of the output and look for required information.
select * from msdb.dbo.sysjobsteps;

Related

Execute a table with list of bigQuery commands

I have a table with bigQuery update commands, for example:
Row is: UPDATEtableAS t2 SET chr3_3268035_CT=1 FROMsorted_500AS t1 WHERE t1.sample_id = t2.sample_id AND t1.PIK3CA_features="chr3_3268035_CT"
Is there a way to send this table column for execution?
Many thanks!
You cannot directly execute commands listed in your table, but you could write a script to read a series of commands from a table and then execute them.

Create table with checksum of all tables in a database?

I'm trying to figure out how to determine if a table has been affected by a number of processes that run in sequence, and need to know what the state of the table is before and after each runs. What I've been trying to do is run some SQL before all the processes run that saves a before checksum of every table in the db to a table, then running it again when each ends and updating the table row with an after checksum. After all the processes are over, I compare the checksums and get all rows where before <> after.
Only problem is that I'm not the best guy for SQL, and am a little lost. Here's where I'm at right now:
select checksum_agg(binary_checksum(*)) from empcomp with (nolock)
create table Test_CheckSum_Record ( TableName varchar(max), CheckSum_Before int, CheckSum_After int)
SELECT name into #TempNames
FROM sys.Tables where is_ms_shipped = 0
And the pseudocode for what I want to do is something like
foreach(var name in #TempNames)
insert into Test_CheckSum_Record(name, ExecuteSQL N'select checksum_agg(binary_checksum(*)) from ' + name + ' with (nolock)', null)
But how does one do this?
Judging by the comments you need to create a trigger that handles all CRUD operations and just places a flag
Syntax is
Create TRIGGER [TriggerName] ON [TableName]
AFTER UPDATE, AFTER Delete, AFTER UPDATE
In the trigger you can do a
select CHECKSUM_AGG([Columns you want to compare against])
from [ParentTable] store that value in a variable and check it against the checksum table before column. If it does not exist you add a new entry with the DELETED tables checksum_AGG value as the before entry
Please note the choice not to use the inserted table is just preference for me on calculated columns
I will edit later when I have more time to add code

Table Sync from DB2 to SQL Server

We have a tables in Db2, That which we need to get that table to MS SQL server (only for read), And I want it to be in sync for every 15 minutes (one way from DB2 to SQL Server). Can you suggest the best approach?
Have a SQL Agent job execute an SSIS package every 15 minutes.
I know that all the time MERGE is the right option to sync the tables in the SQL. But I am not sure, whether we can use it in linked servers also. Anyway, after some research I got this task accomplished by using the merge join. Merge will update, insert, delete what ever required. But it will take a little bit more time to update the table for every 15 min, when the job runs. So, you can create a #Temptable to insert the transactions that were done from the lastjob done.You can use the datetime stamp in that source table to retrieve the transactions that were done from the last job done(15min). If you don't have the date time in source table, you can use the audit table for that source table(if applicable).
(JLT table have 3 columns (last_job_end)( cur_job_start)(some job identity). JLT is the job log table we need to create in linked server to get the last job end and cur job start time, We need to update last job end every time at the end of query in JOB. As well as cur job start in the beginning of the job )
SELECT *
INTO #TEMPtable
FROM OPENQUERY([DB2], 'Select * from source_table
where some_id_column in
(select some_id_column
from audit_table AT, Job_log_table JLT
where datetime > last_job_end
and datetime <= cur_job_start
and c_job = ''some_job_id'')’)`
If you don't have the audit table and you have the datetime in Source.
SELECT *
INTO #TEMPtable
FROM OPENQUERY([DB2], 'Select *
from source_table s, JOB_CYCLE_TABLE pr
where s.DATETIME <= pr.cur_job_start
and s.DATETIME > pr.last_job_end
and pr.c_job = ''some_job_id''')

Track number of rows in a #table which the population is in progress

I am working in SQL Server 2012 Management studio.
In a SQL query window, an insert into a #table is happening. It is expected to insert somewhere around 80 million rows with 3 INT columns each.
The query execution is going on.
Is there a way that I can track the no of rows in the #table?
Since you cannot run two queries in the same window simultaneously and temp tables are not accessible in other sessions if they are declared with a single #, you should try defining it with a double # in your insert query.
Then you could try querying it using WITH(NOLOCK).
Open a new query window on the same db and try
SELECT COUNT(*)
FROM ##YourTableName WITH(NOLOCK)
This will get dirty reads, but i do not think it would be a problem in your case as you would like a rough measure on where your INSERT is.
One method is to query the DMVs using the temp table object id. You can get the local temp table object id from the session that created it using this query:
SELECT OBJECT_ID(N'tempdb..#table', 'U');
Then run the script below in another windows, supplying the object_id value from the above query (-1180342868 in this example):
DECLARE #object_id int = -1180342868;
SELECT SUM(rows)
FROM tempdb.sys.partitions
WHERE
object_id = #object_id
AND index_id IN(0,1);
Of course, this method assumes you had the foresight to get the temp table object id before running the insert. If the query is currently running, you could run the script below and make an educated guess as to which object might be the temp table being loaded.
USE tempdb;
SELECT OBJECT_NAME(object_id), SUM(rows)
FROM tempdb.sys.partitions
WHERE
index_id IN(0,1)
AND OBJECTPROPERTYEX(object_id, 'IsUserTable') = 1
GROUP BY
OBJECT_NAME(object_id);
Be aware that this might not be a reliable way to track the load progress. Much depends on the query plan particulars. It could be that the costly operators are earlier in the plan and the actual insert won't occur until the last minute.
If you wish to run the query to count rows in another window or outside the scope where the table was declared, please use a global temp table.
For Example,
CREATE TABLE ##table(
a int,
b int,
c int)
And the in another window you can run, this will work
SELECT COUNT(*) FROM ##table WITH (NOLOCK)

SQL Server Global Temporary Table Locking

How do I lock a global temporary table in a stored procedure that's getting created and populated by a SELECT INTO statement? For example:
SELECT *
INTO ##TempEmployee
FROM Employee
This stored procedure is executed for generating reports and it's there in every client database (multi-tenant architecture using different DB per client). I do not want data in this global temporary table to be shared between clients when the report is generated concurrently. I don't have a choice but to use global temp table because I use it for generating columns on the fly using PIVOT.
Why not include it inside a transaction block like
begin transaction
SELECT *
INTO ##TempEmployee
FROM Employee
Try this,
WorkDummySQL
create table rr(id integer,name varchar(20))
insert into rr values(1,'aa')
select * from rr
Tempdb
select * into ##ta from WorkDummySQL.dbo.rr