When I am trying to update an existing job I get Supply either #job_id or #job_name to identify the job - sql-server-2012

I am new to modify existing SQL Agent jobs and I got requirement from my client that I need to rename the job with adding the description to it.
I am trying the below statement but it gives me Supply either #job_id or #job_name to identify the job. error.
EXEC msdb.dbo.sp_update_job #job_id='CD63A0B5-522C-495D-BED5-D9F900F71202',
#job_name=N'JobName',
#new_name=N'NewJobName',
#enabled=0,
#description=N'The Description is now available.'
I don't know what am I missing over here.

The error means you need to specify either #job_id or #job_name to identify the existing job, but not both. You can use either of these scripts to accomplish the task:
EXEC msdb.dbo.sp_update_job #job_id='CD63A0B5-522C-495D-BED5-D9F900F71202',
#new_name=N'NewJobName',
#enabled=0,
#description=N'The Description is now available.';
GO
EXEC msdb.dbo.sp_update_job #job_name=N'JobName',
#new_name=N'NewJobName',
#enabled=0,
#description=N'The Description is now available.';
GO

Related

Schedule SP in SQL Agent

I have an stored procedure which run an SSIS package.
Below is the script
CREATE PROCEDURE MRA
AS
Declare #execution_id bigint
EXEC [SSISDB].[catalog].[create_execution] #package_name=N'CNP_FORMAT.dtsx',
#execution_id= #execution_id OUTPUT,
#folder_name=N'MRA',
#project_name=N'CNP_FORMAT',
#use32bitruntime=False,
#reference_id=Null
EXEC [SSISDB].[catalog].[start_execution] #execution_id
The script run correctly in sql server where targeted table are uploaded correctly.
I have schedule above script in SQL Agent.
Schedule Job runs without errors but targeted table NOT uploaded.
Means the schedule job in SQL Agent did not run the ssis package!
I'm currently stuck with this issue and don't know how to proceed.

Stored procedure with parameter validation for job name

I would like to create a stored procedure in SQL that executes only jobs that are specified in the parameter validation of the procedure itself. For example I would like my dev team to only be able to pass the job names that I specify that way they will be able to run the jobs they need only. From their perspective it will look something like this.
Exec sp_run_only_jobs myjob
or
Exec sp_run_only_jobs myotherjob
if they try to run a job other than the two above it should fail with an error message.
I
You can create a stored procedure to check if the specified job name falls under a specific list of jobs, then execute the job. Otherwise, throw error message.
CREATE PROCEDURE dbo.usp_run_only_jobs #JobName NVARCHAR(4000)
AS
BEGIN
IF #JobName EXISTS IN ('AllowedJob1', 'AllowedJob2')
BEGIN
EXEC msdb.dbo.sp_start_job #JobName ;
END
ELSE
BEGIN
THROW 51000, 'The specified job is not allowed to be started', 1;
END
END

Is it possible to create Job Schedule using T-SQL script (not SQL Server Agent schedule creation wizard)?

I have to create a script which will execute at scheduled times. Because my client is not familiar with SQL Server, I would like to create a schedule on my machine as per my client's requirements using SQL Server Agent schedule creation wizard, then create a script of the created schedule (which I will send to the client).
How can I create a T-SQL Job Schedule without using the SQL Server Agent schedule creation wizard?
You can use management studio to get the creation script.
Create the job in management studio
Then right click the job and from the floating menu select Script The Job As -> Create To -> New Query Editor Window.
Alter the script details to match your needs
To create a SQL Server Agent job using Transact-SQL:
Execute sp_add_job to create a job.
Execute sp_add_jobstep to create one or more job steps.
Execute sp_add_schedule to create a schedule.
Execute sp_attach_schedule to attach a schedule to the job.
Execute sp_add_jobserver to set the server for the job.
More here.
I see you accepted the above answer, but if you're trying to create a schedule specifically which is what your question said, as opposed to a job, you use sp_add_schedule.
You can start SQL Profiler, run your wizard steps, and then see what SQL has been executed in the background, using this as a basis for preparing your scripts.
If you know the name of the job and want to add a new schedule, and then add this to multiple jobs you could script it this way (this is an example for a frequence of 1x/day at 02h00, added to 2 jobs)
USE [msdb]
GO
DECLARE #schedule_id int, #job_id uniqueidentifier
-- create the schedule
EXEC msdb.dbo.sp_add_schedule #schedule_name=N'name of the schedule',
#enabled=1,
#freq_type=4,
#freq_interval=1,
#freq_subday_type=1,
#freq_subday_interval=0,
#freq_relative_interval=0,
#freq_recurrence_factor=1,
#active_start_date=20180703,
#active_end_date=99991231,
#active_start_time=020000,
#active_end_time=235959, #schedule_id = #schedule_id OUTPUT
-- add it to job 1
set #job_id = (select job_id from msdb.dbo.sysjobs where name='Name of job 1')
EXEC msdb.dbo.sp_attach_schedule #job_id=#job_id,#schedule_id= #schedule_id
-- add it to job 2
set #job_id = (select job_id from msdb.dbo.sysjobs where name='Name of job 2')
EXEC msdb.dbo.sp_attach_schedule #job_id=#job_id,#schedule_id= #schedule_id
GO

SQL Server (TSQL) - Is it possible to EXEC statements in parallel?

SQL Server 2008 R2
Here is a simplified example:
EXECUTE sp_executesql N'PRINT ''1st '' + convert(varchar, getdate(), 126) WAITFOR DELAY ''000:00:10'''
EXECUTE sp_executesql N'PRINT ''2nd '' + convert(varchar, getdate(), 126)'
The first statement will print the date and delay 10 seconds before proceeding.
The second statement should print immediately.
The way T-SQL works, the 2nd statement won't be evaluated until the first completes. If I copy and paste it to a new query window, it will execute immediately.
The issue is that I have other, more complex things going on, with variables that need to be passed to both procedures.
What I am trying to do is:
Get a record
Lock it for a period of time
while it is locked, execute some other statements against this record and the table itself
Perhaps there is a way to dynamically create a couple of jobs?
Anyway, I am looking for a simple way to do this without having to manually PRINT statements and copy/paste to another session.
Is there a way to EXEC without wait / in parallel?
Yes, there is a way, see Asynchronous procedure execution.
However, chances are this is not what you need. T-SQL is a data access language, and when you take into consideration transactions, locking and commit/rollback semantics is almost impossible to have a parallel job. Parallel T-SQL works for instance with requests queues, where each requests is independent and there is no correlation between jobs.
What you describe doesn't sound at all like something that can, nor should, actually be paralellized.
If you want to lock a record so you can execute statements against it, you may want to execute those statements as a transaction.
To execute SQL in parallel, you need to paralellize SQL calls, by executing your SQL from within separate threads/processes in Java, C++, perl, or any other programming language (hell, launching "isql" in shell script in background will work)
If after reading all above about potential problems and you still want to run things in parallel, you probably can try sql jobs, put your queries in different jobs, then execute by calling the jobs like this
EXEC msdb..sp_start_job 'Job1'
EXEC msdb..sp_start_job 'Job2'
SQL Agent Jobs can run in parallel and be created directly from TSQL. The answer by Remus Rusanu contains a link that mentions this along with some disadvantages.
Another disadvantage is that additional security permissions are required to create the job. Also, for the implementation below, the job must run as a specific user+login with additional job management privileges.
It is possible to run the arbitrary SQL as a different (safer) user however I believe it requires sysadmin privilege to designate the job as such.
The returned #pJobIdHexOut can be used to stop the job if needed.
create function Common.ufn_JobIdFromHex(
#pJobIdBinary binary(16)
)
returns varchar(100) as
/*---------------------------------------------------------------------------------------------------------------------
Purpose: Convert the binary represenation of the job_id into the job_id string that can be used in queries
against msdb.dbo.sysjobs.
http://stackoverflow.com/questions/68677/how-can-i-print-a-binary-value-as-hex-in-tsql
http://stackoverflow.com/questions/3604603
MsgBoards
Modified By Description
---------- -------------- ---------------------------------------------------------------------------------------
2014.08.22 crokusek Initial version, http://stackoverflow.com/questions/3604603 and MsgBoards.
---------------------------------------------------------------------------------------------------------------------*/
begin
-- Convert from binary and strip off the '0x'.
--
declare
#jobIdHex varchar(100) = replace(convert(varchar(300), #pJobIdBinary, 1), '0x', '');
-- The endianness appears to be backwards and there are dashes needed.
--
return
substring(#jobIdHex,7,2) +
substring(#jobIdHex,5,2) +
substring(#jobIdHex,3,2) +
substring(#jobIdHex,1,2) +
'-' +
substring(#jobIdHex,11,2) +
substring(#jobIdHex,9,2) +
'-' +
substring(#jobIdHex,15,2) +
substring(#jobIdHex,13,2) +
'-' +
substring(#jobIdHex,17,4) +
'-' +
substring(#jobIdHex,21,12);
end
go
create proc [Common].[usp_CreateExecuteOneTimeBackgroundJob]
#pJobNameKey varchar(100), -- Caller should ensure uniqueness to avoid a violation
#pJobDescription varchar(1000),
#pSql nvarchar(max),
#pJobIdHexOut varchar(100) = null out, -- JobId as Hex string. For SqlServer 2014 binary(16) = varchar(64)
#pDebug bit = 0 -- True to include print messages
--
with execute as 'TSqlBackgroundJobOwner' -- requires special permissions (See below)
as
/*---------------------------------------------------------------------------------------------------------------------
Purpose: Create a one time background job and launch it immediately. The job is owned by the "execute as" UserName
Caller must ensure the #pSql argument is safe.
Required Permissions for "execute as" user:
-- User must be created with associated login (w/ deny connect).
use [msdb];
create user [$UserName$] for login [$LoginName$];
alter role [SQLAgentUserRole] add member [$UserName$];
alter role [SQLAgentReaderRole] add member [$UserName$];
alter role [SQLAgentOperatorRole] add member [$UserName$];
grant select on dbo.sysjobs to [$UserName$];
grant select on dbo.sysjobactivity to [$UserName$];',
use [Master];
create user [$UserName$] for login [$LoginName$];
grant execute on xp_sqlagent_is_starting to [$UserName$];
grant execute on xp_sqlagent_notify to [$UserName$];';
Modified By Description
---------- ----------- ------------------------------------------------------------------------------------------
2014.08.22 crokusek Initial version
2015.12.22 crokusek Use the SP caller as the job owner (removed the explicit setting of the job owner).
---------------------------------------------------------------------------------------------------------------------*/
begin try
declare
#usp varchar(100) = object_name(##procid),
#currentDatabase nvarchar(100) = db_name(),
#jobId binary(16),
#jobOwnerLogin nvarchar(100);
set xact_abort on; -- ensure transaction is aborted on non-catchables like client timeout, etc.
begin transaction
exec msdb.dbo.sp_add_job
#job_name=#pJobNameKey,
#enabled=1,
#notify_level_eventlog=0,
#notify_level_email=2,
#notify_level_netsend=2,
#notify_level_page=2,
#delete_level=3,
#description=#pJobDescription,
#category_name=N'Database Maintenance',
-- If not overridden then the the current login is the job owner
--#owner_login_name=#jobOwnerLogin, -- Requires sysadmin to set this so avoiding.
#job_id = #jobId output;
-- Get the job_id string of the jobId (special format)
--
set #pJobIdHexOut = Common.ufn_JobIdFromHex(#jobId);
if (#pDebug = 1)
begin
print 'JobId: ' + #pJobIdHexOut;
print 'Sql: ' + #pSql;
end
exec msdb.dbo.sp_add_jobserver #job_id=#jobId; -- default is local server
exec msdb.dbo.sp_add_jobstep
#job_id=#jobId,
#step_name=N'One-Time Job Step 1',
#step_id=1,
#command=#pSql,
#database_name=#currentDatabase,
#cmdexec_success_code=0,
#on_success_action=1,
#on_fail_action=2,
#retry_attempts=0,
#retry_interval=0,
#os_run_priority=0,
#subsystem=N'TSQL',
#flags=0
;
declare
#startResult int;
exec #startResult = msdb.dbo.sp_start_job
#job_id = #jobId;
-- End the transaction
--
if (#startResult != 0)
raiserror('Unable to start the job', 16, 1); -- causes rollback in catch block
else
commit; -- Success
end try
begin catch
declare
#CatchingUsp varchar(100) = object_name(##procid);
if (xact_state() = -1)
rollback;
--exec Common.usp_Log
-- #pMethod = #CatchingUsp;
--exec Common.usp_RethrowError
-- #pCatchingMethod = #CatchingUsp;
end catch
go
It might be worth to check out the article Asynchronous T-SQL Execution Without Service Broker.
You can create an SSIS that has 2 tasks that run in parallel. Then make an unscheduled agent job to call this SSIS. You can finally execute this unscheduled agent job using sp_start_job.
Hi Picking one answer above it is possible to produce something similar to a multi-thread execution using SQL Agent jobs and some auxiliary tables and using SQL Server metadata. I made it already and I was able to call the same procedure 32 times on a server processing 1/32 parts of the processed data each.
Of course one needs to pay high attention to the data partitioning logic so datasets do not overlap. The best way is to use the mod operator over a numeric field.
This logic even allows different partitioning sets between steps on the procedure. On one step you can use field A on the next step field B.
As mentioned above you need to be very carefully with table locks and something I noticed is partitioning tables will also speed data insertion and updates.
I built a master job generator engine using T-SQL that triggered the requested number of procedures using jobs. All of this was called from a SSIS job.
The process was far from being simple to develop but it mimics quite well C# or Java multi-thread logic.
I also had to build some auxiliary tables that hold each job status so the master T-SQL job engine procedure was able to check each job status.
I used SQL Server metadata but each job that was created and started knew how to update its own status -> Job X when it starts updates its status on the main table status monitor, when running the same and when it finishes it closes its status. The main job procedure keeps checking on those auxiliary tables if there are Jobs on running status and will only end when all of them have the status finished.
Microsoft could think on developing something similar on SSIS.

Which user account is running SQLCMD in T-SQL Script without -U and -P option?

I am using sqlcmd in a T-SQl script to write a text file to a network location. However SQLCMD is failing to write to that location due to access permission to the network folder. SP is being run under my user account which has access to the network folder.
Could you please help me under which account sqlcmd will run if I do not specify -U and -P option in TSQL Script?
Use this to find the user name:
PRINT Suser_Sname();
If you don't provide credentials with -u/-p it will try to use windows authentication; i.e the windows account of whomever is running it.
I often just use Process Monitor to look at what account is being used and what the permission error is.
You say you are using SQLCMD in a T-SQL script, don't you mean you are using SQLCMD to run a T-SQL script? How is your script writing a text file? Does it work in SQL Manager? My guess is that the user account SQL Server is running under doesn't have access to that location.
If you call an SQL script via xp_cmdshell without User and Password parameters it will run in the environment of the mssqlserver service, which is very much restricted, and without changing security parameters you will get mostly an 'Access is denied' message instead of the results of the script.
To avoid this security conflict situation I use the following trick in my stored procedure create_sql_proc. I read the text of the script file, and wrap it in a procedure by adding a head and a foot to it. Now I have a script creating a stored procedure from the SQL-file called #procname.
If you let now run this stored procedure by EXEC #procname, it will run in your security environment, and delivers the result you would get by running it from a command prompt:
CREATE PROCEDURE create_sql_proc(#procName sysname, #sqlfile sysname) AS
BEGIN
DECLARE #crlf nvarchar(2) = char(10)+char(13)
DECLARE #scriptText nvarchar(max)
DECLARE #cmd nvarchar(max)
= N'SET #text = (SELECT * FROM openrowset(BULK '''+#sqlFile+''', SINGLE_CLOB) as script)'
EXEC sp_executesql #cmd , N'#text nvarchar(max) output', #text = #scriptText OUTPUT
DECLARE #ProcHead nvarchar(max) = N'CREATE or ALTER PROCEDURE '+#procName+ ' AS '+#crlf+'BEGIN'+#crlf
DECLARE #ProcTail nvarchar(max) = #crlf + N'END '
SET #scriptText = #ProcHead + #scriptText + #ProcTail
-- create TestGen stored procedure --
EXEC sys.sp_executesql #scriptText
END