Execute sql server stored procedure from changeset - liquibase

Using changesets that call SQL Server scripts, there are some identical scripts which perform some work where the only difference is a version number (string).
To consolidate code, a stored procedure was considered which would take a parameter (version string).
How can a stored procedure, with a parameter, be called from a changeset? I've seen one example for Oracle but it doesn't seem to work for SQL Server.

Related

Execute SQL Server stored procedure from Oracle

I have an Oracle procedure that executes a SQL Server stored procedure. The SQL Server stored procedure executes one of three stored procedures based on an id field passed it from the Oracle procedure. Two of the stored procedure execute okay but the third does not.
The third stored procedure has several nested stored procedure calls. It performs some calculations and then inserts to 3 different tables and then some additional calculations and updates one of the tables it just inserted to. If I execute the stored procedure in SSMS it runs without issue. When it is executed from Oracle side I get nothing.
Could this because of the nesting but because of the implicit commits I see it work using SMSS? I tried doing a BEGIN Transaction in the starting stored procedure on the MS SQL Server side and a commit at the end where it should return. Still nothing. I have Try and Catch blocks in the MS SQL stored procedures and don't receive any errors.
Any suggestions would be greatly appreciated.
Thank you.
Does nothing means none of the inserts or updates show up in the tables. No error messages
As far as posting code; what specifically would you be interested in seeing? There are 4 stored procedures which have 100s of lines of code each.

Getting fields from SQL Server Stored Procedure

I'm trying to build a transformation in Kettle that gets FIELDS from a SQL Server Stored Procedure and inserts it in a MySql table.
The problem is that I can't find a way to get stored procedure "fields". I understand that Call DB Procedure task expects in/out params, and that's not my case, so I'm trying to use "Execute SQL Statements" with the following SQL:
exec credisfera.dbo.sp_insere_parcelas #dt_ref = '2016-05-03'
Is there a way to achieve this?
Simply put the exec statement in a Table input step. Upon execution (or "Output fields...", PDI will get the metadata from the JDBC driver.

Running synchronous commands to between two sql servers

I'm running a stored procedure on server1 from my application. The stored procedure does a bunch of stuff and populate a table on server2 with the result from the procedure.
I'm using linked server to accomplish this.
When the stored procedure is done running the application continues and tries to do some manipulation of the result from the stored procedure.
My problem is that the results from the stored procedure has not been completely inserted into the tables yet, so the manipulation of the tables fails.
So my question is. Is it possible to ensure the insert into on the linked server is done synchronous? I would like to have the stored procedure not return until the tables on the linked server actually is done.
You can use an output parameter of the first procedure. When the table is create on the second server the output parameter value will be return to your application and indicates the operation is ready.
If the things are difficult then this you can try setting a different isolation level of your store procedure:
http://msdn.microsoft.com/en-us/library/ms173763.aspx
I found the reason for this strange behavior. There was a line of code in my stored procedure added during debug that did a select on a temporary mem table before the data in the same table was written to the linked server.
When the select statement was run, the control was given back to my application and at the same time the stored procedure continued running. I guess the stored procedure was running synchronously from the start.

check if stored procedure is valid

I have a stored procedure (sproc A) which is syntactically correct. So when I hit "run" on its create or alter statement, it is saved into the database.
However, sproc A has a call to another stored procedure (sproc B). It does not provide enough parameters for sproc B, so I don't see how it's a valid stored procedure.
I want to detect any stored procedures in my database which aren't passing enough parameters to their own stored procedures.
Thankyou,
Fidel
Unfortunately, there is no mechanism in SQL Server to test dependencies, parameters etc
You have to search+check, or provide defaults for parameters. You'll only pick it up by testing otherwise.
A good auto complete tool like Red Gate SQL prompt can list parameters + types for you
Note:
It's a long standing problem and there is even a request to MS including this.
SP parameter checking is one of the OPTION STRICT suggestions

Stored procedure SQL SELECT statement issue

I am using SQL Server 2008 Enterprise on Windows Server 2008 Enterprise. In a stored procedure, we can execute a SELECT statement directly. And it could also be executed in this new way, I am wondering which method is better, and why?
New method,
declare #teststatement varchar(500)
set #teststatement = 'SELECT * from sometable'
print #teststatement
exec (#teststatement)
Traditional method,
SELECT * from sometable
regards,
George
FYI: it’s not a new method, it is known as Dynamic SQL.
Dynamic SQL are preferred when we need to set or concatenate certain values into sql statements.
Traditional or normal way sql statements are recommended, because stored procedures are complied. Complied on first run "Stored Procedure are Compiled on First Run"
, execution plan of statements are being created at the time of compilation.
Dynamic sqls are ignored while creating execution plans, because it is taken as string (VARCHAR or NVARCHAR as declared).
Refer following articles for more details about dynamic query and stored procs
Introduction to Dynamic SQL Part 1
Introduction to Dynamic SQL Part 2
Everything you wanted to know about Stored Procedures
The traditional method is safer, because the query is parsed when you save it. The query in the 'exec' method is not parsed and can contain errors.
The "new" way, as mentioned, has nothing to do with SQL 2008. EXEC has been available for quite some time. It's also - in most cases - a Very Bad Idea.
You lose parameterization - meaning you are now vulnerable to SQL Injection. It's ugly and error-prone. It's less efficient. And it creates a new execution scope - meaning it can't share variables, temp tables, etc. - from it's calling stored proc.
sp_executesql is another (and preferred) method of executing dynamic SQL. It's what your client apps use, and it supports parameters - which fixes the most glaring problem of EXEC. However, it too has very limited use cases within a stored proc. About the only redeeming use is when you need a dynamic table or column name. T-SQL does not support a variable for that - so you need to use sp_executesql. The number of times you need or should be doing that are very low.
Bottom line - you'd be best off forgetting you ever heard of it.