Catching OUT parameter value of a bigquery Procedure from composer - google-bigquery

I am calling a bigquery procedure using BigQueryInsertJob operator using "declare out_var String; call bq_sp("input params", out_var)". This is getting called properly and executing the stored procedure properly also. But what I need to do along with that, is to print the value of out_var to the composer log, to catch the value of out_var, so that I can do some more things depending on the value of that variable. Could you please help me, how can I catch the value of the OUT parameter value from the cloud composer. Thank you.
I have a stored procedure with input and output parameters. and calling that from composer.
This is working absolutely fine.
I am expecting to fetch the OUT parameter value in the cloud composer job

Related

Dynamic Procedure BigQuery

Does Bigquery support dynamic procedure calls? I am looking to generate a procedure name as string based on variables and call the string.
EXECUTE IMMEDIATE returns this error clearly it doesn't support CALL. Is there any other way?
SQL created by EXECUTE IMMEDIATE contains unsupported statement type: CallStatement.
Thank you.
I make this answer for visibility, the approach you can use is as follow (as mention in the comments by mikhail):
CREATE OR REPLACE PROCEDURE `project-id.dataset`.maker(option INT64)
BEGIN
IF option=1 THEN
CALL `project-id.dataset`.create_user(); #call to procedure
ELSEIF option=2 THEN
CALL `project-id.dataset`.create_customer(); #call to procedure
ELSE
SELECT "END"; #default
END IF;
END
to use
CALL `project-id.dataset`.maker(2)
As stated in the comments, execute immediate do not support at the moment the usage of call.
I also found a feature request about supporting using call with execute immediate on google issue tracker you can leave a comment to support the feature request.

Pentaho Carte How to pass parameter in Job Level

I am currently trying to develop simple parameter passing process using Pentaho and execute the job from web (Carte). I have a transformation and also a job. I have successfully pass the parameter if I execute it directly through transformation. http://cluster:cluster#localhost:8080/kettle/executeTrans/?trans=/data-integration/PENTAHO_JOB/test_var.ktr&testvar=1234567
however, when I try to put the transformation in a Job and execute it in job level, I could not get the parameter testvar now even though I can run it successfully. i also found out that there is no Get Variable function in Job level. I wonder if I can get the parameter testvar by executing from job level in Carte?
http://cluster:cluster#localhost:8080/kettle/executeJob/?job=/data-integration/PENTAHO_JOB/test_var.kjb&testvar=1234567
#Raspi Surya :
Its working for me . You need to set the variable in parameter at job level. I used the below URL.
http://localhost:8080/kettle/executeJob/?job=C:\Data-Integration\carte-testing.kjb&file_name=C:\Data-Integration\file1.csv
See the attached SS

Call Worklight Javascript SQL Adapter form REST Client

I am using IBM Worklight 7.1 and I am trying to call a Javascript SQL adapter form REST client like HttpRequester. I can call adapter but cannot figure out how to pass parameters to procedure.
For Adapter named MyAdapter and procedure named myProc, I can call adapter using baseUrl/MyAdapter/myProc, using both GET and POST method form REST Client, but all the parameters in procedure are undefined.
function myProc(a,c) {
return {
result : "OK"
};
}
I have tried passing parameter in following ways.
As query string ?a=b&c=d
As JSON String {"a":"b","c":"d"}
Passing parameter in array as parameters=['b','c']
Why Do This
Reason behind doing this is to make Data Setting, Procedure call, Output check and data erase process automatic by writing script to make testing easy and automatic. So, feel free to suggest if any other better process already exist to do above steps automatically.
When calling a JavaScript adapter (this answer is not applicable to Java adapters), the REST call should look like:
/{project-context}/adapters/{adapter-name}/{procedure-name}/?params=[a,b,c,d]
In other words, a JavaScript procedure receives only ONE parameter called params which needs to be an array of ordered, unnamed values.

How do you retrieve the return value of a DB2 SQL sproc using Perl DBI?

I need to retrieve the value returned by a DB2 sproc that I have written. The sproc returns the number of rows in a table and is used by the calling process to decide whether or not to update other data.
I have looked at several similar questions on SO but they refer to the use of out parameters instead of using the sproc's return value, for example:
Perl Dbi and stored procedures
I am using a standard DBI connection to the database with both RaiseError and PrintError enabled.
$sql_stmt = "call MY_TABLE_SPACE.MY_SPROC('2011-10-31')";
$sth = $dbh->prepare($sql_stmt)
or die "Unable to prepare SQL '$sql_stmt': $rps_met_dbh->errstr";
$rsp = 0;
$rsp = $sth->execute();
unless($rsp) {
print(STDERR "Unable to execute sproc: $rps_met_dbh->errstr\n");
}
print(STDERR "$?\n");
I have tried looking at $h->err for both the statement handle and the db handle.
I would really prefer communicating the number of rows via a return code rather than using SQLSTATE mechanism if I can.
Edit:
I have finished up using a dedicated out parameter to communicate the number of rows updated as follows:
$sql_stmt = "call MY_TABLE_SPACE.MY_SPROC('2011-10-31')";
$sth = $dbh->prepare($sql_stmt)
or die "Unable to prepare SQL '$sql_stmt': $rps_met_dbh->errstr";
$sth = $dbh->bind_param_inout(1, $rows_updated, 128)
or die "Unable to prepare SQL '$sql_stmt': $rps_met_dbh->errstr";
$rows_updated = 0;
$rsp = 0;
$rsp = $sth->execute();
unless($rsp) {
print(STDERR "Unable to execute sproc: $rps_met_dbh->errstr\n");
}
print(STDERR "$rows_updated\n");
Edit 2:
And now thinking about this further I have realised that I should apply the PragProg principle of "Tell. Don't Ask." That is, I shouldn't call the sproc. then have it give me back a number before I decide whether or not to call the anopther sproc, i.e. "Ask".
I should just call the first sproc. and have it decide whether it should call the other sproc or not, i.e. "Tell" and let it decide.
What is wrong with using an output parameter in your procedure. I've not got a working DB2 lying around right now or I'd provide an example but when I was using it I'm sure you can define output parameters in procedures and bind them with bind_param_inout. I cannot remember if a DB2 procedure can return a value (like a function) but if it can them using "? = call MY_TABLE_SPACE.MY_SPROC('2011-10-31')" would allow you to bind the output return value. If this doesn't work you could use a DB2 function which definitely can return a value. However, at the end of the day the way you get data out of a procedure/function is to bind output parameters - that is just the way it is.
I've no idea what you mean by "using SQLSTATE". I've also no idea what you mean by looking at $h->err as that is only set if the procedure fails or you cannot call the procedure (SQL error etc).

Executing the contents of a memo on a TADOQuery

I have a really long list of sql commands on a memo, when I try to execute it I get the following error:
Parameter object is improperly defined. Inconsistent or incomplete information was provided.
The code to execute it:
Query.SQL.Text := Memo1.Lines.Text;
Query.ExecSQL;
I have a vague idea that the error is caused due to the way the query content was added, so, here's how I'm doing it now:
1) Memo1.Lines.LoadFromFile('Patch.sql');
2) Proceed to the query commands
As you can see, the contents of the memo is loaded from a file. Is there any other way to successfully do this?
P.S.: I'm using Microsoft SQL 2008.
Thank you!
It looks like you're not using parameters, so set ParamCheck off
Query.ParamCheck := false;
If there is a colon ":" in a string in the SQL, the TADOQuery thinks it's a parameter