array of input tables as pipeline parameter - azure-data-factory-2

With Continuation from the question which is already resolved :
how to concatenate with parameter in azure data factory / passing dynamic table names in source query
i have created a pipeline parameter and passing array of tables to perform multiple copy (incremental) operation using control table (watermark table). i have successfully tested the pipeline with single table as a parameter which is of type string. However when i am changing the parameter from String to Array and passing 2 tables ["table1","table2"] i am getting error as:
"concat does not have an overload that supports the arguments given:(StringLiteral,Array.........)" *
However i have updated the parameterized query in following way (using converion to string).
#concat('select * from DW_GL.',string(pipeline().parameters.p_param_input_table),' where updated_on > ''',activity('Old_Lookup1').output.firstRow.date_value,''' and updated_on <= ''',activity('Old_Lookup1').output.firstRow.date_value_new, '''')
i am getting error now as follows:
ErrorCode=UserErrorInvalidValueInPayload,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to convert the value in 'table' property to 'System.String' type. Please make sure the payload structure and value are correct.,Source=Microsoft.DataTransfer.DataContracts,''Type=System.InvalidCastException,Message=Object must implement IConvertible.,Source=mscorlib,' –
However i have updated the parameterized query in following way (using converion to string).
#concat('select * from DW_GL.',string(pipeline().parameters.p_param_input_table),' where updated_on > ''',activity('Old_Lookup1').output.firstRow.date_value,''' and updated_on <= ''',activity('Old_Lookup1').output.firstRow.date_value_new, '''')

You have an array of tables, and trying to use it in the same query just by replacing string with an array of tables directly.
If you want to apply the process which worked for each table separately, then you can use a for each loop. Lets say I have the following array of table names:
Now I have configured the for each loop where I have given the items value as the above parameter. Inside for each I have taken a sample script. I have used a query similar to the one given below:
#concat('select * from DW_GL.',item(),' where updated_on > ''',activity('Old_Lookup1').output.firstRow.id,''' and updated_on <= ''',activity('Old_Lookup1').output.firstRow.id, '''')
Now within this for each, you have to keep all the activities that you have used in the case of single table name for parameter (string).
If you are trying to do it simultaneously, then you can check what query is being implemented in the debug input of script activity. Giving same query as yours would build the query in the following way:
To select multiple tables at a time, you can use join() instead of directly converting it to string. Use the following dynamic content instead:
#concat('select * from DW_GL.',join(pipeline().parameters.p_param_input_table,',DW_GL.'),' where updated_on > ''',activity('Old_Lookup1').output.firstRow.date_value,''' and updated_on <= ''',activity('Old_Lookup1').output.firstRow.date_value_new, '''')
NOTE: My activity failed as I just wanted to demonstrate what query will be executed based on the given dynamic content. Use accordingly.

Related

How can you filter Snowflake EXPLAIN AS TABULAR syntax when its embedded in the TABLE function? Can you filter it with anything?

I have a table named Posts I would like to count and profile in Snowflake using the current Snowsight UI.
When I return the results via EXPLAIN using TABLULAR I am able to return the set with the combination of TABLE, RESULT_SCAN, and LAST_QUERY_ID functions, but any predicate or filter or column reference seems to fail.
Is there a valid way to do this in Snowflake with the TABLE function or is there another way to query the output of the EXPLAIN using TABLULAR?
-- Works
EXPLAIN using TABULAR SELECT COUNT(*) from Posts;
-- Works
SELECT t.* FROM TABLE(RESULT_SCAN(LAST_QUERY_ID())) as t;
-- Does not work
SELECT t.* FROM TABLE(RESULT_SCAN(LAST_QUERY_ID())) as t where operation = 'GlobalStats';
-- invalid identifier 'OPERATION', the column does not seem recognized.
Tried the third example and expected the predicate to apply to the function output. I don't understand why the filter works on some TABLE() results and not others.
You need to double quote the column name
where "operation"=
From the Documentation
Note that because the output column names from the DESC USER command
were generated in lowercase, the commands use delimited identifier
notation (double quotes) around the column names in the query to
ensure that the column names in the query match the column names in
the output that was scanned

Custom column in Custom SQL on BigQuery

I'm stacking on some issue on Tableau when I'm trying to run Custom query with string parameter. I'd like to query one column dynamically from certain table on BigQuery.
My SQL looks like.:
select <Parameters.column for research> as column,
count(*) as N
from table_name
where date=<Parameters.date>
group by 1
Here I'm trying to use parameter as column name.
But unfortunatlly I'm receive string column with one value of the parameter.
Is it possible to execute my request? If it's doable, so how to write the Custom SQL?

PostgreSQL - How to cast dynamically?

I have a column that has the type of the dataset in text.
So I want to do something like this:
SELECT CAST ('100' AS %INTEGER%);
SELECT CAST (100 AS %TEXT%);
SELECT CAST ('100' AS (SELECT type FROM dataset_types WHERE id = 2));
Is that possible with PostgreSQL?
SQL is strongly typed and static. Postgres demands to know the number of columns and their data type a the time of the call. So you need dynamic SQL in one of the procedural language extensions for this. And then you still face the obstacle that functions (necessarily) have a fixed return type. Related:
Dynamically define returning row types based on a passed given table in plpgsql?
Function to return dynamic set of columns for given table
Or you go with a two-step flow. First concatenate the query string (with another SELECT query). Then execute the generated query string. Two round trips to the server.
SELECT '100::' || type FROM dataset_types WHERE id = 2; -- record resulting string
Execute the result. (And make sure you didn't open any vectors for SQL injection!)
About the short cast syntax:
Postgres data type cast

Can you concatenate two strings to create dynamic column names in PostgreSQL?

I have a form where people can type in a start and end date, as well as a column name prefix.
In the backend, I want to do something along the lines of
SELECT *, CAST('{{startDate}}' AS TIMESTAMP) AS ({{prefix}} + '_startDate')
Is this possible? Basically, I want to dynamically create the name of the new column. The table is immediately returned to the user, so I don't want to mutate the underlying table itself. Thanks!
You can execute dynamic query that you have prepared by using EXECUTE keyword, otherwise it is not possible to have dynamic structure of SQL.
Since you are preparing your SQL outside database, you can use something like:
SELECT *, CAST('{{startDate}}' AS TIMESTAMP) AS {{prefix}}_startDate
Assuming that {{prefix}} is replaced with some string by your template before it is sent to database.

Array parameter for TADOQuery in Delphi 2010

I need to execute a simple query:
SELECT * FROM MyTable WHERE Id IN (:ids)
Obviously, it returns the set of records which have their primary key 'Id' in the given list. How can I pass an array of integer IDs into ADOQuery.Parameters for parameter 'ids'? I have tried VarArray - it does not work. Parameter 'ids' has FieldType = ftInteger by default, if it matters.
There is no parameter type that can be used to pass a list of values to in. Unfortunately, this is one of the shortcomings of parameterized SQL.
You'll have to build the query from code to either generate the list of values, or generate a list of parameters which can then be filled from code. That's because you can pass each value as a different parameter, like this:
SELECT * FROM MyTable WHERE Id IN (:id1, :id2, :id3)
But since the list will probably have a variable size, you'll have to alter the SQL to add parameters. In that case it is just as easy to generate the list of values, although parametereized queries may be cached better, depending on which DB you use.
The IN param just takes a comma separated string of values like (1,2,3,4,5) so I assume you set the datatype to ftstring and just build the string and pass that...? Not tried it but it's what I would try...