How to get straight table values or table data into an variable in Qlikview? - qlikview

How to get straight table values or table data into an variable in Qlikview ?
As I am going to use the table data in Javascript fetched through variable and generate PPT reports using pptxgenjs library.

Use something like let MyVar = peek(...) (see documentation for peek() here).
If you need to merge values from multiple fields you should do that first using the scripting function concat() to load those values into a string in a new (temporary) table, and use peek on that table.

Related

Azure data factory - pass multiple values from lookup into dynamic query?

I have a lookup function that returns a list of valid GUID IDs in ADF. I then have a foreach process which runs a stored procedure for each GUID ID and assigns an ID column to it.
What I want to do is then have another lookup run which will run the below query to bring me the GUID and also the newly assigned ID. It is very simple to write in SQL
SELECT GUID, Identifier from DBO.GuidLOAD
WHERE GUID in ('GUIDID','GUIDID','GUIDID')
However I am struggling to translate this in ADF.. I have got as far as the #Concat part and most of the help I find online only refers to dynamic queries with single values as input parameters.. where mine is a list of GUIDs where they may be 1, more or none at all..
Can someone advise the best way of writing this dynamic query?
first 2 run fine, I just need the third lookup to run the query based on the output of the first lookup
You can use string interpolation (#{...}) instead of concat(). I have a sample table with with 2 records in my demo table as shown below:
Now, I have sample look up which returns 3 guid records. The following is debug output of lookup activity.
Now, I have used a for loop to create an array of these guid's returned by lookup activity using append variable activity. The items value for each activity is #activity('get guid').output.value. The following is the configuration of append variable inside for each.
#item().guids
I have used join function on the above array variable to create a string which can be used in the required query.
"#{join(variables('req'),'","')}"
Now, the query accepts guid's wrapped inside single quotes i.e., WHERE GUID in ('GUIDID','GUIDID','GUIDID'). So, I created 2 parameters with following values. I used them in order to replace double quotes from the above final variable with single quotes.
singlequote: '
doublequote: "
Now in the look up where you want to use your query, you can build it using the below dynamic content:
SELECT guid, identifier from dbo.demo WHERE GUID in (#{replace(variables('final'),pipeline().parameters.doublequote,pipeline().parameters.singlequote)})
Now, when I debug the pipeline, the following query would be executed which can be seen in the debug input of the final lookup.
The output would be as below. Only one row should be returned from the sample I have taken and output is as expected:

Dynamic Parameter in Power Pivot Query

We are using Excel 2013 and Power Pivot to build modules that consist of several Pivot tables that are all pulling data from the same Power Pivot table, which queries our T-SQL data warehouse.
In an effort to simplify and fully automate this module, we wanted to create a text field that would allow a user to enter a value (a client ID# for example), and then have that value be used as a parameter in the Power Pivot query.
Is it possible to pass a Parameter in the Power Pivot query, which is housed in a text field outside of the query?
You can also pass a slicer or combobox selection to a cell. Define a name for that cell. Put that cell (and others if you have multiple text variables to use) in a table. For convenience, I usually name this table "Parameters". You can then 'read in' the parameters to your query and drop them in your query statements.
The code at the top of your query to read these parameters in might look like...
let
Parameter_Table = Excel.CurrentWorkbook(){[Name="Parameter"]}[Content],
XXX_Value = Parameter_Table{1}[Value],
YYY_Value = Parameter_Table{2}[Value],
ZZZ_Value = Parameter_Table{3}[Value],
Followed by your query wherein instead of searching for, say a manually typed in customer called "BigDataCo", you would replace "BigDataCo" with XXX_Value.
Refreshing the link each time a different customer is selected will indeed be a very slow approach, but this has worked for me.
Rather than pass a parameter to the data source SQL query, why not utilize a pivot table filter or slicer to do allow the users to dynamically filter the data? This is much faster than refreshing the data from the source.
If for some reason you need to pass this directly to the source query, you'll have to do some VBA work.

DATA foo TYPE ANY TABLE possible?

I have the following problem:
I'm testing many BAPI's and dont want to create a Table with the corosponding types for the rows, each time I call a new BAPI.
Is it possible to generate s.th. like a generic table like:
DATA foo TYPE ANY TABLE.
and use this to put it as the table parameter to get the result of the bapi?
No, this is not possible - you can't declare variables using a generic type. However, you could try to determine the data type (e. g. using RPY_FUNCTIONMODULE_*) and then use CREATE DATA to create the table dynamically using a reference. Check the documentation of CREATE DATA for an example.

SSIS execute sql task parameter mapping

I'm trying to execute a sql script using the task in SSIS.
My script just inserts a bunch of nambe value pairs in a table. For example -
insert into mytable (name, value)
values
(?, 'value1'),
(?, 'value2')
Now, I want to map a variable defined in SSIS to be mapped to the parameters in the statement above. I tried defining a scalar variable but I guess the sql task doesn't like that. Oh and all the name parameters in the insert statement resolve to a single variable.
For example I want
insert into mytable (name, value)
values
('name1', 'value1'),
('name1', 'value2')
When I open the Parameter Mapping tab for the task, it wants me to map each parameter invidually like -
Variable Name - User::Name
Direction - Input
Data Type - LONG
Parameter Name - 0
Parameter Size - -1
Variable Name - User::Name
Direction - Input
Data Type - LONG
Parameter Name - 1
Parameter Size - -1
This quickly gets out of hand and cumbersome if have 5-10 values for a name and forces me to add multiple assignments for the same name.
Is there an easy(-ier) way to do this?
The easiest (and most extensible) way, is to use a Data Flow Task instead of using an Execute SQL Task.
Add a Dataflow Task; I assume that you have all the variables filled with the right parameters, and that you know how to pass the values onto them.
Create a dummy row with the columns you will need to insert, so use whatever pleases you the most as a source (in this example, i've used an oledb connection). One good tip is to define the datatype(s) of each column in the source as you will need them in your destination table. This will align the metadata of the dataflow with the one the insert table (Screenshot #1).
Then add a multicast component to the dataflow.
For the first parameter/value, add a derived column component, name it cleanly and proceed to substitute the content of your parameters with your variables.
For each further parameter/value that needs to be added; copy the previously created derived column component, add one extra branch from the multicast component and proceed to substitute the column parameter/value as necessary.
Add a union all and join all the flows
Insert into the table
VoilĂ ! (Screenshot #2)
The good thing about this method is that you can make as extensible as you wish... validate each value with different criteria, modify the data, add business rules, discard non-compliant values (by checking the full number of complying values)... !
Have a nice day!
Francisco.
PS: I had prepared a couple more screenshots... but stackoverflow has decided that I am too new to the site to post things with images or more than two links (!) Oh well..

query a table not in normal 3rd form

Hi I have a table which was designed by a lazy developer who did not created it in 3rd normal form. He saved the arrays in the table instead of using MM relation . And the application is running so I can not change the database schema.
I need to query the table like this:
SELECT * FROM myTable
WHERE usergroup = 20
where usergroup field contains data like this : 17,19,20 or it could be also only 20 or only 19.
I could search with like:
SELECT * FROM myTable
WHERE usergroup LIKE 20
but in this case it would match also for field which contain 200 e.g.
Anybody any idea?
thanx
Fix the bad database design.
A short-term fix is to add a related table for the correct structure. Add a trigger to parse the info in the old field to the related table on insert and update. Then write a script to [parse out existing data. Now you can porperly query but you haven't broken any of the old code. THen you can search for the old code and fix. Once you have done that then just change how code is inserted or udated inthe orginal table to add the new table and drop the old column.
Write a table-valued user-defined function (UDF in SQL Server, I am sure it will have a different name in other RDBMS) to parse the values of the column containing the list which is stored as a string. For each item in the comma-delimited list, your function should return a row in the table result. When you are using a query like this, query against the results returned from the UDF.
Write a function to convert a comma delimited list to a table. Should be pretty simple. Then you can use IN().