Spotfire dynamic filtering - data-visualization

I have a file which consists of a few part numbers.Using this file i need to exclude data in dashboard in another table which also has part numbers.How to filter data out of the table based on the part numbers present in the file if the part numbers in the file can change over time?

When you import the file with a list of part numbers, add a calculated column under transformations (also make sure that it's not reading the first record of your part list file as a header row--I don't know what your file looks like). In the expression box, just enter something simple like 1. Call this new dataset something like part_list. This column represents a flag that we will add to the table that is already in your dashboard. Let's suppose that table is called data.
Once the file is imported, click Insert > Columns... and ensure that data is selected in the "Add columns to data table:" drop down box, and that part_list is selected in the "Add columns from:" menu. Click Next. Match the part number column in both tables, and click next. Add the flag column to data with a left outer join (assuming this makes sense with your data). Once the column is added, you can filter out the 1's.
If this does not answer your question, consider providing more details about what your data looks like.

Related

Create table schema and load data in bigquery table using source google drive

I am creating table using google drive as a source and google sheet as a format.
I have selected "Drive" as a value for create table from. For file Format, I selected Google Sheet.
Also I selected the Auto Detect Schema and input parameters.
Its creating the table but the first row of the sheet is also loaded as a data instead of table fields.
Kindly tell me what I need to do to get the first row of the sheet as a table column name not as a data.
It would have been helpful if you could include a screenshot of the top few rows of the file you're trying to upload at least to see the data types you have in there. BigQuery, at least as of when this response was composed, cannot differentiate between column names and data rows if both have similar datatypes while schema auto detection is used. For instance, if your data looks like this:
headerA, headerB
row1a, row1b
row2a, row2b
row3a, row3b
BigQuery would not be able to detect the column names (at least automatically using the UI options alone) since all the headers and row data are Strings. The "Header rows to skip" option would not help with this.
Schema auto detection should be able to detect and differentiate column names from data rows when you have different data types for different columns though.
You have an option to skip header row in Advanced options. Simply put 1 as the number of rows to skip (your first row is where your header is). It will skip the first row and use it as the values for your header.

sm30: Set matching column heading

I created a table in SAP via se11, then I used the table maintenance generator.
Now I edit the table via sm30:
The second and the third column: Both have the heading "Feldname".
The first "Feldname" column is called COLUMN_NAME and its data element is "Fieldname".
The second "Feldname" column is called AUTH_FIELD and its data element is "XUFIELD"
I would like to see the column names which I gave the columns in se16 (COLUMN_NAME, AUTH_FIELD) in the heading.
How to prevent the table maintenance generator from giving other names in the headings?
Option 1 - use custom data elements:
Instead of using Fieldname and XUFIELD data elements, you can create your custom data elements and give them what header you would like.
(You will have to regenerate table maintenance)
Option 2 - editing screen
When generated the table maintenance, you supplied a function group and a screen number.
Go to SE80 -> Function Groups -> <function_group_supplied> -> screens -> <screen_supplied>.
Then edit it as you want.
Note: Modifying a generated object is considered risky. Your customized changes might be overwritten in a future regeneration.
Add custom data elements with suitable descriptions. Let the new data elements refer to the original ones (resp. the domains) to avoid having to reinvent everything.
Data element descriptions can be translated.
You can set different descriptions for different lengths, e.g. "Field" for the narrow column with length 10, and "Field name" for a wide label with length 30.
Regenerating the maintenance screen won't accidentally delete the changed descriptions.

SSRS SQL report builder deleting column

I have a problem with SSRS report builder. Basically what I want to do is to delete a column. I have a report that someone else made and there is 1 column (xxx) that no longer exists in data source tables so I need to delete it.
When I go to query designer and delete this column from the code and run it there, it works. I close query designer window and see that list of columns (fields) is updated now and xxx column is not there. Then, I delete this column manually in designer (default screen) and when I try to run the report, It doesn't work:
"The Value expression for the text box ‘XXXDataField’ refers to the field ‘xxx’. Report item expressions can only refer to fields within the current dataset scope or, if inside an aggregate, the specified dataset scope. Letters in the names of fields must use the correct case."
But that field should be already deleted. So I don't know what else I can do, or what it can be linked to. I just want to delete it. Any Idea?
Thank you
If your dataset does not contain xxx but your DataTable has this error is normal
You either delete that column from your table or at lease delete DataSet Binding from table so that SSRS does not try to retrieve that column from DataSet
It will be a reference to the field in another field. For example, if you deleted a column that showed an OrderShipped status, then you might have another text box highlighted based on this.
The error is telling you which textbox is in error. So, click somewhere on the designer, then in the properties window, right at the top, click the dropdown which allows you to choose specific report items, choose XXXDataField (the one named in the error message) and then check the value expression. In there you will find the reference to the column you deleted.

PDI /Kettle - Passing data from previous hop to database query

I'm new to PDI and Kettle, and what I thought was a simple experiment to teach myself some basics has turned into a lot of frustration.
I want to check a database to see if a particular record exists (i.e. vendor). I would like to get the name of the vendor from reading a flat file (.CSV).
My first hurdle selecting only the vendor name from 8 fields in the CSV
The second hurdle is how to use that vendor name as a variable in a database query.
My third issue is what type of step to use for the database lookup.
I tried a dynamic SQL query, but I couldn't determine how to build the query using a variable, then how to pass the desired value to the variable.
The database table (VendorRatings) has 30 fields, one of which is vendor. The CSV also has 8 fields, one of which is also vendor.
My best effort was to use a dynamic query using:
SELECT * FROM VENDORRATINGS WHERE VENDOR = ?
How do I programmatically assign the desired value to "?" in the query? Specifically, how do I link the output of a specific field from Text File Input to the "vendor = ?" SQL query?
The best practice is a Stream lookup. For each record in the main flow (VendorRating) lookup in the reference file (the CSV) for the vendor details (lookup fields), based on its identifier (possibly its number or name or firstname+lastname).
First "hurdle" : Once the path of the csv file defined, press the Get field button.
It will take the first line as header to know the field names and explore the first 100 (customizable) record to determine the field types.
If the name is not on the first line, uncheck the Header row present, press the Get field button, and then change the name on the panel.
If there is more than one header row or other complexities, use the Text file input.
The same is valid for the lookup step: use the Get lookup field button and delete the fields you do not need.
Due to the fact that
There is at most one vendorrating per vendor.
You have to do something if there is no match.
I suggest the following flow:
Read the CSV and for each row look up in the table (i.e.: the lookup table is the SQL table rather that the CSV file). And put default upon not matching. I suggest something really visible like "--- NO MATCH ---".
Then, in case of no match, the filter redirect the flow to the alternative action (here: insert into the SQL table). Then the two flows and merged into the downstream flow.

Qlikview Current selections box to use labels rather than table.fieldnames

In my Qlik View document I want to change the Current Selections information to use the Label applied to the field rather than the table.field format.
For example
PartsTable.PartNo
Would be
Part Number
Unfortunately there's no way to do this in the Current Selections object. However, you may have some alternatives depending on your requirements.
First of all, if you're just happy with seeing the current selections as text (rather than having the functionality of the Current Selections object), you can create a Text object and use the expression:
=replace(GetCurrentSelections(), 'PartsTable.PartNo', 'Part Number')
This will then look something like the below:
The other alternative is to use the RENAME statement in the load script after all your table loads are complete. RENAME allows you to rename a single field or a collection of fields by using a mapping table. The syntax for a single field is shown below:
RENAME FIELD oldname to newname
If you should need to rename more than one field at a time, you can expand this to:
RENAME FIELD oldname1 to newname1, oldname2 to newname2,...
More detail on the syntax including using a mapping table can be found in the QlikView installed help file.
For your example, I put together a small demo:
QUALIFY *;
PartsTable:
LOAD * INLINE [
PartNo
100
200
300
];
UNQUALIFY *;
RENAME FIELD PartsTable.PartNo to [Part Number];
This then results in:
RENAME is similar to the alias (AS) statement, except that you can first load all of your data, and then do the rename at the end. This will then rename your field so that it appears under its new name in any front-end controls (e.g. Current Selections etc.) However, this may not be suitable for you if you already have an existing field named Part Number in your script.