Salesforce flow datatable - api

I am getting data from a api and wants to show it in a data table on the screen flow. The api is returning the data. I tried to map the data source of the data table with the api fields but the data source always returns blank and just not mapping with the data table. I tried Manual Variable in the action window but It is also not excepting multiple value variables. How to map collection variable to hold api data and map it to show in data table. I home I explained what is the challenge I am facing.

Related

Adding a dynamic column in copy activity of azure data factory

I am using Data flow in my Azure Data factory pipeline in order to copy data from one cosmos db collection to another cosmos db collection. I am using cosmos SQL Api as the source and sink datasets.
Problem is when copying the documents from one collection to other,I would like to add an additional column whose value will be same as one of the existing key in json. I am trying with Additional column thing in Source settings but i am not able to figure out how can I add an existing column value in there. anyone with any help on this_
In case of copy activity, you can assign the existing column value to new column under Additional column by specifying value as $$COLUMN and add the column name to be assigned.
If you are adding new column in data flow, you can achieve this using derived column

How to add a table data to transport request programmatically?

I have a task to add selected rows from alv grid to the transport request.
At this moment I already have:
Name of transport request
Selected rows (I put them in a table because I don't know what the type they should be if I want to put them into the transport request):
First I get indexes:
call method grid->get_selected_rows
importing
et_index_rows = lt_rows.
Second I get rows that I need and put them into a new table:
if lt_rows is not initial.
loop at lt_rows into ls_row.
read table lt_variable index ls_row into ls_variable.
append ls_variable to lt_variable_changed.
endloop.
endif.
As I understand I need to use all of this in the function TR_OBJECTS_INSERT, but unfortunately I didn't get any information that could help me to understand that I did it correctly.
What is the critical need of transporting data in runtime? It is unstable and not recommended.
Just create customizing table in Data Dictionary and insert necessary ALV grid rows to it in runtime.
Then use transport with object type R3TR-TABU to move that customizing table to another system. Do it in batches, not by pieces of 2 or 3 rows like you want.
Here is the full tutorial.
But it's a bad practice to do like this. Replicating data across the landscape in a regular manner is a BASIS task and should be done by BASIS, not by developer.
And replicating rows of business data in runtime is a horrible practice.

Parse.com REST API - Can I query a value within a subset of an Object data type column?

I am a new user of Parse.com and do like what I see so far. I am helping a friend out. They have a locations table that houses lat and long values. However, the developer that created it used an Object data type instead of a GeoPoint data type to store lat/long info. The developer wrote an iOS app that pulls records from parse and stores them locally to display on the map. What I am trying to do is figure out how that developer pulls only the records within X number of miles... or even based on a bounding box. Anyways, my question is if I have an object column with data like what I have shown below is there any way to filter the results by lat/long values (gt or lt etc)?
{"Latitude":33.51882,"Longitude":-93.97484}
It seems like an Object data type is a loose way of storing key value pairs without having to create an entities table etc. However, I cannot find a way to query this subset.
thanks for any help!

SSRS Subscription with Multiple Data Sources

Using the subscription functionality of SSRS, I have automatically run reports in a scheduled manner and sent out e-mails with the results of the report. I have only done this using a single data source. My question is, can I do this while using multiple data sources?
My goal is to just run the same report across a collection of data sources, and then have all of the results from each data source get sent out in one e-mail as a subscription.
In my specific case, I just need a single row for each data source. My intent would be to form a table, with one row articulated for each data source.
Using the subscription functionality of SSRS, I have automatically run
reports in a scheduled manner and sent out e-mails with the results of
the report. I have only done this using a single data source. My
question is, can I do this while using multiple data sources?
This isn't clear because a single report can only be matched to a single subscription.
My goal is to just run the same report across a collection of data sources, and then have
all of the results from each data source get sent out in one e-mail as a subscription.
Do this in a single report that uses multiple data sources.
In my specific case, I just need a single row for each data source. My
intent would be to form a table, with one row articulated for each
data source.
Sounds like you have two options here:
Use Linked Servers:
Linking Servers Link 1.
Create Linked Servers Link 2.
Write a query that returns a single line for each source and use UNION ALL to create a single result set/'table'.
Create a report that uses the multiple data sources.
Have a single row table for each source and then arrange the tables to look like a single one when rendered.
Create a single subscription for your new report that combines multiple data sources.

wicket - table - dynamically inserting rows without complete refresh of the table

I've been playing around with Wicket for some time now and in one of the latest requirements that I received specifies that I should display some data in a table, and I'm wondering what is the best solution in my case: DataTable, DataView, ... ?
What is my case?
The table should display new rows submitted from a form on the same page by the current user or another user also connected to the application without refreshing the entire table to reduce the amount of data transferred.
What's your point of view?