React Admin - Make input for filter based on other resource - react-admin

I am using React Admin to make a dashboard and I have this Lead resource with the status field, that is computed based on another resource, Call, and wanted to make a filter component for Lead's list. The way it works is that for each lead, I query the last call (sorted by a date field) associated with this lead and get its status. The lead status is the status for the last call.
{ filter: { lead }, sort: { date: -1 }, limit: 1 }
the lead status query
I use this query to make a field (that appear in the list in the row of a single lead), and wanted to know how I can make an input component to use as a filter in the list. I know this pattern is weird, but it's hard to change it in the backend because of how it's structured. I am open to suggestions concerning how to change this messy computed field situation, but as I said, I would be satisfied with knowing how I can create the input component.

The solution I'm going with is a computed field. In my case, as I use MongoDB, it will be done through an aggregation pipeline. As I'm using REST instead of GraphQL, I cannot use a resolver that would only be called in the need of the status field, sometimes resulting in an uneeded aggregation (getting the last Call for a given Lead). However, it won't incur in an additional roundtrip - and instead only consume more processing time in the DB - which would be necessary for react-admin to compute this field in through a reference. And status is an important field that will usually be needed anyways.

Related

Apigee Integration: How to use listEntitiesPageSize parameter in conjunction with the listEntitiesPageToken parameter o navigate through the pages

Good day everyone,
we are trying to have through the use of the integrations of the Apigee service of google all the rows in a bigquery table that have a certain value in a field.
this operation is quite easy to do, but when we have more than 200 lines as a result, problems arise.
The problem is that using the integration to connect to BigQuery I am not returning any listEntitiesPageToken value and not even any listEntitiesNextPageToken value
so i can't figure out how i can go about navigating the result pages
Has anyone had the same problem? What do you suggest?
In the tutorial: "https://cloud.google.com/apigee/docs/api-platform/integration/connectors-task#configure-the-connectors-task" is write : "For example, if you are expecting 1000 records in your result set, you can set the listEntitiesPageSize to 100. So when the Connectors task runs for the first time, it returns the first 100 records, the next 100 records in the second run and so on."
And there is a tip: "Use the listEntitiesPageSize parameter in conjunction with the listEntitiesPageToken parameter to navigate through the pages."
I used the tutorial to understand how to use the task for loop and I understood that I should create a "subintegration" which must be called by a "main integration" for each element present in a list / array.
But what what can i do since these tokens are empty?

React-Admin filters that relate to the current results

We're really enjoying using the capabilities offered by React-Admin.
We're using <ReferenceArrayInput> to allow filtering of a <List> by Country. The drop-down contains all countries in the database.
But, we'd like it to just contain the countries that relate to the current set of filtered records.
So, in the context of the React-Admin demo, if we've filtered for Returned, then the Customer drop-down would only contain customers who have returned items (see below). This would make a real-difference in finding the records of interest.
Our current plan is to (somehow) handle this in our <DataProvider>. But, is there are more ReactAdmin friendly way of doing this?
So you want to build dependent filters, which is not a native feature of react-admin - and a complex beast to tame.
First, doing so in the dataProvider will not work, because you'll only have the data of the first page of results. A record in a following page may have another value for your array input.
You could implement that logic in a custom Input component instead. This component can wrap the original <ReferenceArrayInput> and read the current ListContext to get the current data and filter value (https://marmelab.com/react-admin/useListContext.html), then alter the array of possible values using the filter prop (https://marmelab.com/react-admin/ReferenceArrayInput.html#filter).

Checking Whether Table Data Exists, Updating / Inserting Into Two Tables & Posting End Outcome

I am working on my cron system which gathers informaiton via an API call. For most, it has been fairly straight forward, but now I am faced with multiple difficulties, as the API call is dependant on who is making the API request. It runs through each users API Key and certain information will be visible/hidden to them and visaversa to the public.
There are teams, and users are part of teams. A user can stealth their move, however all information will be showed to them and their team, however this will not be visible to their oponent, however both teams share the same id and have access tothe same informaiton, just one can see more of it than the other.
Defendants Point Of View
"attacks": {
"12345`": {
"timestamp": 1645345234,
"attacker_id": "",
"attacker_team_id": "",
"defender_id": 321,
"defender_team_id": 1,
"stealthed": 1
}
}
Attackers Point Of View
"attacks": {
"12345`": {
"timestamp": 1645345234,
"attacker_id": 123,
"attacker_team_id": 2
"defender_id": 321,
"defender_team_id": 1,
"stealthed": 1,
"boosters": {
"fair_fight": 3,
"retaliation": 1,
"group_attack": 1
}
}
}
So, if the defendant's API key is first used, id 12345 will already be in the team_attacks table but will not include the attacker_id and attacker_team_id. For each insert there after, I need to check to see whether the new insert's ID already exist and has any additional information to add to the row.
Here is the part of my code that loops through the API and obtains the data, it loops through all the attacks per API Key;
else if ($category === "attacks") {
$database = new Database();
foreach($data as $attack_id => $info) {
$database->query('INSERT INTO team_attacks (attack_id, attacker_id, attacker_team_id, defender_id, defender_team_id) VALUES (:attack_id, :attacker_id, :attacker_team_id, :defender_id, :defender_team_id)');
$database->bind(':attack_id', $attack_id);
$database->bind(':attacker_id', $info["attacker_id"]);
$database->bind(':attacker_team_id', $info["attacker_team_id"]);
$database->bind(':defender_id', $info["defender_id"]);
$database->bind(':defender_team_id', $info["defender_team_id"]);
$database->execute();
}
}
I have also been submitting to the news table, and typically I have simply been submitting X new entries have been added or whatnot, however I haven't a clue if there is a way to check during the above if any new entries and any updated entries to produce two news feeds:
2 attacks have bee updated.
49 new attack information added.
For this part, I was simply counting how many is in the array, but this only works for the first ever upload, I know I cannot simply count the array length on future inserts which require additional checks.
If The attack_id Does NOT Already Exist I also need to submit the boosters into another table, for this I was adding them to an array during the above loop and then looping through them to submit those, but this also depends on the above, not simply attempting to upload for each one without any checks. Boosters will share the attack_id.
With over 1,000 teams who will potentially have at least one members join my site, I need to be as efficient as this as possible. The API will give the last 100 attacks per call and I want this to be within my cron which collects any new data every 30 seconds, so I need to sort through potentially 100,000.
In SQL, you can check conditions when inserting new data using merge:
https://en.wikipedia.org/wiki/Merge_(SQL)
Depending on the database you are using, the name and syntax of the command might be different. Common names for the command are also upsert and replace.
But: If you are seeking for high performance and almost-realtimeness, consider using a cache holding critical aggregated data instead of doing the aggregation 100'000 times per minute.
This may or may not be the "answer" you're looking for. The question(s) imply use of a single table for both teams. It's worth considering one table per team for writes to avoid write contention altogether. The two data sets could be combined at query time in order to return "team" results via the API. At scale, you could have another process calculating and storing combined team results in an API-specific cache table that serves the API request.

Podio API filtering of returned fields without using a view

Based on the information provided by Pavlo it appears there is no way to exclude certain fields and no way to include specific fields from Podio data returned from a JSON Post query. For the purpose of this question, the fields involved are (for example) text, category, date, calc field and others which can be added using 'modify template'.
The best work around is redesign of the app to reduce the amt of field data.
(original question)
Is there a way to limit the amount of Podio data returned from a JSON Post query to include only a few specific fields instead of every field?
I understand how to use a podio view or filtering in the Post query to limit how many items are returned, but my question has to do with reducing the amount of data returned for each item by preventing data in unnecessary fields from being returned.
(following is an example of the query I use currently, but as stated above I'm looking for a way to limit the fields returned to a small subset)
Example JSON query: https://api.podio.com/item/app/14773320/filter
Example JSON body:
{
"filters": {
"created_on": {
"from": "{date.addMonths(-6).format()}",
"to": "{date.today}"
}
},
"limit":250,
"offset":{props.offSet}
}
You can use fields parameter for that.
More details on how it works and how else it could be used are right here: https://developers.podio.com/index/api. Scroll down to Bundling responses using fields parameter section.
Most likely you are looking for fields=items.view(micro) parameter. Podio API will return then only 5 values for each item:
app_item_id
item_id
title
link
revision

Rally Lookback, Snapshot with empty custom field

I am trying to get a snapshot of deleted userstory to get value for a custom field(c_Dep). I get the snapshot but the custom field is empty. It had value in it. Does lookback not save value for cutomer created cutom field?
findConfig: {
_TypeHierarchy: 'HierarchicalRequirement',
"ObjectID": 12345,
"_ValidFrom": {
"$lte": "2017-01-25T19:00:57.475Z"
}
Sarita, It is hard to tell from the information you have given what is going on precisely. However, I can give you some pointers
The Lookback API will store changes in values for custom fields. The selection you have shown is valid from 24thJan to 25thJan. During this period was the custom field set? Probably not, because the array is only one long and I think it is showing the creation event.
Was the custom field updated to contain something after this time period?
The reason for asking is that a common misunderstanding is that the records stored in the lookback database will hold the current value of fields - it doesn't. It holds the changes in fields. If c_Dependencies didn't change during that time period, you may not see an entry returned in the array. The next entry in the database might be the record where the c_Dependencies field was set (changed from null to something) and that might be 'after' your time period filter.
It looks like your query is requesting snapshots earlier than 2017/1/25 ($lte). Since there's only one, it's probably the creation snapshot. If you get all snapshots for the ObjectID by removing the _ValidFrom parameter, you should see the changes made to c_Dep after artifact creation.
As I am not allowed to comment, I have to post a new answer.
I think William Scott meant remove the ValidTo filter. The one you have is the creation change. The update will be afterwards.