ListGrid.updateData() for multiple records - smartclient

Is there an smartClient solution to update multiple?
saveAllEdits didn't send any update request to the server, updateData works on a single Record object, if I try to send an array it ends up at the server as
0:{
name:"example",
permission:"high"
},
1:{
name:"test",
permission:"low"
},
2:{
name:"inquery",
permisson:"low"
}
I need a solution that will send the request as
records:[
{
name:"example",
permission:"high"
},
{
name:"test",
permission:"low"
},
{
name:"inquery",
permisson:"low"
}
]

use queueing:
RPCManager.startQueue();
grid.updateData(record1);
grid.updateData(record2);
RPCManager.sendQueue();

Related

Fetching nearest events with user location using meetup.com GraphQL API

I am trying to find out a way to fetch nearby events using GraphQL meetup.com API. After digging into the documentation for quite some time, I wasn't able to find a query that suits my needs. Furthermore, I wasn't able to find old, REST, documentation, where, the solution for my case might be present.
Thanks in advance !
This is what I could figure out so far, the Documentation for SearchNode is missing, but I could get id's for events:
query($filter: SearchConnectionFilter!) {
keywordSearch(filter: $filter) {
count
edges {
cursor
node {
id
}
}
}
}
Input JSON:
{ "filter" : {
"query" : "party",
"lat" : 43.8,
"lon" : -79.4, "radius" : 100,
"source" : "EVENTS"
}
}
Hope that helps. Trying to figure out this new GraphQL API
You can do something like this (customize it with whatever fields you want from Event):
const axios = require('axios');
const data = {
query: `
query($filter: SearchConnectionFilter!) {
keywordSearch(filter: $filter) {
count
edges {
cursor
node {
id
result {
... on Event {
title
eventUrl
description
dateTime
going
}
}
}
}
}
}`,
variables: {
filter: {
query: "party",
lat: 43.8,
lon: -79.4,
radius: 100,
source: "EVENTS",
},
},
};
axios({
method: "post",
url: `https://api.meetup.com/gql`,
headers: {
Authorization: `Bearer YOUR_OAUTH_ACCESS_TOKEN`,
},
data,
})

Delete many records in one query on 8base

On 8base I try to delete multiples records in one query on my app but nothing seems to work properly because the two arguments of the delete mutation (data and filter) does not seems to support array of Ids.
Is the there a way to do that properly in one query ?
Same thing on nested data, is the there a way to delete for example an user and all this posts in one query ?
Thank you in advance !
There is this community post that answers the question: https://community.8base.com/t/delete-many-records-in-one-query/464/3
I also included one part of a solution here:
mutation {
user1: userDelete(data: { id: "id1" }) { success }
user2: userDelete(data: { id: "id2" }) { success }
user3: userDelete(data: { id: "id3" }) { success }
user4: userDelete(data: { id: "id4" }) { success }
...etc
}

Why is Date query with aggregate is not working in parse-server?

I want to query user where updatedAt is less than or equal today using aggregate because I'm doing other stuff like sorting by pointers.
I'm using cloud code to define the query from the server.
I first tried using mongoDB Compass to check my query using ISODate and it works, but using it in NodeJS seems not working correctly.
I also noticed about this problem that was already fix, they say. I also saw their tests.
Here's a link to that PR.
I'm passing date like this:
const pipeline = [
{
project: {
_id: true,
process: {
$substr: ['$_p_testdata', 12, -1]
}
}
},
{
lookup: {
from: 'Test',
localField: 'process',
foreignField: '_id',
as: 'process'
}
},
{
unwind: {
path: '$process'
}
},
{
match: {
'process._updated_at': {
$lte: new Date()
}
}
}
];
const query = new Parse.Query('data');
return query.aggregate(pipeline);
I expect value to be an array with length of 4 but only give me empty array.
I was able to fetch data without match date.
Please try this:
const pipeline = [
{
match: {
'editedBy.updatedAt': {
$lte: new Date()
}
}
}
];

Read query from apollo cache with a query that doesn't exist yet, but has all info stored in the cache already

I have a graphql endpoint where this query can be entered:
fragment ChildParts {
id
__typename
}
fragment ParentParts {
__typename
id
children {
edges{
node {
...ChildParts
}
}
}
query {
parents {
edges
nodes {
...ParentParts
}
}
}
}
When executed, it returns something like this:
"data": {
"edges": [
"node": {
"id": "<some id for parent>",
"__typename": "ParentNode",
"children": {
"edges": [
node: {
"id": "<some id for child>",
"__typename": "ChildNode"
},
...
]
}
},
...
]
}
Now, with apollo client, after a mutation, I can read this query from the cache, and update / add / delete any ParentNode, and also any ChildNode, but I have to go over the structure returned by this query.
Now, I'm looking for a possibility to get a list of ChildNodes out of the cache (which has those already, as the cache is created as a flat list), to make the update of nested data a bit easier. Is there a possibility of reading a query out of the cache, without having read the same query from the server before?
You can use the client's readFragment method to retrieve any one individual item from the cache. This just requires the id and a fragment string.
const todo = client.readFragment({
id,
fragment: gql`
fragment fooFragment on Foo {
id
bar
qax
}
`,
})
Note that id here is the cache key returned by the dataIdFromObject function -- if you haven't specified a custom function, then (provided the __typename and id or _id fields are present) the default implementation is just:
${result.__typename}:${result.id || result._id}
If you provided your own dataIdFromObject function, you'll need to provide whatever id is returned by that function.
As #Herku pointed out, depending on the use case, it's also possible to use cache redirects to utilize data cached for one query when resolving another one. This is configured as part of setting up your InMemoryCache:
const cache = new InMemoryCache({
cacheRedirects: {
Query: {
book: (_, args, { getCacheKey }) =>
getCacheKey({ __typename: 'Book', id: args.id })
},
},
})
Unfortunately, as of writing this answer, I don't believe there's any method to delete a cached item by id. There's on going discussion here around that point (original issue here).

Add a new element in each array of objects where array may have different length in mongodb

I have a following shema.
{
id:week
output:{
headerValues:[
{startDate:"0707",headers:"ID|week"},
{startDate:"0715",headers:"ID1|week1"},
{startDate:"0722",headers:"ID2|week2"}
]
}
}
I have to add a new field into headerValues array like this:
{
id:week
output:{
headerValues[
{startDate:"0707",headers:"ID|week",types:"used"},
{startDate:"0715",headers:"ID1|week1",types:"used"},
{startDate:"0722",headers:"ID2|week2",types:"used"}
]
}
}
I tried different approaches like this:
1)
db.CollectionName.find({}).forEach(function(data){
for(var i=0;i<data.output.headerValues.length;i++) {
db.CollectionName.update({
"_id": data._id, "output.headerValues.startDate":data.output.headerValues[i].startDate
},
{
"$set": {
"output.headerValues.$.types":"used"
}
},true,true
);
}
})
So, In this approach it is executing script and then failing. It is updating result with failed statement.
2)
Another approach I have followed using this link:
https://jira.mongodb.org/browse/SERVER-1243
db.collectionName.update({"_id":"week"},
{ "$set": { "output.headerValues.$[].types":"used" }
})
But it fails with error:
cannot use the part (headerValues of output.headerValues.$[].types) to
traverse the element ({headerValues: [ { startDate: "0707", headers:
"Id|week" } ]}) WriteError#src/mongo/shell/bulk_api.js:469:48
Bulk/mergeBatchResults#src/mongo/shell/bulk_api.js:836:49
Bulk/executeBatch#src/mongo/shell/bulk_api.js:906:13
Bulk/this.execute#src/mongo/shell/bulk_api.js:1150:21
DBCollection.prototype.updateOne#src/mongo/shell/crud_api.js:550:17
#(shell):1:1
I have searched with many different ways which can update different arrays object by adding new field to each object but no success. Can anybody please suggest that what am I doing wrong?
Your query is {"_id" : "week"} but in your data id field is week
So you can change {"_id" : "week"} to {"id" : "week"} and also update your mongodb latest version
db.collectionName.update({"id":"week"},
{ "$set": { "output.headerValues.$[].types":"used" }
})