Querying (and filtering) in a many-to-many relationship in Backand - backand

I'm trying to use the pet-owner example to create some sort of playlist app where a playlist can be shared among different users.
I have read both links to understand how many-to-many relationship is created in Backand:
Link 1 -
Link 2
According to pet's example, to get all owners from one pet I should get the pet object (using its id field) and then filter its user_pets list matching the user id. That may work for small amount of users/pets but I'd rather prefer to query user_pets table directly by filtering by user_id and pet_id.
My approach has been this code without success:
$http({
method: 'GET',
url: getUrl(), // this maps to pets_owner "table"
params: {
deep: true,
exclude: 'metadata',
filter: [
{ fieldName: 'pet', operator: 'equals', value: pet_id },
{ fieldName: 'owner', operator: 'equals', value: user_id }
]
}
})
Any idea how to query/filter to get only related results?
Thanks in advance

Because user_id and pet_d are both object fields the operator should be "in"
From Backand docs :
following are the possible operators depending on the field type:
numeric or date fields:
-- equals
....
object fields:
-- in

Related

FaunaDB: how to fetch a custom column

I'm just learning FaunaDB and FQL and having some trouble (mainly because I come from MySQL). I can successfully query a table (eg: users) and fetch a specific user. This user has a property users.expiry_date which is a faunadb Time() type.
What I would like to do is know if this date has expired by using the function LT(Now(), users.expiry_date), but I don't know how to create this query. Do I have to create an Index first?
So in short, just fetching one of the users documents gets me this:
{
id: 1,
username: 'test',
expiry_date: Time("2022-01-10T16:01:47.394Z")
}
But I would like to get this:
{
id: 1,
username: 'test',
expiry_date: Time("2022-01-10T16:01:47.394Z"),
has_expired: true,
}
I have this FQL query now (ignore oauthInfo):
Query(
Let(
{
oauthInfo: Select(['data'], Get(Ref(Collection('user_oauth_info'), refId))),
user: Select(['data'], Get(Select(['user_id'], Var('oauthInfo'))))
},
Merge({ oauthInfo: Var('oauthInfo') }, { user: Var('user') })
)
)
How would I do the equivalent of the mySQL query SELECT users.*, IF(users.expiry_date < NOW(), 1, 0) as is_expired FROM users in FQL?
Your use of Let and Merge show that you are thinking about FQL in a good way. These are functions that can go a long way to making your queries more organized and readable!
I will start with some notes, but they will be relevant to the final answer, so please stick with me.
The Query function
https://docs.fauna.com/fauna/current/api/fql/functions/query
First, you should not need to wrap anything in the Query function, here. Query is necessary for defining functions in FQL that will be run later, for example, in the User-Defined Function body. You will always see it as Query(Lambda(...)).
Fauna IDs
https://docs.fauna.com/fauna/current/learn/understanding/documents
Remember that Fauna assigns unique IDs for every Document for you. When I see fields named id, that is a bit of a red flag, so I want to highlight that. There are plenty of reasons that you might store some business-ID in a Document, but be sure that you need it.
Getting an ID
A Document in Fauna is shaped like:
{
ref: Ref(Collection("users"), "101"), // <-- "id" is 101
ts: 1641508095450000,
data: { /* ... */ }
}
In the JS driver you can use this id by using documentResult.ref.id (other drivers can do this in similar ways)
You can access the ID directly in FQL as well. You use the Select function.
Let(
{
user: Get(Select(['user_id'], Var('oauthInfo')))
id: Select(["ref", "id"], Var("user"))
},
Var("id")
)
More about the Select function.
https://docs.fauna.com/fauna/current/api/fql/functions/select
You are already using Select and that's the function you are looking for. It's what you use to grab any piece of an object or array.
Here's a contrived example that gets the zip code for the 3rd user in the Collection:
Let(
{
page: Paginate(Documents(Collection("user")),
},
Select(["data", 2, "data", "address", "zip"], Var("user"))
)
Bring it together
That said, your Let function is a great start. Let's break things down into smaller steps.
Let(
{
oauthInfo_ref: Ref(Collection('user_oauth_info'), refId)
oauthInfo_doc: Get(Var("oathInfoRef")),
// make sure that user_oath_info.user_id is a full Ref, not just a number
user_ref: Select(["data", "user_id"], Var("oauthInfo_doc"))
user_doc: Get(Var("user_ref")),
user_id: Select("id", Var("user_ref")),
// calculate expired
expiry_date: Select(["data", "expiry_date"], Var("user_doc")),
has_expired: LT(Now(), Var("expiry_date"))
},
// if the data does not overlap, Merge is not required.
// you can build plain objects in FQL
{
oauthInfo: Var("oauthInfo_doc"), // entire Document
user: Var("user_doc"), // entire Document
has_expired: Var("has_expired") // an extra field
}
)
Instead of returning the auth info and user as separate points if you do want to Merge them and/or add additional fields, then feel free to do that
// ...
Merge(
Select("data", Var("user_doc")), // just the data
{
user_id: Var("user_id"), // added field
has_expired: Var("has_expired") // added field
}
)
)

Creating an index for all active items

I have a collection of documents that follow this schema {label: String, status: Number}.
I want to introduce a new field, deleted_at: Date that will hold information if a document has already been deleted. Seems like a perfect use case for an index, to be able to search for all undeleted tasks.
CreateIndex({
name: "activeTasks",
source: Collection("tasks"),
terms: [
{ field: ["data", "deleted_at"] }
]
})
And then filter by undefined / null value in shell:
Paginate(Match(Index("activeTasks"), null))
Paginate(Match(Index("activeTasks"), undefined))
It returns nothing, even for documents where I explicitly set deleted_at to null.
That's not my point, though. I want to get documents that do not have the deleted_at defined at all, so that I do not have to update the whole collection.
PS. When I add document where deleted: "test" and query for it, the shell does return the expected result.
What do I don't get?
The reason is because FaunaDB doesn't support reading empty/null value the way you think it does. You need to use a special Bindings to do that.
Make sure to check out https://docs.fauna.com/fauna/current/tutorials/indexes/bindings.html#empty for a more thorough explanation and examples.
My understanding of how bindings work would yield the following code. I haven't tested it though and I'm not sure it works.
You need a special binding index:
CreateIndex({
name: "activeTasks",
source: [{
collection: Collection("tasks"),
fields: {
null_deleted_at: Query(
Lambda(
"doc",
Equals(Select(["data", "deleted_at"], Var("doc"), null), null)
)
)
}
}],
terms: [ {binding: "null_deleted_at"} ],
})
Usage:
Map(
Paginate(Match(Index("activeTasks"), true)),
Lambda("X", Get(Var("X")))
)

set two different memory stores for one dojo widget (dijit/form/FilteringSelect) at the same time

I have two different JSON structures. One represent the individual users of the system and other represents groups made of these users. So, I created two memory stores with these (each has different idProperty - userId and groupId, respectively).
I have a filteringSelect dropdown and my requirement is to add both of these as the data store of the list, so that either a valid user or a valid group could be selected from the dropdown.
Two possible ways I could think of doing this :
1) by creating one common memory store of two JSONs - but idProperty is different so not sure how this is possible
2) by adding both the memory stores to the widget but again different idProperty so not sure.
I am very new to using Dojo so any help would be really appreaciated. Thanks in advance!
I think that, if you use a store to represent something (model data), it should be formed so that it can be used properly within a widget.
So in your case I would add both of them to a single store. If they have a different ID (for example when it's a result of a back-end service), then you could map both types of models into a single object structure. For example:
var groups = [{
groupId: 1,
groupName: "Group 1",
users: 10
}, {
groupId: 2,
groupName : "Group 2",
users: 13
}, {
groupId: 3,
groupName : "Group 3",
users: 2
}];
var users = [{
userId: 1,
firstName: "John",
lastName: "Doe"
}, {
userId: 2,
firstName: "Jane",
lastName: "Doe"
}, {
userId: 3,
firstName: "John",
lastName: "Smith"
}];
require(["dojo/store/Memory", "dijit/form/FilteringSelect", "dojo/_base/array", "dojo/domReady!"], function(Memory, FilteringSelect, array) {
var filterData = array.map(groups, function(group) {
return {
id: "GROUP" + group.groupId,
groupId: group.groupId,
name: group.groupName,
type: "group"
};
});
Array.prototype.push.apply(filterData, array.map(users, function(user) {
return {
id: "USER" + user.userId,
userId: user.userId,
name: user.firstName + " " + user.lastName,
type: "user"
};
}));
});
In this example, we have two arrays groups and users, and to merge them I used the map() function of dojo/_base/array and then I concatenated both results.
They still contain their original ID and a type, so you will still be able to reference the original object.
From my previous experiences, I learned that your model data should not represent pure business data, but data that is easily used in the view/user interface.
By giving both arrays a similar object structure, you can easily use them in a dijit/form/FilteringSelect, which you can see here: http://jsfiddle.net/ut5hjbyb/

MongoDB: How retrieve data that is newly constructed instead of original documents in the collection?

I have a collection in which documents are all in this format:
{"user_id": ObjectId, "book_id": ObjectId}
It represents the relationship between user and book, which is also one-to-many, that means, a user can have more than one books.
Now I got three book_id, for example:
["507f191e810c19729de860ea", "507f191e810c19729de345ez", "507f191e810c19729de860efr"]
I want to query out the users who have these three books, because the result I want is not the document in this collection, but a newly constructed array of user_id, it seems complicated and I have no idea about how to make the query, please help me.
NOTE:
The reason why I didn't use the structure like:
{"user_id": ObjectId, "book_ids": [ObjectId, ...]}
is because in my system, books increase frequently and have no limit in amount, in other words, user may read thousands of books, so I think it's better to use the traditional way to store it.
This question is not restricted by MongoDB, you can answer it in relational database thoughts.
Using a regular find you cannot get back all user_id fields who own all the book_id's because you normalized your collection (flattened it).
You can do it, if you use aggregation framework:
db.collection.aggregate([
{
$match: {
book_id: {
$in: ["507f191e810c19729de860ea",
"507f191e810c19729de345ez",
"507f191e810c19729de860efr" ]
}
}
},
{
$group: {
_id: "$user_id",
count: { $sum: 1 }
}
},
{
$match: {
count: 3
}
},
{
$group: {
_id: null,
users: { $addToSet: "$_id" }
}
}
]);
What this does is filters through the pipeline only for documents which match one of the three book_id values, then it groups by user_id and counts how many matches that user got. If they got three they pass to the next pipeline operation which groups them into an array of user_ids. This solution assumes that each 'user_id,book_id' record can only appear once in the original collection.

How to filter by sub-object in Rally 2 SDK

I'm trying to query for user stories whose release start date is greater than a particular date. Is it possible to do this using the "filters" config rather than querying all stories and then checking manually?
Is this valid? :
Ext.create('Rally.data.WsapiDataStore', {
model: 'UserStory',
context: {
project: '/project/xxxx'
},
autoLoad: true,
fetch: ['Rank', 'FormattedID', 'Release'],
filters: [
{
property: 'Release.ReleaseStartDate',
operator: '>',
value: '2012-10-10'
}
]
});
It doesn't work, just fetches all records.
The code posted above does work. I was actually using a comboBox value in the "value" property. Turns out I didn't convert it to a proper DateTime format and hence the comparison was failing and returning all records.
In my case I had to use Rally.util.DateTime.toIsoString in order to compare the value in the combobox.