Delete record field via deepstream Record - deepstream.io

I'm looking for a way to clean document nested field, for example, consider I have a JSON object:
{
fieldToClean: {
fieldA: '..',
fieldB: '..',
fieldC: '..'
}
}
I know that I don't need fieldB anymore. I found one solution that looks like:
var record = deepstream.record.getRecord('<proper path>')
record.whenReady(function(){
var fieldToClean = record.get('fieldToClean')
delete fieldToClean.fieldB
record.set('fieldToClean', fieldToClean)
})
I wonder if deepstream provides something like:
record.delete('fieldToClean.fieldB')
or
record.set('fieldToClean.fieldB', undefined)
I wasn't able to find something like this in documentation.
Thank you for your time!

There's actually an issue for this open, our main design question is around deleting an index in array. Is that a null or splice? Be great to have your feedback!
https://github.com/deepstreamIO/deepstream.io/issues/29

Related

firestore remove documents by field value

each of my docs in my firestore db contains 1 field {name: 'some value'}
I would like to loop through all the docs and then if the doc's fields value is equal to my param I would like to remove that doc
I'm trying to do it like so:
removeContact: function(name){
console.log('removing contact',name)
db.collection("contacts").forEach(doc=>{
if(doc.data().name === name){
doc.delete()
}
})
}
but I get the error that forEach() is not defined.
You need to use .get() following the collection or query to get a query snapshot promise, which you then handle accordingly. You can use forEach on the snapshot and delete each doc.
A better way, instead of searching through every document and using an if statement, would be to use a query like where('name', '==', name) and delete the document that way. Using a query would leave less for your function to do.
To delete a document, you need to know the full path to that document. And since you don't know the document IDs, that means you'll first need to read the documents.
The good news is that this means you can also perform a query to filter only the documents you're interesting in, instead of doing an client-side if.
db.collection("contacts").where("name", "==", name)
.get()
.then((querySnapshot) => {
querySnapshot.forEach(doc=>{
doc.ref.delete()
})
})

Podio API JS - Update relationship field of a Item

Using NodeJS, I am trying to update relationship field which link to another app (contacts-leads). I have try all combination but still getting error. I think I have the necessary data to post, app_id, item_id, external_id..etc. I need help with forming JSON structure.
p.request('put','item/<Item_Id>/value', data)
var data {....}
app_id:'<app_id>'
value:'<value>' (value is the app_item_id of the link to application; that is the number in URL)
app_item_id: '<app_item_id>'
external_id:'<external_id>'
I was able to update non-relationship field without problem.
Thanks
Well, going to answer my own question. That will work for single app link, not sure about multiple ones.
data = {
"<external_id>": {
"apps": [{"app_id": <app_id>}],
"value: <app_item_id>
}
}

how to populate keystoneJS relationship

in keystoneJs's doc:
Populating related data in queries
You can populate related data for relationship fields thanks to Mongoose's populate functionality. To populate the author and category documents when loading a Post from the example above, you would do this:
Post.model.findOne().populate('author categories').exec(function(err,post) {
// the author is a fully populated User document
console.log(post.author.name);
});
my question is that there is any options I can configure so these List APIs can populate the many relationship automatically.
thanks.
mike so
I think not. This is how I do it when I use Keystone as an API (using .populate).
exports.getStoreWithId = function (req, res) {
Store.model
.find()
.populate('productTags productCategories')
.where('_id', req.params.id)
.exec(function (err, item) {
if (err) return res.apiError('database error', err);
res.apiResponse({
store: item,
});
});
};
Pretty sure the short answer here is no. If you want to populate you'll need to include the .populate.
That being said, keystone gives you access to the mongoose schema, so the answer here should work. Obviously their mongoose.Schema is done by your Post.add stuff so I think you can ignore their first snippet, and you should be able to add the hooks as Post.schema.pre(... for the second snippet.
The Post.schema.pre('save',... hooks definitely work with keystone, so I assume the pre-find hooks work too, however I've not actually tested this. (I'd be interested to know the outcome though!)
Finally, if that works, you could also have a look at the mongoose-autopopulate package, and see if you can get that to play nicely with keystone.

Right way to dynamically update view in Angular

What is the right way to updated the Model in the view, say after a successful API POST. I've a textarea, something like in a Twitter, where a user can enter text and post. The entered text must show up soon after it is posted successfully.
How to achieve this? Should I make another call to get the posts separately or is there any other way to do this?
My Code looks like
feedsResolve.getFeeds().then(function(feeds){
$scope.feeds = feeds;
}
where feedsResolve is a service returning a promise
$scope.postFeed = function(){
var postObj = Restangular.all('posts');
postObj.post( $scope.feed.text ).then(function(res){
//res contains only the new feed id
})
}
How do I update the $scope.feeds in the view?
I assume you are posting a new post and that generally posts look like:
{
id: 42,
text: 'This is my text'
}
In this case you can do something like:
$scope.postFeed = function(){
var postObj = Restangular.all('posts');
var feedText = $scope.feed.text;
postObj.post( feedText ).then(function(res){
$scope.feeds.push({ id: res.id, text: feedText});
})
};
A better practice when writing restful service though is to just have your POST return an actual JSON object with the new feed that was added (not just the id). If that were the case you could just add it to your feeds array.
If your JSON object is complex, this practice is the most common an easiest way to handle this without needing extra requests to the server. Since you already are on the server, and you've likely already created the object (in order to be able to insert it into the database), all you have to do is serialize it back out to the HTTP response. This adds little to no overhead and gives the client all the information it needs to effortlessly update.

Using meteor subscribe onReady function followed by observe results in repeated data

I use datatables on the client to allow speedy live sorting/filtering of around 10,000 rows of data. It is much faster to supply an array of rows to a DataTable during table creation than to add the rows individually. I can use the onReady function in subscribe to achieve this.
If I then call observe to pick up changes, I get the data already supplied in subscribe again.
While I can hack around this, I presume I am just not using meteor correctly and appreciate any advice.
Here is some sample code:
Meteor.subscribe("books", function(){
// Runs when subscription is complete
var mData = Books.find().fetch();
MyTable = $('#testTable').dataTable( {
'aoColumns': [
{ sTitle: 'title', sClass: 'alignRight', mDataProp: 'title'},
],
'aaData' : mData
});
// Add any new books.
Books.find().observe({added: function(item){
// ERR: Adds the books already fetched into mData as well as any new books.
MyTable.fnAddData([item]);
}});
});
There's a hidden option to observe ({_suppress_initial: true}) that avoids this behaviour. I'm not sure if it's a good idea to use it, but it is there.
As for advice around how to structure your code; it's not as easy as it should be, but I think you want to something like the following:
Wrap your table in a {{#constant}} helper so it never gets re-rendered.
Make sure the table doesn't get rendered the one-and-only time until the data is ready (this could help: https://github.com/oortcloud/unofficial-meteor-faq#how-do-i-know-when-my-subscription-is-ready-and-not-still-loading)
Do your code above in the table's Template.table.rendered callback.
That approach seems more modular.