How to update values of existing keys or insert new key -values to a mongo document based on a python dictionary using pymongo? - pymongo

I have a dictionary object in python,
dic = {"_id":1,
"A":123,
"B":234,
"C":222}
and a collection in mongo that has stored the same ID document in mongo which looks like,
mongodoc= {"_id":1,"A":233,"B":234,"D":999}
I want to update the documents in mongo directly based on the dictionary values using pymongo,
mongodoc={"_id":1,"A":123,"B":234,"D":999,"C":222}
If the keys matches between the dictionary and mongo document, update the values else insert new key value pairs.
I tried using
collection.update{"_id:1",{"set"}}
but this needs to be given a specific keys which doesn't work for my problem statement. Not sure how to proceed.
Kindly help.

The $set operator takes a dictionary which will achieve what you are looking to do:
collection.update_one({'_id': dic['_id']}, {'$set': dic})

Related

How to check if a value in python dictionary exist in mongo db? If exist update the value in mongoDB else update in mongoDB

I have a python dictionary that has {objctID},'GUID',{'A':A's score'},{ 'B':'B's score}.I have got 5 guid's in the dictionary.
The same format document is already stored in mongoDB.
I want to check if the guid's in python dict are there in mongodb collection. If exist it has be to updated else insert to mongoDB.
How can I do this using pyMongo?
Assuming that RESULT is a dict with keys that match yourIDs and that contains your scores, you can do the following :
ids = RESULT.keys()
for id in ids:
collection.update({"ID": id}, {$set : {id : RESULT[id]}}, true)
The true, will insert if the document doesn't already exist.
Here are the pymongo docs

How to get keys from the value in redis

I have checked following, but didn't work.
https://redis.io/commands/keys
KEYS Room:*
1) "Room:120"
2) "Room:121"
3) "Room:122"
Following is the redis key/values (HMSET)
Room:120 [SocketId:mOQDJusPjDTBN5L-AAAC,TimeStamp:10-10-2017 12:10:00 AM]
Room:121 ....
Room:122 ....
...
Need to search as Room:* SocketId:mOQDJusPjDTBN5L-AAAC
How can I search for SocketId in the collection ?
Need to search with:
mOQDJusPjDTBN5L-AAAC
The question is not so clear
as u mentioned hmset i am assuming that you are using hashes to store your data.
As per your data,
'room120' should be the key, 'socketId' should be the field and 'mOQDJusPjDTBN5L-AAAC' should be the value.
so in order to search for socketId you can use hscan,where hscan iterates through the field of a particular key.https://redis.io/commands/scan
in case if you are just using key/value storage i.e
'socketId' being the key ,'mOQDJusPjDTBN5L-AAAC' being the value.
here u can just use the command Keys *socket*to search for the key socketId

std::map issue while auto sorting string as key

I am using std::map to add values
std::map<CString,int>
the follwing is the list of items to be inserted as Key
X_1O1
X_101_A
X_70
X_67
I was expecting a sorted map with
X_67
X_70
X_101
X_101_A
but i am getting the result as
X_101
X_101_A
X_67
X_70
is there a way that I can sort the keys properly in the map?

Redis: How to increment hash key when adding data?

I'm iterating through data and dumping some to a Redis DB. Here's an example:
hmset id:1 username "bsmith1" department "accounting"
How can I increment the unique ID on the fly and then use that during the next hmset command? This seems like an obvious ask but I can't quite find the answer.
Use another key, a String, for storing the last ID. Before calling HMSET, call INCR on that key to obtain the next ID. Wrap the two commands in a MULTI/EXEC block or a Lua script to ensure the atomicity of the transaction.
Like Itamar mentions you can store your index/counter in a separate key. In this example I've chosen the name index for that key.
Python 3
KEY_INDEX = 'index'
r = redis.from_url(host)
def store_user(user):
r.incr(KEY_INDEX, 1) # If key doesn't exist it will get created
index = r.get(KEY_INDEX).decode('utf-8') # Decode from byte to string
int_index = int(index) # Convert from string to int
result = r.set('user::%d' % int_index, user)
...
Note that user::<index> is an arbitrary key chosen by me. You can use whatever you want.
If you have multiple machines writing to the same DB you probably want to use pipelines.

Generate dynamic hstore key calls in Prawn

I have an hstore column that I'm using to build a table in Prawn (pdf builder). The data will consist of records for a given month. Since it is hstore, the keys used will likely change from day to day so this needs to be dynamic.
I need to determine:
1 What unique keys are used that month
I created a helper to find the unique keys that were used in the month. These will be used as column headers.
keys(#users_logs)
# this returns an array like - ["XC", "PIC", "Mountain"]
The table will display a users dutylog data for the month. For testing...If I explicitly call known hstore keys...the data displays correctly. But, since its hstore...I wont know what the table column will be in production.
For testing, I call known hstore keys...this creates the prawn table row data per duty log.
#users_logs.map do |dutylog|
[ dutylog.properties["XC"],
dutylog.properties["PIC"],
dutylog.properties["Mountain"]
]
end
But, since this is hstore...I wont know what keys to call in production. So, I need to make the above iteration dynamic.
I tried, without success, to iterate over each dutylog entry, then iterate over each unique key and output one "dutylog.properties[x]" call for each key value...but, this just outputs the array of key values. I tried using send() in the block, but that didnt help.
#users_logs.map do |dutylog|
[ keys(#users_logs).each { |k| dutylog.properties[k] }.join(",") ]
end
Any ideas on how I could make the "dutylog.properties[k]" dynamic?
Took some head scratching...but turning out to be quit easy
This will build the rows for the Prawn table
def hstore_duty_log_rows
[keys(#users_logs)] +
#users_logs.map do |dutylog|
keys(#users_logs).map { |key| dutylog.properties.keys.include?(key) ? "#{dutylog.properties[key]}" : "0" }
end
end