I have a array of values, some of the values in the array may already be present in my database. I want to upsert the new values and increase the count of the old values.
One way to do this is :
For all the values that are present in the database use a update command and increment count. It can be done using : db.test.update({ link: {$in: ["A", "B"]}}, {$inc: {count: 1}}, {upsert: true, multi:true})
For all the values not present in my database, check each and every value and upsert it into the database.
The second step may put some load on the network. Is there any way to do the second step in one command?
For example consider this:
Initial state of my database:
{ "_id" : ObjectId("5a45f97f84527190e1f28cb7"), "link" : "A", "count" : 3 }
and I have the array as following: const values = ['A', 'B', 'C']
Now I want to have something like this:
{"_id": ObjectId("abc"), "link": "A", "count" : 4},
{"_id": ObjectId("xyz"), "link": "B", "count" : 1},
{"_id": ObjectId("fgh"), "link": "C", "count" : 1}
To achieve the second step (as you mentioned) optimally you can perform lookup in your DB using find() method.
If the array value isn't present in the DB, then find() is gonna return an empty object. So you just see the length of the object returned from find(). If found empty, insert into DB else update the info.
Example code for nodejs environment:
var db = mongojs('dbname', ['collection_name']);
var theLink = 'XYZ';
db.collection_name.find({link: theLink}, function (err, obj) {
if (err)
throw err;
if (obj.length == 0)
// write logic to insert
else
// write logic to update
});
Related
I'm new to MongoDB. I've an object below
{
"_id" : "ABCDEFGH1234",
"level" : 0.6,
"pumps" : [
{
"pumpNo" : 1
},
{
"pumpNo" : 2
}
]
}
And I just want to move level field to pumps array's objects like this
{
"_id" : "ABCDEFGH1234",
"pumps" : [
{
"pumpNo" : 1,
"level" : 0.6
},
{
"pumpNo" : 2,
"level" : 0.6
}
]
}
I've check on MongoDB doc in Aggregation section but didn't found anything. In SQL by JOIN or SUB Query I'm able to do but here it's No-SQL
Could you please help me with this? Thankyou
Try this on for size:
db.foo.aggregate([
// Run the existing pumps array through $map and for each
// item (the "in" clause), create a doc with the existing
// pumpNo and bring in the level field from doc. All "peer"
// fields to 'pumps' are addressable as $field.
// By $projecting to a same-named field (pumps), we effectively
// overwrite the old pumps array with the new.
{$project: {pumps: {$map: {
input: "$pumps",
as: "z",
in: {pumpNo:"$$z.pumpNo", level:"$level"}
}}
}}
]);
Strongly recommend you explore the power of $map, $reduce, $concatArrays, $slice, and other array functions that make MongoDB query language different from the more scalar-based approach in SQL.
Is it possible to bulk update (upsert) an array of documents with MongoDB by an array of replacement fields (documents)?
Basically to get rid of the for loop in this pseudo code example:
for user in users {
db.users.replaceOne(
{ "name" : user.name },
user,
{ "upsert": true }
}
The updateMany documentation only documents the following case where all documents are being updated in the same fashion:
db.collection.updateMany(
<query>,
{ $set: { status: "D" }, $inc: { quantity: 2 } },
...
)
I am trying to update (upsert) an array of documents where each document has it's own set of replacement fields:
updateOptions := options.UpdateOptions{}
updateOptions.SetUpsert(true)
updateOptions.SetBypassDocumentValidation(false)
_, error := collection.Col.UpdateMany(ctx, bson.M{"name": bson.M{"$in": names}}, bson.M{"$set": users}, &updateOptions)
Where users is an array of documents:
[
{ "name": "A", ...further fields},
{ "name": "B", ...further fields},
...
]
Apparently, $set cannot be used for this case since I receive the following error: Error while bulk writing *v1.UserCollection (FailedToParse) Modifiers operate on fields but we found type array instead.
Any help is highly appreciated!
You may use Collection.BulkWrite().
Since you want to update each document differently, you have to prepare a different mongo.WriteModel for each document update.
You may use mongo.ReplaceOneModel for individual document replaces. You may construct them like this:
wm := make([]mongo.WriteModel, len(users)
for i, user := range users {
wm[i] = mongo.NewReplaceOneModel().
SetUpsert(true).
SetFilter(bson.M{"name": user.name}).
SetReplacement(user)
}
And you may execute all the replaces with one call like this:
res, err := coll.BulkWrite(ctx, wm)
Yes, here we too have a loop, but that is to prepare the write tasks we want to carry out. All of them is sent to the database with a single call, and the database is "free" to carry them out in parallel if possible. This is likely to be significantly faster that calling Collection.ReplaceOne() for each document individually.
I've got two collections in MongoDB called collection_A and collection_B.
Essentially, I want to copy over a field from collection B given a condition in collection A but I'm not quite sure how to do this via PyMongo.
Collection Schemas:
collection_A
{
"ID": ObjectId(1234567),
"match": "yes"
}
collection_B
{
"B_ID": 1234567,
"add_field": "field_name"
}
In psuedocode, what I aim to do is:
if collection_A match = "yes":
loop through records
if collection_A ID = collection_B B_ID:
add "add_field": "field_name" field to collection_A
What I have so far:
values = collection_A.find({"match": "yes})
for doc in values:
TO-DO
I've looked around a lot on how to do this via PyMongo but was unable to find much. If anyone can help me with this problem, or direct me to useful links, that'd be helpful. Thank you
This example copies all the required field from B to A that matches on ID. The comments explain each step.
for a_record in db.collection_A.find({'match': 'yes'}): # Loop through collection A based on filter
b_record = db.collection_B.find_one({'B_ID': a_record.get('ID')}, {'_id': 0, 'add_field': 1}) # Lookup corresponding record in Collection B
if b_record is not None: # B record will be None if no match
db.collection_A.update_one({'_id': a_record['_id']}, {'$set': b_record}) # Update A with the value from B
else:
db.collection_A.update_one({'_id': a_record['_id']}, {'$set': {'add_field': None}})
Worked example with data:
from pymongo import MongoClient
db = MongoClient()['mydatabase']
db.collection_A.insert_one({
"ID": 1234567,
"match": "yes"
})
db.collection_B.insert_one({
"B_ID": 1234567,
"add_field": "field_name"
})
for a_record in db.collection_A.find({'match': 'yes'}): # Loop through collection A based on filter
b_record = db.collection_B.find_one({'B_ID': a_record.get('ID')}, {'_id': 0, 'add_field': 1}) # Lookup corresponding record in Collection B
if b_record is not None: # B record will be None if no match
db.collection_A.update_one({'_id': a_record['_id']}, {'$set': b_record}) # Update A with the value from B
print(list(db.collection_A.find({})))
prints:
[{'_id': ObjectId('5fc555172bc0555f17ccb918'), 'ID': 1234567, 'match': 'yes', 'add_field': 'field_name'}]
I have this collection :
{
username : "user1",
arr : [
{
name : "test1",
times : 0
},
{
name : "test2",
times : 5
}
]
}
I have an array with some object. This objects have a name and the value times. Now I want to add new objects, if my array doesn't contain them. Example:
I have this two objects with the name "test1" and "test2" already in the collection. I want now to insert the objects "test2", "test3" and "test4". It should only add the object "test3" and "test4" to the array and not "test2" again. The value times doesn't do anything in this case, they should just have the value 0 when it gets insert.
Is there a way to do this with one query?
If you can insert test1, test2,... one by one, then you can do something like this.
db.collection.update(
{username : "user1", 'arr.name': {$ne: 'test2'}},
{$push: {
arr: {'name': 'test2', 'times': 0}
}
})
The $ne condition will prevent the update if the name is already present in arr.
You can now use the addToSet operator that is built just for that: adds a value to an array if it does not exist.
Documentation: https://docs.mongodb.com/manual/reference/operator/update/addToSet/
MongoDB seems to interpret $set paths with numerical components as object keys rather than array indexes if the field has not already been created as an array.
> db.test.insert({_id: "one"});
> db.test.update({_id: "one"}, {$set: {"array.0.value": "cheese"}});
> db.find({_id: "one"})
{ "_id": "one", "array": { "0" : { "value" : "cheese" } }
I expected to get "array": [{"value": "cheese"}], but instead it was initialized as an object with a key with the string "0".
I could get an array by initializing the whole array, like so:
> db.test.update({_id: "one"}, {$set: {"array": [{"value": "cheese"}]}});
... but this would clobber any existing properties and other array elements that might have been previously set.
Is there any way to convince $set that I want "array" to be an array type, with the following constraints:
I want to execute this in a single query, without looking up the record first.
I want to preserve any existing array entries and object values
In short, I want the behavior of $set: {"array.0.value": ... } if "array" had already been initialized as an array, without knowing whether or not it has. Is this possible?
I am not sure if this is possible without lookup. Perhaps you can change schema design, and try something like this:
db.test.insert({_id: "one"});
db.test.update({_id: "one"}, {$addToSet: {array: { $each:['cheese', 'ham'] }}});
db.test.findOne({_id:'one'});
// { "_id" : "one", "array" : [ "cheese", "ham" ] }
Handling array elements (sub-documents in array) in MongoDb is pain. https://jira.mongodb.org/browse/SERVER-1243