Move an element from one array to another within same document MongoDB - arrays

I have data that looks like this:
{
"_id": ObjectId("4d525ab2924f0000000022ad"),
"array": [
{ id: 1, other: 23 },
{ id: 2, other: 21 },
{ id: 0, other: 235 },
{ id: 3, other: 765 }
],
"zeroes": []
}
I'm would like to to $pull an element from one array and $push it to a second array within the same document to result in something that looks like this:
{
"_id": ObjectId("id"),
"array": [
{ id: 1, other: 23 },
{ id: 2, other: 21 },
{ id: 3, other: 765 }
],
"zeroes": [
{ id: 0, other: 235 }
]
}
I realize that I can do this by doing a find and then an update, i.e.
db.foo.findOne({"_id": param._id})
.then((doc)=>{
db.foo.update(
{
"_id": param._id
},
{
"$pull": {"array": {id: 0}},
"$push": {"zeroes": {doc.array[2]} }
}
)
})
I was wondering if there's an atomic function that I can do this with.
Something like,
db.foo.update({"_id": param._id}, {"$move": [{"array": {id: 0}}, {"zeroes": 1}]}
Found this post that generously provided the data I used, but the question remains unsolved after 4 years. Has a solution to this been crafted in the past 4 years?
Move elements from $pull to another array

There is no $move in MongoDB. That being said, the easiest solution is a 2 phase approach:
Query the document
Craft the update with a $pull and $push/$addToSet
The important part here, to make sure everything is idempotent, is to include the original array document in the query for the update.
Given a document of the following form:
{
_id: "foo",
arrayField: [
{
a: 1,
b: 1
},
{
a: 2,
b: 1
}
]
}
Lets say you want to move { a: 1, b: 1 } to a different field, maybe called someOtherArrayField, you would want to do something like.
var doc = db.col.findOne({_id: "foo"});
var arrayDocToMove = doc.arrayField[0];
db.col.update({_id: "foo", arrayField: { $elemMatch: arrayDocToMove} }, { $pull: { arrayField: arrayDocToMove }, $addToSet: { someOtherArrayField: arrayDocToMove } })
The reason we use the $elemMatch is to be sure the field we are about to remove from the array hasn't changed since we first queried the document. When coupled with a $pull it also isn't strictly necessary, but I am typically overly cautious in these situations. If there is no parallelism in your application, and you only have one application instance, it isn't strictly necessary.
Now when we check the resulting document, we get:
db.col.findOne()
{
"_id" : "foo",
"arrayField" : [
{
"a" : 2,
"b" : 1
}
],
"someOtherArrayField" : [
{
"a" : 1,
"b" : 1
}
]
}

Related

Mongo updateMany statement with an inner array of objects to manipulate

I'm struggling to write a Mongo UpdateMany statement that can reference and update an object within an array.
Here I create 3 documents. Each document has an array called innerArray always containing a single object, with a single date field.
use test;
db.innerArrayExample.insertOne({ _id: 1, "innerArray": [ { "originalDateTime" : ISODate("2022-01-01T01:01:01Z") } ]});
db.innerArrayExample.insertOne({ _id: 2, "innerArray": [ { "originalDateTime" : ISODate("2022-01-02T01:01:01Z") } ]});
db.innerArrayExample.insertOne({ _id: 3, "innerArray": [ { "originalDateTime" : ISODate("2022-01-03T01:01:01Z") } ]});
I want to add a new date field, based on the original date field, to end up with this:
{ _id: 1, "innerArray": [ { "originalDateTime" : ISODate("2022-01-01T01:01:01Z"), "copiedDateTime" : ISODate("2022-01-01T12:01:01Z") } ]}
{ _id: 2, "innerArray": [ { "originalDateTime" : ISODate("2022-01-02T01:01:01Z"), "copiedDateTime" : ISODate("2022-01-02T12:01:01Z") } ]}
{ _id: 3, "innerArray": [ { "originalDateTime" : ISODate("2022-01-03T01:01:01Z"), "copiedDateTime" : ISODate("2022-01-03T12:01:01Z") } ]}
In pseudo code I am saying take the originalDateTime, run it through a function and add a related copiedDateTime value.
For my specific use-case, the function I want to run strips the timezone from originalDateTime, then overwrites it with a new one, equivalent to the Java ZonedDateTime function withZoneSameLocal. Aka 9pm UTC becomes 9pm Brussels (therefore effectively 7pm UTC). The technical justification and methodology were answered in another Stack Overflow question here.
The part of the query I'm struggling with, is the part that updates/selects data from an element inside an array. In my simplistic example, for example I have crafted this query, but unfortunately it doesn't work:
This function puts copiedDateTime in the correct place... but doesn't evaluate the commands to manipulate the date:
db.innerArrayExample.updateMany({ "innerArray.0.originalDateTime" : { $exists : true }}, { $set: { "innerArray.0.copiedDateTime" : { $dateFromString: { dateString: { $dateToString: { "date" : "$innerArray.0.originalDateTime", format: "%Y-%m-%dT%H:%M:%S.%L" }}, format: "%Y-%m-%dT%H:%M:%S.%L", timezone: "Europe/Paris" }}});
// output
{
_id: 1,
innerArray: [
{
originalDateTime: ISODate("2022-01-01T01:01:01.000Z"),
copiedDateTime: {
'$dateFromString': {
dateString: { '$dateToString': [Object] },
format: '%Y-%m-%dT%H:%M:%S.%L',
timezone: 'Europe/Paris'
}
}
}
]
}
This simplified query, also has the same issue:
b.innerArrayExample.updateMany({ "innerArray.0.originalDateTime" : { $exists : true }}, { $set: { "innerArray.0.copiedDateTime" : "$innerArray.0.originalDateTime" }});
//output
{
_id: 1,
innerArray: [
{
originalDateTime: ISODate("2022-01-01T01:01:01.000Z"),
copiedDateTime: '$innerArray.0.originalDateTime'
}
]
}
As you can see this issue looks to be separate from the other stack overflow question. Instead of being able changing timezones, it's about getting things inside arrays to update.
I plan to take this query, create 70,000 variations of it with different location/timezone combinations and run it against a database with millions of records, so I would prefer something that uses updateMany instead of using Javascript to iterate over each row in the database... unless that's the only viable solution.
I have tried putting $set in square brackets. This changes the way it interprets everything, evaluating the right side, but causing other problems:
test> db.innerArrayExample.updateMany({ "_id" : 1 }, [{ $set: { "innerArray.0.copiedDateTime" : "$innerArray.0.originalDateTime" }}]);
//output
{
_id: 1,
innerArray: [
{
'0': { copiedDateTime: [] },
originalDateTime: ISODate("2022-01-01T01:01:01.000Z")
}
]
}
Above it seems to interpret .0. as a literal rather than an array element. (For my needs I know the array only has 1 item at all times). I'm at a loss finding an example that meets my needs.
I have also tried experimenting with the arrayFilters, documented on my mongo updateMany documentation but I cannot fathom how it works with objects:
test> db.innerArrayExample.updateMany(
... { },
... { $set: { "innerArray.$[element].copiedDateTime" : "$innerArray.$[element].originalDateTime" } },
... { arrayFilters: [ { "originalDateTime": { $exists: true } } ] }
... );
MongoServerError: No array filter found for identifier 'element' in path 'innerArray.$[element].copiedDateTime'
test> db.innerArrayExample.updateMany(
... { },
... { $set: { "innerArray.$[0].copiedDateTime" : "$innerArray.$[element].originalDateTime" } },
... { arrayFilters: [ { "0.originalDateTime": { $exists: true } } ] }
... );
MongoServerError: Error parsing array filter :: caused by :: The top-level field name must be an alphanumeric string beginning with a lowercase letter, found '0'
If someone can help me understand the subtleties of the Mongo syntax and help me back on to the right path I'd be very grateful.
You want to be using pipelined updates, the issue you're having with the syntax you're using is that it does not allow the usage of aggregation operators and document field values.
Here is a quick example on how to do it:
db.collection.updateMany({},
[
{
"$set": {
"innerArray": {
$map: {
input: "$innerArray",
in: {
$mergeObjects: [
"$$this",
{
copiedDateTime: "$$this.originalDateTime"
}
]
}
}
}
}
}
])
Mongo Playground

Reference value from positional element in array in update

Suppose I have a document that looks like this:
{
"id": 1,
"entries": [
{
"id": 100,
"urls": {
"a": "url-a",
"b": "url-b",
"c": "url-c"
},
"revisions": []
}
]
}
I am trying to add a new object to the revisions array that contains its own urls field. Two of the fields should be copied from the entry's urls, while the last one will be new. The result should look like this:
{
"id": 1,
"entries": [
{
"id": 100,
"urls": {
"a": "url-a",
"b": "url-b",
"c": "url-c"
},
"revisions": [
{
"id": 1000,
"urls": {
"a": "url-a", <-- copied
"b": "url-b", <-- copied
"c": "some-new-url" <-- new
}
}
]
}
]
}
I am on MongoDB 4.2+, so I know I can use $property on the update query to reference values. However, this does not seem to be working as I expect:
collection.updateOne(
{
id: 1,
"enntries.id": 100
},
{
$push: {
"entries.$.revisions": {
id: 1000,
urls: {
"a": "$entries.$.urls.a",
"b": "$entries.$.urls.b",
"c": "some-new-url"
}
}
}
}
);
The element gets added to the array, but all I see for the url values is the literal $entries.$.urls.a. value I suspect the issue is with combining the reference with selecting a specific positional array element. I have also tried using $($entries.$.urls.a), with the same result.
How can I make this work?
Starting from MongoDB version >= 4.2 you can use aggregation pipeline in updates which means your update part of query will be wrapped in [] where you can take advantage of executing aggregation in query & also use existing field values in updates.
Issue :
Since you've not wrapped update part in [] to say it's an aggregation pipeline, .updateOne() is considering "$entries.$.urls.a" as a string. I believe you'll not be able to use $ positional operator in updates which use aggregation pipeline.
Try below query which uses aggregation pipeline :
collection.updateOne(
{
id: 1,
"entries.id": 100 /** "entries.id" is optional but much needed to avoid execution of below aggregation for doc where `id :1` but no `"entries.id": 100` */,
}
[
{
$set: {
entries: {
$map: { // aggregation operator `map` iterate over array & creates new array with values.
input: "$entries",
in: {
$cond: [
{ $eq: ["$$this.id", 100] }, // `$$this` is current object in array iteration, if condition is true do below functionality for that object else return same object as is to array being created.
{
$mergeObjects: [
"$$this",
{
revisions: { $concatArrays: [ "$$this.revisions", [{ id: 1000, urls: { a: "$$this.urls.a", b: "$$this.urls.b", c: "some-new-url" } } ]] }
}
]
},
"$$this" // Returning same object as condition is not met.
]
}
}
}
}
}
]
);
$mergeObjects will replace existing revisions field in $$this (current) object with value of { $concatArrays: [ "$$this.revisions", { id: 1000, urls: { a: "$$this.urls.a", b: "$$this.urls.b", c: "some-new-url" } } ] }.
From the above field name revisions and as it being an array I've assumed there will multiple objects in that field & So we're using $concatArrays operator to push new objects into revisions array of particular entires object.
In any case, if your revisions array field does only contain one object make it as an object instead of array Or you can keep it as an array & use below query - We've removed $concatArrays cause we don't need to merge new object to existing revisions array as we'll only have one object every-time.
collection.update(
{
id: 1,
"entries.id": 100
}
[
{
$set: {
entries: {
$map: {
input: "$entries",
in: {
$cond: [
{ $eq: ["$$this.id", 100] },
{
$mergeObjects: [
"$$this",
{
revisions: [ { id: 1000, urls: { a: "$$this.urls.a", b: "$$this.urls.b", c: "some-new-url" } } ]
}
]
},
"$$this"
]
}
}
}
}
}
]
);
Test : Test your aggregation pipeline here : mongoplayground
Ref : .updateOne()
Note : If in any case .updateOne() throws in an error due to in-compatible client or shell, try this query with .update(). This execution of aggregation pipeline in updates helps to save multiple DB calls & can be much useful on arrays with less no.of elements.

MongoDB: $pull / $unset with multiple conditions

Example Document:
{
_id: 42,
foo: {
bar: [1, 2, 3, 3, 4, 5, 5]
}
}
The query:
I'd like to "remove all entries from foo.bar that are $lt: 4 and the first matching entry that matches $eq: 5". Important: The $eq part must only remove a single entry!
I have a working solution, that uses 3 update queries, but that's too much for that simple task. Nevertheless, here's what I did so far:
1. Find the first entry matching $eq: 5 and $unset it. (As you know: $unset doesn't remove it. It just sets it to null):
update(
{ 'foo.bar': 5 },
{ $unset: { 'foo.bar.$': 1 } }
)
2. $pull all entries $eq: null, so that former 5 is really gone:
update(
{},
{ $pull: { 'foo.bar': null } }
)
3. $pull all entries $lt: 4:
update(
{},
{ $pull: { 'foo.bar': { $lt: 4 } } }
)
Resulting Document:
{
_id: 42,
foo: {
bar: [4, 5]
}
}
Ideas and Thoughts:
Extend query 1., so that it will $unset the entries $lt: 4 and one entry $eq: 5. Afterwards we can execute query 2. and there's no need for query 3..
Extend query 2. to $pull everything that matches $or: [{$lt: 4}, {$eq: 5}]. Then there's no need for query 3..
Extend query 2. to $pull everything that is $not: { $gte: 4 }. This expression should match $lt: 4 and $eq: null.
I already tried to implement those queries, but sometimes it complained about the query syntax and sometimes the query did execute and just removed nothing.
Would be nice, if someone has a working solution for this.
Not sure if I get your full meaning of this, but to "bulk" update documents you can always take this approach in addition the oringal $pull and adding some "detection" of which documents you need to remove "duplicate" 5 from:
// Remove less than four first
db.collection.update({},{ "$pull": { "foo.bar": { "$lt": 4 } } },{ "multi": true });
// Initialize Bulk
var bulk = db.collection.initializeOrderdBulkOp(),
count = 0;
// Detect and cycle documents with duplicate five to be removed
db.collection.aggregate([
// Project a "reduced" array and calculate if the same size as orig
{ "$project": {
"foo.bar": { "$setUnion": [ "$foo.bar", [] ] },
"same": { "$eq": [
{ "$size": { "$setUnion": [ "$foo.bar", [] ] } },
{ "$size": "$foo.bar" }
] }
}},
// Filter the results that were unchanged
{ "$match": { "same": true } }
]).forEach(function(doc) {
bulk.find({ "_id": doc._id })
.updateOne({ "$set": { "foo.bar": doc.foo.bar.sort() } });
count++;
// Execute per 1000 processed and re-init
if ( count % 1000 == 0 ) {
bulk.execute();
bulk = db.collection.initializeOrderdBulkOp();
}
});
// Clean up any batched
if ( count % 1000 != 0 )
bulk.execute();
That trims out anything less than "4" and all duplicates where a "duplicate" is detected from the difference in "set" length.
If you just want values of 5 removed as duplicates you can take a similar logic approach to the detection and modification, just not with "set operators" that remove anything that is a "duplicate" making it a valid "set".
At any rate, some detection strategy is going to be better than iterating updates until "all but one" value is gone.
Of course you can simplify your statements a little and remove one update operation, it's not pretty because $pull does not allow an $or condition in a query, but I hope you get the idea if this applies:
db.collection.update(
{ "foo.bar": 5 },
{ "$unset": { "foo.bar.$": 1 } },
{ "multi": true }
); // same approach
// So include all the values "less than four"
db.collection.update(
{ "foo.bar": { "$in": [1,2,3,null] } },
{ "$pull": { "foo.bar": { "$in": [1,2,3,null] } }},
{ "multi": true }
);
It's a bit less processing but of course those need to be exact integer values. Otherwise stick with the three updates you are doing. Better than cycling in code.
For reference, the "nicer" syntax that will unfortunately not work would be something like this:
db.collection.update(
{
"$or": [
{ "foo.bar": { "$lt": 4 } },
{ "foo.bar": null }
]
},
{
"$pull": {
"$or": [
{ "foo.bar": { "$lt": 4 } },
{ "foo.bar": null }
]
}
},
{ "multi": true }
);
Probably worth a JIRA issue, but I suspect mostly because the array element is not the "first" argument directly following $pull.
You can use the Array.prototype.filter() and the Array.prototype.splice() methods
The filter() method creates a news array with foo.bar values $lt: 4 then you use the splice method to remove those values and the first value equal 5 from foo.bar
var idx = [];
db.collection.find().forEach(function(doc){
idx = doc.foo.bar.filter(function(el){
return el < 4;
});
for(var i in idx){
doc.foo.bar.splice(doc.foo.bar.indexOf(idx[i]), 1);
}
doc.foo.bar.splice(doc.foo.bar.indexOf(5), 1);
db.collection.save(doc);
} )

MongoDB Find Exact Array Match but order doesn't matter

I am querying for finding exact array match and retrieved it successfully but when I try to find out the exact array with values in different order then it get fails.
Example
db.coll.insert({"user":"harsh","hobbies":["1","2","3"]})
db.coll.insert({"user":"kaushik","hobbies":["1","2"]})
db.coll.find({"hobbies":["1","2"]})
2nd Document Retrieved Successfully
db.coll.find({"hobbies":["2","1"]})
Showing Nothing
Please help
The currently accepted answer does NOT ensure an exact match on your array, just that the size is identical and that the array shares at least one item with the query array.
For example, the query
db.coll.find({ "hobbies": { "$size" : 2, "$in": [ "2", "1", "5", "hamburger" ] } });
would still return the user kaushik in that case.
What you need to do for an exact match is to combine $size with $all, like so:
db.coll.find({ "hobbies": { "$size" : 2, "$all": [ "2", "1" ] } });
But be aware that this can be a very expensive operation, depending on your amount and structure of data.
Since MongoDB keeps the order of inserted arrays stable, you might fare better with ensuring arrays to be in a sorted order when inserting to the DB, so that you may rely on a static order when querying.
To match the array field exactly Mongo provides $eq operator which can be operated over an array also like a value.
db.collection.find({ "hobbies": {$eq: [ "singing", "Music" ] }});
Also $eq checks the order in which you specify the elements.
If you use below query:
db.coll.find({ "hobbies": { "$size" : 2, "$all": [ "2", "1" ] } });
Then the exact match will not be returned. Suppose you query:
db.coll.find({ "hobbies": { "$size" : 2, "$all": [ "2", "2" ] } });
This query will return all documents having an element 2 and has size 2 (e.g. it will also return the document having hobies :[2,1]).
Mongodb filter by exactly array elements without regard to order or specified order.
Source: https://savecode.net/code/javascript/mongodb+filter+by+exactly+array+elements+without+regard+to+order+or+specified+order
// Insert data
db.inventory.insertMany([
{ item: "journal", qty: 25, tags: ["blank", "red"], dim_cm: [ 14, 21 ] },
{ item: "notebook", qty: 50, tags: ["red", "blank"], dim_cm: [ 14, 21 ] },
{ item: "paper", qty: 100, tags: ["red", "blank", "plain"], dim_cm: [ 14, 21 ] },
{ item: "planner", qty: 75, tags: ["blank", "red"], dim_cm: [ 22.85, 30 ] },
{ item: "postcard", qty: 45, tags: ["blue"], dim_cm: [ 10, 15.25 ] }
]);
// Query 1: filter by exactly array elements without regard to order
db.inventory.find({ "tags": { "$size" : 2, "$all": [ "red", "blank" ] } });
// result:
[
{
_id: ObjectId("6179333c97a0f2eeb98a6e02"),
item: 'journal',
qty: 25,
tags: [ 'blank', 'red' ],
dim_cm: [ 14, 21 ]
},
{
_id: ObjectId("6179333c97a0f2eeb98a6e03"),
item: 'notebook',
qty: 50,
tags: [ 'red', 'blank' ],
dim_cm: [ 14, 21 ]
},
{
_id: ObjectId("6179333c97a0f2eeb98a6e05"),
item: 'planner',
qty: 75,
tags: [ 'blank', 'red' ],
dim_cm: [ 22.85, 30 ]
}
]
// Query 2: filter by exactly array elements in the specified order
db.inventory.find( { tags: ["blank", "red"] } )
// result:
[
{
_id: ObjectId("6179333c97a0f2eeb98a6e02"),
item: 'journal',
qty: 25,
tags: [ 'blank', 'red' ],
dim_cm: [ 14, 21 ]
},
{
_id: ObjectId("6179333c97a0f2eeb98a6e05"),
item: 'planner',
qty: 75,
tags: [ 'blank', 'red' ],
dim_cm: [ 22.85, 30 ]
}
]
// Query 3: filter by an array that contains both the elements without regard to order or other elements in the array
db.inventory.find( { tags: { $all: ["red", "blank"] } } )
// result:
[
{
_id: ObjectId("6179333c97a0f2eeb98a6e02"),
item: 'journal',
qty: 25,
tags: [ 'blank', 'red' ],
dim_cm: [ 14, 21 ]
},
{
_id: ObjectId("6179333c97a0f2eeb98a6e03"),
item: 'notebook',
qty: 50,
tags: [ 'red', 'blank' ],
dim_cm: [ 14, 21 ]
},
{
_id: ObjectId("6179333c97a0f2eeb98a6e05"),
item: 'planner',
qty: 75,
tags: [ 'blank', 'red' ],
dim_cm: [ 22.85, 30 ]
}
]
This query will find exact array with any order.
let query = {$or: [
{hobbies:{$eq:["1","2"]}},
{hobbies:{$eq:["2","1"]}}
]};
db.coll.find(query)
with $all we can achieve this.
Query : {cast:{$all:["James J. Corbett","George Bickel"]}}
Output : cast : ["George Bickel","Emma Carus","George M. Cohan","James J. Corbett"]
Using aggregate this is how I got mine proficient and faster:
db.collection.aggregate([
{$unwind: "$array"},
{
$match: {
"array.field" : "value"
}
},
You can then unwind it again for making it flat array and then do grouping on it.
This question is rather old, but I was pinged because another answer shows that the accepted answer isn't sufficient for arrays containing duplicate values, so let's fix that.
Since we have a fundamental underlying limitation with what queries are capable of doing, we need to avoid these hacky, error-prone array intersections. The best way to check if two arrays contain an identical set of values without performing an explicit count of each value is to sort both of the arrays we want to compare and then compare the sorted versions of those arrays. Since MongoDB does not support an array sort to the best of my knowledge, we will need to rely on aggregation to emulate the behavior we want:
// Note: make sure the target_hobbies array is sorted!
var target_hobbies = [1, 2];
db.coll.aggregate([
{ // Limits the initial pipeline size to only possible candidates.
$match: {
hobbies: {
$size: target_hobbies.length,
$all: target_hobbies
}
}
},
{ // Split the hobbies array into individual array elements.
$unwind: "$hobbies"
},
{ // Sort the elements into ascending order (do 'hobbies: -1' for descending).
$sort: {
_id: 1,
hobbies: 1
}
},
{ // Insert all of the elements back into their respective arrays.
$group: {
_id: "$_id",
__MY_ROOT: { $first: "$$ROOT" }, // Aids in preserving the other fields.
hobbies: {
$push: "$hobbies"
}
}
},
{ // Replaces the root document in the pipeline with the original stored in __MY_ROOT, with the sorted hobbies array applied on top of it.
// Not strictly necessary, but helpful to have available if desired and much easier than a bunch of 'fieldName: {$first: "$fieldName"}' entries in our $group operation.
$replaceRoot: {
newRoot: {
$mergeObjects: [
"$__MY_ROOT",
{
hobbies: "$hobbies"
}
]
}
}
}
{ // Now that the pipeline contains documents with hobbies arrays in ascending sort order, we can simply perform an exact match using the sorted target_hobbies.
$match: {
hobbies: target_hobbies
}
}
]);
I cannot speak for the performance of this query, and it may very well cause the pipeline to become too large if there are too many initial candidate documents. If you're working with large data sets, then once again, do as the currently accepted answer states and insert array elements in sorted order. By doing so you can perform static array matches, which will be far more efficient since they can be properly indexed and will not be limited by the pipeline size limitation of the aggregation framework. But for a stopgap, this should ensure a greater level of accuracy.

Update embedded mongoose document in array

Lets say that I have the following document in the books collection:
{
_id:0 ,
item: "TBD",
stock: 0,
info: { publisher: "1111", pages: 430 },
tags: [ "technology", "computer" ],
ratings: [ { _id:id1, by: "ijk", rating: 4 }, {_id:id2 by: "lmn", rating: 5 } ],
reorder: false
}
I would like to update the value of ratings[k].rating and all I know is the id of the collection and the _id of the objects existing in the array ratings.
The tutorial of mongoDB has the following example that uses the position of the object inside the array but I suppose that if the update can only be done by knowing the position, this means that I firstly have to find the position and then proceed with the update? Can I do the update with only one call and if so how I can do that?
db.books.update(
{ _id: 1 },
{
$inc: { stock: 5 },
$set: {
item: "ABC123",
"info.publisher": "2222",
tags: [ "software" ],
"ratings.1": { by: "xyz", rating: 3 }
}
}
)
Sorry for late answer; I think this is what you want to do with mongoose.
Books.findOneAndUpdate({
_id: 1,
'ratings._id': id1
},
{
$set: {
'ratings.$.rating' : 3
}
}, function(err, book){
// Response
});
Positional operator may help you:
db.books.update(
// find book by `book_id` with `rating_id` specified
{ "_id": book_id, "ratings._id": rating_id },
// set new `value` for that rating
{ $set: { 'ratings.$.rating': value }}
);
$ will save position of matched document.

Resources