Swap elements in a MongoDB array given only their ids (in-place) - arrays

I read some threads about it, such as this one and that one, but they both come with major flaws & caveats (to me at least).
What I want is simple! An update query that:
swaps 2 elements in an array.
each element is a sub-document.
I am only given the two indexes as input, of those which need to be swapped.
I want to AVOID using a javascript function, if at all possible.
I want it to BE ATOMIC!
Here is the part i'm struggling with:
Option 1: Using update WITH an aggregation pipeline
( summary: succeeds at $arrayElemAt, fails over at "images.0" )
db.users.update(
{ userID: 1 },
[
{ $set: {
"images.0": { $arrayElemAt: ['$images', 2] },
"images.2": { $arrayElemAt: ['$images', 0] },
}}
]
)
If I use an aggregation in my update, I get access to the $arrayElemAt operator, which is great! But this solution fails because this → "images.0" doesn't do what I expect it too. I would have expected it to say "hey, please override the first element inside the images array". But no, what it does is it pushes a new field called "0": { ... }, to each and every sub-document inside the images array.
Option 2: Using update WITHOUT an aggregation pipeline
( summary: succeeds at "images.0", fails over at $arrayElemAt)
db.users.update(
{ userID: 1 },
{ $set: {
"images.0": { $arrayElemAt: ['$images', 2] },
"images.2": { $arrayElemAt: ['$images', 0] },
}}
)
This one fails & succeeds exactly at the opposite places.
The "images.n" DOES replace the n'th element with the given input.
The problem here is, since it is no longer an aggregation, I have no access to the $arrayElemAt operator, and the new n'th element simply becomes:
{
"$arrayElemAt" : [
"$aboutMe.images",
2.0
]
},
Does anyone know what can be done?

This is a nice question that I didn't meet before.
The catch here, is that you need a pipeline to refer existing values, but this prevents working with direct index like dot notation or even $push.
Hence, one option is using $reduce:
$set keys for the wanted values.
$reduce with reference to the size of $$value, according to this answer, in order to replace.
db.users.update(
{userID: 1},
[
{$set: {
firstItem: {$arrayElemAt: ["$images", 2]},
secondItem: {$arrayElemAt: ["$images", 0]}
}
},
{$set: {
images: {
$reduce: {
input: "$images",
initialValue: [],
in: {$concatArrays: [
"$$value",
{$cond: [
{$eq: [{$size: "$$value"}, 0]},
["$firstItem"],
{$cond: [{$eq: [{$size: "$$value"}, 2]},
["$secondItem"],
["$$this"]]
}
]
}
]
}
}
},
firstItem: "$$REMOVE",
secondItem: "$$REMOVE"
}
}
])
See how it works on the playground example

Related

Removing an element in a mongoDB array based on the position of element with dynamically defined array

My question is a combination of:
This question: Removing the array element in mongoDB based on the position of element
And this question: mongodb set object where key value dynamically changes
I know you can define the field (array) dynamically like this:
{ [arrayName]: { <condition> } }
But, I then want to remove a certain element in a dynamically defined array by specifying the position (which is also defined dynamically). In other words, the function that processes this query is coming in which two parameters: the array's name and the index of the element to remove.
The options given by the selected answer were the following:
Option 1, does not work (in general), adapted to my case this looks like:
{ $pull : { [arrayName] : { $gt: index-1, $lt: index+1 } } }
Option 2, I cannot use dynamically defined values in field selectors with quotation marks (as far as I am aware):
{ $pull : "[arrayName].[index]" }
or
{ $pull : "[arrayName].$": index }
Option 3, is different method but can't use it for the same reason:
{ $unset: { "[arrayName].[index]": 1 } } // Won't work
{ $pull: { [arrayName]: null } } // Would probably work
The only workarounds I can think of right now involve significantly changing the design which would be a shame. Any help is appreciated!
PS: I'm using mongoose as a driver on the latest version as of today (v6.3.5) and MongoDB version 5.0.8
On Mongo version 4.2+ You can use pipelined updates to achieve this, you can get it done in multiple ways, here is what I consider the easiest two ways:
using $slice and $concatArrays to remove a certain element:
db.collection.update({},
[
{
$set: {
[arrayName]: {
$concatArrays: [
{
$slice: [
`$${arrayName}`,
index,
]
},
{
$slice: [
`$${arrayName}`,
index + 1,
{
$size: `$${arrayName}`
}
]
}
]
}
}
}
])
Mongo Playground
using $filter and $zip to filter out based on index:
db.collection.updateOne(
{},
[
{
"$set": {
[arrayName]: {
$map: {
input: {
$filter: {
input: {
$zip: {
inputs: [
{
$range: [
0,
{
$size: `$${arrayName}`
}
]
},
`$${arrayName}`
]
}
},
cond: {
$ne: [
{
"$arrayElemAt": [
"$$this",
0
]
},
index
]
}
}
},
in: {
$arrayElemAt: [
"$$this",
1
]
}
}
}
}
}
])
Alternatively you can just prepare

Mongo updateMany statement with an inner array of objects to manipulate

I'm struggling to write a Mongo UpdateMany statement that can reference and update an object within an array.
Here I create 3 documents. Each document has an array called innerArray always containing a single object, with a single date field.
use test;
db.innerArrayExample.insertOne({ _id: 1, "innerArray": [ { "originalDateTime" : ISODate("2022-01-01T01:01:01Z") } ]});
db.innerArrayExample.insertOne({ _id: 2, "innerArray": [ { "originalDateTime" : ISODate("2022-01-02T01:01:01Z") } ]});
db.innerArrayExample.insertOne({ _id: 3, "innerArray": [ { "originalDateTime" : ISODate("2022-01-03T01:01:01Z") } ]});
I want to add a new date field, based on the original date field, to end up with this:
{ _id: 1, "innerArray": [ { "originalDateTime" : ISODate("2022-01-01T01:01:01Z"), "copiedDateTime" : ISODate("2022-01-01T12:01:01Z") } ]}
{ _id: 2, "innerArray": [ { "originalDateTime" : ISODate("2022-01-02T01:01:01Z"), "copiedDateTime" : ISODate("2022-01-02T12:01:01Z") } ]}
{ _id: 3, "innerArray": [ { "originalDateTime" : ISODate("2022-01-03T01:01:01Z"), "copiedDateTime" : ISODate("2022-01-03T12:01:01Z") } ]}
In pseudo code I am saying take the originalDateTime, run it through a function and add a related copiedDateTime value.
For my specific use-case, the function I want to run strips the timezone from originalDateTime, then overwrites it with a new one, equivalent to the Java ZonedDateTime function withZoneSameLocal. Aka 9pm UTC becomes 9pm Brussels (therefore effectively 7pm UTC). The technical justification and methodology were answered in another Stack Overflow question here.
The part of the query I'm struggling with, is the part that updates/selects data from an element inside an array. In my simplistic example, for example I have crafted this query, but unfortunately it doesn't work:
This function puts copiedDateTime in the correct place... but doesn't evaluate the commands to manipulate the date:
db.innerArrayExample.updateMany({ "innerArray.0.originalDateTime" : { $exists : true }}, { $set: { "innerArray.0.copiedDateTime" : { $dateFromString: { dateString: { $dateToString: { "date" : "$innerArray.0.originalDateTime", format: "%Y-%m-%dT%H:%M:%S.%L" }}, format: "%Y-%m-%dT%H:%M:%S.%L", timezone: "Europe/Paris" }}});
// output
{
_id: 1,
innerArray: [
{
originalDateTime: ISODate("2022-01-01T01:01:01.000Z"),
copiedDateTime: {
'$dateFromString': {
dateString: { '$dateToString': [Object] },
format: '%Y-%m-%dT%H:%M:%S.%L',
timezone: 'Europe/Paris'
}
}
}
]
}
This simplified query, also has the same issue:
b.innerArrayExample.updateMany({ "innerArray.0.originalDateTime" : { $exists : true }}, { $set: { "innerArray.0.copiedDateTime" : "$innerArray.0.originalDateTime" }});
//output
{
_id: 1,
innerArray: [
{
originalDateTime: ISODate("2022-01-01T01:01:01.000Z"),
copiedDateTime: '$innerArray.0.originalDateTime'
}
]
}
As you can see this issue looks to be separate from the other stack overflow question. Instead of being able changing timezones, it's about getting things inside arrays to update.
I plan to take this query, create 70,000 variations of it with different location/timezone combinations and run it against a database with millions of records, so I would prefer something that uses updateMany instead of using Javascript to iterate over each row in the database... unless that's the only viable solution.
I have tried putting $set in square brackets. This changes the way it interprets everything, evaluating the right side, but causing other problems:
test> db.innerArrayExample.updateMany({ "_id" : 1 }, [{ $set: { "innerArray.0.copiedDateTime" : "$innerArray.0.originalDateTime" }}]);
//output
{
_id: 1,
innerArray: [
{
'0': { copiedDateTime: [] },
originalDateTime: ISODate("2022-01-01T01:01:01.000Z")
}
]
}
Above it seems to interpret .0. as a literal rather than an array element. (For my needs I know the array only has 1 item at all times). I'm at a loss finding an example that meets my needs.
I have also tried experimenting with the arrayFilters, documented on my mongo updateMany documentation but I cannot fathom how it works with objects:
test> db.innerArrayExample.updateMany(
... { },
... { $set: { "innerArray.$[element].copiedDateTime" : "$innerArray.$[element].originalDateTime" } },
... { arrayFilters: [ { "originalDateTime": { $exists: true } } ] }
... );
MongoServerError: No array filter found for identifier 'element' in path 'innerArray.$[element].copiedDateTime'
test> db.innerArrayExample.updateMany(
... { },
... { $set: { "innerArray.$[0].copiedDateTime" : "$innerArray.$[element].originalDateTime" } },
... { arrayFilters: [ { "0.originalDateTime": { $exists: true } } ] }
... );
MongoServerError: Error parsing array filter :: caused by :: The top-level field name must be an alphanumeric string beginning with a lowercase letter, found '0'
If someone can help me understand the subtleties of the Mongo syntax and help me back on to the right path I'd be very grateful.
You want to be using pipelined updates, the issue you're having with the syntax you're using is that it does not allow the usage of aggregation operators and document field values.
Here is a quick example on how to do it:
db.collection.updateMany({},
[
{
"$set": {
"innerArray": {
$map: {
input: "$innerArray",
in: {
$mergeObjects: [
"$$this",
{
copiedDateTime: "$$this.originalDateTime"
}
]
}
}
}
}
}
])
Mongo Playground

MongoDB array of objects

The problem I am facing is below:
I have a MongoDB document whose structure is as follows
"name": "XYZ",
"array":[
{
"value": "Alpha"
},
{
"value": "Beta"
},
{
"value": "Alpha"
},
]
and I have to count how many objects have value Alpha.
I have tried the following two queries but both only give me value 1.
db.current_database.find({array: {$elemMatch: {value: "Alpha"}}}).count()
db.current_database.find({'array.value': 'Alpha'}).count()
The find collection method returns documents, not fragments.
A few options to count occurrances of elements in an array:
Most languages provide a method to filter/reduce/count elements in an array, so this should be fairly straightforward on the client side.
The MongoDB aggregation framework provides $reduce, $filter, $size, $group, $unwind, and a few other operators that might be useful in this situation.
One possible solution using $reduce:
db.current_database.aggregate([
{$match: {"array.value": "Alpha"}},
{$addFields:{
count: {
$reduce: {
input: "$array",
initialValue: 0,
in: {
$cond: {
if: {$eq: ["$$this.value", "Alpha"]},
then: {$sum: ["$$value", 1]},
else: "$$value"
}
}
}
}
}}
])

Reference value from positional element in array in update

Suppose I have a document that looks like this:
{
"id": 1,
"entries": [
{
"id": 100,
"urls": {
"a": "url-a",
"b": "url-b",
"c": "url-c"
},
"revisions": []
}
]
}
I am trying to add a new object to the revisions array that contains its own urls field. Two of the fields should be copied from the entry's urls, while the last one will be new. The result should look like this:
{
"id": 1,
"entries": [
{
"id": 100,
"urls": {
"a": "url-a",
"b": "url-b",
"c": "url-c"
},
"revisions": [
{
"id": 1000,
"urls": {
"a": "url-a", <-- copied
"b": "url-b", <-- copied
"c": "some-new-url" <-- new
}
}
]
}
]
}
I am on MongoDB 4.2+, so I know I can use $property on the update query to reference values. However, this does not seem to be working as I expect:
collection.updateOne(
{
id: 1,
"enntries.id": 100
},
{
$push: {
"entries.$.revisions": {
id: 1000,
urls: {
"a": "$entries.$.urls.a",
"b": "$entries.$.urls.b",
"c": "some-new-url"
}
}
}
}
);
The element gets added to the array, but all I see for the url values is the literal $entries.$.urls.a. value I suspect the issue is with combining the reference with selecting a specific positional array element. I have also tried using $($entries.$.urls.a), with the same result.
How can I make this work?
Starting from MongoDB version >= 4.2 you can use aggregation pipeline in updates which means your update part of query will be wrapped in [] where you can take advantage of executing aggregation in query & also use existing field values in updates.
Issue :
Since you've not wrapped update part in [] to say it's an aggregation pipeline, .updateOne() is considering "$entries.$.urls.a" as a string. I believe you'll not be able to use $ positional operator in updates which use aggregation pipeline.
Try below query which uses aggregation pipeline :
collection.updateOne(
{
id: 1,
"entries.id": 100 /** "entries.id" is optional but much needed to avoid execution of below aggregation for doc where `id :1` but no `"entries.id": 100` */,
}
[
{
$set: {
entries: {
$map: { // aggregation operator `map` iterate over array & creates new array with values.
input: "$entries",
in: {
$cond: [
{ $eq: ["$$this.id", 100] }, // `$$this` is current object in array iteration, if condition is true do below functionality for that object else return same object as is to array being created.
{
$mergeObjects: [
"$$this",
{
revisions: { $concatArrays: [ "$$this.revisions", [{ id: 1000, urls: { a: "$$this.urls.a", b: "$$this.urls.b", c: "some-new-url" } } ]] }
}
]
},
"$$this" // Returning same object as condition is not met.
]
}
}
}
}
}
]
);
$mergeObjects will replace existing revisions field in $$this (current) object with value of { $concatArrays: [ "$$this.revisions", { id: 1000, urls: { a: "$$this.urls.a", b: "$$this.urls.b", c: "some-new-url" } } ] }.
From the above field name revisions and as it being an array I've assumed there will multiple objects in that field & So we're using $concatArrays operator to push new objects into revisions array of particular entires object.
In any case, if your revisions array field does only contain one object make it as an object instead of array Or you can keep it as an array & use below query - We've removed $concatArrays cause we don't need to merge new object to existing revisions array as we'll only have one object every-time.
collection.update(
{
id: 1,
"entries.id": 100
}
[
{
$set: {
entries: {
$map: {
input: "$entries",
in: {
$cond: [
{ $eq: ["$$this.id", 100] },
{
$mergeObjects: [
"$$this",
{
revisions: [ { id: 1000, urls: { a: "$$this.urls.a", b: "$$this.urls.b", c: "some-new-url" } } ]
}
]
},
"$$this"
]
}
}
}
}
}
]
);
Test : Test your aggregation pipeline here : mongoplayground
Ref : .updateOne()
Note : If in any case .updateOne() throws in an error due to in-compatible client or shell, try this query with .update(). This execution of aggregation pipeline in updates helps to save multiple DB calls & can be much useful on arrays with less no.of elements.

Filtering out the unique values between 2 mongodb arrays using mongodb 3.4

I have 2 arrays:
"array1": [
"057a7",
"05790",
"0575d",
"0579f",
"0576b",
"05784",
"05775"
]
"array2": [
"0579f",
"057a7",
"05790",
"05784",
"0575d",
"0576a",
"0576b",
"05775"
]
I have tried $setDifference, $setUnion and $setIntersection and these only output the elements that match. I would like to output the one that does not ("0576a"). The examples I find in stack overflow only seem to show you how to output the duplicates and not the unique values. The final output should be an array like so:
"final_array": ["0576a"]
Trying to do this in mongodb aggregation and not have to tap into mapReduce.
{
"$project": {
"_id": 0,
"unique": {
"$setDifference": [
"$array2",
"$array1"
]
}
}
}
The following should work for you:
db.collection('test').aggregate({
$project: {
"unique": {
$concatArrays: [
{ $setDifference: [ "$array1", "$array2" ] },
{ $setDifference: [ "$array2", "$array1" ] }
]
}
}
})
The key thing to understand about $setDifference is that the argument order matters since according to the documentation it...
...takes two sets and returns an array containing the elements that only
exist in the first set; i.e. performs a relative complement of the
second set relative to the first.
That's why you'll have to look at your arrays from both directions which will give you all unique elements and then you can simply merge the two results using $concatArrays.

Resources