Mongo: Used $addToSet without $each. How to remove arrays? - arrays

I have an object like so:
object: {
ids: [ ObjectId('1...'), ObjectId('2...')]
}
And I want to add a few elements to the ids array:
[ObjectId('3...'), ObjectId('4...')]]
Instead of doing that properly:
Object.update({ _id: someId },
{
$addToSet: { ids: { $each: [ObjectId('3...'), ObjectId('4...')]]}
}
);
I unintentionally did a bad thing:
Object.update({ _id: someId },
{
$addToSet: { ids: [ObjectId('3...'), ObjectId('4...')]] }
}
);
Sticking me now with an object that looks like:
{
ids: [ ObjectId('1...'), ObjectId('2...'), [ObjectId('3...'), ObjectId('4...')]]
}
So obviously, that's a problem. I tried some programmatic solutions such as checking if the element was an array (if Array.isArray(elem)), but it looked like mongoose was flattening the results on query.
What is a good way to reverse this problem? Or in other terms, how do I flatten the array?

Related

is there any possible way with upsert the document of array in mongodb [duplicate]

I have the following collection
{
"_id" : ObjectId("57315ba4846dd82425ca2408"),
"myarray" : [
{
userId : ObjectId("570ca5e48dbe673802c2d035"),
point : 5
},
{
userId : ObjectId("613ca5e48dbe673802c2d521"),
point : 2
},
]
}
These are my questions
I want to push into myarray if userId doesn't exist, it should be appended to myarray. If userId exists, it should be updated to point.
I found this
db.collection.update({
_id : ObjectId("57315ba4846dd82425ca2408"),
"myarray.userId" : ObjectId("570ca5e48dbe673802c2d035")
}, {
$set: { "myarray.$.point": 10 }
})
But if userId doesn't exist, nothing happens.
and
db.collection.update({
_id : ObjectId("57315ba4846dd82425ca2408")
}, {
$push: {
"myarray": {
userId: ObjectId("570ca5e48dbe673802c2d035"),
point: 10
}
}
})
But if userId object already exists, it will push again.
What is the best way to do this in MongoDB?
Try this
db.collection.update(
{ _id : ObjectId("57315ba4846dd82425ca2408")},
{ $pull: {"myarray.userId": ObjectId("570ca5e48dbe673802c2d035")}}
)
db.collection.update(
{ _id : ObjectId("57315ba4846dd82425ca2408")},
{ $push: {"myarray": {
userId:ObjectId("570ca5e48dbe673802c2d035"),
point: 10
}}
)
Explination:
in the first statment $pull removes the element with userId= ObjectId("570ca5e48dbe673802c2d035") from the array on the document where _id = ObjectId("57315ba4846dd82425ca2408")
In the second one $push inserts
this object { userId:ObjectId("570ca5e48dbe673802c2d035"), point: 10 } in the same array.
The accepted answer by Flying Fisher is that the existing record will first be deleted, and then it will be pushed again.
A safer approach (common sense) would be to try to update the record first, and if that did not find a match, insert it, like so:
// first try to overwrite existing value
var result = db.collection.update(
{
_id : ObjectId("57315ba4846dd82425ca2408"),
"myarray.userId": ObjectId("570ca5e48dbe673802c2d035")
},
{
$set: {"myarray.$.point": {point: 10}}
}
);
// you probably need to modify the following if-statement to some async callback
// checking depending on your server-side code and mongodb-driver
if(!result.nMatched)
{
// record not found, so create a new entry
// this can be done using $addToSet:
db.collection.update(
{
_id: ObjectId("57315ba4846dd82425ca2408")
},
{
$addToSet: {
myarray: {
userId: ObjectId("570ca5e48dbe673802c2d035"),
point: 10
}
}
}
);
// OR (the equivalent) using $push:
db.collection.update(
{
_id: ObjectId("57315ba4846dd82425ca2408"),
"myarray.userId": {$ne: ObjectId("570ca5e48dbe673802c2d035"}}
},
{
$push: {
myarray: {
userId: ObjectId("570ca5e48dbe673802c2d035"),
point: 10
}
}
}
);
}
This should also give (common sense, untested) an increase in performance, if in most cases the record already exists, only the first query will be executed.
There is a option called update documents with aggregation pipeline starting from MongoDB v4.2,
check condition $cond if userId in myarray.userId or not
if yes then $map to iterate loop of myarray array and check condition if userId match then merge with new document using $mergeObjects
if no then $concatArrays to concat new object and myarray
let _id = ObjectId("57315ba4846dd82425ca2408");
let updateDoc = {
userId: ObjectId("570ca5e48dbe673802c2d035"),
point: 10
};
db.collection.update(
{ _id: _id },
[{
$set: {
myarray: {
$cond: [
{ $in: [updateDoc.userId, "$myarray.userId"] },
{
$map: {
input: "$myarray",
in: {
$mergeObjects: [
"$$this",
{
$cond: [
{ $eq: ["$$this.userId", updateDoc.userId] },
updateDoc,
{}
]
}
]
}
}
},
{ $concatArrays: ["$myarray", [updateDoc]] }
]
}
}
}]
)
Playground
Unfortunately "upsert" operation is not possible on embedded array. Operators simply do not exist so that this is not possible in a single statement.Hence you must perform two update operations in order to do what you want. Also the order of application for these two updates is important to get desired result.
I haven't found any solutions based on a one atomic query. Instead there are 3 ways based on a sequence of two queries:
always $pull (to remove the item from array), then $push (to add the updated item to array)
db.collection.update(
{ _id : ObjectId("57315ba4846dd82425ca2408")},
{ $pull: {"myarray.userId": ObjectId("570ca5e48dbe673802c2d035")}}
)
db.collection.update(
{ _id : ObjectId("57315ba4846dd82425ca2408")},
{
$push: {
"myarray": {
userId:ObjectId("570ca5e48dbe673802c2d035"),
point: 10
}
}
}
)
try to $set (to update the item in array if exists), then get the result and check if the updating operation successed or if a $push needs (to insert the item)
var result = db.collection.update(
{
_id : ObjectId("57315ba4846dd82425ca2408"),
"myarray.userId": ObjectId("570ca5e48dbe673802c2d035")
},
{
$set: {"myarray.$.point": {point: 10}}
}
);
if(!result.nMatched){
db.collection.update({_id: ObjectId("57315ba4846dd82425ca2408")},
{
$addToSet: {
myarray: {
userId: ObjectId("570ca5e48dbe673802c2d035"),
point: 10
}
}
);
always $addToSet (to add the item if not exists), then always $set to update the item in array
db.collection.update({_id: ObjectId("57315ba4846dd82425ca2408")},
myarray: { $not: { $elemMatch: {userId: ObjectId("570ca5e48dbe673802c2d035")} } } },
{
$addToSet : {
myarray: {
userId: ObjectId("570ca5e48dbe673802c2d035"),
point: 10
}
}
},
{ multi: false, upsert: false});
db.collection.update({
_id: ObjectId("57315ba4846dd82425ca2408"),
"myArray.userId": ObjectId("570ca5e48dbe673802c2d035")
},
{ $set : { myArray.$.point: 10 } },
{ multi: false, upsert: false});
1st and 2nd way are unsafe, so transaction must be established to avoid two concurrent requests could push the same item generating a duplicate.
3rd way is safer. the $addToSet adds only if the item doesn't exist, otherwise nothing happens. In case of two concurrent requests, only one of them adds the missing item to the array.
Possible solution with aggregation pipeline:
db.collection.update(
{ _id },
[
{
$set: {
myarray: { $filter: {
input: '$myarray',
as: 'myarray',
cond: { $ne: ['$$myarray.userId', ObjectId('570ca5e48dbe673802c2d035')] },
} },
},
},
{
$set: {
myarray: {
$concatArrays: [
'$myarray',
[{ userId: ObjectId('570ca5e48dbe673802c2d035'), point: 10 },
],
],
},
},
},
],
);
We use 2 stages:
filter myarray (= remove element if userId exist)
concat filtered myarray with new element;
When you want update or insert value in array try it
Object in db
key:name,
key1:name1,
arr:[
{
val:1,
val2:1
}
]
Query
var query = {
$inc:{
"arr.0.val": 2,
"arr.0.val2": 2
}
}
.updateOne( { "key": name }, query, { upsert: true }
key:name,
key1:name1,
arr:[
{
val:3,
val2:3
}
]
In MongoDB 3.6 it is now possible to upsert elements in an array.
array update and create don't mix in under one query, if you care much about atomicity then there's this solution:
normalise your schema to,
{
"_id" : ObjectId("57315ba4846dd82425ca2408"),
userId : ObjectId("570ca5e48dbe673802c2d035"),
point : 5
}
You could use a variation of the .forEach/.updateOne method I currently use in mongosh CLI to do things like that. In the .forEach, you might be able to set all of your if/then conditions that you mentioned.
Example of .forEach/.updateOne:
let medications = db.medications.aggregate([
{$match: {patient_id: {$exists: true}}}
]).toArray();
medications.forEach(med => {
try {
db.patients.updateOne({patient_id: med.patient_id},
{$push: {medications: med}}
)
} catch {
console.log("Didn't find match for patient_id. Could not add this med to a patient.")
}
})
This may not be the most "MongoDB way" to do it, but it definitely works and gives you the freedom of javascript to do things within the .forEach.

MongoDB: find out query array's element not in database

is it possible to find out the query array element not in database?
example:
const query = ['aaa','bbb','ccc']
Documents in db:
[{name:'bbb'},{name:'ccc'}]
I want to find query array elements not in database:
return result should be:
['aaa']
I can't find some quickly method to do this except query each element(or batch?) in array
Any one has better method? thanks
Querying for stuff that are -missing- is always a more expensive operation, also there is no "magic" query to do it for you. I recommend using Mongo's distinct method, like so:
const queryArr = ['aaa', 'bbb', 'ccc'];
const allNames = await db.collection.distinct('name');
const notInDb = queryArr.filter(e => !allNames.includes(e));
However if you want to do it in 1 db command you could do something like this:
db.collection.aggregate([
{
$group: {
_id: null,
names: {
"$addToSet": "$name"
}
}
},
{
"$replaceRoot": {
"newRoot": {
results: {
$filter: {
input: [
"aaa",
"bbb",
"ccc"
],
as: "datum",
cond: {
$not: {
"$setIsSubset": [
[
"$$datum"
],
"$names"
]
}
}
}
}
}
}
}
])
Mongo Playground
As you can tell both approaches require you to load all the names into memory, there is no way around this, if your db's scale is too big for these approaches you will have to iterate over the query input and do it one by one.
const queryArr = ['aaa', 'bbb', 'ccc'];
for (let queryName of queryArr) {
const found = await db.collection.findOne({name: queryName})
if (!found) {
//ding
}
}
Assuming you have an index on name field this should be very efficient.

Mongoose add to array without duplicates and sorted

In Mongoose it is possible to add new values to an array without duplicates with $addToSet:
Model.update({ _id: elementFromDb._id }, { $addToSet: { array: newElement} }
And it is possible to add a new value to the array in correct order (sorted) with $push and $sort:
Model.update({ _id: elementFromDb._id }, { $push: { array: { $each: newElement, $sort: 1 } } }
But it seems NOT possible to add a new element to the array without duplicates AND sorted because $addToSet dosnt work with $sort...
How can I do this?

MongoDB find all not in this array

I'm trying to find all users except for a few, like this:
// get special user IDs
var special = db.special.find({}, { _id: 1 }).toArray();
// get all users except for the special ones
var users = db.users.find({_id: {$nin: special}});
This doesn't work because the array that I'm passing to $nin is not and array of ObjectId but an array of { _id: ObjectId() }
Variable special looks like this after the first query:
[ { _id: ObjectId(###) }, { _id: ObjectId(###) } ]
But $nin in the second query needs this:
[ ObjectId(###), ObjectId(###) ]
How can I get just the ObjectId() in an array from the first query so that I can use them in the second query?
Or, is there a better way of achieving what I'm trying to do?
Use the cursor.map() method returned by the find() function to transform the list of { _id: ObjectId(###) } documents to an array of ObjectId's as in the following
var special = db.special.find({}, { _id: 1 }).map(function(doc){
return doc._id;
});
Another approach you can consider is using the $lookup operator in the aggregation framework to do a "left outer join" on the special collection and filtering the documents on the new "joined" array field. The filter should match on documents whose array field is empty.
The following example demonstrates this:
db.users.aggregate([
{
"$lookup": {
"from": "special",
"localField": "_id",
"foreignField": "_id",
"as": "specialUsers" // <-- this will produce an arry of "joined" docs
}
},
{ "$match": { "specialUsers.0": { "$exists": false } } } // <-- match on empty array
])

MongoDB rename database field within array

I need to rename indentifier in this:
{ "general" :
{ "files" :
{ "file" :
[
{ "version" :
{ "software_program" : "MonkeyPlus",
"indentifier" : "6.0.0"
}
}
]
}
}
}
I've tried
db.nrel.component.update(
{},
{ $rename: {
"general.files.file.$.version.indentifier" : "general.files.file.$.version.identifier"
} },
false, true
)
but it returns: $rename source may not be dynamic array.
For what it's worth, while it sounds awful to have to do, the solution is actually pretty easy. This of course depends on how many records you have. But here's my example:
db.Setting.find({ 'Value.Tiers.0.AssetsUnderManagement': { $exists: 1 } }).snapshot().forEach(function(item)
{
for(i = 0; i != item.Value.Tiers.length; ++i)
{
item.Value.Tiers[i].Aum = item.Value.Tiers[i].AssetsUnderManagement;
delete item.Value.Tiers[i].AssetsUnderManagement;
}
db.Setting.update({_id: item._id}, item);
});
I iterate over my collection where the array is found and the "wrong" name is found. I then iterate over the sub collection, set the new value, delete the old, and update the whole document. It was relatively painless. Granted I only have a few tens of thousands of rows to search through, of which only a few dozen meet the criteria.
Still, I hope this answer helps someone!
Edit: Added snapshot() to the query. See why in the comments.
You must apply snapshot() to the cursor before retrieving any documents from the database.
You can only use snapshot() with unsharded collections.
From MongoDB 3.4, snapshot() function was removed. So if using Mongo 3.4+ ,the example above should remove snapshot() function.
As mentioned in the documentation there is no way to directly rename fields within arrays with a single command. Your only option is to iterate over your collection documents, read them and update each with $unset old/$set new operations.
I had a similar problem. In my situation I found the following was much easier:
I exported the collection to json:
mongoexport --db mydb --collection modules --out modules.json
I did a find and replace on the json using my favoured text editing utility.
I reimported the edited file, dropping the old collection along the way:
mongoimport --db mydb --collection modules --drop --file modules.json
Starting Mongo 4.2, db.collection.update() can accept an aggregation pipeline, finally allowing the update of a field based on its own value:
// { general: { files: { file: [
// { version: { software_program: "MonkeyPlus", indentifier: "6.0.0" } }
// ] } } }
db.collection.updateMany(
{},
[{ $set: { "general.files.file": {
$map: {
input: "$general.files.file",
as: "file",
in: {
version: {
software_program: "$$file.version.software_program",
identifier: "$$file.version.indentifier" // fixing the typo here
}
}
}
}}}]
)
// { general: { files: { file: [
// { version: { software_program: "MonkeyPlus", identifier: "6.0.0" } }
// ] } } }
Literally, this updates documents by (re)$setting the "general.files.file" array by $mapping its "file" elements in a "version" object containing the same "software_program" field and the renamed "identifier" field which contains what used to be the value of "indentifier".
A couple additional details:
The first part {} is the match query, filtering which documents to update (in this case all documents).
The second part [{ $set: { "general.files.file": { ... }}}] is the update aggregation pipeline (note the squared brackets signifying the use of an aggregation pipeline):
$set is a new aggregation operator which in this case replaces the value of the "general.files.file" array.
Using a $map operation, we replace all elements from the "general.files.file" array by basically the same elements, but with an "identifier" field rather than "indentifier":
input is the array to map.
as is the variable name given to looped elements
in is the actual transformation applied on elements. In this case, it replaces elements by a "version" object composed by a "software_program" and a "identifier" fields. These fields are populated by extracting their previous values using the $$file.xxxx notation (where file is the name given to elements from the as part).
I had to face the issue with the same schema. So this query will helpful for someone who wants to rename the field in an embedded array.
db.getCollection("sampledocument").updateMany({}, [
{
$set: {
"general.files.file": {
$map: {
input: "$general.files.file",
in: {
version: {
$mergeObjects: [
"$$this.version",
{ identifer: "$$this.version.indentifier" },
],
},
},
},
},
},
},
{ $unset: "general.files.file.version.indentifier" },
]);
Another Solution
I also would like rename a property in array: and I used thaht
db.getCollection('YourCollectionName').find({}).snapshot().forEach(function(a){
a.Array1.forEach(function(b){
b.Array2.forEach(function(c){
c.NewPropertyName = c.OldPropertyName;
delete c["OldPropertyName"];
});
});
db.getCollection('YourCollectionName').save(a)
});
The easiest and shortest solution using aggregate (Mongo 4.0+).
db.myCollection.aggregate([
{
$addFields: {
"myArray.newField": {$arrayElemAt: ["$myArray.oldField", 0] }
}
},
{$project: { "myArray.oldField": false}},
{$out: {db: "myDb", coll: "myCollection"}}
])
The problem using forEach loop as mention above is the very bad performance when the collection is huge.
My proposal would be this one:
db.nrel.component.aggregate([
{ $unwind: "$general.files.file" },
{
$set: {
"general.files.file.version.identifier": {
$ifNull: ["$general.files.file.version.indentifier", "$general.files.file.version.identifier"]
}
}
},
{ $unset: "general.files.file.version.indentifier" },
{ $set: { "general.files.file": ["$general.files.file"] } },
{ $out: "nrel.component" } // carefully - it replaces entire collection.
])
However, this works only when array general.files.file has a single document only. Most likely this will not always be the case, then you can use this one:
db.nrel.componen.aggregate([
{ $unwind: "$general.files.file" },
{
$set: {
"general.files.file.version.identifier": {
$ifNull: ["$general.files.file.version.indentifier", "$general.files.file.version.identifier"]
}
}
},
{ $unset: "general.files.file.version.indentifier" },
{ $group: { _id: "$_id", general_new: { $addToSet: "$general.files.file" } } },
{ $set: { "general.files.file": "$general_new" } },
{ $unset: "general_new" },
{ $out: "nrel.component" } // carefully - it replaces entire collection.
])

Resources