I have a structure like this
unions { // collection
members { // array
instanceId // some id
...
}
...
}
In documents, I have ids prop (array)
I need to lookup all unions that have at least one id from ids (basically $in)
The problem is that it doesn't work
First I wanted to try this variant
{
from: 'unions',
let: { instanceIds: '$ids' },
as: 'unions',
pipeline: [
{
$match: { 'members.instanceId': { $in: '$$instanceIds' } },
},
],
}
But we can't use aggregation variables here. For that, we need to use $expr
{
from: 'unions',
let: { instanceIds: '$ids' },
as: 'unions',
pipeline: [
{
$match: {
$expr: {
$in: ['$members.instanceId', '$$instanceIds']
}
},
},
],
}
But then it returns 0 documents. The instanceIds array is not empty, I've checked it.
Also, if I paste an array with values in the example without $expr then it returns the right values. So most likely the problem is how I build this $lookup.
use { $ne: [{ $setIntersection: ['$members.instanceId', '$$instanceIds'] }, []] }
{
from: 'unions',
let: { instanceIds: '$ids' },
as: 'unions',
pipeline: [
{
$match: {
$expr: {
cond: { $ne: [{ $setIntersection: ['$members.instanceId', '$$instanceIds'] }, []] },
},
},
},
],
}
Related
I am having this mongo DB query which queries a collection called songs and for each song, returns the respective album associated:
db.songs.aggregate([{
$lookup: {
from: "albums",
let: { album: '$album' },
as: "album",
pipeline: [{
$match: {
$expr: {
$and: [
{ $eq: ['$albumId', '$$album._id'] },
{ $eq: ['$status', 'Draft'] },
]
}
}
}]
}
}])
In the above query, my intention was to return a song only if the album was in Draft status, but in contrast, it returns all songs, and for the ones for which the album is not in Draft, it just returns an empty array inside the lookup. How can I not return the song document at all if the album is not in Draft?
Additionally, is it possible to flatten the results in the document? ie, merge all the fields of albums into the song document?
Once you perform the $lookup you can filter out the documents with an empty array:
{ $match: { album: { $ne: [] } }}
Then there is an example in the MongoDB documentation for the $mergeObjects operator that is very similar to your case. Assuming that each song belongs to one album, put together your aggregation pipeline may look like this:
db.songs.aggregate([
{
$lookup: {
from: "albums",
let: { album: '$album' },
as: "album",
pipeline: [{
$match: {
$expr: {
$and: [
{ $eq: ['$albumId', '$$album._id'] },
{ $eq: ['$status', 'Draft'] },
]
}
}
}]
}
},
{ $match: { album: { $ne: [] } }},
{
$replaceRoot: { newRoot: { $mergeObjects: [ { $arrayElemAt: [ "$album", 0 ] }, "$$ROOT" ] } }
},
{ $project: { album: 0 } }
])
You may want to experiment going in the other direction: find albums in status = Draft then get the songs:
db.album.aggregate([
{$match: {"status":"Draft"}}
,{$lookup: {from: "song",
localField: "album", foreignField: "album",
as: "songs"}}
// songs is now an array of docs. Run $map to turn that into an
// array of just the song title, and overwrite it (think x = x + 1):
,{$addFields: {songs: {$map: {
input: "$songs",
in: "$$this.song"
}} }}
]);
If you have a LOT of material in the song document, you can use the fancier $lookup to cut down the size of the docs in the lookup array -- but you still need the $map to turn it into an array of strings.
db.album.aggregate([
{$match: {"status":"Draft"}}
,{$lookup: {from: "song",
let: { aid: "$album" },
pipeline: [
{$match: {$expr: {$eq:["$album","$$aid"]}}},
{$project: {song:true}}
],
as: "songs"}}
,{$addFields: {songs: {$map: {
input: "$songs",
in: "$$this.song"
}} }}
]);
I'm having a Model which is structured similar to this:
{
"_id": ObjectId("5c878c5c18a4ff001b981zh5"),
"books": [
ObjectId("5d963a7544ec1b122ab2ddc"),
ObjectId("5d963be01f663d168f8ea4dc"),
ObjectId("5d963bcb1f663d168f8ea2f4"),
ObjectId("5d963bdf1f663d16858ea7c9"),
}
Now I want to use the aggregation framework to get a list of only the populated books, like:
{ _id: ObjectId("5d963a7544ec1b122ab2ddc"), title: ...., ... },
..
.aggregate([
{
$lookup: {
from: 'books',
let: { books: '$books' },
pipeline: [{ $match: { $expr: { _id: { $in: ['_id', '$$books'] } } } }],
as: 'bookInfos'
}
},
{ $unwind: '$bookInfos' },
{ $replaceRoot: { newRoot: '$bookInfos' } }
])
I am not too sure about your question, but I think this might be what you're looking for.
So this query worked for me:
{
$match: {
_id: user._id,
},
},
{
$lookup: {
from: "books",
localField: "books",
foreignField: "_id",
as: "booksInfo",
},
},
{ $unwind: "$booksInfo" },
{
$replaceRoot: {
newRoot: "$booksInfo",
},
},
Thanks #zishone. Somehow your query returned all the books available in the db and not only the ones from the User Model, but it works as desired when looking up the documents with local and foreignField.
aggregation in nodejs resulting in nested json, can I get it without nesting, taking only one data _id from all collections. Is there any possibility to get the data without a nested json
I was trying aggregation in nodejs with the below code. I got the output as given in output session below. But I would like to get the output as expected output, since I cant use looping on looping
Student.aggregate([
{
$match: { name: 'abcd'}
},
{
$lookup:{
from:'teachers',
pipeline: [
{
$match: { name: 'pqrs' }
},
{
$project:{
"_id":1
}
}
],
as: "teacherLookup"
}
},
{
$lookup:
{
from:'subjects',
pipeline: [
{
$match: { name: 'computer' }
},
{
$project:{
"_id":1
}
}
],
as: "subjectLookup"
}
}
])
output
[
{
_id: '52301c7878965455d2a4',
teacherLookup: [ '5ea737412589688930' ],
subjectLookup: [ '5ea745821369999917' ]
}
]
I am expecting the output as (without nested json)
[
{
studentId: '5ea1c7878965455d2a4',
teacherId: '5ea737412589688930' ,
subjectId: '5ea745821369999917'
}
]
You can use $arrayElemAt to get the first element from the array.
Student.aggregate([
{
$match: { name: "abcd" },
},
{
$lookup: {
from: "teachers",
pipeline: [
{
$match: { name: "pqrs" },
},
{
$project: {
_id: 1,
},
},
],
as: "teacherId",
},
},
{
$lookup: {
from: "subjects",
pipeline: [
{
$match: { name: "computer" },
},
{
$project: {
_id: 1,
},
},
],
as: "subjectId",
},
},
{
$project: {
teacherId: { $arrayElemAt: ["$teacherId", 0] },
subjectId: { $arrayElemAt: ["subjectId", 0] },
},
}
]);
I have changed one of the fields of my collection in mongoDB from an array of strings to an array of object containing 2 strings. New documents get inserted without any problem, but when a get method is called to get , querying all the documents I get this error:
Failed to decode 'Students'. Decoding 'photoAddresses' errored
with: readStartDocument can only be called when CurrentBSONType is
DOCUMENT, not when CurrentBSONType is STRING.
photoAddresses is the field that was changed in Students.
I was wondering is there any way to update all the records so they all have the same data type, without losing any data.
The old version of photoAdresses:
"photoAddresses" : ["something","something else"]
This should be updated to the new version like this:
"photoAddresses" : [{photoAddresses:"something"},{photoAddresses:"something else"}]
The following aggregation queries update the string array to object array, only if the array has string elements. The aggregation operator $map is used to map the string array elements to objects. You can use any of the two queries.
db.test.aggregate( [
{
$match: {
$expr: { $and: [ { $isArray: "$photo" },
{ $gt: [ { $size: "$photo" }, 0 ] }
]
},
"photo.0": { $type: "string" }
}
},
{
$project: {
photo: {
$map: {
input: "$photo",
as: "ph",
in: { addr: "$$ph" }
}
}
}
},
] ).forEach( doc => db.test.updateOne( { _id: doc._id }, { $set: { photo: doc.photo } } ) )
The following query works with MongoDB version 4.2+ only. Note the update operation is an aggregation instead of an update. See updateMany.
db.test.updateMany(
{
$expr: { $and: [ { $isArray: "$photo" },
{ $gt: [ { $size: "$photo" }, 0 ] }
]
},
"photo.0": { $type: "string" }
},
[
{
$set: {
photo: {
$map: {
input: "$photo",
as: "ph",
in: { addr: "$$ph" }
}
}
}
}
]
)
[EDIT ADD]: The following query works with version MongoDB 3.4:
db.test.aggregate( [
{
$addFields: {
matches: {
$cond: {
if: { $and: [
{ $isArray: "$photoAddresses" },
{ $gt: [ { $size: "$photoAddresses" }, 0 ] },
{ $eq: [ { $type: { $arrayElemAt: [ "$photoAddresses", 0 ] } }, "string" ] }
] },
then: true,
else: false
}
}
}
},
{
$match: { matches: true }
},
{
$project: {
photoAddresses: {
$map: {
input: "$photoAddresses",
as: "ph",
in: { photoAddresses: "$$ph" }
}
}
}
},
] ).forEach( doc => db.test.updateOne( { _id: doc._id }, { $set: { photoAddresses: doc.photoAddresses } } ) )
I have the next collection for exaple:
// vehicles collection
[
{
"_id": 321,
manufactor: SOME-OBJECT-ID
},
{
"_id": 123,
manufactor: ANOTHER-OBJECT-ID
},
]
And I have a collection named tables:
// tables collection
[
{
"_id": SOME-OBJECT-ID,
title: "Skoda"
},
{
"_id": ANOTHER-OBJECT-ID,
title: "Mercedes"
},
]
As you can see, the vehicles collection's documents are pulling data from the
tables's collection ducments - the first document in the vehicles collection has a manufactor
id which is getting pulled from the tables collection and named Skoda.
That is great.
When I am querying the DB using aggregate I can able to easily pull the remote data from the remote collections
respectively - without any problem.
I can also easily make rules and limitations like $project, $sort, $skip, $limit and others.
But I want to display to the user only those vehicles that are manufcatord by Mercedes.
Since Mercedes is not mentioned in the vehicles collection, but only its ID, the $text $search would not
return with the right results.
This is the aggregate pipeline that I provide:
[
{
$match: {
$text: {
$search: "Mercedes"
}
}
},
{
$lookup: {
from: "tables",
let: {
manufactor: "$manufactor"
},
pipeline: [
{
$match: {
$expr: {
$eq: [
"$_id", "$$manufactor"
]
}
}
},
{
$project: {
title: 1
}
}
],
as: "manufactor"
},
},
{
$unwind: "$manufactor"
},
{
$lookup: {
from: "tables",
let: {
model: "$model"
},
pipeline: [
{
$match: {
$expr: {
$eq: [
"$_id", "$$model"
]
}
}
},
{
$project: {
title: 1
}
}
],
as: "model"
},
},
{
$unwind: "$model"
},
{
$lookup: {
from: "users",
let: {
joined_by: "$_joined_by"
},
pipeline: [
{
$match: {
$expr: {
$eq: [
"$_id", "$$joined_by"
]
}
}
},
{
$project: {
personal_info: 1
}
}
],
as: "joined_by"
},
},
{
$unwind: "$joined_by"
}
]
As you can see I am using the $text and $search $match at the first stage in the pipleline - otherwise
MongoDB will throw an error.
But this $text $search object searhed only in the origin collection - the vehicles collection.
Is there a way to tell MongoDB to search in the remote collection with the $text and $search method
and then put in the aggregate only results that are matching both?
UPDATE
When I am doing this instead:
{
$lookup: {
from: "tables",
pipeline: [
{
$match: {
$text: {
$search: "Mercedes"
}
}
},
{
$project: {
title: 1
}
}
],
as: "manufactor"
},
},
This is what I receive:
MongoError: pipeline requires text score metadata, but there is no text score available
if you are using one of the affected versions in this thread, you need to update your mongodb server.
As you can see the issue was fixed in version 4.1.8