Query using ObjectId in MongoDB - database

I have a notes collection as:
{
note: {
type: String,
},
createdBy: {
type: String,
required: true
},
}
where "createdBy" contains _id of a user from users collection.
First Question: Should I define it as String or ObjectId?
Second Question:
While querying the data as db.users.find({ createdBy: ObjectId(userid) },'notes'). Is it a O(1) operation?
Or, do I have to create an index for that to be 0(1)?

If your user collection is using ObjectId then you would better also use ObjectId in notes collection since you may want to $lookup them.
Only _id field would create index in the begging of collection. You need to create index for createdBy since you want O(1) operation.

Related

Prisma: Is it possible to have a field on a model be an array of objects without any relation?

I have a field on my user model called favorites. I want this to be an array of objects. I cannot set the field to be an array without some kind of relation or defining it, but there is no way to define it with an object. I also cannot use types since I am using a PostgreSQL DB. Is there any way I can have an array as a field that takes in objects without that field having any relation to another model?
An example of some dummy data in the favorites field
[
{ id: 1,
title: 'blah'
},
{ id: 2,
title: 'ok'
},
]
my schema:
model User {
id Int #id #default(autoincrement())
createdAt DateTime #default(now())
updatedAt DateTime #updatedAt
email String #unique
firstName String
lastName String
password String
playlists Playlist[]
favorites Song[]
}
I currently have favorites related to a Song model which I do not need. I just want favorites to be an array of objects that I store with no relation. Something like:
model User {
favorites {}[]
}
One way to do this would be using the Json type in the Prisma schema:
model User {
favorites Json
}
The main drawback is that this is currently not typed (until Prisma supports typed Json fields), so you won't get any autocompletion or type-safety from the TS compiler.
If you want to have the type-safety, you'll need to model it as a relation as of now (or use MongoDB where embedded documents are already supported via the type keyword).

How do I insert a document along with the result of a find query together in a collection in a single query in MongoDB?

Suppose I have a collection named oldCollection which has a record like
{
name: "XYZ"
}
Now I want to insert this data into a new collection named newCollection. But I also want to add another key-value field (suppose a boolean field exists) for this same record like :
{
name: XYZ
exists:true
}
I am using find query to extract the required data and insert it into the new collection but how can I add more fields (like exists in the above example) in the same record?
Use $out aggregation stage:
db.collection.aggregate([
{
"$addFields": { "exists": true }
},
{
"$out": "resultedCollection"
}
])
see playground

Update unique compound indexes on an existing data set

Problem:
I'm trying to update a unique compound index on an existing data set and Mongo isn't updating the index.
Background:
In the database for our web app we have a unique compound index using a user's clubID and email. This means emails must be unique in regards to a user's clubID.
I'm in the process of updating this index to allow users to share emails. We added a new property on the user model called 'primaryAccountHolder'.
I want the new compound index to allow users with same clubID to share an email but only one user in the same club can have the field primaryAccountHolder set to true. I have this index working locally but the updating on our existing data set is unsuccessful.
I believe this is because we have existing entries in our DB that won't allow this index to be updated. So my question is:
how can I achieve updating a compound index that maintains uniqueness on an existing data set?
Below are the indexes I have created using Mongoose / Typescript. These work locally but not on our existing db.
Old Index:
UserSchema.index({ email: 1, clubID: 1 }, { unique: true })
// Won't allow a user of the same club to have the same email. This is the index on our DB.
New Index:
UserSchema.index({ email: 1, clubID: 1 }, { unique: true, partialFilterExpression: { email: { $exists: true }, primaryAccountHolder: { $eq: true } } })
// Will allow users to share an email but only one of them can have the primary account holder field set to true.
The new index uses a partial filter expression. This is the part that isn't created on the existing data set.
Thanks for the help!
Sam Gruse
You'll have to drop and recreate the index:
UserSchema.dropIndex({ email: 1, clubID: 1 })
And then recreate it:
UserSchema.createIndex(
{ email: 1, clubID: 1 },
{ unique: true,
partialFilterExpression:
{ email: { $exists: true },primaryAccountHolder: { $eq: true } }}
)
from MongoDB Documentation:
https://docs.mongodb.com/manual/tutorial/manage-indexes/#modify-an-index
MongoDB cannot update an existing index. You need to drop the current index and create the new one.
From https://docs.mongodb.com/manual/tutorial/manage-indexes/#modify-an-index:
To modify an existing index, you need to drop and recreate the index. The exception to this rule is TTL indexes, which can be modified via the collMod command in conjunction with the index collection flag.

Querying mongoDB document based on Array element

This is one user's notes. I want to query and get only the notes of this use with "activeFlag:1". My query object code is
findAccountObj =
{ _id: objectID(req.body.accountId),
ownerId: req.body.userId,
bookId: req.body.bookId,
"notes.activeFlag": 1 };
But this query returns all the notes, including the ones with "activeFlag:0".
How do I fix this?
If you are on v2.2, use elementmatch operator. v3.2 and above allow aggregation and filter to return a subset of a document.
here is an example Retrieve only the queried element in an object array in MongoDB collection

Is there a built-in function to get all unique values in an array field, across all records?

My schema looks like this:
var ArticleSchema = new Schema({
...
category: [{
type: String,
default: ['general']
}],
...
});
I want to parse through all records and find all unique values for this field across all records. This will be sent to the front-end via being called by service for look-ahead search on tagging articles.
We can iterate through every single record and run go through each array value and do a check, but this would be O(n2).
Is there an existing function or another way that has better performance?
You can use the distinct function to get the unique values across all category array fields of all documents:
Article.distinct('category', function(err, categories) {
// categories is an array of the unique category values
});
Put an index on category for best performance.

Resources