Remove all items from an array NOT in array in MongoDB - arrays

I have two collections in Mongo. For simplification I´m providing a minified example
Template Collection:
{
templateId:0,
values:[
{data:"a"},
{data:"b"},
{data:"c"}
{data:"e"}
]
}
Data Collection:
{
dataId:0,
templateId:0,
values:[
{data:"a",
value: 10},
{data:"b",
value: 120},
{data:"c",
value: 3220},
{data:"d",
value: 0}
]
}
I want to make a sync from Template Collection -> Data Collection, between the template 0 and all documents using that template. In the case of the example, that would mean:
Copy {data:"e"} into the arrays of all documents with the templateId: 0
Remove {data:"d"} from the arrays of all documents with the templateId: 0
BUT do not touch the rest of the items. I can´t simply replace the array, because those values have to be kept
I´ve found a solution for 1.
db.getCollection('data').update({templateId:0},
{$addToSet: {values: {
$each:[
{data:"a"},
{data:"b"},
{data:"c"}
{data:"e"}
]
}}}, {
multi: true}
)
And a partial solution for 2.
I got it. First tried with $pullAll, but the normal $pull seems to work together with the $nin operator
db.getCollection('data').update({templateId:"QNXC4bPAF9J6r9FQu"},
{$pull:{values: { $nin:[
{data:"a"},
{data:"b"},
{data:"c"}
{data:"e"}]
}}}, {
multi: true}
)
This will remove {data:"d"} from all document arrays, but it seems to overwrite the complete array, and this is not what I want, as those value entries need to be persisted
But how can I perform a query like Remove everything from an array EXCEPT/NOT IN [a,b,c,d,...] ?

Related

How to insert/delete an item by index in an "array" in Firebase's realtime database?

In my app I have 3 vehicles in array, ordered by an index.
So my array looks like this:
// My array
const array = [
{ name: 'car', index: 1 },
{ name: 'aeroplane', index: 2 },
{ name: 'bus', index: 3 }
];
// How I push to the database
database.ref('/').push(array[0]);
database.ref('/').push(array[1]);
database.ref('/').push(array[2]);
// How my realtime db looks right after
{
"-M9UBf4uJa9cy0_ZT4LF" : {
"index" : 1,
"name" : "car"
},
"-M9UBf50Jdyo4GmXO23l" : {
"index" : 2,
"name" : "aeroplane"
},
"-M9UBf526cmgjUw99dWw" : {
"index" : 3,
"name" : "bus"
}
}
No problem so far.
However, I'm struggling when it comes to deleting or inserting vehicles in Firebase, because the indexes remain like they were. So for example, If I delete 'airplane' above with the following code:
// Delete second item
database.ref('/-M9UBf50Jdyo4GmXO23l').remove();
...I get an array like this:
[{name: 'car', index: 1}, {name: 'bus', index: 3}].
I need the bus's index to become '2' instead of remaining '3'. I realize there's no built-in way to achieve this using Firebase's API (and on purpose, since it would create problems in real time applications). However, I absolutely need the indexes to remain.
I thought about manually .update()-ing every index greater than the one I deleted but in cases where I have millions and millions of vehicles, wouldn't the performance be horrible?
Is there any way I could this with Firebase's realtime database or should I go with another database?

Using $rename in MongoDB for an item inside an array of objects

Consider the following MongoDB collection of a few thousand Objects:
{
_id: ObjectId("xxx")
FM_ID: "123"
Meter_Readings: Array
0: Object
Date: 2011-10-07
Begin_Read: true
Reading: 652
1: Object
Date: 2018-10-01
Begin_Reading: true
Reading: 851
}
The wrong key was entered for 2018 into the array and needs to be renamed to "Begin_Read". I have a list using another aggregate of all the objects that have the incorrect key. The objects within the array don't have an _id value, so are hard to select. I was thinking I could iterate through the collection and find the array index of the errored Readings and using the _id of the object to perform the $rename on the key.
I am trying to get the index of the array, but cannot seem to select it correctly. The following aggregate is what I have:
[
{
'$match': {
'_id': ObjectId('xxx')
}
}, {
'$project': {
'index': {
'$indexOfArray': [
'$Meter_Readings', {
'$eq': [
'$Meter_Readings.Begin_Reading', True
]
}
]
}
}
}
]
Its result is always -1 which I think means my expression must be wrong as the expected result would be 1.
I'm using Python for this script (can use javascript as well), if there is a better way to do this (maybe a filter?), I'm open to alternatives, just what I've come up with.
I fixed this myself. I was close with the aggregate but needed to look at a different field for some reason that one did not work:
{
'$project': {
'index': {
'$indexOfArray': [
'$Meter_Readings.Water_Year', 2018
]
}
}
}
What I did learn was the to find an object within an array you can just reference it in the array identifier in the $indexOfArray method. I hope that might help someone else.

Mongodb: upsert multiple documents at the same time (with key like $in)

I stumbled upon a funny behavior in MongoDB:
When I run:
db.getCollection("words").update({ word: { $in: ["nico11"] } }, { $inc: { nbHits: 1 } }, { multi: 1, upsert: 1 })
it will create "nico11" if it doesn't exist, and increase nbHits by 1 (as expected).
However, when I run:
db.getCollection("words").update({ word: { $in: ["nico10", "nico11", "nico12"] } }, { $inc: { nbHits: 1 } }, { multi: 1, upsert: 1 })
it will correctly update the keys that are already in the DB, but not insert the missing ones.
Is that the expected behavior, and is there any way I can provide an array to mongoDB, for it to update the existing elements, and create the ones that need to be created?
That is expected behaviour according to the documentation:
The update creates a base document from the equality clauses in the
parameter, and then applies the update expressions from the
parameter. Comparison operations from the will not be
included in the new document.
And, no, there is no way to achieve what you are attempting to do here using a simple upsert. The reason for that is probably that the expected outcome would be impossible to define. In your specific case it might be possible to argue along the lines of: "oh well, it is kind of obvious what we should be doing here". But imagine a more complex query like this:
db.getCollection("words").update({
a: { $in: ["b", "c" ] },
x: { $in: [ "y", "z" ]}
},
{ $inc: { nbHits: 1 } },
{ multi: 1, upsert: 1 })
What should MongoDB do in this case?
There is, however, the concept of bulk write operations in MongoDB where you would need to define three separate updateOne operations and package them up in a single request to the server.

Array .map() returning undefined

This is the structure of the array when I console.log it.
-[]
-0: Array(31)
-0:
-date: "2018-08-26T00:00:00-04:00"
-registered:
-standard: 0
-vip: 0
-waitlisted:
-standard: 0
-vip: 0
This is my code to map the date and the registered (two separate arrays):
this.data.map((value) => value.date);
this.data.map((value) => value.registered['standard']);
I either get an empty array or undefined when I log these. What am I doing wrong?
I want to use these for a chart using ChartJS where:
this.lineChart = new Chart(lineCtx, {
type: 'line',
data: {
datasets: [{
label: (I want the dates to be the labels),
data: (I want the list of standard registrants)
}]...
EDIT:
I've updated the way I get the data to show the following structure:
{
"registrationHistory": [{
"date": "2018-08-26T00:00:00-4:00",
"registered": {
"vip":0,
"standard":0
},
"waitlisted":{
"vip":0,
"standard":0
}
{
,...
}
]}
Your array is two-dimensional and map is iterating only the first dimension, i.e:
-[]
-0: Array(31) // first dimension
-0: // second dimension
-date: "2018-08-26T00:00:00-04:00"
...
This would look like the following JSON string:
[[{"date":"2018-08-26T00:00:00-04:00", ...}]]
Since you haven't provided a full example it's impossible to recommend the most applicable solution:
If you control the data source, remove the first dimension since it appears redundant.
Assuming you only want the first element of the first dimension, refer to that key:
this.data[0].map((value) => value.date);
If your data model is more complex than revealed in your question you'll need to figure out another approach.

$set operator interpreting array index as object index in mongodb

MongoDB seems to interpret $set paths with numerical components as object keys rather than array indexes if the field has not already been created as an array.
> db.test.insert({_id: "one"});
> db.test.update({_id: "one"}, {$set: {"array.0.value": "cheese"}});
> db.find({_id: "one"})
{ "_id": "one", "array": { "0" : { "value" : "cheese" } }
I expected to get "array": [{"value": "cheese"}], but instead it was initialized as an object with a key with the string "0".
I could get an array by initializing the whole array, like so:
> db.test.update({_id: "one"}, {$set: {"array": [{"value": "cheese"}]}});
... but this would clobber any existing properties and other array elements that might have been previously set.
Is there any way to convince $set that I want "array" to be an array type, with the following constraints:
I want to execute this in a single query, without looking up the record first.
I want to preserve any existing array entries and object values
In short, I want the behavior of $set: {"array.0.value": ... } if "array" had already been initialized as an array, without knowing whether or not it has. Is this possible?
I am not sure if this is possible without lookup. Perhaps you can change schema design, and try something like this:
db.test.insert({_id: "one"});
db.test.update({_id: "one"}, {$addToSet: {array: { $each:['cheese', 'ham'] }}});
db.test.findOne({_id:'one'});
// { "_id" : "one", "array" : [ "cheese", "ham" ] }
Handling array elements (sub-documents in array) in MongoDb is pain. https://jira.mongodb.org/browse/SERVER-1243

Resources