Return specific array from object collection - reactjs

So I get some data into my socket
The code in Client is :
useEffect(() => {
const socket = io("http://localhost:5000/api/socket");
socket.on("newThought", (thought) => {
console.log(thought);
});
}, []);
And then the code in my server is
connection.once("open", () => {
console.log("MongoDB database connected");
console.log("Setting change streams");
const thoughtChangeStream = connection.collection("phonenumbers").watch();
thoughtChangeStream.on("change", (change) => {
io.of("/api/socket").emit("newThought", change);
});
});
When something in my "phonenumbers" collection gets changed I get in return the whole collection . How would I be able to only get the array that got changed from the object in collection?
So for example if in the collection the only service that changed is the one with id "607deefd13c4ebcbcfa0900a" that should be the only one returned and not the whole collection object.

The fullDocument parameter to the options (second) argument to the watch method can be used to get a delta describing the changes to the document for update operations:
const thoughtChangeStream = connection.collection("phonenumbers").watch([], {
fullDocument: 'updateLookup'
});
thoughtChangeStream.on("change", (change) => {
io.of("/api/socket").emit("newThought", change);
});
This will then return a response document like this where updateDescription contains the fields that were modified by the update:
{
_id: {
_data: '8260931772000000012B022C0100296E5A1004ABFC09CB5798444C8126B1DBABB9859946645F696400646082EA7F05B619F0D586DA440004'
},
operationType: 'update',
clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 1, high_: 1620252530 },
ns: { db: 'yourDatabase', coll: 'yourCollection' },
documentKey: { _id: 6082ea7f05b619f0d586da44 },
updateDescription: {
updatedFields: { updatedField: 'newValue' },
removedFields: []
}
}
Note: This will only work for update operations and will not work for replace, delete, insert, etc.
See also:
http://mongodb.github.io/node-mongodb-native/3.0/api/Collection.html.
https://docs.mongodb.com/manual/reference/change-events/

Related

how should i make my database fetch continously

function shield() {
setInterval(async function () {
const ProfileModelS = require("../models/ProfileSchema");
await ProfileModelS.find({}).then((doc) => {
doc.forEach(async (u) => {
if (u.ShieldPoints <= 0) return console.log(u.Name);
if (u.ShieldPoints > 0) {
await ProfileModelS.findOneAndUpdate(
{ userID: u.userID },
{
$inc: {
ShieldPoints: -1,
},
},
console.log("done")
);
}
});
});
}, 1000);
}
module.exports = shield
I want my mongodb to fetch the model on every Interval but its not doing that, whenever i run my code it fetches the model for example, it will fetch
[{name: 'Joseph' , Points: 10}, {name: 'carman' , Points: -1}, {name: 'thee' , Points: 2}]
according to code it will properly not reduce the points of objects whose points are less than 0,
but it will go on decreasing points of object more than 0, i want it to stop reducing points if the objects point reaches 0, and it should go on decreasing the points of object whose points are greater than 0
In short the process for a particular object should be stop once it points reaches 0
You could try using Promise.all to iterate over all of the retrieved models, as you seem to want to perform an async operation on every instance with ShieldPoints > 0 and your findOneAndUpdate operations are independent of one another.
function shield() {
// Import the model schema
const ProfileModelS = require("../models/ProfileSchema");
setInterval(async function () {
// Retrieve all models
const users = await ProfileModelS.find();
// Parallelise the process of updating the models that need to be updated
Promise.all(
users.map(async (user) => {
if (user.ShieldPoints > 0) {
await ProfileModelS.findOneAndUpdate(
{ userID: user.userID },
{
$inc: {
ShieldPoints: -1,
},
},
);
};
})
);
}, 1000);
}
module.exports = shield
However, if your findOneAndUpdate operations do not complete within the 1000ms interval, then the code will issue a second findOneAndUpdate operation for those same model instances. This could then cause your models to update more than once, which is unintended behaviour. To address this, you would need to add some form of guard against this.

findOneAndUpdate causing duplication problem

I am having a problem in findOneAndUpdate in mongoose.
The case is that i am updating a document by finding it.
The query is as follows:
UserModel.findOneAndUpdate({
individualId: 'some id'
}, {
$push: {
supporterOf: 'some string'
}
})
The 'supporterOf' is the ref of UserModel and its type is 'ObjectId'.
The issue i am facing here is that, 'some string' is being pushed twice under 'supporterOf' in the document.
Can anyone tell me that how to push an array element inside the document?
I was having same problem, solution is.
I was keeping await like below.
**await** schema.findOneAndUpdate(queryParms, {
"$push": {
"array1": arrayDetails,
"array2": array2Details
}
}, {
"upsert": true,
"new": true
},
function (error, updateResponse) {
if (error) {
throw new Error (error);
} else {
// do something with updateResponse;
}
});
simply removing await helped me resolving this problem.
Need to find the root cause.
any pointer for references are welcome.
I have recently encountered the same problem. However, I managed to overcome this issue by some other logics (details given below) but couldn't understand the reason behind that why findOneAndUpdate inserting duplicate entries in mongodb.
You can overcome this problem by following logic.
Use findOne or findById instead of findOneAndUpdate to search the document in your collection and then manually update your document and run save().
You can have better idea with this code snippet
return new Promise(function (resolve, reject) {
Model.findOne({
someCondition...
}, function (err, item) {
if (err) {
reject(err);
} else {
item.someArray.push({
someKeyValue...
});
item.save().then((result) => {
resolve(result)
}).catch((err) => {
reject(err)
});
}
}).catch((err) => {
reject(err)
});
});
This will not insert duplicate item. However, if you come to know the reasoning behind duplication, must do update this thread.
The issue seems to stem from combining an await and a callback. I had the same issue until I realised I was using an (err, resp) callback and a .catch(...).
models[auxType].findOneAndUpdate(
filter,
updateObject,
options,
(err, resp)=>{
if (err) {
console.log("Update failed:",err)
res.json(err)
} else if (resp) {
console.log("Update succeeded:",resp)
res.json(resp)
} else {
console.log("No error or response returned by server")
}
})
.catch((e)=>{console.log("Error saving Aux Edit:",e)}); // << THE PROBLEM WAS HERE!!
The problem resolved as soon as I removed the .catch(...) line.
From the mongoose documentation:
"Mongoose queries are not promises. They have a .then() function for co and async/await as a convenience. However, unlike promises, calling a query's .then() can execute the query multiple times."
(https://mongoosejs.com/docs/queries.html#queries-are-not-promises)
Use $addToSet instead of $push, it should solve the problem. I believe there is an issue with the data structure used in the creation of a mongoose 'Model'. As we know push is an array (which allows duplication) operation while addToSet may be a Set operation (Sets do not allow duplication).
The problem with the accepted answer is that it only solves the problem by wrapping it in an unnecessary additional promise, when the findOneAndUpdate() method already returns a promise. Additionally, it uses both promises AND callbacks, which is something you should almost never do.
Instead, I would take the following approach:
I generally like to keep my update query logic separate from other concerns for both readability and re-usability. so I would make a wrapper function kind of like:
const update = (id, updateObj) => {
const options = {
new: true,
upsert: true
}
return model.findOneAndUpdate({_id: id}, {...updateObj}, options).exec()
}
This function could then be reused throughout my application, saving me from having to rewrite repetitive options setup or exec calls.
Then I would have some other function that is responsible for calling my query, passing values to it, and handling what comes back from it.
Something kind of like:
const makePush = async () => {
try {
const result = await update('someObjectId', {$push: {someField: value}});
// do whatever you want to do with the updated document
catch (e) {
handleError(e)
}
}
No need to create unnecessary promises, no callback hell, no duplicate requests, and stronger adherence to single responsibility principles.
I was having the same problem. My code was:
const doc = await model.findOneAndUpdate(
{filter}, {update},
{new: true}, (err, item) => if(err) console.log(err) }
)
res.locals.doc = doc
next();
The thing is, for some reason this callback after the "new" option was creating a double entry. I removed the callback and it worked.
I had the same problem.
I found a solution for me:
I used a callback and a promise (so using keyword "await") simultaneously.
Using a callback and a promise simultaneously will result in the query being executed twice. You should be using one or the other, but not both.
options = {
upsert: true // creates the object if it doesn't exist. defaults to false.
};
await Company.findByIdAndUpdate(company._id,
{ $push: { employees: savedEmployees } },
options,
(err) => {
if (err) {
debug(err);
}
}
).exec();
to
options = {
upsert: true // creates the object if it doesn't exist. defaults to false.
};
await Company.findByIdAndUpdate(company._id,
{ $push: { employees: savedEmployees } },
options
).exec();
UserModel.findOneAndUpdate(
{ _id: id },
{ object }
)
Even if you use _id as a parameter don't forget to make the filter explicit by id
In my case, changing the async callback solved the problem.
changing this:
await schema.findOneAndUpdate(
{ queryData },
{ updateData },
{ upsert: true },
(err) => {
if (err) console.log(err);
else await asyncFunction();
}
);
To this:
await schema.findOneAndUpdate(
{ queryData },
{ updateData },
{ upsert: true },
(err) => {
if (err) console.log(err);
}
);
if (success) await asyncFunction();
The $addToSet instead of $push allowed me to prevent duplicate entry in my mongoDb array field of User document like this.
const blockUserServiceFunc = async(req, res) => {
let filter = {
_id : req.body.userId
}
let update = { $addToSet: { blockedUserIds: req.body.blockUserId } };
await User.findOneAndUpdate(filter, update, (err, user) => {
if (err) {
res.json({
status: 501,
success: false,
message: messages.FAILURE.SWW
});
} else {
res.json({
status: 200,
success: true,
message: messages.SUCCESS.USER.BLOCKED,
data: {
'id': user._id,
'firstName': user.firstName,
'lastName': user.lastName,
'email': user.email,
'isActive': user.isActive,
'isDeleted': user.isDeleted,
'deletedAt': user.deletedAt,
'mobileNo': user.mobileNo,
'userName': user.userName,
'dob': user.dob,
'role': user.role,
'reasonForDeleting': user.reasonForDeleting,
'blockedUserIds': user.blockedUserIds,
'accountType': user.accountType
}
});
}
}
).catch(err => {
res.json({
status: 500,
success: false,
message: err
});
});
}

remove an field from mongodb using mongoose

this is my model function,
Project.collection.findOneAndUpdate(query, updatedProject)
.then((res) => {
if (res) {
return resolve(res);
} else reject();
})
.catch((err) => {
return reject(err);
});
updatedProject holds json data like,
{
"data":"has data",
"fields":"some data"
}
and query holds,
let query = { _id: mongoose.Types.ObjectId(projectId)}
now my database document project collection data is,
{
"_id":ObjectId("5b446a58ab89ec3cc34c2bec"),
"name":"my name",
"data":"has data123",
"fields":"some data123",
"record":{
"data1":"some records",
"data2":"some records",
"data3":"some records"
}
}
Now, my updatedProject holds only data,fields, so im expecting other fields name, record to be removed, but it responds back with all the fields with updated value,
but if i pass updateProject like this,
"record":{
"data1":"some new records"
}
it just works as expected (removes data2 and data3),
please help me to solve this issue.

Can I update various items in an Immutable List in a React/Redux App

On submitting a form with some updated values, I need to update the state to reflect these changes, but I am new to Immutable.js and am unsure how to do so.
Is it possible to pass a function as a 2nd argument to set or update to update values based on certain criteria.
I have a function which receives state and an array of objects called values. The data in values looks like this:
[
{
key: 'name',
value: 'fred'
},
{
key: 'height',
value: '201'
},
{
key: 'weight',
value: '78'
}
]
I need to map over this data, and the state list, and update the corresponding values in the state list with the values array.
How can I do this. I have put together a function which the Reducer calls to update the state with the new data, but unsure exactly how to get the end result
function updateValue(state, values = []) {
const items = state.get('items').map((i) => {
const key = i.get('key');
values.map(v => {
if (v.key === key) {
return state.update('value', v.value);
}
})
});
return state.update('items', /* Can I use a function here to replace the code above.. to update all of the items in the state List that correspond to the items in the measurements array (which have new values) */);
}
Thank you very much.
Update
Tried the following, but getting the error: Expected [K, V] tuple: i
function updateValue(state, values = []) {
const items = state.get('items').map((i) => {
const key = i.get('key');
values.map(v => {
if (v.key === key) {
return state.update('value', v.value);
}
})
});
return state.update('items', items);
}
More details on the error from Immutable:
function validateEntry(entry) {
if (entry !== Object(entry)) {
throw new TypeError('Expected [K, V] tuple: ' + entry);
}
}
You can use 'merge' to return new object:
return state.merge({
items: values,
});

how to get excel cell data changed event using new api or Excel object in office.js

I am trying to find out how to get cell changed event using the Excel object
Excel.run(function (ctx) {
}
in office 2016.
is the context used by Office.context.document is same as context used in run function
found the answer for this.
Binding concept used earlier can be used now also as shown in the example https://github.com/OfficeDev/office-js-docs/blob/master/reference/excel/bindingcollection.md
(function () {
// Create myTable
Excel.run(function (ctx) {
var table = ctx.workbook.tables.add("Sheet1!A1:C4", true);
table.name = "myTable";
return ctx.sync().then(function () {
console.log("MyTable is Created!");
//Create a new table binding for myTable
Office.context.document.bindings.addFromNamedItemAsync("myTable", Office.CoercionType.Table, { id: "myBinding" }, function (asyncResult) {
if (asyncResult.status == "failed") {
console.log("Action failed with error: " + asyncResult.error.message);
}
else {
// If successful, add the event handler to the table binding.
Office.select("bindings#myBinding").addHandlerAsync(Office.EventType.BindingDataChanged, onBindingDataChanged);
}
});
})
.catch(function (error) {
console.log(JSON.stringify(error));
});
});
// When data in the table is changed, this event is triggered.
function onBindingDataChanged(eventArgs) {
Excel.run(function (ctx) {
// Highlight the table in orange to indicate data changed.
var fill = ctx.workbook.tables.getItem("myTable").getDataBodyRange().format.fill;
fill.load("color");
return ctx.sync().then(function () {
if (fill.color != "Orange") {
ctx.workbook.bindings.getItem(eventArgs.binding.id).getTable().getDataBodyRange().format.fill.color = "Orange";
console.log("The value in this table got changed!");
}
else
})
.then(ctx.sync)
.catch(function (error) {
console.log(JSON.stringify(error));
});
});
}
})();

Resources