I am working on a react project which uses GraphQl for API integration, which I did perfectly.
Now I am working on a module where I need to use subscription.
What I am doing
I have charts to show one UI, which I have created successfully using ReCharts.
So initially I a getting data from query and displaying it on UI which works totally fine.
Than as per requirement There is some live data which I am getting through subscription i.e web socket
So what I am doing is I stored query data in a state and when I got the live data I am appending the live data to that state and UI is updating correctly.
Issue I am facing
As I have many charts on UI so I am conditionally checking for which chart I am getting data and than appending to the particular one.
But what happens is when I got a lot of data which is coming every 2-6 seconds it hangs the browser after sometime, like after 10 minutes or 15 minutes, which is bit frustrating.
I don't know What is wrong in this, may be my approach is not good or something else
My code what I did
// below query is for getting list of factoories
//const { loading: loadingFactoryy, data: factoryList } = useQuery( FACTORYLIST);
// from above query I ll get data like below state
// as I am not connected to backend so I am taking data statically
const [factoryList, setFactoryList] = useState([
{
factoryId: 16,
factoryName: "factory one",
__typename: "user_factoryNames"
},
{
factoryId: 17,
factoryName: "factory two",
__typename: "user_factoryNames"
},
{
factoryId: 18,
factoryName: "factory Three",
__typename: "user_factoryNames"
},
{
factoryId: 19,
factoryName: "factory four",
__typename: "user_factoryNames"
}
]);
My Machine code
// below query to get the machines for each factories, if i click on
//factoryone it will give me all machines of factory one
// const [
// getMachines,
// { loading: loadingMachines, data: machineData },
// ] = useLazyQuery(FACTORYMACHINEBYID);
//I am using lazyquery so when I click I ll get the data
// useEffect(() => { this will run when page loades first time
// if (factoryList !== undefined) {
// if (factoryList.getUserFactorydNames.length > 0) {
// getInitialDataFun({
// variables: {
// factoryId:
// factoryList?.getUserFactorydNames[parseInt(activeDashboard)]
// ?.factoryId,
// },
// });
// }
// }
// }, [active_tab, factoryId, getInitialDataFun]);
//all functions for factories click
let active_tab = localStorage.getItem("active_tab") || 0;
const [active_menu, setActive_menu] = useState(Number(active_tab));
const [machineList, setmachineList] = useState([
{
chartId: 12,
chartBame: "machine1",
data: [
{
dValue: 5,
dName: "Data1"
},
{
dValue: 10,
dName: "Data2"
}
]
},
{
chartId: 13,
chartBame: "machine2",
data: [
{
dValue: 5,
dName: "Data1"
},
{
dValue: 10,
dName: "Data2"
}
]
}
]);
My subscription code
// my subscription code
// const {
// data: LiveDataMachine,
// error: errorLiveDataMachine,
// } = useSubscription(LIVEDATAMACHINEFUN, {
// variables: { topic: { topic: 'livemessage/' } },
// });
const factoryClick = (li, ind) => {
setActive_menu(ind);
localStorage.setItem("active_tab", ind);
};
//My live data looks like below
// {
// "chartId": 16,
// "chartBame": "machine1",
// "data": [
// {
// "dValue": 7,
// "dName": "Data1"
// },
// {
// "dValue": 18,
// "dName": "Data2"
// }
// ]
// }
// So what I am doing is whenever I am getting the live data
//I am looping it with main machine data and checking for chartId
//once it matches I am appending the data like below
// useEffect(() => {
// if(machineList){
// machineData.map((li,ind)=>{
// if(LiveDataMachine.chartId===li.chartId){
// setmachineList(machineList=>[...machineList,LiveDataMachine])
// }
// })
// }
// }, [LiveDataMachine]);
// but what is hapenning is it is updating the UI so fast that my browser is getting hanged
return (
<div className="App">
<FactoryComp
factoryData={factoryList}
active_menu={active_menu}
factoryClick={factoryClick}
/>
<hr />
<MachinesComp machineData={machineList} />
</div>
above is how I am writing my code and managing GraphQl subscription, I don't know what I am doing wrong which is making my browser hangs alot
here I am looking for a correct approach by which I can use subscription with my query.
Here is my working code I have written everything as a comment for better uderstanding
Looks like you are calling setmachineList in the if inside a render. It will endlessly cause component updates, thereby cycling it. Perhaps it is better to move this logic to useEffect?
useEffect(() => {
if(machineList){
machineData.forEach((li,ind)=>{
if(LiveDataMachine.chartId===li.chartId){
setmachineList(machineList=>[...machineList,LiveDataMachine])
}
})
}
}, [LiveDataMachine]);
So I get some data into my socket
The code in Client is :
useEffect(() => {
const socket = io("http://localhost:5000/api/socket");
socket.on("newThought", (thought) => {
console.log(thought);
});
}, []);
And then the code in my server is
connection.once("open", () => {
console.log("MongoDB database connected");
console.log("Setting change streams");
const thoughtChangeStream = connection.collection("phonenumbers").watch();
thoughtChangeStream.on("change", (change) => {
io.of("/api/socket").emit("newThought", change);
});
});
When something in my "phonenumbers" collection gets changed I get in return the whole collection . How would I be able to only get the array that got changed from the object in collection?
So for example if in the collection the only service that changed is the one with id "607deefd13c4ebcbcfa0900a" that should be the only one returned and not the whole collection object.
The fullDocument parameter to the options (second) argument to the watch method can be used to get a delta describing the changes to the document for update operations:
const thoughtChangeStream = connection.collection("phonenumbers").watch([], {
fullDocument: 'updateLookup'
});
thoughtChangeStream.on("change", (change) => {
io.of("/api/socket").emit("newThought", change);
});
This will then return a response document like this where updateDescription contains the fields that were modified by the update:
{
_id: {
_data: '8260931772000000012B022C0100296E5A1004ABFC09CB5798444C8126B1DBABB9859946645F696400646082EA7F05B619F0D586DA440004'
},
operationType: 'update',
clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 1, high_: 1620252530 },
ns: { db: 'yourDatabase', coll: 'yourCollection' },
documentKey: { _id: 6082ea7f05b619f0d586da44 },
updateDescription: {
updatedFields: { updatedField: 'newValue' },
removedFields: []
}
}
Note: This will only work for update operations and will not work for replace, delete, insert, etc.
See also:
http://mongodb.github.io/node-mongodb-native/3.0/api/Collection.html.
https://docs.mongodb.com/manual/reference/change-events/
Following is a code I implemented to create a bar chart using chart js in React app. Here it creates a bar chart with all the data in an array. But, I want to change this code only to give the output in the x-axis - destination, y-axis - no. of occurrence of this destination since it has many repeated destinations.
I searched methods to this but I couldn't get a correct solution.
Can anyone help me to do this?
const dataArrayY4 = [];
res.data.map(item => {
dataArrayY4.push(item.time)
})
const dataArrayX4 = []
res.data.map(item => {
dataArrayX4.push(item.destination)
})
this.setState({
data4: dataArrayY4,
labels4: dataArrayX4,
});
This could be done as follows:
const res = {
data: [
{ time: 1, destination: 'A'},
{ time: 3, destination: 'A'},
{ time: 2, destination: 'B'}
]
};
let tmp4 = [];
res.data.map((o, i) => {
const existing = tmp4.find(e => e.destination == o.destination);
if (existing) {
existing.time += o.time;
} else {
tmp4.push({time: o.time, destination: o.destination});
}
})
this.setState({
data4: tmp.map(o => o.time);
labels4: tmp.map(o => o.destination);
});
Above code could further be optimized by using Array.reduce() instead of Array.map().
I would make the code more efficient. Instead of dataArrayY4 being an array, I would make it an object that has a key of value and the number of occurrence of each value. This way, you can count all the number of occurrences of the all items in res.data
const dataArrayY4 = {};
res.data.map(item => {
dataArrayY4[item.destination] = (dataArrayY4[item.destination] || 0) + 1
})
const dataArrayX4 = []
res.data.forEach(item => {
dataArrayX4.push(item.destination)
})
this.setState({
data4: dataArrayY4,
labels4: dataArrayX4,
});
Then if you want to look for the occurrence of a particular value you
use this eg. Sri Lanka
this.state.data4['Sri Lanka']
The old value comes back into the input field.
I had initialized this.props.house.rent (which is the old value coming back) with input field value this.state.rent (user entered value) but I cannot do it as it becomes antipattern syntax, you can see it in below comments of code.
cacheUpdate = (cache, { data: { updateHouse } }) => {
const data = cache.readQuery({
query: QUERY_CONFIRM_QUERY,
variables: { id: this.props.confirmId },
});
const houses = data.queryConfirm.houses;
const prevHouse = this.props.house;
//prevHouse.rent = this.state.rent; // this.state.rent is user entered input value
const updatedHouses = houses.map(house => {
if (house.id === prevHouse.id) {
const updatedHouseItem = _.pickBy(updateHouse, _.identity);
return { ...prevHouse, ...updatedHouseItem };
}
return house;
});
data.queryConfirm.houses = updatedHouses;
cache.writeQuery({
query: QUERY_CONFIRM_QUERY,
variables: {
id: this.props.confirmId,
},
data,
});
};
I want the old value to be removed from readQuery or writeQuery of cache.
Hope this helps someone. Ooof..I think a lot, it was simple actually. I thought to make callback, but not possible as values are coming from database. Anyways, in conclusion simple look at the value of data.queryConfirm.houses in the console and assign the required array like, say I required was data.queryContact.houses[0].rent and initialize it to this.state.rent. That's all.
Currently I need to push a large CSV file into a mongo DB and the order of the values needs to determine the key for the DB entry:
Example CSV file:
9,1557,358,286,Mutantville,4368,2358026,,M,0,0,0,1,0
9,1557,359,147,Wroogny,4853,2356061,,D,0,0,0,1,0
Code to parse it into arrays:
var fs = require("fs");
var csv = require("fast-csv");
fs.createReadStream("rank.txt")
.pipe(csv())
.on("data", function(data){
console.log(data);
})
.on("end", function(data){
console.log("Read Finished");
});
Code Output:
[ '9',
'1557',
'358',
'286',
'Mutantville',
'4368',
'2358026',
'',
'M',
'0',
'0',
'0',
'1',
'0' ]
[ '9',
'1557',
'359',
'147',
'Wroogny',
'4853',
'2356061',
'',
'D',
'0',
'0',
'0',
'1',
'0' ]
How do I insert the arrays into my mongoose schema to go into mongo db?
Schema:
var mongoose = require("mongoose");
var rankSchema = new mongoose.Schema({
serverid: Number,
resetid: Number,
rank: Number,
number: Number,
name: String,
land: Number,
networth: Number,
tag: String,
gov: String,
gdi: Number,
protection: Number,
vacation: Number,
alive: Number,
deleted: Number
});
module.exports = mongoose.model("Rank", rankSchema);
The order of the array needs to match the order of the schema for instance in the array the first number 9 needs to always be saved as they key "serverid" and so forth. I'm using Node.JS
You can do it with fast-csv by getting the headers from the schema definition which will return the parsed lines as "objects". You actually have some mismatches, so I've marked them with corrections:
const fs = require('mz/fs');
const csv = require('fast-csv');
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug', true);
const rankSchema = new Schema({
serverid: Number,
resetid: Number,
rank: Number,
name: String,
land: String, // <-- You have this as Number but it's a string
networth: Number,
tag: String,
stuff: String, // the empty field in the csv
gov: String,
gdi: Number,
protection: Number,
vacation: Number,
alive: Number,
deleted: Number
});
const Rank = mongoose.model('Rank', rankSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
let headers = Object.keys(Rank.schema.paths)
.filter(k => ['_id','__v'].indexOf(k) === -1);
console.log(headers);
await new Promise((resolve,reject) => {
let buffer = [],
counter = 0;
let stream = fs.createReadStream('input.csv')
.pipe(csv({ headers }))
.on("error", reject)
.on("data", async doc => {
stream.pause();
buffer.push(doc);
counter++;
log(doc);
try {
if ( counter > 10000 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
}
} catch(e) {
stream.destroy(e);
}
stream.resume();
})
.on("end", async () => {
try {
if ( counter > 0 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
resolve();
}
} catch(e) {
stream.destroy(e);
}
});
});
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
As long as the schema actually lines up to the provided CSV then it's okay. These are the corrections that I can see but if you need the actual field names aligned differently then you need to adjust. But there was basically a Number in the position where there is a String and essentially an extra field, which I'm presuming is the blank one in the CSV.
The general things are getting the array of field names from the schema and passing that into the options when making the csv parser instance:
let headers = Object.keys(Rank.schema.paths)
.filter(k => ['_id','__v'].indexOf(k) === -1);
let stream = fs.createReadStream('input.csv')
.pipe(csv({ headers }))
Once you actually do that then you get an "Object" back instead of an array:
{
"serverid": "9",
"resetid": "1557",
"rank": "358",
"name": "286",
"land": "Mutantville",
"networth": "4368",
"tag": "2358026",
"stuff": "",
"gov": "M",
"gdi": "0",
"protection": "0",
"vacation": "0",
"alive": "1",
"deleted": "0"
}
Don't worry about the "types" because Mongoose will cast the values according to schema.
The rest happens within the handler for the data event. For maximum efficiency we are using insertMany() to only write to the database once every 10,000 lines. How that actually goes to the server and processes depends on the MongoDB version, but 10,000 should be pretty reasonable based on the average number of fields you would import for a single collection in terms of the "trade-off" for memory usage and writing a reasonable network request. Make the number smaller if necessary.
The important parts are to mark these calls as async functions and await the result of the insertMany() before continuing. Also we need to pause() the stream and resume() on each item otherwise we run the risk of overwriting the buffer of documents to insert before they are actually sent. The pause() and resume() are necessary to put "back-pressure" on the pipe, otherwise items just keep "coming out" and firing the data event.
Naturally the control for the 10,000 entries requires we check that both on each iteration and on stream completion in order to empty the buffer and send any remaining documents to the server.
That's really what you want to do, as you certainly don't want to fire off an async request to the server both on "every" iteration through the data event or essentially without waiting for each request to complete. You'll get away with not checking that for "very small files", but for any real world load you're certain to exceed the call stack due to "in flight" async calls which have not yet completed.
FYI - a package.json used. The mz is optional as it's just a modernized Promise enabled library of standard node "built-in" libraries that I'm simply used to using. The code is of course completely interchangeable with the fs module.
{
"description": "",
"main": "index.js",
"dependencies": {
"fast-csv": "^2.4.1",
"mongoose": "^5.1.1",
"mz": "^2.7.0"
},
"keywords": [],
"author": "",
"license": "ISC"
}
Actually with Node v8.9.x and above then we can even make this much simpler with an implementation of AsyncIterator through the stream-to-iterator module. It's still in Iterator<Promise<T>> mode, but it should do until Node v10.x becomes stable LTS:
const fs = require('mz/fs');
const csv = require('fast-csv');
const streamToIterator = require('stream-to-iterator');
const { Schema } = mongoose = require('mongoose');
const uri = 'mongodb://localhost/test';
mongoose.Promise = global.Promise;
mongoose.set('debug', true);
const rankSchema = new Schema({
serverid: Number,
resetid: Number,
rank: Number,
name: String,
land: String,
networth: Number,
tag: String,
stuff: String, // the empty field
gov: String,
gdi: Number,
protection: Number,
vacation: Number,
alive: Number,
deleted: Number
});
const Rank = mongoose.model('Rank', rankSchema);
const log = data => console.log(JSON.stringify(data, undefined, 2));
(async function() {
try {
const conn = await mongoose.connect(uri);
await Promise.all(Object.entries(conn.models).map(([k,m]) => m.remove()));
let headers = Object.keys(Rank.schema.paths)
.filter(k => ['_id','__v'].indexOf(k) === -1);
//console.log(headers);
let stream = fs.createReadStream('input.csv')
.pipe(csv({ headers }));
const iterator = await streamToIterator(stream).init();
let buffer = [],
counter = 0;
for ( let docPromise of iterator ) {
let doc = await docPromise;
buffer.push(doc);
counter++;
if ( counter > 10000 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
}
}
if ( counter > 0 ) {
await Rank.insertMany(buffer);
buffer = [];
counter = 0;
}
} catch(e) {
console.error(e)
} finally {
process.exit()
}
})()
Basically, all of the stream "event" handling and pausing and resuming gets replaced by a simple for loop:
const iterator = await streamToIterator(stream).init();
for ( let docPromise of iterator ) {
let doc = await docPromise;
// ... The things in the loop
}
Easy! This gets cleaned up in later node implementation with for..await..of when it becomes more stable. But the above runs fine on the from the specified version and above.
By saying #Neil Lunn need headerline within the CSV itself.
Example using csvtojson module.
const csv = require('csvtojson');
const csvArray = [];
csv()
.fromFile(file-path)
.on('json', (jsonObj) => {
csvArray.push({ name: jsonObj.name, id: jsonObj.id });
})
.on('done', (error) => {
if (error) {
return res.status(500).json({ error});
}
Model.create(csvArray)
.then((result) => {
return res.status(200).json({result});
}).catch((err) => {
return res.status(500).json({ error});
});
});
});