Issue with pushing data into firebase - reactjs

Here I'm trying to push the data inside the array in Firebase, but it's pushing the data continuously until the cache from the app is destroyed. Here is my code and Firebase screenshot.
code:
var Input = {
AaMessage: 'brb',
}
var query = firebase.database().ref('UserList/');
query
.orderByChild('PostId')
.equalTo(this.state.PostID)
.on('value', snapshot => {
snapshot.forEach(child => {
firebase
.database()
.ref('UserList/' + child.key + '/Paymentdetails')
.push(Input)
.then(resp => {
console.log('Done', resp);
})
.catch(err => {
console.log('Error', err);
});
});
});
Firebase view:

You're opening a listener which writes to the same node it listens to. Even if that's not causing recursion, you're still writing a new doc for every child every single time your UserList is updated.
Also, avoid mixing lists and documents in a single rt-db node. That can only lead to pain.
It's difficult to understand what you are trying to do -- but it looks like you might want to call once instead of on, so the listener doesn't stay open and keep writing (potential lots of) new documents.
Additionally, I would recommend not writing to the node you're listening to.
database().ref("SomewhereElse/").push(doc);
I don't know why you would want to push new docs whenever the snapshot updates, you're going to get a lot of duplicates. If that was a mistake you likely want to do those pushes in a onCreate trigger.

Related

What could be the reason for firebase to show great unexplained amounts of reads?

I am developing a chrome extension that reads data from firestore, in the last few days, i'd finish a day's work with a maximum of 50 reads, today for some reason it is showing a whopping 34K reads, how can I fix this error?
This is my code
useEffect(() => {
const getNotes = () => {
db.collection("Users")
.doc(user.email)
.collection("Notes")
.get()
.then((snapshot) => {
const loadedNotes = snapshot.docs.map((docs) => {
return {
note: docs.data().note,
id: docs.id,
};
});
setNotes(Object.values(loadedNotes) ?? []);
});
};
getNotes();
});
This function is written 3 times in 3 different components, for notes, todos and upcoming Events. the todos have an extra document field which is the state of completion and upcoming Events have a field for event date.
I also have firebase authentication in my project, I don't know is this matters on not.
Where are your useEffect dependencies? As written, this will run on EVERY render of the component... is that what you intended?
If you are using get() on collection references, this loads every document in that collection every time it is requested which is highly excessive on reads. You should be using using limit(n) and pagination where needed, and if possible: source:cache on documents.
Ideally, you should be using some sort of redux/redis local storage which you add the data too once on app load and read from before making random scraping to Firebase Collections
https://firebase.google.com/docs/reference/android/com/google/firebase/firestore/Source
https://firebase.google.com/docs/firestore/query-data/order-limit-data

Updating multiple firestone collections

I'm trying to update 2 firebase collections that will contain an array element that is the same.
For example, I'm building a job app, so when a user creates a job, it pushes that job object into a firebase collection called alljobs under a document called alljobs. In addition, the same job is pushed to a firebase collection called created jobs. Where each user on the app has their individual created jobs, each doc is named the users id.
Is there an easy way to update this specific job in both alljobs collection and the createdjobs collection?
For example, my approach of doing it would be like this.
Individual Job component (obtained by previously mapping through all the jobs)
const [userjobs, setUserjobs] = useState([])
const {job, createdjobs} = props
function updateJob(){
createdjobs?.map(job1=>{
if(job1.jobid===job.jobid){
const jobindex = createdjobs.indexOf(job1)
createdjobs[jobindex].jobtitle = 'New title'
db.collection('createdjobs').doc(user.uid).update({
jobs: createdjobs
})
}
})
}
I'll basically have to repeat this same process to update once again the job that has just been updated in the createdjobs collection. This gets repetitive and messy. So looking for a solution to this. By mapping through alljobs this time.
useEffect(()=>{
db.collection('alljobs').doc('alljobs').onSnapshot(snap=>{
setAlljobs(snap.data().jobs)
})
},[])
There is no shortcut for your problem I think. But I suggest you to write a sync function to Firebase Function.
It will watch changes of one source and sync to others. So that your logic code only needs to manage one source of trust.
As #Thanh Le suggested you can write a Google Cloud Function to fulfill this purpose. In cloud functions there is a function type named triggers. You can use this triggers.
Cloud Function triggers
We can write functions which will automatically trigger when the specfied document or set of documents,
onCreate - Trigger when document creating
onUpdate - Triggered when a document already exists and has any value changed.
onDelete - Trigger when document deleting
onWrite - Triggered when onCreate, onUpdate or onDelete is triggered.
From these triggers you can use onWrite Trigger to to implement the function.
exports.updateAllJobsTrigger = functions.firestore.document('createdJob/{userId}')
onWrite(async (change, context) => {
// Check if document is deleted
if (!change.after.exists) {
logger.log('Document not existing. Function exited.');
return;
}
const allJobsRef = admin.firestore().collection('alljobs').doc('alljobs');
// Check if document created
if (!change.before.exists) {
try {
// This is a newly created document. Therefore newjob should be in first element
const newJob = change.after.data().jobs[0];
const data = (await allJobsRef.get()).data();
if (data) {
const jobs = data.jobs;
await allJobsRef.update({
jobs: [...jobs, newJob]
});
logger.info('Job added to All jobs queue.');
}
} catch (exception) {
logger.error(exception)
}
return;
}
try {
// This is a updating document.newly added job is in the last element of the array
const newJob = change.after.data().jobs[change.after.data().jobs.length - 1];
const data = (await allJobsRef.get()).data();
if (data) {
const jobs = data.jobs;
await allJobsRef.update({
jobs: [...jobs, newJob]
});
logger.info('Job added to All jobs queue.');
}
} catch (exception) {
logger.error(exception)
}
});
As #Nimna Perera said, you can use Cloud Functions to solve this issue. Your CF should be triggered when a document is updated, created or deleted (so the onWrite option). Another way to do this is through transactions, when you need to read and write the documents or batched writes when you only need to write in one or various documents. In both cases you are not limited to a single collection, so it should work for your issue.

A more performance-friendly way to use setInterval() in componentDidMount when fetching data

i am struggling pretty hard here to find the right solution. Currently, I am using setInterval() to "poll" my server and retrieve an array of objects. To fetch the data, I am using axios. Here are the pertinent functions:
componentDidMount(){
this.timer = setInterval(() => [this.getData(), this.getCustData()], 1000);
}
componentWillUnmount(){
this.timer && clearInterval(this.timer);
this.timer = false
}
getData = () => {
axios.get('http://localhost:3001/api/v1/pickup_deliveries')
.then((response) => {
this.setState({
apiData: response.data
})
})
.catch((error)=>{console.log(error);});
}
getCustData = () => {
axios.get('http://localhost:3001/api/v1/customers')
.then((response) => {
this.setState({
custData: response.data
})
})
.catch((error)=>{console.log(error);});
}
The application is running so slow and often times, it will completely hang the server which makes the whole application unusable. Currently the array it's fetching has over 1000+ objects and that number is growing daily. If I fetch the data without polling the server, the feel of my application is night and day. I am not quite sure what the answer is but I do know what I am doing is NOT the right way.
Is this just the nature of mocking "polling" with setInterval() and it is what it is? Or is there a way to fetch data only when state has changed?
If I need to implement SSE or WebSockets, I will go through the hassle but I wanted to see if there was a way to fix my current code for better performance.
Thanks for hearing me out.
On the frontend side, my advice would be to not use setInterval, but use setTimeout instead.
Using setInterval, your app might send another request even if the response for previous request hasn't come back yet (e. g.: it took more than 1 second). Preferably, you should only send another request 1 second after the previous response is received.
componentDidMount() {
getData();
}
getData = () => {
fetch().then(() => {
updateState();
// schedule next request
setTimeout(getData, 1000);
});
}
You should also try to reduce the amount of updates that need to be done on the frontend, for example by reducing the number of the data.
But nevertheless, I think the most important is to rethink the design of your application. Polling huge JSON that is going to only grow bigger is not a scalable design. It's bad for both the server and the client.
If what you are trying to do is to have the client be notified of any changes in the server side, you should look into WebSocket. A simple idea is that the browser should establish a WS connection to the server. Upon any updates to the server, instead of sending down the whole data, the server should only send down the updates to the client. The client would then update its own state.
For example, let's say 2 users are opening the same page, and one user make changes by adding a new Product. Server will receive this request and update the database accordingly. It will also broadcast a message to all open WebSocket connections (except for the one connection that added the Product), containing a simple object like this:
{
"action": "INSERT",
"payload": {
"product": {
"id": 123123,
... // other product data
}
}
}
The other user will use this data to update its own state so it matches the server.

After axios call, how should I update the front-end? [I have 2 ways]

I have an array of todos. When I delete one of them, I succesfully delete it also from the database with DELETE call. However, I am not sure how to update the front-end. First way is changing the state by deleting the related todo.
onDelete(todo) {
axios.delete('api/todos/' + todo.id).then(res => {
var array = [...this.state.todos]; // make a separate copy of the array
var index = array.indexOf(todo)
array.splice(index, 1);
this.setState({todos: array}); // other state elemenets other than todos will not be affected
});
}
Other way is, making a new axios GET request to get all the todos from the database. (this will be an axios request inside an axios request)
onDelete(todo) {
axios.delete('api/todos/' + todo.id).then(res => {
// make an axios get request on api/todos
// then, set state with data in response.
});
}
Hence, which one is the better approach?
The best approach would be the second one, to call a get request once more. So that no matter what the change database has undergone, the UI will be an exact replica of the data from db instead of manually changing(deleting in this case) the list from UI. However there will be the delay of api response for UI updation in this case.

share() vs ReplaySubject: Which one, and neither works

I'm trying to implement short-term caching in my Angular service -- a bunch of sub-components get created in rapid succession, and each one has an HTTP call. I want to cache them while the page is loading, but not forever.
I've tried the following two methods, neither of which have worked. In both cases, the HTTP URL is hit once for each instance of the component that is created; I want to avoid that -- ideally, the URL would be hit once when the grid is created, then the cache expires and the next time I need to create the component it hits the URL all over again. I pulled both techniques from other threads on StackOverflow.
share() (in service)
getData(id: number): Observable<MyClass[]> {
return this._http.get(this.URL)
.map((response: Response) => <MyClass[]>response.json())
.share();
}
ReplaySubject (in service)
private replaySubject = new ReplaySubject(1, 10000);
getData(id: number): Observable<MyClass[]> {
if (this.replaySubject.observers.length) {
return this.replaySubject;
} else {
return this._http.get(this.URL)
.map((response: Response) => {
let data = <MyClass[]>response.json();
this.replaySubject.next(data);
return data;
});
}
}
Caller (in component)
ngOnInit() {
this.myService.getData(this.id)
.subscribe((resultData: MyClass[]) => {
this.data = resultData;
},
(error: any) => {
alert(error);
});
}
There's really no need to hit the URL each time the component is created -- they return the same data, and in a grid of rows that contain the component, the data will be the same. I could call it once when the grid itself is created, and pass that data into the component. But I want to avoid that, for two reasons: first, the component should be relatively self-sufficient. If I use the component elsewhere, I don't want to the parent component to have to cache data there, too. Second, I want to find a short-term caching pattern that can be applied elsewhere in the application. I'm not the only person working on this, and I want to keep the code clean.
Most importantly, if you want to make something persistent even when creating/destroying Angular components it can't be created in that component but in a service that is shared among your components.
Regarding RxJS, you usually don't have to use ReplaySubject directly and use just publishReplay(1, 10000)->refCount() instead.
The share() operator is just a shorthand for publish()->refCount() that uses Subject internally which means it doesn't replay cached values.

Resources