React/Flux: 1 Event Synchronously Updating Multiple Stores - reactjs

React 0.14.0, Vanilla Flux Pattern
Question:
What if a triggered event needs to update 2 data structures that live in 2 separate stores?
This is a very fundamental pain that I feel with Flux.
You logically decompose your stores, and then one day you find yourself creating an event, but unfortunately it needs simultaneously update 2 separate data structures that live in 2 different stores.(crap)
Why that confuses me during development:
Please Correct Me If This Logic Is Wrong
As far as my understanding of Flux goes, we should not dispatch an Action until the re-rendering caused by the previous action is complete. Therefore multiple synchronous actions to update a store(or multiple stores) is a flux no-no.
My Solutions:
Crunch Stores Together -
I can move the data that needs to be updated into the same store in order to keep it to one action.(that sounds like over complicating a store)
Move State Server-Side -
I can keep track of some of the state server-side, and then I can use asynchronous action calls that update the server first and then the server pushes the update back to the store.(Sounds unRESTful and slow)
I'm a dedicated React developer, and am open to any advice and/or correction to help my understanding to build great React applications. Thx

Related

Tracking unused redux data in React components

I'm looking for a good way to track which props received by a component are not being used and can be safely removed.
In a system I maintain, our client single-page app fetches a large amount of data from some private endpoints in our backend services via redux saga. For most endpoints called, all data received is passed directly to our React components, no filtering applied. We are working to improve the overall system performance, and part of that process involves reducing the amount of data returned by our backend-for-frontend services, given those themselves call a large number of services to compose the returned JSON data, which adds to the overall response time.
Ideally, we want to make sure we only fetch the data we absolutely need and save the server from doing unnecessary calls and data normalization. So far, we've been trimming the backend services data by doing a code inspection; we inspect the returned data for each endpoint, then inspect the front-end code and finally remove the data we identified (as a best guess) as unused. That's proven to be risky and inefficient, frequently we assume some data is unused, then months later find a corner case in which it was actually needed, and have to reverse the work. I'm looking for a smart, automated way to identify unused props in my app. Has anyone else had to work on something like that before? Ideas?
There's an existing library called https://github.com/aholachek/redux-usage-report , which wraps the Redux state in a proxy to identify which pieces of state are actually being used.
That may be sufficiently similar to what you're doing to be helpful, or at least give you some ideas that you can take inspiration from.

React-Redux ideal interaction with a database

If there is a complex Redux store for determining the states of many components throughout the app.
What is the best pattern for when to save things to the DB? I see pros and cons to different approaches, but I am wondering what is standard for applications with a complex UI?
Save the store to DB every time a change is made. (Makes it difficult chasing lots of instant vs. async processes... Either lots of loading states and waiting or juggling the store and the DB separately.)
Autosaving every now and then... (Allows the store to instantly determine the UI, faster... With occasional loading states.)
Manual saving... Ya, no thanks...
I recommend saving automatically every time a change is made, but use a "debounce" function so that you only save at most every X milliseconds (or whatever interval is appropriate for your situation).
Here is an example of a "debounce" function from lodash: https://lodash.com/docs/#debounce

Managing Server side and Client side state after POST and DELETE

So here is the issue i am facing on deciding, Would like to know what is the best practice and which one to choose.
Suppose we have a list of items we use GET to get all the lists and we do POST for editing and DELETE for deleting a list. After successful completion of either of these request whats the best practices to make sure client state is in sync with server state.
GET to fetch all the lists
let server return all the lists after edit or delete operations.
or client should trust 200 OK and update the client state with modifying lists
This is very opinion-based, as there are lots of variables in play here. Ideally I'd always say, drop the client-state, and always re-get the data after updates. This of course completely depends on the cost/complexity of re-retrieving that state. If it is cheap/small, go ahead and re-GET the lists... it's far simpler than trying to maintain state synchronization. If you DO update a local cache, make sure it's a CACHE pattern, and not a REPOSITORY pattern. This means you shouldn't necessarily trust the cache for important operations, and that you should do things like verify the entities before operating on them.
I would not return updated lists from a POST/DELETE, updated entities is fine, but the whole list should be a subsequent GET. Anything else and you are not only breaking REST, but you are also defining the behavior of the client application in your service (which you don't want to do).

Bursts of data, redux buffering / react rerendering

Here is my data flow:
1: 500+ objects come trough socket in 1-2s bursts
2: Objects are added directly to redux store
3: React table container is connected to redux store (using it as data source) and re-renders for every object.
That many re-renders pretty much kill browser.
What options do I have to buffer incoming objects(events) and send batches to reducer every, say 1s? Even better solution would be to somehow time-limit react rendering (shouldComponentUpdate...), but I doubt it's possible?
I would advise looking into RxJS and redux-observable specifically. What you're looking for is debounce operator if I'm not mistaken.
RxJS has a pretty steep learning curve, but I have managed to set up a basic working solution pretty quickly.
I have been inspired by this talk and I encourage you to take a listen.
You could use a temporary cache in the reducer as local data. Maybe with a timeout before adding the data to the store object.
You can implement your own throttle behavior using setTimeout in shouldComponentUpdate or use something ready like https://github.com/ryo33/react-throttle-render

Should I be concerned with the rate of state change in my React Redux app?

I am implementing/evaluating a "real-time" web app using React, Redux, and Websocket. On the server, I have changes occurring to my data set at a rate of about 32 changes per second.
Each change causes an async message to the app using Websocket. The async message initiates a RECEIVE action in my redux state. State changes lead to component rendering.
My concern is that the frequency of state changes will lead to unacceptable load on the client, but I'm not sure how to characterize load against number of messages, number of components, etc.
When will this become a problem or what tools would I use to figure out if it is a problem?
Does the "shape" of my state make a difference to the rendering performance? Should I consider placing high change objects in one entity while low change objects are in another entity?
Should I focus my efforts on batching the change events so that the app can respond to a list of changes rather than each individual change (effectively reducing the rate of change on state)?
I appreciate any suggestions.
Those are actually pretty reasonable questions to be asking, and yes, those do all sound like good approaches to be looking at.
As a thought - you said your server-side data changes are occurring 32 times a second. Can that information itself be batched at all? Do you literally need to display every single update?
You may be interested in the "Performance" section of the Redux FAQ, which includes answers on "scaling" and reducing the number of store subscription updates.
Grouping your state partially based on update frequency sounds like a good idea. Components that aren't subscribed to that chunk should be able to skip updates based on React Redux's built-in shallow equality checks.
I'll toss in several additional useful links for performance-related information and libraries. My React/Redux links repo has a section on React performance, and my Redux library links repo has relevant sections on store change subscriptions and component update monitoring.

Resources