How to queue requests using react/redux? - reactjs

I have to pretty weird case to handle.
We have to few boxes, We can call some action on every box. When We click the button inside the box, we call some endpoint on the server (using axios). Response from the server return new updated information (about all boxes, not the only one on which we call the action).
Issue:
If user click submit button on many boxes really fast, the request call the endpoints one by one. It's sometimes causes errors, because it's calculated on the server in the wrong order (status of group of boxes depends of single box status). I know it's maybe more backend issue, but I have to try fix this on frontend.
Proposal fix:
In my opinion in this case the easiest fix is disable every submit button if any request in progress. This solution unfortunately is very slow, head of the project rejected this proposition.
What we want to goal:
In some way We want to queue the requests without disable every button. Perfect solution for me at this moment:
click first button - call endpoint, request pending on the server.
click second button - button show spinner/loading information without calling endpoint.
server get us response for the first click, only then we really call the second request.
I think something like this is huge antipattern, but I don't set the rules. ;)
I was reading about e.g. redux-observable, but if I don't have to I don't want to use other middleware for redux (now We use redux-thunk). Redux-saga it will be ok, but unfortunately I don't know this tool. I prepare simple codesandbox example (I added timeouts in redux actions for easier testing).
I have only one stupid proposal solution. Creating a array of data needs to send correct request, and inside useEffect checking if the array length is equal to 1. Something like this:
const App = ({ boxActions, inProgress, ended }) => {
const [queue, setQueue] = useState([]);
const handleSubmit = async () => { // this code do not work correctly, only show my what I was thinking about
if (queue.length === 1) {
const [data] = queue;
await boxActions.submit(data.id, data.timeout);
setQueue(queue.filter((item) => item.id !== data.id));
};
useEffect(() => {
handleSubmit();
}, [queue])
return (
<>
<div>
{config.map((item) => (
<Box
key={item.id}
id={item.id}
timeout={item.timeout}
handleSubmit={(id, timeout) => setQueue([...queue, {id, timeout}])}
inProgress={inProgress.includes(item.id)}
ended={ended.includes(item.id)}
/>
))}
</div>
</>
);
};
Any ideas?

I agree with your assessment that we ultimately need to make changes on the backend. Any user can mess with the frontend and submit requests in any order they want regardless how you organize it.
I get it though, you're looking to design the happy path on the frontend such that it works with the backend as it is currently.
It's hard to tell without knowing the use-case exactly, but there may generally be some improvements we can make from a UX perspective that will apply whether we make fixes on the backend or not.
Is there an endpoint to send multiple updates to? If so, we could debounce our network call to submit only when there is a delay in user activity.
Does the user need to be aware of order of selection and the impacts thereof? If so, it sounds like we'll need to update frontend to convey this information, which may then expose a natural solution to the situation.
It's fairly simple to create a request queue and execute them serially, but it seems potentially fraught with new challenges.
E.g. If a user clicks 5 checkboxes, and order matters, a failed execution of the second update would mean we would need to stop any further execution of boxes 3 through 5 until update 2 could be completed. We'll also need to figure out how we'll handle timeouts, retries, and backoff. There is some complexity as to how we want to convey all this to the end user.
Let's say we're completely set on going that route, however. In that case, your use of Redux for state management isn't terribly important, nor is the library you use for sending your requests.
As you suggested, we'll just create an in-memory queue of updates to be made and dequeue serially. Each time a user makes an update to a box, we'll push to that queue and attempt to send updates. Our processEvents function will retain state as to whether a request is in motion or not, which it will use to decide whether to take action or not.
Each time a user clicks a box, the event is added to the queue, and we attempt processing. If processing is already ongoing or we have no events to process, we don't take any action. Each time a processing round finishes, we check for further events to process. You'll likely want to hook into this cycle with Redux and fire new actions to indicate event success and update the state and UI for each event processed and so on. It's possible one of the libraries you use offer some feature like this as well.
// Get a better Queue implementation if queue size may get high.
class Queue {
_store = [];
enqueue = (task) => this._store.push(task);
dequeue = () => this._store.shift();
length = () => this._store.length;
}
export const createSerialProcessor = (asyncProcessingCallback) => {
const updateQueue = new Queue();
const addEvent = (params, callback) => {
updateQueue.enqueue([params, callback]);
};
const processEvents = (() => {
let isReady = true;
return async () => {
if (isReady && updateQueue.length() > 0) {
const [params, callback] = updateQueue.dequeue();
isReady = false;
await asyncProcessingCallback(params, callback); // retries and all that include
isReady = true;
processEvents();
}
};
})();
return {
process: (params, callback) => {
addEvent(params, callback);
processEvents();
}
};
};
Hope this helps.
Edit: I just noticed you included a codesandbox, which is very helpful. I've created a copy of your sandbox with updates made to achieve your end and integrate it with your Redux setup. There are some obvious shortcuts still being taken, like the Queue class, but it should be about what you're looking for: https://codesandbox.io/s/dank-feather-hqtf7?file=/src/lib/createSerialProcessor.js

In case you would like to use redux-saga, you can use the actionChannel effect in combination with the blocking call effect to achieve your goal:
Working fork:
https://codesandbox.io/s/hoh8n
Here is the code for boxSagas.js:
import {actionChannel, call, delay, put, take} from 'redux-saga/effects';
// import axios from 'axios';
import {submitSuccess, submitFailure} from '../actions/boxActions';
import {SUBMIT_REQUEST} from '../types/boxTypes';
function* requestSaga(action) {
try {
// const result = yield axios.get(`https://jsonplaceholder.typicode.com/todos`);
yield delay(action.payload.timeout);
yield put(submitSuccess(action.payload.id));
} catch (error) {
yield put(submitFailure());
}
}
export default function* boxSaga() {
const requestChannel = yield actionChannel(SUBMIT_REQUEST); // buffers incoming requests
while (true) {
const action = yield take(requestChannel); // takes a request from queue or waits for one to be added
yield call(requestSaga, action); // starts request saga and _waits_ until it is done
}
}
I am using the fact that the box reducer handles the SUBMIT_REQUEST actions immediately (and sets given id as pending), while the actionChannel+call handle them sequentially and so the actions trigger only one http request at a time.
More on action channels here:
https://redux-saga.js.org/docs/advanced/Channels/#using-the-actionchannel-effect

Just store the promise from a previous request and wait for it to resolve before initiating the next request. The example below uses a global variable for simplicity - but you can use smth else to preserve state across requests (e.g. extraArgument from thunk middleware).
// boxActions.ts
let submitCall = Promise.resolve();
export const submit = (id, timeout) => async (dispatch) => {
dispatch(submitRequest(id));
submitCall = submitCall.then(() => axios.get(`https://jsonplaceholder.typicode.com/todos`))
try {
await submitCall;
setTimeout(() => {
return dispatch(submitSuccess(id));
}, timeout);
} catch (error) {
return dispatch(submitFailure());
}
};

Related

How to clear & invalidate cache data using RTK Query?

I was facing a problem for sometime, that was I'm unable to clear cache using RTK query.
I tried in various ways but cache data is not clear.
I used invalidatesTag in my mutation query and it called the api instantly. But in this case I want to refetch multiple api again, but not from any rtk query or mutation. I want to make the api call after some user activity like click.
How can I solve this problem?
I made a separate function where I return api.util.invalidateTags(tag) or api.util.resetApiState().
this is my code-snipet:-
` const api = createApi({.....})
export const resetRtkCache = (tag?: String[]) => {
const api =
if (tag) {
return api.util.invalidateTags(tag)
} else {
return api.util.resetApiState()
}
}`
& I called it using dispatch method from other files
`const reloadData = () => {
dispatch(resetRtkCache())
}`
but here cache data is not removed.I think dispatch funtion is not working. I don't see the api call is being sent to server in the browser network.
But in this case I want to refetch multiple api again, but not from
any rtk query or mutation. I want to make the api call after some user
activity like click. How can I solve this problem?
So if I understood correctly what you want to achieve is to fetch some api that you have in RTK only after some kind of user interaction?
Can't you just define something like this?
const { data } = useGetYourQuery({ skip: skipUntilUserInteraction })
Where skipUntilUserInteraction is a component state variable that you will set to true and update to false based on the user interaction you need? (e.g. a click of a button).
So essentially on component render that specific endpoint will be skipped but will be fetched after the interaction that you want will happen?
wow, you actually asking so many questions at once. but I think you should definitely read the documentation because it covers all the questions you have.
so trying to answer your questions one by one.
I used invalidatesTag in my mutation query and it called the api instantly.
invalidating with Tags is one of the ways to clear the cache.
you should first set the tagTypes for your API then use those tags in mutation queries and tell the RTK query which part of entities you want to clear.
I want to refetch multiple APIs again
you can customize the query inside of a mutation or query like this example and by calling one function query you can send multiple requests at once and if you want to fetch the API again after the cache removed you do not need to do anything because RTK query will do it for you.
I want to make the API call after some user activity like click
every mutation gives u a function that you can pass to onClick like below:
import { use[Mymutation]Mutation } from 'features/api';
const MyComponenet() {
const [myMutationFunc, { isLoading, ...}] = use[Mymutation]Mutation();
return <button type='button' onClick={myMutationFunc}>Click for call mutaion</button>
}
and remember if you set providesTags for your endpoint which you were defined in tagTypes by clicking on the button and firing up the myMutationFunc you will be clearing the cache with those tags.
and if you looking for an optimistic update for the cache you can find your answer in here.
async onQueryStarted({ id, ...patch }, { dispatch, queryFulfilled }) {
const patchResult = dispatch(
api.util.updateQueryData('getPost', id, (draft) => {
Object.assign(draft, patch)
})
)
try {
await queryFulfilled
} catch {
patchResult.undo()
}
}

Mapping State in React with Server Requests

I'm new to React as we are trying to migrate our app from AngularJS. One thing I'm struggling to wrap my head around is what's the best way to make and cache state mapping requests.
Basically, I would do a search, that returns a list of objects and one of the field is a status code (e.g. 100, 200, 300, etc.), some number. To display the result, I need to map that number to a string and we do that with a http request to the server, something like this:
GET /lookup/:stateId
So my problem now is:
I have a list of results but not many different states, how can I make that async call (useEffect?) to make that lookup only once for different stateId? Right now, I can get it to work, but the request is made on every single mapping. I'm putting the Axio call in a utility function to try and reuse this across multiple pages doing similar things, but is that the "React" way? In AngularJS, we use the "|" filter to map the code to text.
Once I have that mapping id => string, I want to store it in cache so next one that needs to map it no longer make the http request. Right now, I put the "cache" in the application level context and use dispatch to update/add values to the cache. Is that more efficient? It appears if I do a language change, where I keep the language in the same application context state, the cache would be re-initialized, and I'm not sure what other things would reset that. In AngularJS, we used the $rootState to 'cache'.
Thanks for any pointers!
In a lookupUtil.js
const DoLookupEntry = async (entryId) => {
const lookupUrl = `/lookup/${entryId}`;
try {
const response = await Axios.get(looupUrl,);
return response.data;
} catch (expt) {
console.log('error [DoLookupEntry]:',expt);
}
}
In a formatUtils.js
const formatLookupValue = (entryId) => {
const appState = useContext(AppContext);
const appDispatch = useContext(DispatchContext);
const language = appState.language;
if (appState.lookupCache
&& appState.lookupCache[entryId]
&& appState.lookupCache[entryId][language]) {
// return cached value
const entry = appState.lookupCache[entryId][language];
return entry.translatedValue;
}
// DoLookup is async, but we are not, so we want to wait...
DoLookupEntry(entryId)
.then((entry) => { // try to save to cache when value returns
appDispatch({type: States.APP_UPDATE_LOOKUP_CACHE,
value:{language, entry}})
return entry.translatedValue;
});
}
And finally the results.js displaying the result along the line (trying formatLookupValue to map the id):
{searchState.pageResults.map((item) => {
return (
<tr>
<td><Link to={/getItem/item.id}>{item.title}</Link></td>
<td>{item.detail}</td>
<td>{formatLookupValue(item.stateId)}</td>
</tr>
)
})}

React - Any way to cancel (or ignore) async action on 'OnChange' handler?

I have a code here https://codesandbox.io/embed/cranky-glade-82leu
In here, I have a 'verifyUserNameAvailable' function under 'api/' folder. In this function, I added 10sec delay to test extreme case when our API server is bad.
When user enters a valid email address (valid email format like abc#test.net), I see it's email address in helperText line 10 sec later. (This is because I added 10 sec delay in 'verifyUserNameAvailable'.)
When user enters 'abc#test.net', 'abc2#test.net' and 'abc3#test.net' 3 emails one by one without waiting for 10 seconds, I see that all three emails are displayed in sequential order on helperText line. (I guess this is expected given what I have in the code now.)
My question here-> I want the helperText line to show the most latest one only. When user enters above 3 email addresses and 'abc3#test.net' is the last one on the textField, I want to just show the 'abc3#test.net' on helperText line not the others.
Is there anyway to check what's actually on textfield and discard or abort all irrelevant async requests?
There's a way to do this without cancelling the verifyUserNameAvailable promise: Keep track of the number of requests in flight, and ignore any responses other than the latest one.
Here's what that could look like in your example, this uses useRef to keep track of the latest request value (see the React docs for useRef to see how this works https://reactjs.org/docs/hooks-reference.html#useref):
export const EmailTextField = props => {
const { onStateChange } = props;
const [state, setState] = useState({
errors: [],
onChange: false,
pristine: true,
touched: false,
value: null
});
// `useRef` holds a mutable value that lasts as long
// as the component, staying "more up to date" than the
// current closure.
const requestsInFlight = useRef(0);
const helperText = "Email address will be used as your username.";
const handleBlur = async event => {
// Email Validation logic
const emailAddress = event.target.value;
const matches = event.target.value.match(
`[a-z0-9._%+-]+#[a-z0-9.-]+.[a-z]{2,3}`
);
if (matches) {
// If there's a match, increment the ref for in-flight requests
requestsInFlight.current += 1;
await verifyUserNameAvailable(emailAddress);
// After the response comes back, decrement the ref, and if
// there are any requests still in flight, ignore that response.
requestsInFlight.current -= 1;
if (requestsInFlight.current > 0) {
return;
}
const updatedState = {
...state,
touched: true,
value: emailAddress,
errors: [emailAddress]
};
setState(updatedState);
onStateChange(updatedState);
} else {
...
}
};
Lets think about possible solutions from deepest level.
Internally you calling setTimeout. Timeout can be cancelled with clearTimeout. I don't think that this is the way you're looking for, so I will not describe how to implement this.
setTimeout is wrapped with Promise. Unfortunately, Promises are not cancelable in current version of JS. There is proposal for it.
I suspect, that you'll use code with some API to fetch data from backend. If you'll use axios, it provides cancellation of pending requests to backend using cancellation token (and cancellation is based on cancellation proposal for JS).
And if you'll use Redux in your app, you may consider Redux-saga for backend requests. It also supports cancellation.

Kick off separate redux-saga on login and logout

I'm learning Redux-Saga and having a bit of trouble wrapping my head round the correct flow for connectng people to a chat service (Chatkit by Pusher) when they log in and disconnecting them on logout.
So far I have an "auth" saga which waits for a LOGIN_REQUEST action, logs in to a REST api using axios then stores a username and token in the store by calling a USER_SET action.
My question is, when the login happens and the credentials are stored, should I PUT a new action called something like CHAT_CONNECT which would kick off another saga to connect to Chatkit, or should I get the chat saga to listen to the LOGIN_SUCCESS being fired and act on that? Is there even any practical difference in these two approaches.
As a bonus question, what's the best way to receive and act on new websocket messages from Chatkit using Redux Sagas? Here's the boilerplate code for connecting and receiving events from chatkit.
chatManager
.connect()
.then(currentUser => {
currentUser.subscribeToRoom({
roomId: currentUser.rooms[0].id,
hooks: {
onNewMessage: message => {
console.log(`Received new message: ${message.text}`)
}
}
});
})
.catch(error => {
console.error("error:", error);
})
Regarding your first question:
My question is, when the login happens and the credentials are stored, should I PUT a new action called something like CHAT_CONNECT which would kick off another saga to connect to Chatkit, or should I get the chat saga to listen to the LOGIN_SUCCESS being fired and act on that?
With the information provided its difficult to decide which approach is ideal because either will accomplish the same functionality. The biggest difference I see between the two proposed approaches is the direction of dependency. You have two different "modules" (features, packages, ...whatever you call your chunks of code that handle a single responsiblity), lets call them log-in and connect-chat.
If you dispatch an action CHAT_CONNECT from within the log-in saga, your log-in module will be dependent to the connect-chat module. Presumably, the connect-chat action will live in the connect-chat module.
Alternatively, if your connect-chat saga waits for LOGIN_SUCCESS, then your connect-chat module will be dependent on your log-in module. Presumably, the LOGIN_SUCCESS will live in the log-in module.
There's nothing wrong with either approach. Which is best depends on your applications needs and functionality.
If you might want to connect to chat any other time then after successfully logging in, then it might make sense to dispatch CHAT_CONNECT from within your log-in saga. Because chat is no longer dependent on log in. There are several scenarios where either approach will work better than the other, but it really depends on how the rest of your application is set up.
Regarding your bonus questions:
One approach to hooking external events in redux-saga is accomplished via eventChannels. Docs: https://redux-saga.js.org/docs/api/#eventchannelsubscribe-buffer-matcher
There's a bit of boiler plate, but I found this approach makes testing easier and truly encapsulates external functionality. Here's a quick example of how I might hook up an event channel to the code snippet you provided:
export const createOnMessageChannel = () =>
eventChannel((emit) => {
chatManager
.connect()
.then(currentUser => {
currentUser.subscribeToRoom({
roomId: currentUser.rooms[0].id,
hooks: {
onNewMessage: message => emit({ message }),
}
});
})
.catch(error => emit({ error }));
return () => {
// Code to unsubscribe, e.g. chatManager.disconnet() ?
};
});
export function* onMessage({ message, error }) {
if (error) {
yield put(handleError(error));
return;
}
yield put(handleMessage(message));
}
// this is what you pass to your root saga
export function* createOnMessageSaga() {
// using call because this makes it easier to test
const channel = yield call(createOnMessageChannel);
if (!channel) return;
yield takeEvery(channel, onMessage);
}

Strange behavior in react/redux

In my React/Redux app, I make a backend API call to create an entry in a calendar. This is initiated in my handler function which calls the action creator.
Once this initial step is done, I check to see if the entry the user has just created has the same date as the current date my calendar component showing. If so, I call the backend API to get calendar events. I do this to refresh the calendar.
As I step through the process, everything seems to be working fine BUT my calendar does not show updated data.
Here comes the weird part: as I step through this process, everything works and the calendar updates fine. In other words, if I somehow slow down the process, everything seems to be working perfectly fine.
If I don't slow down the process, the calendar fails to update. There are no errors. And as I said, as I step through the process, I see that the API returns correct data, action creator to SET_CALENDAR_EVENTS gets called which then calls the reducer and the reducer sets the data.
Like I said, there are no problems except if I let it happen without slowing down the process, the calendar doesn't update.
Any idea what's causing this? Any suggestions?
My handler function code looks like this:
clickHandleCreateEvent(event) {
// Call API
this.props.actions.createEvent(event);
// Get active date
const activeDate = this.props.activeDate;
if(activeDate === event.eventDate) {
this.props.actions.getCalendarEvents(activeDate);
}
}
UPDATE:
Here's my getCalendarEvents function:
export const getCalendarEntries = (calendarId, date) => {
// Create get calendar entries object
var request = {
id: calendarId,
date: date
};
// Get calendar entries
return (dispatch) => fetch('/api/calendars/entries', fetchOptionsPost(request))
.then((response) => {
if (response.ok) {
// Got events
parseJSON(response)
.then(entries => {
dispatch(setEvents(entries))
})
.then(() => dispatch(setCalendarIsLoading(false)))
} else {
// Couldn't get data
dispatch(setBadRequest(true))
}
})
}
Since both createEvent and getCalendarEvents are async functions involving network communication there is no guarantee which request reaches the server first. So you might read old data while createEvent request were still travelling over the wire.
To avoid this you need to synchronize both requests ie call getCalendarEvents after the server has responded ok to createEvent request.
clickHandleCreateEvent(event) {
// Call API
return this.props.actions
.createEvent(event);
.then(() => {
// Get active date
const activeDate = this.props.activeDate;
if(activeDate === event.eventDate) {
return this.props.actions.getCalendarEvents(activeDate)
}
})
}

Resources