Timeout on Axios Requests - reactjs

Our site currently has a filter feature that fetches new data via axios depending on what is being filtered.
The issue is that the filter is done on real time and every change made via react causes an axios request.
Is there a way to put a timeout on the axios request so that I only fetch the last state?

I would suggest using debounce in this case to trigger API call after a specified millisecond of user input.
But just in case you need to add a timeout during axios call, this can be achieved like -
instance.get('/longRequest', {
timeout: 5000
});

The problem has two parts.
The first part is debouncing and is default for event listeners that can be triggered often, especially if their calls are expensive or may cause undesirable effects. HTTP requests fall into this category.
The second part is that if debounce delay is less than HTTP request duration (this is true for virtual every case), there still will be competing requests, responses will result in state changes over time, and not necessarily in correct order.
The first part is addressed with debounce function to reduce the number of competing requests, the second part uses Axios cancellation API to cancel incomplete requests when there's a new one, e.g.:
onChange = e => {
this.fetchData(e.target.value);
};
fetchData = debounce(query => {
if (this._fetchDataCancellation) {
this._fetchDataCancellation.cancel();
}
this._fetchDataCancellation = CancelToken.source();
axios.get(url, {
cancelToken: this._fetchDataCancellation.token
})
.then(({ data }) => {
this.setState({ data });
})
.catch(err => {
// request was cancelled, not a real error
if (axios.isCancel(err))
return;
console.error(err);
});
}, 200);
Here is a demo.

From this axios issue (Thanks to zhuyifan2013 for giving the solution), I've found that axios timeout is response timeout not connection timeout.
Please check this answer

You can also use as a general setting by axios.defaults for all requests:
axios.defaults.timeout = 5000

Related

How to cancel subsequent requests in axios interceptor?

I am working on a project in react with redux/redux-saga and a doubt arose. I am implementing with Axios in the response interceptor a way to logout the user when the session token has expired.
Basically, what I'm looking for is, to logout the user when calling to a private endpoint and it returns a 403. But I have this problem:
I have routes where I must do 3 dispatches (calls to different endpoints on the API) during the component loads which all 3 bring me relevant information to the components. Obviously, when the token is expired it will return 403, and the interceptor in the response will do the logout process to remove it from the session. However, even after doing the logout, the other 2 requests will also be called and there is no need cuz I already detected in the first call that the token expired.
// EFFECTS
useEffect(() => {
dispatch(getAccountsInit("users"));
dispatch(getAccountsInit("kash"));
dispatch(getBanksInit());
dispatch(getCurenciesInit());
}, [dispatch]);
How do I prevent this? How do I cancel subsequent requests when detecting that the token has expired on the first one? .. I was looking for information about it but I couldn't find it. I thank you very much for the help.
Here is my response inteceptor
export const resInterceptor = (instance) =>
instance.interceptors.response.use(
(res) => res,
(error) => {
const configRequest = error.config,
status = error.status || error.response.status;
console.warn("Error status: ", status || error.code);
console.log(error);
if (status === 418 && !configRequest._retry) {
alert("Ha finalizado tu sesión, serás re dirigido y deberás iniciar sesión nuevamente.");
store.dispatch(logoutSuccess());
}
I assume action creators like getAccountsInit are async thunks. In this meaning, you actually send 4 requests in parallel and once any of it gets 403, there is no use try stopping others. This way you can only try preventing of interceptor to call logoutSuccess() but is it really needed? I doubt.
You can refer to store inside interceptor in order to get current "logged in or not" status, but to me it seems as unnecessary complication.
You also can return Promise from your getAccountsInit, getAccountsInit etc and chain them:
useEffect(() => {
dispatch(getAccountsInit("users"))
.then(() => dispatch(getAccountsInit("kash")))
.then(() => dispatch(getBanksInit()))
.then(() => dispatch(getCurenciesInit()));
}, [dispatch]);
But this way for "normal" flow(session has not expired) user will get ~4x longer loading(instead of running in parallel requests go in sequence).
TL;DR; just let it sending requests even if they may be useless due to session expiration; code complexity or slower loading is not worth it

How to queue requests using react/redux?

I have to pretty weird case to handle.
We have to few boxes, We can call some action on every box. When We click the button inside the box, we call some endpoint on the server (using axios). Response from the server return new updated information (about all boxes, not the only one on which we call the action).
Issue:
If user click submit button on many boxes really fast, the request call the endpoints one by one. It's sometimes causes errors, because it's calculated on the server in the wrong order (status of group of boxes depends of single box status). I know it's maybe more backend issue, but I have to try fix this on frontend.
Proposal fix:
In my opinion in this case the easiest fix is disable every submit button if any request in progress. This solution unfortunately is very slow, head of the project rejected this proposition.
What we want to goal:
In some way We want to queue the requests without disable every button. Perfect solution for me at this moment:
click first button - call endpoint, request pending on the server.
click second button - button show spinner/loading information without calling endpoint.
server get us response for the first click, only then we really call the second request.
I think something like this is huge antipattern, but I don't set the rules. ;)
I was reading about e.g. redux-observable, but if I don't have to I don't want to use other middleware for redux (now We use redux-thunk). Redux-saga it will be ok, but unfortunately I don't know this tool. I prepare simple codesandbox example (I added timeouts in redux actions for easier testing).
I have only one stupid proposal solution. Creating a array of data needs to send correct request, and inside useEffect checking if the array length is equal to 1. Something like this:
const App = ({ boxActions, inProgress, ended }) => {
const [queue, setQueue] = useState([]);
const handleSubmit = async () => { // this code do not work correctly, only show my what I was thinking about
if (queue.length === 1) {
const [data] = queue;
await boxActions.submit(data.id, data.timeout);
setQueue(queue.filter((item) => item.id !== data.id));
};
useEffect(() => {
handleSubmit();
}, [queue])
return (
<>
<div>
{config.map((item) => (
<Box
key={item.id}
id={item.id}
timeout={item.timeout}
handleSubmit={(id, timeout) => setQueue([...queue, {id, timeout}])}
inProgress={inProgress.includes(item.id)}
ended={ended.includes(item.id)}
/>
))}
</div>
</>
);
};
Any ideas?
I agree with your assessment that we ultimately need to make changes on the backend. Any user can mess with the frontend and submit requests in any order they want regardless how you organize it.
I get it though, you're looking to design the happy path on the frontend such that it works with the backend as it is currently.
It's hard to tell without knowing the use-case exactly, but there may generally be some improvements we can make from a UX perspective that will apply whether we make fixes on the backend or not.
Is there an endpoint to send multiple updates to? If so, we could debounce our network call to submit only when there is a delay in user activity.
Does the user need to be aware of order of selection and the impacts thereof? If so, it sounds like we'll need to update frontend to convey this information, which may then expose a natural solution to the situation.
It's fairly simple to create a request queue and execute them serially, but it seems potentially fraught with new challenges.
E.g. If a user clicks 5 checkboxes, and order matters, a failed execution of the second update would mean we would need to stop any further execution of boxes 3 through 5 until update 2 could be completed. We'll also need to figure out how we'll handle timeouts, retries, and backoff. There is some complexity as to how we want to convey all this to the end user.
Let's say we're completely set on going that route, however. In that case, your use of Redux for state management isn't terribly important, nor is the library you use for sending your requests.
As you suggested, we'll just create an in-memory queue of updates to be made and dequeue serially. Each time a user makes an update to a box, we'll push to that queue and attempt to send updates. Our processEvents function will retain state as to whether a request is in motion or not, which it will use to decide whether to take action or not.
Each time a user clicks a box, the event is added to the queue, and we attempt processing. If processing is already ongoing or we have no events to process, we don't take any action. Each time a processing round finishes, we check for further events to process. You'll likely want to hook into this cycle with Redux and fire new actions to indicate event success and update the state and UI for each event processed and so on. It's possible one of the libraries you use offer some feature like this as well.
// Get a better Queue implementation if queue size may get high.
class Queue {
_store = [];
enqueue = (task) => this._store.push(task);
dequeue = () => this._store.shift();
length = () => this._store.length;
}
export const createSerialProcessor = (asyncProcessingCallback) => {
const updateQueue = new Queue();
const addEvent = (params, callback) => {
updateQueue.enqueue([params, callback]);
};
const processEvents = (() => {
let isReady = true;
return async () => {
if (isReady && updateQueue.length() > 0) {
const [params, callback] = updateQueue.dequeue();
isReady = false;
await asyncProcessingCallback(params, callback); // retries and all that include
isReady = true;
processEvents();
}
};
})();
return {
process: (params, callback) => {
addEvent(params, callback);
processEvents();
}
};
};
Hope this helps.
Edit: I just noticed you included a codesandbox, which is very helpful. I've created a copy of your sandbox with updates made to achieve your end and integrate it with your Redux setup. There are some obvious shortcuts still being taken, like the Queue class, but it should be about what you're looking for: https://codesandbox.io/s/dank-feather-hqtf7?file=/src/lib/createSerialProcessor.js
In case you would like to use redux-saga, you can use the actionChannel effect in combination with the blocking call effect to achieve your goal:
Working fork:
https://codesandbox.io/s/hoh8n
Here is the code for boxSagas.js:
import {actionChannel, call, delay, put, take} from 'redux-saga/effects';
// import axios from 'axios';
import {submitSuccess, submitFailure} from '../actions/boxActions';
import {SUBMIT_REQUEST} from '../types/boxTypes';
function* requestSaga(action) {
try {
// const result = yield axios.get(`https://jsonplaceholder.typicode.com/todos`);
yield delay(action.payload.timeout);
yield put(submitSuccess(action.payload.id));
} catch (error) {
yield put(submitFailure());
}
}
export default function* boxSaga() {
const requestChannel = yield actionChannel(SUBMIT_REQUEST); // buffers incoming requests
while (true) {
const action = yield take(requestChannel); // takes a request from queue or waits for one to be added
yield call(requestSaga, action); // starts request saga and _waits_ until it is done
}
}
I am using the fact that the box reducer handles the SUBMIT_REQUEST actions immediately (and sets given id as pending), while the actionChannel+call handle them sequentially and so the actions trigger only one http request at a time.
More on action channels here:
https://redux-saga.js.org/docs/advanced/Channels/#using-the-actionchannel-effect
Just store the promise from a previous request and wait for it to resolve before initiating the next request. The example below uses a global variable for simplicity - but you can use smth else to preserve state across requests (e.g. extraArgument from thunk middleware).
// boxActions.ts
let submitCall = Promise.resolve();
export const submit = (id, timeout) => async (dispatch) => {
dispatch(submitRequest(id));
submitCall = submitCall.then(() => axios.get(`https://jsonplaceholder.typicode.com/todos`))
try {
await submitCall;
setTimeout(() => {
return dispatch(submitSuccess(id));
}, timeout);
} catch (error) {
return dispatch(submitFailure());
}
};

React Query handling response status codes

this is related to this question:
Handling unauthorized request in react-query
I understand the point that React-Query doesnt care about responses codes because there is no error. So for example if the server respond with a 400 "Bad Request", do i have to check for this on the data returned by the muate function?
const handleFormSubmit = async (credentials) => {
const data = await mutateLogin(credentials);
// Do i have to check this data if for example i wanna show an error message
// "Invalid Credentials"?
};
I need to save the user on the cache.
const useMutateLogin = () => {
return useMutation(doLogin, {
throwOnError: true,
onSuccess: data => // Do i have to check here again if i receive the user or 400 code
})
}
Thanks.
react-query does not take care of the requests and it is completely agnostic of what you use to make them as long you have a Promise. From the documentation we have the following specification for the query function:
Must return a promise that will either resolves data or throws an error.
So if you need to fail on specific status codes, you should handle that in the query function.
The confusion comes because popular libraries usually take care of that for you. For example, axios and jQuery.ajax() will throw an error/reject if the HTTP status code falls out of the range of 2xx. If you use the Fetch API (like the discussion in the link you posted), the API won't reject on HTTP error status.
Your first code snippet:
const handleFormSubmit = async (credentials) => {
const data = await mutateLogin(credentials);
};
The content of data depends on the mutateLogin function implementation. If you are using axios, the promise will reject to any HTTP status code that falls out of the range of 2xx. If you use the Fetch API you need to check the status and throw the error or react-query will cache the whole response as received.
Your second code snippet:
const useMutateLogin = () => {
return useMutation(doLogin, {
throwOnError: true,
onSuccess: data => // Do i have to check here again if i receive the user or 400 code
})
}
Here we have the same case as before. It depends on doLogin implementation.

Redux action on .then promise of another very slow

I have a redux action set up that posts to an external API, this updates a database, and returns the updated results. I then run another function inside to check a database table for new results:
this.props.updateAddTest(payload)
.then((response) => {
if (response.error) {
} else {
let payloadTwo = {
parentTestId: this.state.parentTestId,
bespokeTestId: response.response.testId,
selectedTests: selectedTests,
}
page.props.loadAvailableTests(payloadTwo)
.then((response) => {
page.setState({checkInvalidTests: response.response})
})
}
})
Running this code makes the network response time around 10 seconds - why does it take so long? Running the functions separately, it takes around 200ms. e.g just running:
this.props.updateAddTest(payload);
Why does nesting one redux action inside another slow it down so much?

React JS & Axios chaining promies

I am developing a react js application and we are using a promise based library axios for calling APIs.
Now, in the initial part of application, user gets a login page, when the login is successful, we contact different systems to retrieve some extra information about user.
axios
.get('url to authentication endpoint') // 1st call
.then(response => {
// if login is successful
// 1. retrieve the user preferences like, on the customised screens what fields user wanted to see
axios.get('user preference endpoint') // 2nd call
// 2. send a request to one more external systems, which calculates what user can see and not based on LDAP role
axios.get('role calculation endpoint') // 3rd call
})
.catch(error => {
})
Now I can see that I can use
axios.all()
for second and third call, but with promised based client, how to chain first and second call? To retrieve user preferences, I have to wait for user to be authenticated.
How to chain this calls in a promise based way, rather than callback style?
as mentioned in the thread for this Github issue, axios() and axios.all() return Promise objects which can be chained however you see fit:
axios.get('/auth')
.then(function(response) {
return axios.all([ axios.get('/preferences'), axios.get('/roles') ]);
})
.then(function(responses) {
const [
preferencesResponse,
rolesResponse
] = responses;
// do more things
})
.catch(function(error) {
console.log(error);
});
Dan O's answer is very good and it works perfectly but it's much readable using async/await although it's also working with promises under the hoood
async yourReactClassFunction(){
try{
let getAuth = await axios.get('/auth');
//if login not successful return;
let result = await Promise.all([axios.get('/preferences'), axios.get('/roles')]);
//Do whatever with the results.
}catch(e){
//TODO error handling
}
}
Although it's the same thing, 'feels' more readable in my very subjective opinion

Resources