Cancel query on refetch in apollo - reactjs

I'm trying to build a small widget with some inputs and cards that contain data which is recalculated on the backend using react and apollo. Cards contain a loader to indicate that request is in progress.
Whenever I change an input, I want to trigger a request to backend, which takes a long time (due to computation) - around 30 seconds.
When you input data - mutation is triggered via one apollo client, but right afterwards it triggers a query to another apollo client, which takes 30 seconds or so.
In that time it is possible to update input again - and in that case, I would like to cancel the previous request and trigger a new one.
This is the query
const {
data,
loading,
refetch,
networkStatus,
} = useCalculationQuery({
widget_id: widgetId,
skip: !isCalcTriggered,
});
Code for useCalculationQuery
function useCalculationQuery({ widget_id, skip }: Options = {}) {
const { teamId } = ... ;
const query = useQuery(QUERY_CALCULATION, {
skip,
fetchPolicy: 'network-only',
variables: {
team_id: teamId,
widget_id,
},
client: apolloCalculations,
});
return query;
}
And this is the fetch function
async function handleCalcFetch() {
if (isCalcTriggered) {
setIsFetchInProgress(true);
await refetch();
setIsFetchInProgress(false);
}
if (!isCalcTriggered) {
setIsCalcTriggered(true);
}
}
As you can see - with isCalcTriggered flag I skip the query execution until handleCalcFetch is called. Then I set the flag, and on every other call except the first I do refetch.
These other flag (isFetchInProgress which is set with setIsFetchInProgress) is to indicate that request is in place, because after calling refetch, networkStatus is always 7 rather than 4. That's issue number one.
Issue number two is that I want to cancel previous requests whenever I hit refetch(), so that I am sure I always get the newest data, which now is not the case.

Related

Using ApolloClient pagination API results in requests, even if all page content is already in cache

I am using the ApolloClient core pagination API approach to accumulate paginated requests in a merge function and the repaginate them with a read function: https://www.apollographql.com/docs/react/pagination/core-api
This all works, but now there is a request for each page, even the ones that are already in the cache.
Which defeats the whole purpose when I'm repaginating!
I'm using the default fetchStrategy, cache-first.
If all requested data is present in the cache, that data is returned. Otherwise, Apollo Client executes the query against your GraphQL server and returns that data after caching it.
I wonder how ApolloClient checks that all requested data is in the cache with the pagination implementation.
Because right now (and the docs seems to rely on this) it always does the request, even when the keyArgs match and the data is in the cache.
Does someone know what causes this and how I can customize this cache-first strategy to check if all the items of the requested page are already in the cache?
Here is my code, in case that helps for context or if I'm just doing something wrong:
typePolicies: {
Query: {
fields: {
paginatedProductTracking: {
// Include everything except 'skip' and 'take' to be able to use `fetchMore`
// and repaginate when reading cache
// (essential for switching between desktop pagination and mobile lazy loading
// without having to refetch)
keyArgs: (args) => JSON.stringify(omit(args, ['query.skip', 'query.take'])),
merge: (existing, incoming, { args }) => {
if (!existing) {
return incoming;
}
if (!incoming) {
return existing;
}
const data = existing.paginatedData;
const newData = incoming.paginatedData;
return {
...existing,
// conservative merge that is robust against pages being requested out of order
paginatedData: [
...data.slice(0, args?.query.skip || 0),
...newData,
...data.slice((args?.query.skip || 0) + newData.length),
],
};
},
},
},
},
},
const [pageSize, setPageSize] = useState(100);
const [page, setPage] = useState(0);
const skip = page * pageSize;
const query = {
filter,
aggregationInterval,
order,
skip,
take: pageSize,
search: search ? values : null,
locations: currentLocations.length > 0 ? currentLocations.map((location) => location.id) : undefined,
};
const { data, loading, fetchMore } = useProductTrackingAggregatedDataQuery({
variables: {
query,
},
});
onPageChange={async (newPage) => {
await fetchMore({
variables: {
query: {
...query,
skip: newPage * pageSize,
},
},
});
setPage(newPage);
}}
I was recently faced with the exact same issue and had everything implemented in the way the official documentation illustrates until I stumbled upon this issue which is still open so I'm guessing this is still how the fetchMore function actually behaves to date. So #benjamn says that:
The fetchMore method sends a separate request that always has a fetch policy of no-cache, which is why it doesn't try to read from the cache first.
This being the case, fetchMore is only useful if you are implementing an endless scroll sort of pagination where you know beforehand that the new data is not in the cache.
In the pagination documentation it also states that:
If you are not using React and useQuery, the ObservableQuery object returned by client.watchQuery has a method called setVariables that you can call to update the original variables.
If you change the variables to your query it will trigger your read function implementation. And if the read function finds the data within existing it can return them or return undefined which will in turn trigger a network request to your graphql server to fetch the missing data, which will trigger your merge function to merge the data in the desired way, which will again trigger the read function which will now be able to slice the data you requested according to your { args } out of your existing and return them, which will finally trigger your watched ObservableQuery to fire and your UI to be updated.
Now, this approach is counter intuitive and goes against the "recommended" way of implementing pagination, but contrary to the recommended way this approach actually works.
I was unable to find anything that would prove my conclusions about fetchMore to be wrong, so if any Apollo client guru happens to stumble upon this please do shed some light into this. Until then the only solution I can offer is working with setVariables instead of fetchMore.
Keep in mind that you will need to implement a read function along with your merge. It will be responsible for slicing your cached data and triggering a network request by returning undefined if it was unable to find a full slice.

RTK Query hooks - preventing polling to auto refetch when arg change

I'm trying to refresh a token in a Query hook, with the polling feature every 9 seconds:
"/App.tsx"
..
...
const [storedToken, setStoredToken] = useState(getStoredToken());
const { data, error, refetch } = useRefreshUserQuery(storedToken, {
pollingInterval: 9000,
// refetchOnMountOrArgChange: false // -> This has no effect
});
...
..
The problem is, it re-fetches instantly when the token is set with setStoredToken(token). The new token is passed as argument to the Query hook storedToken and refetch immediately (like an infinite loop).
That would be pretty neat to be able to do this. Is there any better way to refresh a token with polling?
I believe that issue is nothing to solve on RTK-Q level - it's a pretty common and expected "limitation" of hooks and rendering lifecycle architecture. And I feel that RTK-Q polling just won't fit your requirements here, of course, that you are trying to achieve - it's not actually polling in common sense. At least - it's conditional polling, which needs some more logic)
So I would solve this just by debouncing and useEffect:
const [storedToken, setStoredToken] = useState<string>(getStoredToken());
const [tokenDebounced] = useDebounce(storedToken, 9000);
const { data } = useRefreshUserQuery(tokenDebounced);
useEffect(() => {
if (data) {
setStoredToken(data);
// console.log(newToken);
}
}, [data]);
The useEffect content and data content may differ, but the overall idea should be clear.
useDebounce is from https://www.npmjs.com/package/use-debounce,
but your own implementations should work the same if you have some defined already.
Another idea, touching you AUTH setup a bit - is just avoid
const [storedToken, setStoredToken] = useState<string>(getStoredToken());
the part at all, and keep useRefreshUserQuery() without params.
Most likely and common is to store the token in localStorage or redux\other store, and define new baseQuery, based on fetchBaseQuery that will set header and\or to include cookies with credentials: "include" with a token from localStorage or redux\other store. Definitely, you will need to store it during the first AUTH then.
I think RTK-Q auth example reveals this case in some way also:
https://redux-toolkit.js.org/rtk-query/usage/examples#authentication
After you'll avoid that useState and query hook param - you'll be able to use polling with no issues:
const { data, error, refetch } = useRefreshUserQuery(undefined ,{
pollingInterval: 9000,
});
"Polling" here means "fetch X seconds after I have data", but of course you have to get the first data itself - and that is that first fetch. If you prevent that, polling will also never start.
Tbh., this is kind of a weird requirement and doing it like this will fill your cache with dozens of state entries.
I'd do something a little differently - solve it in the endpoint lifecycle.
This is untested pseudocode and you'll need to adjust it a bit:
function waitFor(ms) {
return new Promise(resolve => setTimeout(() => resolve("waited"), ms))
}
currentToken: build.query({
query() {
// whatever you need for the first token here
},
async onCacheEntryAdded(
arg,
{ updateCachedData, cacheDataLoaded, cacheEntryRemoved }
) {
try {
// wait for the initial query to resolve before proceeding
await cacheDataLoaded
while (true) {
const result = await Promise.race(waitFor(9000), cacheEntryRemoved)
if (result !== "waited") {
// cache entry was removed, stop the loop
break
}
// make a fetch call to get a new token here
const newToken = fetch(...)
updateCachedData((oldData) => newToken)
}
},
})
and then just
const result = useCurrentTokenQuery()
in your component

Redux action on .then promise of another very slow

I have a redux action set up that posts to an external API, this updates a database, and returns the updated results. I then run another function inside to check a database table for new results:
this.props.updateAddTest(payload)
.then((response) => {
if (response.error) {
} else {
let payloadTwo = {
parentTestId: this.state.parentTestId,
bespokeTestId: response.response.testId,
selectedTests: selectedTests,
}
page.props.loadAvailableTests(payloadTwo)
.then((response) => {
page.setState({checkInvalidTests: response.response})
})
}
})
Running this code makes the network response time around 10 seconds - why does it take so long? Running the functions separately, it takes around 200ms. e.g just running:
this.props.updateAddTest(payload);
Why does nesting one redux action inside another slow it down so much?

How do I prevent infinite loops with useEffect when the function relies on a value that it sets?

I have a React component that makes a request to the backend to get a list of transactions. There is pagination involved, so the payload will return a cursor, that can be used the next time that the request should be made.
useEffect((): void => {
const { data } = await client.query({
query: QueryTransactions,
variables: {
id: accountId,
after: cursor
}
});
setData(data.data)
setCursor(data.nextCursor)
}, [accountId, cursor]);
My problem is that I am setting the cursor each time I receive a payload from the backend, but that cursor value is in the hook dependencies. This makes the effect run again.
How do I get around this infinite loop?

React - Any way to cancel (or ignore) async action on 'OnChange' handler?

I have a code here https://codesandbox.io/embed/cranky-glade-82leu
In here, I have a 'verifyUserNameAvailable' function under 'api/' folder. In this function, I added 10sec delay to test extreme case when our API server is bad.
When user enters a valid email address (valid email format like abc#test.net), I see it's email address in helperText line 10 sec later. (This is because I added 10 sec delay in 'verifyUserNameAvailable'.)
When user enters 'abc#test.net', 'abc2#test.net' and 'abc3#test.net' 3 emails one by one without waiting for 10 seconds, I see that all three emails are displayed in sequential order on helperText line. (I guess this is expected given what I have in the code now.)
My question here-> I want the helperText line to show the most latest one only. When user enters above 3 email addresses and 'abc3#test.net' is the last one on the textField, I want to just show the 'abc3#test.net' on helperText line not the others.
Is there anyway to check what's actually on textfield and discard or abort all irrelevant async requests?
There's a way to do this without cancelling the verifyUserNameAvailable promise: Keep track of the number of requests in flight, and ignore any responses other than the latest one.
Here's what that could look like in your example, this uses useRef to keep track of the latest request value (see the React docs for useRef to see how this works https://reactjs.org/docs/hooks-reference.html#useref):
export const EmailTextField = props => {
const { onStateChange } = props;
const [state, setState] = useState({
errors: [],
onChange: false,
pristine: true,
touched: false,
value: null
});
// `useRef` holds a mutable value that lasts as long
// as the component, staying "more up to date" than the
// current closure.
const requestsInFlight = useRef(0);
const helperText = "Email address will be used as your username.";
const handleBlur = async event => {
// Email Validation logic
const emailAddress = event.target.value;
const matches = event.target.value.match(
`[a-z0-9._%+-]+#[a-z0-9.-]+.[a-z]{2,3}`
);
if (matches) {
// If there's a match, increment the ref for in-flight requests
requestsInFlight.current += 1;
await verifyUserNameAvailable(emailAddress);
// After the response comes back, decrement the ref, and if
// there are any requests still in flight, ignore that response.
requestsInFlight.current -= 1;
if (requestsInFlight.current > 0) {
return;
}
const updatedState = {
...state,
touched: true,
value: emailAddress,
errors: [emailAddress]
};
setState(updatedState);
onStateChange(updatedState);
} else {
...
}
};
Lets think about possible solutions from deepest level.
Internally you calling setTimeout. Timeout can be cancelled with clearTimeout. I don't think that this is the way you're looking for, so I will not describe how to implement this.
setTimeout is wrapped with Promise. Unfortunately, Promises are not cancelable in current version of JS. There is proposal for it.
I suspect, that you'll use code with some API to fetch data from backend. If you'll use axios, it provides cancellation of pending requests to backend using cancellation token (and cancellation is based on cancellation proposal for JS).
And if you'll use Redux in your app, you may consider Redux-saga for backend requests. It also supports cancellation.

Resources