RTK Query hooks - preventing polling to auto refetch when arg change - reactjs

I'm trying to refresh a token in a Query hook, with the polling feature every 9 seconds:
"/App.tsx"
..
...
const [storedToken, setStoredToken] = useState(getStoredToken());
const { data, error, refetch } = useRefreshUserQuery(storedToken, {
pollingInterval: 9000,
// refetchOnMountOrArgChange: false // -> This has no effect
});
...
..
The problem is, it re-fetches instantly when the token is set with setStoredToken(token). The new token is passed as argument to the Query hook storedToken and refetch immediately (like an infinite loop).
That would be pretty neat to be able to do this. Is there any better way to refresh a token with polling?

I believe that issue is nothing to solve on RTK-Q level - it's a pretty common and expected "limitation" of hooks and rendering lifecycle architecture. And I feel that RTK-Q polling just won't fit your requirements here, of course, that you are trying to achieve - it's not actually polling in common sense. At least - it's conditional polling, which needs some more logic)
So I would solve this just by debouncing and useEffect:
const [storedToken, setStoredToken] = useState<string>(getStoredToken());
const [tokenDebounced] = useDebounce(storedToken, 9000);
const { data } = useRefreshUserQuery(tokenDebounced);
useEffect(() => {
if (data) {
setStoredToken(data);
// console.log(newToken);
}
}, [data]);
The useEffect content and data content may differ, but the overall idea should be clear.
useDebounce is from https://www.npmjs.com/package/use-debounce,
but your own implementations should work the same if you have some defined already.
Another idea, touching you AUTH setup a bit - is just avoid
const [storedToken, setStoredToken] = useState<string>(getStoredToken());
the part at all, and keep useRefreshUserQuery() without params.
Most likely and common is to store the token in localStorage or redux\other store, and define new baseQuery, based on fetchBaseQuery that will set header and\or to include cookies with credentials: "include" with a token from localStorage or redux\other store. Definitely, you will need to store it during the first AUTH then.
I think RTK-Q auth example reveals this case in some way also:
https://redux-toolkit.js.org/rtk-query/usage/examples#authentication
After you'll avoid that useState and query hook param - you'll be able to use polling with no issues:
const { data, error, refetch } = useRefreshUserQuery(undefined ,{
pollingInterval: 9000,
});

"Polling" here means "fetch X seconds after I have data", but of course you have to get the first data itself - and that is that first fetch. If you prevent that, polling will also never start.
Tbh., this is kind of a weird requirement and doing it like this will fill your cache with dozens of state entries.
I'd do something a little differently - solve it in the endpoint lifecycle.
This is untested pseudocode and you'll need to adjust it a bit:
function waitFor(ms) {
return new Promise(resolve => setTimeout(() => resolve("waited"), ms))
}
currentToken: build.query({
query() {
// whatever you need for the first token here
},
async onCacheEntryAdded(
arg,
{ updateCachedData, cacheDataLoaded, cacheEntryRemoved }
) {
try {
// wait for the initial query to resolve before proceeding
await cacheDataLoaded
while (true) {
const result = await Promise.race(waitFor(9000), cacheEntryRemoved)
if (result !== "waited") {
// cache entry was removed, stop the loop
break
}
// make a fetch call to get a new token here
const newToken = fetch(...)
updateCachedData((oldData) => newToken)
}
},
})
and then just
const result = useCurrentTokenQuery()
in your component

Related

Using ApolloClient pagination API results in requests, even if all page content is already in cache

I am using the ApolloClient core pagination API approach to accumulate paginated requests in a merge function and the repaginate them with a read function: https://www.apollographql.com/docs/react/pagination/core-api
This all works, but now there is a request for each page, even the ones that are already in the cache.
Which defeats the whole purpose when I'm repaginating!
I'm using the default fetchStrategy, cache-first.
If all requested data is present in the cache, that data is returned. Otherwise, Apollo Client executes the query against your GraphQL server and returns that data after caching it.
I wonder how ApolloClient checks that all requested data is in the cache with the pagination implementation.
Because right now (and the docs seems to rely on this) it always does the request, even when the keyArgs match and the data is in the cache.
Does someone know what causes this and how I can customize this cache-first strategy to check if all the items of the requested page are already in the cache?
Here is my code, in case that helps for context or if I'm just doing something wrong:
typePolicies: {
Query: {
fields: {
paginatedProductTracking: {
// Include everything except 'skip' and 'take' to be able to use `fetchMore`
// and repaginate when reading cache
// (essential for switching between desktop pagination and mobile lazy loading
// without having to refetch)
keyArgs: (args) => JSON.stringify(omit(args, ['query.skip', 'query.take'])),
merge: (existing, incoming, { args }) => {
if (!existing) {
return incoming;
}
if (!incoming) {
return existing;
}
const data = existing.paginatedData;
const newData = incoming.paginatedData;
return {
...existing,
// conservative merge that is robust against pages being requested out of order
paginatedData: [
...data.slice(0, args?.query.skip || 0),
...newData,
...data.slice((args?.query.skip || 0) + newData.length),
],
};
},
},
},
},
},
const [pageSize, setPageSize] = useState(100);
const [page, setPage] = useState(0);
const skip = page * pageSize;
const query = {
filter,
aggregationInterval,
order,
skip,
take: pageSize,
search: search ? values : null,
locations: currentLocations.length > 0 ? currentLocations.map((location) => location.id) : undefined,
};
const { data, loading, fetchMore } = useProductTrackingAggregatedDataQuery({
variables: {
query,
},
});
onPageChange={async (newPage) => {
await fetchMore({
variables: {
query: {
...query,
skip: newPage * pageSize,
},
},
});
setPage(newPage);
}}
I was recently faced with the exact same issue and had everything implemented in the way the official documentation illustrates until I stumbled upon this issue which is still open so I'm guessing this is still how the fetchMore function actually behaves to date. So #benjamn says that:
The fetchMore method sends a separate request that always has a fetch policy of no-cache, which is why it doesn't try to read from the cache first.
This being the case, fetchMore is only useful if you are implementing an endless scroll sort of pagination where you know beforehand that the new data is not in the cache.
In the pagination documentation it also states that:
If you are not using React and useQuery, the ObservableQuery object returned by client.watchQuery has a method called setVariables that you can call to update the original variables.
If you change the variables to your query it will trigger your read function implementation. And if the read function finds the data within existing it can return them or return undefined which will in turn trigger a network request to your graphql server to fetch the missing data, which will trigger your merge function to merge the data in the desired way, which will again trigger the read function which will now be able to slice the data you requested according to your { args } out of your existing and return them, which will finally trigger your watched ObservableQuery to fire and your UI to be updated.
Now, this approach is counter intuitive and goes against the "recommended" way of implementing pagination, but contrary to the recommended way this approach actually works.
I was unable to find anything that would prove my conclusions about fetchMore to be wrong, so if any Apollo client guru happens to stumble upon this please do shed some light into this. Until then the only solution I can offer is working with setVariables instead of fetchMore.
Keep in mind that you will need to implement a read function along with your merge. It will be responsible for slicing your cached data and triggering a network request by returning undefined if it was unable to find a full slice.

Mapping State in React with Server Requests

I'm new to React as we are trying to migrate our app from AngularJS. One thing I'm struggling to wrap my head around is what's the best way to make and cache state mapping requests.
Basically, I would do a search, that returns a list of objects and one of the field is a status code (e.g. 100, 200, 300, etc.), some number. To display the result, I need to map that number to a string and we do that with a http request to the server, something like this:
GET /lookup/:stateId
So my problem now is:
I have a list of results but not many different states, how can I make that async call (useEffect?) to make that lookup only once for different stateId? Right now, I can get it to work, but the request is made on every single mapping. I'm putting the Axio call in a utility function to try and reuse this across multiple pages doing similar things, but is that the "React" way? In AngularJS, we use the "|" filter to map the code to text.
Once I have that mapping id => string, I want to store it in cache so next one that needs to map it no longer make the http request. Right now, I put the "cache" in the application level context and use dispatch to update/add values to the cache. Is that more efficient? It appears if I do a language change, where I keep the language in the same application context state, the cache would be re-initialized, and I'm not sure what other things would reset that. In AngularJS, we used the $rootState to 'cache'.
Thanks for any pointers!
In a lookupUtil.js
const DoLookupEntry = async (entryId) => {
const lookupUrl = `/lookup/${entryId}`;
try {
const response = await Axios.get(looupUrl,);
return response.data;
} catch (expt) {
console.log('error [DoLookupEntry]:',expt);
}
}
In a formatUtils.js
const formatLookupValue = (entryId) => {
const appState = useContext(AppContext);
const appDispatch = useContext(DispatchContext);
const language = appState.language;
if (appState.lookupCache
&& appState.lookupCache[entryId]
&& appState.lookupCache[entryId][language]) {
// return cached value
const entry = appState.lookupCache[entryId][language];
return entry.translatedValue;
}
// DoLookup is async, but we are not, so we want to wait...
DoLookupEntry(entryId)
.then((entry) => { // try to save to cache when value returns
appDispatch({type: States.APP_UPDATE_LOOKUP_CACHE,
value:{language, entry}})
return entry.translatedValue;
});
}
And finally the results.js displaying the result along the line (trying formatLookupValue to map the id):
{searchState.pageResults.map((item) => {
return (
<tr>
<td><Link to={/getItem/item.id}>{item.title}</Link></td>
<td>{item.detail}</td>
<td>{formatLookupValue(item.stateId)}</td>
</tr>
)
})}

How to queue requests using react/redux?

I have to pretty weird case to handle.
We have to few boxes, We can call some action on every box. When We click the button inside the box, we call some endpoint on the server (using axios). Response from the server return new updated information (about all boxes, not the only one on which we call the action).
Issue:
If user click submit button on many boxes really fast, the request call the endpoints one by one. It's sometimes causes errors, because it's calculated on the server in the wrong order (status of group of boxes depends of single box status). I know it's maybe more backend issue, but I have to try fix this on frontend.
Proposal fix:
In my opinion in this case the easiest fix is disable every submit button if any request in progress. This solution unfortunately is very slow, head of the project rejected this proposition.
What we want to goal:
In some way We want to queue the requests without disable every button. Perfect solution for me at this moment:
click first button - call endpoint, request pending on the server.
click second button - button show spinner/loading information without calling endpoint.
server get us response for the first click, only then we really call the second request.
I think something like this is huge antipattern, but I don't set the rules. ;)
I was reading about e.g. redux-observable, but if I don't have to I don't want to use other middleware for redux (now We use redux-thunk). Redux-saga it will be ok, but unfortunately I don't know this tool. I prepare simple codesandbox example (I added timeouts in redux actions for easier testing).
I have only one stupid proposal solution. Creating a array of data needs to send correct request, and inside useEffect checking if the array length is equal to 1. Something like this:
const App = ({ boxActions, inProgress, ended }) => {
const [queue, setQueue] = useState([]);
const handleSubmit = async () => { // this code do not work correctly, only show my what I was thinking about
if (queue.length === 1) {
const [data] = queue;
await boxActions.submit(data.id, data.timeout);
setQueue(queue.filter((item) => item.id !== data.id));
};
useEffect(() => {
handleSubmit();
}, [queue])
return (
<>
<div>
{config.map((item) => (
<Box
key={item.id}
id={item.id}
timeout={item.timeout}
handleSubmit={(id, timeout) => setQueue([...queue, {id, timeout}])}
inProgress={inProgress.includes(item.id)}
ended={ended.includes(item.id)}
/>
))}
</div>
</>
);
};
Any ideas?
I agree with your assessment that we ultimately need to make changes on the backend. Any user can mess with the frontend and submit requests in any order they want regardless how you organize it.
I get it though, you're looking to design the happy path on the frontend such that it works with the backend as it is currently.
It's hard to tell without knowing the use-case exactly, but there may generally be some improvements we can make from a UX perspective that will apply whether we make fixes on the backend or not.
Is there an endpoint to send multiple updates to? If so, we could debounce our network call to submit only when there is a delay in user activity.
Does the user need to be aware of order of selection and the impacts thereof? If so, it sounds like we'll need to update frontend to convey this information, which may then expose a natural solution to the situation.
It's fairly simple to create a request queue and execute them serially, but it seems potentially fraught with new challenges.
E.g. If a user clicks 5 checkboxes, and order matters, a failed execution of the second update would mean we would need to stop any further execution of boxes 3 through 5 until update 2 could be completed. We'll also need to figure out how we'll handle timeouts, retries, and backoff. There is some complexity as to how we want to convey all this to the end user.
Let's say we're completely set on going that route, however. In that case, your use of Redux for state management isn't terribly important, nor is the library you use for sending your requests.
As you suggested, we'll just create an in-memory queue of updates to be made and dequeue serially. Each time a user makes an update to a box, we'll push to that queue and attempt to send updates. Our processEvents function will retain state as to whether a request is in motion or not, which it will use to decide whether to take action or not.
Each time a user clicks a box, the event is added to the queue, and we attempt processing. If processing is already ongoing or we have no events to process, we don't take any action. Each time a processing round finishes, we check for further events to process. You'll likely want to hook into this cycle with Redux and fire new actions to indicate event success and update the state and UI for each event processed and so on. It's possible one of the libraries you use offer some feature like this as well.
// Get a better Queue implementation if queue size may get high.
class Queue {
_store = [];
enqueue = (task) => this._store.push(task);
dequeue = () => this._store.shift();
length = () => this._store.length;
}
export const createSerialProcessor = (asyncProcessingCallback) => {
const updateQueue = new Queue();
const addEvent = (params, callback) => {
updateQueue.enqueue([params, callback]);
};
const processEvents = (() => {
let isReady = true;
return async () => {
if (isReady && updateQueue.length() > 0) {
const [params, callback] = updateQueue.dequeue();
isReady = false;
await asyncProcessingCallback(params, callback); // retries and all that include
isReady = true;
processEvents();
}
};
})();
return {
process: (params, callback) => {
addEvent(params, callback);
processEvents();
}
};
};
Hope this helps.
Edit: I just noticed you included a codesandbox, which is very helpful. I've created a copy of your sandbox with updates made to achieve your end and integrate it with your Redux setup. There are some obvious shortcuts still being taken, like the Queue class, but it should be about what you're looking for: https://codesandbox.io/s/dank-feather-hqtf7?file=/src/lib/createSerialProcessor.js
In case you would like to use redux-saga, you can use the actionChannel effect in combination with the blocking call effect to achieve your goal:
Working fork:
https://codesandbox.io/s/hoh8n
Here is the code for boxSagas.js:
import {actionChannel, call, delay, put, take} from 'redux-saga/effects';
// import axios from 'axios';
import {submitSuccess, submitFailure} from '../actions/boxActions';
import {SUBMIT_REQUEST} from '../types/boxTypes';
function* requestSaga(action) {
try {
// const result = yield axios.get(`https://jsonplaceholder.typicode.com/todos`);
yield delay(action.payload.timeout);
yield put(submitSuccess(action.payload.id));
} catch (error) {
yield put(submitFailure());
}
}
export default function* boxSaga() {
const requestChannel = yield actionChannel(SUBMIT_REQUEST); // buffers incoming requests
while (true) {
const action = yield take(requestChannel); // takes a request from queue or waits for one to be added
yield call(requestSaga, action); // starts request saga and _waits_ until it is done
}
}
I am using the fact that the box reducer handles the SUBMIT_REQUEST actions immediately (and sets given id as pending), while the actionChannel+call handle them sequentially and so the actions trigger only one http request at a time.
More on action channels here:
https://redux-saga.js.org/docs/advanced/Channels/#using-the-actionchannel-effect
Just store the promise from a previous request and wait for it to resolve before initiating the next request. The example below uses a global variable for simplicity - but you can use smth else to preserve state across requests (e.g. extraArgument from thunk middleware).
// boxActions.ts
let submitCall = Promise.resolve();
export const submit = (id, timeout) => async (dispatch) => {
dispatch(submitRequest(id));
submitCall = submitCall.then(() => axios.get(`https://jsonplaceholder.typicode.com/todos`))
try {
await submitCall;
setTimeout(() => {
return dispatch(submitSuccess(id));
}, timeout);
} catch (error) {
return dispatch(submitFailure());
}
};

React Apollo Client - modify query data before it goes to cache

Is there a way to modify query response data before it is saved in the internal cache?
I'm using apollo hooks, but this question is relevant to any of front-end approaches using apollo client (HOC & Components as well).
const { data, updateQuery } = useQuery(QUERY, {
onBeforeDataGoesToCache: originalResponseData => {
// modify data before it is cached? Can I have something like this?
return modifiedData;
}
});
Obviously onBeforeDataGoesToCache does not exist, but that's exactly the behavior I'm looking for. There's an updateQuery function in the result, which basically does what is needed, but in the wrong time. I'm looking for something to work as a hook or a middleware inside the query mutation.
It sounds like you want Afterware which, much like Middleware that allows operations before the request is made, allows you to manipulate data in the response e.g.
const modifyDataLink = new ApolloLink((operation, forward) => {
return forward(operation).map(response => {
// Modify response.data...
return response;
});
});
// use with apollo-client
const link = modifyDataLink.concat(httpLink);

Caching in React

In my react App I have a input element. The search query should be memoized, which means that if the user has previously searched for 'John' and the API has provided me valid results for that query, then next time when the user types 'Joh', there should be suggestion for the user with the previously memoized values(in this case 'John' would be suggested).
I am new to react and am trying caching for the first time.I read a few articles but couldn't implement the desired functionality.
You don't clarify which API you're using nor which stack; the solution would vary somewhat depending on if you are using XHR requests or something over GraphQL.
For an asynchronous XHR request to some backend API, I would do something like the example below.
Query the API for the search term
_queryUserXHR = (searchTxt) => {
jQuery.ajax({
type: "GET",
url: url,
data: searchTxt,
success: (data) => {
this.setState({previousQueries: this.state.previousQueries.concat([searchTxt])
}
});
}
You would run this function whenever you want to do the check against your API. If the API can find the search string you query, then insert that data into a local state array variable (previousQueries in my example).
You can either return the data to be inserted from the database if there are unknowns to your view (e.g database id). Above I just insert the searchTxt which is what we send in to the function based on what the user typed in the input-field. The choice is yours here.
Get suggestions for previously searched terms
I would start by adding an input field that runs a function on the onKeyPress event:
<input type="text" onKeyPress={this._getSuggestions} />
then the function would be something like:
_getSuggestions = (e) => {
let inputValue = e.target.value;
let {previousQueries} = this.state;
let results = [];
previousQueries.forEach((q) => {
if (q.toString().indexOf(inputValue)>-1) {
result.push(a);
}
}
this.setState({suggestions: results});
}
Then you can output this.state.suggestions somewhere and add behavior there. Perhaps some keyboard navigation or something. There are many different ways to implement how the results are displayed and how you would select one.
Note: I haven't tested the code above
I guess you have somewhere a function that queries the server, such as
const queryServer = function(queryString) {
/* access the server */
}
The trick would be to memorize this core function only, so that your UI thinks its actually accessing the server.
In javascript it is very easy to implement your own memorization decorator, but you could use existing ones. For example, lru-memoize looks popular on npm. You use it this way:
const memoize = require('lru-memoize')
const queryServer_memoized = memoize(100)(queryServer)
This code keeps in memory the last 100 request results. Next, in your code, you call queryServer_memoized instead of queryServer.
You can create a memoization function:
const memo = (callback) => {
// We will save the key-value pairs in the following variable. It will be our cache storage
const cache = new Map();
return (...args) => {
// The key will be used to identify the different arguments combination. Same arguments means same key
const key = JSON.stringify(args);
// If the cache storage has the key we are looking for, return the previously stored value
if (cache.has(key)) return cache.get(key);
// If the key is new, call the function (in this case fetch)
const value = callback(...args);
// And save the new key-value pair to the cache
cache.set(key, value);
return value;
};
};
const memoizedFetch = memo(fetch);
This memo function will act like a key-value cache. If the params (in our case the URL) of the function (fetch) are the same, the function will not be executed. Instead, the previous result will be returned.
So you can just use this memoized version memoizedFetch in your useEffect to make sure network request are not repeated for that particular petition.
For example you can do:
// Place this outside your react element
const memoizedFetchJson = memo((...args) => fetch(...args).then(res => res.json()));
useEffect(() => {
memoizedFetchJson(`https://pokeapi.co/api/v2/pokemon/${pokemon}/`)
.then(response => {
setPokemonData(response);
})
.catch(error => {
console.error(error);
});
}, [pokemon]);
Demo integrated in React

Resources