Similar to the question here I have found that when using optimisticResponse and update for a mutation, that the id set from the response of the server is wrong. Furthermore, the id actually gets set by running the optimistic function again.
In the mutation below the refetchQueries is comment out on purpose. I don't want to use that. I want to manage everything through the update only.
Also notice the optimisticResponse id has a "-" prepended to it to prove the optimistic function is run twice:
id: "-" _ uuid(),
Mutation
graphql(MutationCreateChild, {
options: {
// refetchQueries: [{QueryAllChildren, variables: {limit: 1000}}],
update: (proxy, {data: {createChild}}) => {
const query = QueryAllChildren;
const data = proxy.readQuery({query});
data.listChildren.items.push(createChild);
proxy.writeQuery({query, data});
console.log("id: ", createChild.id);
}
},
props: props => ({
createChild: child => {
return props.mutate({
variables: child,
optimisticResponse: () => ({
createChild: {
...child,
id: "-" + uuid(),
__typename: "Child"
}
})
});
}
})
})
The output from the console.log statement is:
id: -6c5c2a28-8bc1-49fe-92e1-2abade0d06ca
id: -9e0a1c9f-d9ca-4e72-88c2-064f7cc8684e
While the actual request in the chrome developer console looks like this:
{"data":{"createChild":{"id":"f5bd1c27-2a21-40c6-9da2-9ddc5f05fd40",__typename":"Child"}}}
Is this a bug or am I not accessing the id correctly in the update function?
It's a known issue, which has now been fixed. I imagine it'll get released to the npm registry soon.
https://github.com/awslabs/aws-mobile-appsync-sdk-js/pull/43
https://github.com/awslabs/aws-mobile-appsync-sdk-js/commit/d26ea1ca1a8253df11dea8f11c1749e7bad8ef05
Using your setup, I believe it is normal for the update function to be called twice and you are correct that the real id from the server will only be there the second time. Apollo takes the object you return from optimisticResponse and passes it to the update function so your UI can immediately show changes without waiting for the server. When the server response comes back, the update function is called again with the same state (i.e. the state without the optimistic result) where you can reapply the change with the correct value from the server.
As for why the second id you list with the '-' is not the same as the id you see in the chrome dev console, I am not sure. Are you sure that it was actually that request that matched up with that call to console.log?
Related
I have some queries and mutations and one mutation updates quite big entity so I want to add optimistic response after mutate function is fired. The thing is that even if I pass to optimisticResponse object, full data that will be also returned when mutation completes the job it does not add it to cache - it seems that data is refreshed when mutation response is ready since having optimistic response or not the time of updating UI is the same so I assume optimistic response does not work.
Some code examples I have:
Mutation:
mutation UpdateList($id: ID, $data: ListData) {
updateList(id: $id, data: $data) {
list_id // 1
name
}
}
action
const [action] = useMutation(mutation_from_above)
// async body function so await can be used
await action({ variables: { id: 1, data: { name: 'secret name' }}, optimisticResponse: {
updateList: {
__typename: 'List',
list_id: 1,
name: 'some updated name until data is back'
}
})
and of course I have updated my id field for typename in the cache config like that:
cache: new InMemoryCache({
typePolicies: {
List: {
keyFields: ['list_id'],
},
},
}),
and it looks pretty simple but for me it does not work. I also checked on API side and response from mutation is the same I pass to the optimisticResponse object. Is there some important point why it is not working ? Can someone explain me what to do in order to get this work ?
Thanks, cheers!
I have become slightly lost with react-query. Essentially, I have a useQuery to fetch a user from my database. Their details are added to a form and they can update and submit.
The problem I have is that the update is done to a different database. The main database will be batch updated at a later point. As such, instead of refetching the initial data, I need to use setQueryData to update the cache version.
queryClient = useQueryClient()
const { mutate } = useMutation(postUser, {
onSuccess: async (response) => {
console.log(response)
queryClient.cancelQueries('user');
const previousUser = queryClient.getQueryData('user');
console.log(previousUser)
queryClient.setQueryData('user', {
...previousUser,
data: [
previousUser.data,
{ '#status': 'true' },
],
})
return () => queryClient.setQueryData('user', previousUser)
}
})
At the moment I have something like the above. So it calls postUser and gets a response. The response looks something like so
data:
data:
user_uid: "12345"
status: "true"
message: "User added."
status: 1
I then getQueryData in order to get the cache version of the data, which currently looks like this
data:
#userUuid: "12345"
#status: ""
message: "User found."
status: 1
So I need to update the cached version #status to now be true. With what I have above, it seems to add a new line into the cache
data: Array(2)
0: {#userUuid: "12345", #status: ""}
1: {#status: "true"}
message: "User found."
status: 1
So how do I overwrite the existing one without adding a new row?
Thanks
This is not really react-query specific. In your setQueryData code, you set data to an array with two entries:
data: [
previousUser.data,
{ '#status': 'true' },
],
first entry = previousUser.data
second entry = { '#status': 'true' }
overriding would be something like this:
queryClient.setQueryData('user', {
...previousUser,
data: {
...previousUser.data,
'#status': 'true',
},
})
On another note, it seems like your mixing optimistic the onMutate callback with the onSuccess callback. If you want to do optimistic updates, you'd implement the onMutate function in a similar way like you've done it above:
cancel outgoing queries
set data
return "rollback" function that can be called onError
This is basically the workflow found here in the docs.
if you implement onSuccess, you're updating the cache after the mutation was successful, which is also a legit, albeit different, use-case. Here, you don't need to return anything, it would be more similar to the updates from mutation responses example.
I have a custom hook useServerStatus that fetches from a RESTful API with axios. Checking the network tab, the response went through fine, I can see all my data. Using console.log to print out the result or using debugger to check the result in the browser works flawlessly. However, calling the setState method that I get from useState will not save the response data.
ServerStatus Interface (ServerStatus.ts)
interface ServerStatus {
taskid: string
taskmodule: string
taskident?: string
status: string
server: string
customer: string
}
useServerStatus Hook (useServerStatus.ts)
export default function useServerStatus() {
const [serverStatus, setServerStatus] = useState<ServerStatus[][]>([]);
useEffect(() => {
fetchServerStatus();
}, []);
const fetchServerStatus = () => {
axios.get<ServerStatus[][]>(`${config.apiURL}/servers`)
.then(res => setServerStatus(res.data));
}
return serverStatus;
}
Network Tab
https://i.imgur.com/cWBSPVz.png
The first request you see in the network tab is handled the same exact way, no problems there.
React Developer Console
https://i.imgur.com/YCq3CPo.png
Try
const fetchServerStatus = () => {
axios.get<ServerStatus[][]>(`${config.apiURL}/servers`)
.then(res => { setServerStatus(res.data) });
}
So, to answer my own question:
I figured out that the problem wasn't about data not being saved in state, but data not correctly being received by axios.
I fixed it with a workaround. Instead of returning a ServerStatus[][] in my backend, I returned a ServerStatus[]. I was able to use this data instead.
Following the lesson here https://reactjs.org/docs/hooks-custom.html the most obvious thing that jumps out to me is that you aren't returning the state variable serverStatus in your code vs. the example is returning "isOnline". Try to match this by returning serverStatua in your custom effect to see if it helps.
I am trying to update setState in a for loop, but for some reason state isn't being copied it's just being replaced. There should be 2 clients, instead I am getting one. Can anyone tell me why this is happening? The console.log is returning both clients.
const handleViewClients = () => {
for (let i = 0; i < clients.length; i++) {
console.log(clients[i].clientid);
fetch("http://localhost:3005/all-clients/" + clients[i].clientid)
.then((response) => response.json())
.then((result) => {
console.log(result);
setBarbersClient({
...barbersClient,
client: result,
});
});
}
};
I have also tried this... The console.log is returning what I need
Promise.all(
clients.map((client) =>
fetch("http://localhost:3005/all-clients/" + client.clientid)
)
)
.then((resp) => resp.json())
.then((result) => {
console.log(result.username)
setBarbersClient({
...barbersClient,
client: result,
});
});
Here is the route from the server side
app.get("/all-clients/:clientid", (req, res) => {
db.NewClientsx.findOne({
where: {
id: req.params.clientid,
},
}).then((response) => {
res.json(response);
});
});
There some fundamental concepts of sync vs. async code that you aren't accounting for here. State changing (and fetching) is asynchronous, so it won't run until after this synchronous loop has finished being executed (during which the state value will remain unchanged). Also, it's a bad idea to change state in a loop, for this reason and others.
Fetch all the clients, then do one state change at the end with all the fetched data. You can utilise things like Promise.all and Promise.spread to achieve this. Here's an example of doing multiple fetches then dealing with the results in one batch: How can I fetch an array of URLs with Promise.all?
You're making two distinct mistakes of which either is enough to cause the behaviour you're seeing.
1. You're overwriting the client property.
Every time you call the setter function you're overwriting the previous value of the client property. You'll need some data structure that supports multiple values like a map:
setBarbersClient({
...barbersClient,
clients: {
...barbersClient.clients,
[result.id]: result
},
});
You will need to change your render logic somewhat to accomodate the new data structure.
2. You're using a stale reference.
When you access barbersClient its setter may have already been called with a different value and your reference to it still refers to the value of the previous run of the render function. You can make sure your reference is fresh by using a set state action callback.
setBarbersClient(previousValue => {
...previousValue,
clients: {
...previousValue.clients,
[result.id]: result
},
});
previousValue will never be stale inside the set state action function body.
I am using GraphQL with Apollo-Client in my React(Typescript) application with an in memory cache. The cache is updated on new items being added which works fine with no errors.
When items are removed a string is returned from GraphQL Apollo-Server backend stating the successful delete operation which initiates the update function to be called which reads the cache and then modifies it by filtering out the id of the item. This is performed using the mutation hook from Apollo-Client.
const [deleteBook] = useMutation<{ deleteBook: string }, DeleteBookProps>(DELETE_BOOK_MUTATION, {
variables: { id },
onError(error) {
console.log(error);
},
update(proxy) {
const bookCache = proxy.readQuery<{ getBooks: IBook[] }>({ query: GET_BOOKS_QUERY });
if (bookCache) {
proxy.writeQuery<IGetBooks>({
query: GET_BOOKS_QUERY,
data: { getBooks: bookCache.getBooks.filter((b) => b._id !== id) },
});
}
},
});
The function works and the frontend is updated with the correct items in cache, however the following error is displayed in the console:
Cache data may be lost when replacing the getBooks field of a Query object.
To address this problem (which is not a bug in Apollo Client), define a custom merge function for the Query.getBooks field, so InMemoryCache can safely merge these objects:
existing: [{"__ref":"Book:5f21280332de1d304485ae80"},{"__ref":"Book:5f212a1332de1d304485ae81"},{"__ref":"Book:5f212a6732de1d304485ae82"},{"__ref":"Book:5f212a9232de1d304485ae83"},{"__ref":"Book:5f21364832de1d304485ae84"},{"__ref":"Book:5f214e1932de1d304485ae85"},{"__ref":"Book:5f21595a32de1d304485ae88"},{"__ref":"Book:5f2166601f6a633ae482bae4"}]
incoming: [{"__ref":"Book:5f212a1332de1d304485ae81"},{"__ref":"Book:5f212a6732de1d304485ae82"},{"__ref":"Book:5f212a9232de1d304485ae83"},{"__ref":"Book:5f21364832de1d304485ae84"},{"__ref":"Book:5f214e1932de1d304485ae85"},{"__ref":"Book:5f21595a32de1d304485ae88"},{"__ref":"Book:5f2166601f6a633ae482bae4"}]
For more information about these options, please refer to the documentation:
* Ensuring entity objects have IDs: https://go.apollo.dev/c/generating-unique-identifiers
* Defining custom merge functions: https://go.apollo.dev/c/merging-non-normalized-objects
Is there a better way to update the cache so this error won't be received?
I too faced the exact same warning, and unfortunately didn't come up with a solution other than the one suggested here: https://go.apollo.dev/c/merging-non-normalized-objects
const client = new ApolloClient({
....
cache: new InMemoryCache({
typePolicies: {
Query: {
fields: {
getBooks: {
merge(existing, incoming) {
return incoming;
},
},
},
},
}
}),
});
(I am not sure weather I wrote your fields and types correctly though, so you might change this code a bit)
Basically, the code above let's apollo client how to deal with mergeable data. In this case, I simply replace the old data with a new one.
I wonder though, if there's a better solution
I've also faced the same problem. I've come across a GitHub thread that offers two alternative solutions here.
The first is evicting what's in your cache before calling cache.writeQuery:
cache.evict({
// Often cache.evict will take an options.id property, but that's not necessary
// when evicting from the ROOT_QUERY object, as we're doing here.
fieldName: "notifications",
// No need to trigger a broadcast here, since writeQuery will take care of that.
broadcast: false,
});
In short this flushes your cache so your new data will be the new source of truth. There is no concern about losing your old data.
An alternative suggestion for the apollo-client v3 is posted further below in the same thread:
cache.modify({
fields: {
notifications(list, { readField }) {
return list.filter((n) => readField('id', n) !==id)
},
},
})
This way removes a lot of boilerplate so you don't need to use readQuery, evict, and writeQuery. The problem is that if you're running Typescript you'll run into some implementation issues. Under-the-hood the format used is InMemoryCache format instead of the usual GraphQL data. You'll be seeing Reference objects, types that aren't inferred, and other weird things.