Apollo GraphQL Caching Questions (cacheRedirects and readFragment) - reactjs

I'm new to Apollo and Apollo caching. I've been writing a demo project and have run into two issues:
Can't get cacheRedirects to work properly:
In my demo, I "Load All" COVID19 test data, which queries an API and returns an array of results for 3 countries, including Canada. Then I try to load an individual result ("Load Canada") to get it to use my cacheRedirects resolver. I think I have this set up correctly, but it always goes back to the API to do the query rather than reading from the cache.
Here is my Apollo client with cache:
const client = new ApolloClient({
uri: "https://covid19-graphql.now.sh",
cache: new InMemoryCache({
dataIdFromObject: obj => {
console.log("in cache: obj:", obj);
let dataId = null;
switch (obj.__typename) {
case "Country":
dataId = obj.__typename + "_" + obj.name;
break;
default:
dataId = defaultDataIdFromObject(obj);
}
console.log("in cache: dataId:", dataId);
return dataId;
},
cacheRedirects: {
Query: {
country: (_, args, { getCacheKey }) => {
console.log("in cacheRedirects:", _, args);
const cacheKey = getCacheKey({ __typename: "Country", ...args });
console.log("cacheKey:", cacheKey);
return cacheKey;
}
}
}
})
// connectToDevTools: true
});
I can't figure out how to perform a readFragment:
I've tried so many different configurations within this code, but I never get any results.
Here is my function:
const ReadFragment = () => {
console.log("in ReadFragment");
try {
const data = client.readFragment({
id: GET_DATA.countryData,
fragment: gql`
fragment mostRecent on country {
id
text
complete
}
`
});
if (data) {
console.log(data);
return (
<div>
From Fragment: {JSON.stringify(data)}
</div>
);
} else {
return <div>From Fragment: not found</div>;
}
} catch (error) {
// console.error(error);
return <div>From Fragment: not found</div>;
}
};
Bonus Question: I don't seem to be able to get the Apollo Client Developer Tools extension to work in Chrome browser. Does this still work? My code never seems to connect to it. (uncomment out the connectToDevTools: true.) It seems that being able to examine the contents of the cache would be very useful for development and learning. Is there an alternate way to view the cache contents?

The Apollo GraphQL maintain the cache itself and certainly you don't have to -
export declare type FetchPolicy = 'cache-first' | 'network-only' | 'cache-only' | 'no-cache' | 'standby';
If you look into the fetchPolicy declaration then there are several options to do that -
Network Only
const { data } = useQuery(GET_LIST, {
fetchPolicy: 'network-only'
});
Cache First
const { data } = useQuery(GET_LIST, {
fetchPolicy: 'cache-first'
});
Cache Only
const { data } = useQuery(GET_LIST, {
fetchPolicy: 'cache-only'
});
Similarly the rest options also can be looked upon based on requirement.
If you want to maintain state and do that kinda work then write resolvers for those queries -
const client = new ApolloClient({
uri: apollo.networkInterface,
cache,
resolvers: {
Mutation: {
searchQuery: (launch, _args, { cache }) => {
console.log(cache); // cache can be queried here
// read from cache if planning to modify the data
//const { searchQuery } = cache.readQuery({ query: GET_LIST });
cache.writeQuery({
query: GET_LIST,
data: {searchQuery:_args.searchQuery}
});
},
},
},
})
The Chrome client does work but for that the connection with the graphql server has to be done. Else it will not show up in chrome.

Related

How to get GraphQL response data within TinyMCE file_picker_callback

I am working on image upload with React TinyMCE editor. I have two graphql endpoints to process my image upload. This is my file_picker_callback
file_picker_callback: function() {
const input = document.createElement('input');
input.setAttribute('type', 'file');
input.setAttribute('accept', 'image/*');
input.onchange = function() {
const file = this.files[0];
const isFileValid = validateUpload(file);
if (isFileValid) {
setLastUploadedFile(file);
getImageSignedUrl({
variables: {
id,
locale: 'en-US',
},
});
setIsImageValid(false);
}
};
}
This is the first graphql endpoint that I make
const [getImageSignedUrl] = useLazyQuery(GET_SIGNED_IMAGE_URL, {
fetchPolicy: 'no-cache',
onCompleted: async ({ uploadUrlForCustomContentImage }) => {
const res = await uploadImage(lastUploadedFile, uploadUrlForCustomContentImage?.signedUrl);
if (res.status === 200) {
confirmImageUpload({
variables: {
input: {
id,
locale: 'en-US',
fileName: uploadUrlForCustomContentImage?.fileName,
},
},
});
}
},
});
After 'getImageSignedUrl' is finished in onCompleted block I make a second graphql 'confirmImageUpload' call which should return my imageUrl that I was planning to use within file_picker_callback to insert into input field.
Second endpoint
const [confirmImageUpload] = useMutation(CONFIRM_IMAGE_UPLOAD, {
fetchPolicy: 'no-cache',
});
However I am having trouble accessing data within file_picker_callback after confirmImageUpload is finished executing. I tried to update my local state in onCompleted block but its not able to pick up the change within file_picker_callback.
This is the first time I am working with React TinyMCE editor so if anyone has any suggestions please let me know

Should manually updating the cache always be the preferred option after mutations as long as I get proper data from server?

I am writing a CRUD app with React Query and I created some custom hooks as described here: https://react-query.tanstack.com/examples/custom-hooks
In the docs I see that there are basically two ways to update the cache after a mutation:
Query invalidation (https://react-query.tanstack.com/guides/query-invalidation)
onSuccess: () => {
queryClient.invalidateQueries("posts");
}
Updating the cache manually (https://react-query.tanstack.com/guides/invalidations-from-mutations)
// Update post example
// I get the updated post data for onSuccess
onSuccess: (data) => {
queryClient.setQueryData("posts", (oldData) => {
const index = oldData.findIndex((post) => post.id === data.id);
if (index > -1) {
return [
...oldData.slice(0, index),
data,
...oldData.slice(index + 1),
];
}
});
},
I understand that manual update has the advantage of not doing an extra call for fetching the 'posts', but I wonder if there is any advantage of invalidating cache over the manual update. For example:
import { useMutation, useQueryClient } from "react-query";
const { API_URL } = process.env;
const createPost = async (payload) => {
const options = {
method: "POST",
headers: {
"Content-Type": "application/json",
},
body: JSON.stringify(payload),
};
if (API_URL) {
try {
const response = await fetch(API_URL, options);
if (!response.ok) {
throw new Error(response.statusText);
}
return response.json();
} catch (error) {
throw new Error(error);
}
} else {
throw new Error("No api url is set");
}
};
export default function useCreatePost() {
const queryClient = useQueryClient();
return useMutation((payload) => createPost(payload), {
// DOES INVALIDATING HAVE ANY ADVANTAGE OVER MANUAL UPDATE IN THIS CASE?
// onSuccess: () => {
// queryClient.invalidateQueries("posts");
// },
onSuccess: (data) => {
queryClient.setQueryData("posts", (oldData) => {
return [...oldData, data];
});
},
});
}
Thanks for your time!
As you state it yourself, the only advantage is that you don't waste another network call to update data we already have.
Here we have a create and delete example.
import { useMutation, useQueryClient } from 'react-query'
const queryClient = useQueryClient()
// createPost(post: PostT) {
// const { data } = await http.post<{ post: PostT >('/posts', { post });
// return data.post;
// }
const mutation = useMutation(createPost, {
onSuccess: (post) => {
queryClient.setQueryData<PostT[]>(['posts'], (oldData || []) => [ ...oldData, post])
},
})
// deletePost(id: string) {
// await http.delete(`/posts/${id}`);
// }
const mutation = useMutation(deletePost, {
onSuccess: (_, id) => {
queryClient.setQueryData<PostT[]>(['posts'], (oldData || []) => oldData.filter((post) => id !== post.id)
},
})
Invalidating the query can also be an option is some cases. The query will be invalidated and the data will be marked as stale. This will trigger a refetching in the background. So you know for a fact that the data will be as fresh as possible.
This can be handy if you got:
multiple queries to update with data from a mutation
have a (difficult) nested data structure to update
import { useMutation, useQueryClient } from 'react-query'
const queryClient = useQueryClient()
const mutation = useMutation(createPost, {
onSuccess: () => {
queryClient.invalidateQueries('posts')
queryClient.invalidateQueries('meta')
queryClient.invalidateQueries('headers')
},
})
But it really is up to you.
The main advantage of using manual updates comes from the fact that you can do it before the request is sent to the server; so if you manually update after the request is successful, then there's not much of an advantage if the data that you get from the server doesn't need to be immediately present to the user & in those cases (which I have found to be the majority) you better off invalidating. when you use optimistic updates, you assume the request is successful before you send it to server & then if the request fails you just roll back your update. this way your action happens instantly which is a better UX than doing the action, showing a loading spinner or something & then showing the updated state. so I have found it more useful for giving instantaneous feedback to the user than saving an extra request to the server. in most cases (as in yours) you still need to invalidate the query after, because your manually added post doesn't have an id, so you should sync it with the list of posts from the server. so be very careful about that because if you reading from that id somewhere else in that page, it would be undefined & would throw an error. so at the end of the day your mutation is not a great candidate for optimistic update & you should be careful to handle all the problems that can come up with your posts value having a post with no id in it (as opposed to something like a follow action which is just changing a boolean value in your database & you can confidently mutate the cache & undo it if request was not successful). so if we assume that you can handle that problem your useMutation hook would be something like this:
return useMutation(
(payload) => {
queryClient.setQueryData("posts", (oldData) => {
return [...oldData, payload];
});
return createPost(payload);
},
{
onSettled: () => {
queryClient.invalidateQueries("posts");
},
}
);

apollo client 3 cache doesn't update after mutation

First of all, i'd like to apologize if this is a duplicate but none of the existing answers in similar questions helped me out.
I am using nextjs 9.5.3 and apollo client 3.2.2 . I have a form where a user fills it out to update their profile details. On submission, the data saved is saved in the database and a response returned back to the client. The issue is that the response is unable to update the cache but it can be found inside ROOT_MUTATION according to apollo devtools.
I use a query to initially load the user's data then update the cache with the result during the mutation. Below are the local query and mutation.
// fragments
export const editUserProfileFragment= gql`
fragment EditUserProfileFields on ProfileInterface {
id
type
slug
name
location {
name
address
country
latitude
longitude
}
settings
createdAt
isActive
}
`;
// query
export const editUserProfileQuery = gql`
query EditUserProfile($slug: String) {
Profile(slug: $slug) {
...EditUserProfileFields
}
}
${editUserProfileFragment}
`;
// mutation
export const editUserProfileMutation = gql`
mutation EditUserProfile($id: ObjectID!, $profile: ProfileInput!) {
editProfile(id: $id, profile: $profile) {
...EditUserProfileFields
}
}
${editUserProfileFragment}
`;
Here's how i use the query and mutation:
// the query
const { error, data, loading } = useQuery(editUserProfileQuery, {
variables: { slug },
fetchPolicy: "network-only",
})
// data returns a `Profile` object
// the mutation
const [editUserProfileMutate] = useMutation(editUserProfileMutation)
...
// save data
try {
const response = await editUserProfileMutate({
variables: { id, profile: inputdata },
// update: (cache, { data }) => {
// const cachedData: any = cache.readQuery({
// query: editUserProfileQuery,
// variables: { slug: newProfile.slug }
// });
// const cacheId = cache.identify(data.editProfile) // to see the id, i wanted to try cache.modify() but didn't how to proceed.
// console.log('update.cachedData', cachedData);
// console.log('update.cachedData.Profile', cachedData.Profile);
// console.log('update.data', data);
// const newData = { ...cachedData.Profile, ...data.editProfile }; // i first used [] but
// console.log('newData', newData);
// // cache.writeQuery({
// // query: editUserProfileQuery,
// // variables: { slug: newProfile.slug },
// // data: { editProfile: newData }
// // })
// // cache.modify({
// // id: cacheId,
// // })
// },
// tried the below but didn't work
// refetchQueries: [{
// query: editProfilePageQuery,
// variables: { slug: newProfile.slug },
// }],
// awaitRefetchQueries: true
});
const updatedProfile = response.data.editProfile;
console.log('updatedProfile', updatedProfile);
....
} catch (error) {
....
} // trycatch
Also the below apollo client is mainly based on nextjs with-apollo example:
...
let apolloClient;
...
const cache = new InMemoryCache({
// https://www.apollographql.com/docs/react/data/fragments/#using-fragments-with-unions-and-interfaces
dataIdFromObject: result => `${result.__typename}:${result._id || result.id || result.name || result.slug || Math.floor(Math.random() * 1000000)}`,
possibleTypes: {
ProfileInterface: ["Star", "User"],
},
// #see https://www.apollographql.com/docs/react/caching/cache-field-behavior/#merging-non-normalized-objects from console warnings
typePolicies:
User: {
fields: {
location: {
merge(_existing, incoming) {
return incoming;
},
},
},
},
},
});
function createClient() {
const link = makeLink();
return new ApolloClient({
cache,
link,
connectToDevTools:typeof window !== 'undefined',
ssrMode: typeof window === 'undefined',
});
}
export function initializeApollo(initialState = null) {
const _apolloClient = apolloClient ?? createClient()
if (initialState) {
const existingCache = _apolloClient.extract()
console.log('existingCache', existingCache);
// _apolloClient.cache.restore({ ...existingCache, ...initialState }) // commented out on purpose
_apolloClient.cache.restore(initialState)
}
if (typeof window === 'undefined') return _apolloClient
if (!apolloClient) apolloClient = _apolloClient
return _apolloClient
}
export function useApollo(initialState) {
const store = useMemo(() => initializeApollo({ initialState }), [initialState])
return store
}
So after going through my backend source code, i found out that i wasn't returning the updated values from the database but rather the old ones instead. I fixed it and it works as it should.

RelayObservable: Unhandled Error TypeError: Cannot read property 'subscribe' of undefined in React and Relay

I have followed the subscription tutorial on How to GraphQL React + Relay (https://relay.dev/docs/en/subscriptions) but still not working.
I'm using Relay Modern in my app and have successfully integrated query but not working the requestSubscription function.
Any help would be awesome.
My environment.js file:
function setupSubscription(
config,
variables,
cacheConfig,
observer,
) {
const query = config.text
const subscriptionClient = new SubscriptionClient('ws://192.168.1.19:8000/subscriptions', {reconnect: true});
const id = subscriptionClient.on({query, variables}, (error, result) => {
console.log(result,'result');
observer.onNext({data: result})
})
}
const network = Network.create(fetchQuery, setupSubscription)
const environment = new Environment({
network,
store
});
export default environment;
- My Subscription.js file:
const subscription = graphql`
subscription newVoteSubscription {
leaderboardUpdate {
id,
game_id,
event_id,
colarr,
rowarr
}
}
`;
function newVoteSubscription(callback) {
const variables = {};
return requestSubscription(environment, {
subscription: subscription,
variables: variables,
onError: (error)=> {
console.log(error, "error");
},
onNext: (res) => {
console.log(res,'onNext');
// callback();
},
updater: proxyStore => {
console.log(proxyStore,'proxyStore');
},
onCompleted: () => {
console.log('test');
},
});
}
export default newVoteSubscription;
I had trouble with the network as well. On Relay 7 using an Observable worked for me. This also handles error cases and the server closing the subscription.
const subscriptionClient = new SubscriptionClient('ws://192.168.1.19:8000/subscriptions', {reconnect: true})
function setupSubscription(
request,
variables,
cacheConfig,
) {
const query = request.text;
// Observable is imported from the relay-runtime package
return Observable.create(sink => {
const c = subscriptionClient.request({ query, variables }).subscribe(sink);
return c.unsubscribe;
});
}
I'm not sure why i've gone with the sink / unsubscribe approach, but this is what worked for me. As far as i remember the observable types used by relay and subscriptions-transport-ws were not compatible.
Also i'd advise you to hoist the new SubscriptionClient() call outside of the setupSubscription function as otherwise you'll open a new WebSocket for each subscription request.
I got the response, but now observer.onNext is undefined.
My updated code environment.js:
const setupSubscription = (config, variables, cacheConfig, observer) => {
const query = config.text
const subscriptionClient = new SubscriptionClient('ws://192.168.1.19:8000/subscriptions', {reconnect: true})
subscriptionClient.request({ query, variables }).subscribe((result) => {
observer.onNext({data: result})
});
return Observable.create(() => {
return subscriptionClient;
});
}
const environment = new Environment({
network: Network.create(fetchQuery, setupSubscription),
store: new Store(new RecordSource())
});

React: Data not showing until page refreshes

I am currently building a simple CRUD workflow using React and GraphQL. After I create an object (an article in this case which just has an id, title and description.), I navigate back to an Index page which displays all of the currently created articles. My issue is that after an article is created, the index page does not display the created article until I refresh the page. I am using apollo to query the graphql api and have disabled cacheing on it so I'm not sure why the data isn't displaying. I've set breakpoints in my ArticlesIndex's componentDidMount function and ensured that it is executing and at the time of executing, the database does include the newly added article.
My server side is actually never even hit when the client side query to retrieve all articles executes. I'm not sure what is cacheing this data and why it is not being retrieved from the server as expected.
My ArticlesCreate component inserts the new record and redirects back to the ArticlesIndex component as follows:
handleSubmit(event) {
event.preventDefault();
const { client } = this.props;
var article = {
"article": {
"title": this.state.title,
"description": this.state.description
}
};
client
.mutate({ mutation: CREATE_EDIT_ARTICLE,
variables: article })
.then(({ data: { articles } }) => {
this.props.history.push("/articles");
})
.catch(err => {
console.log("err", err);
});
}
}
then my ArticlesIndex component retrieves all articles from the db as follows:
componentDidMount = () => {
const { client } = this.props; //client is an ApolloClient
client
.query({ query: GET_ARTICLES })
.then(({ data: { articles } }) => {
if (articles) {
this.setState({ loading: false, articles: articles });
}
})
.catch(err => {
console.log("err", err);
});
};
and I've set ApolloClient to not cache data as in my App.js as follows:
const defaultApolloOptions = {
watchQuery: {
fetchPolicy: 'network-only',
errorPolicy: 'ignore',
},
query: {
fetchPolicy: 'network-only',
errorPolicy: 'all',
},
}
export default class App extends Component {
displayName = App.name;
client = new ApolloClient({
uri: "https://localhost:44360/graphql",
cache: new InMemoryCache(),
defaultOptions: defaultApolloOptions
});
//...render method, route definitions, etc
}
Why is this happening and how can I solve it?
It looks like this is an issue with ApolloBoost not supporting defaultOptions as noted in this github issue. To resolve the issue I changed:
const defaultApolloOptions = {
watchQuery: {
fetchPolicy: 'network-only',
errorPolicy: 'ignore',
},
query: {
fetchPolicy: 'network-only',
errorPolicy: 'all',
},
}
export default class App extends Component {
displayName = App.name;
client = new ApolloClient({
uri: "https://localhost:44360/graphql",
cache: new InMemoryCache(),
defaultOptions: defaultApolloOptions
});
//...render method, route definitions, etc
}
To:
const client = new ApolloClient({
uri: "https://localhost:44360/graphql"
});
client.defaultOptions = {
watchQuery: {
fetchPolicy: 'network-only',
errorPolicy: 'ignore',
},
query: {
fetchPolicy: 'network-only',
errorPolicy: 'all',
},
};
export default class App extends Component {
//....
}
I can see that you are getting the data and setting your components state on initial mount. Most probably when you redirect it doesn't fire the componentDidMount lifecylcle hook as it is already mounted, if that is the issue try using componentDidUpdate lifecycle hook so that your component knows there was an update and re-set the data.

Resources