Re-execute relayjs query - reactjs

I'm working on a chat feature on my website which I am developing with Reactjs and Relayjs.
I can successfully query and display the relevant data (chat messages) in my view. However, the view remains static even if the chat partner sends a new message.
I'm looking for ways to implement this, either by checking for database changes or executing the same query on intervals and then fetching the changes, if any.
The relay documentation doesn't seem to describe how to do this. I was first thinking of creating a dummy mutation that doesn't do anything, so that the data is re-fetched which would update the view. However, not only does this seem like a bad implementation, but it would probably not work as the mutation would only return the changes, and not the entire resultset.
I'd rather not use 3rd party libraries, if possible.
Here's my relay container:
export default Relay.createContainer(StartConversationModal, {
prepareVariables() {
return {
limit: 1000,
};
},
fragments: {
conversation: () => Relay.QL`
fragment on Conversation {
id,
title,
type,
conversationDataList(last: $limit) {
edges {
node {
last_read_message_id,
member {
id,
user {
firstname,
lastname
}
}
}
}
},
messageList(last: $limit) {
edges {
node {
id,
message,
sent_on,
member {
id,
role,
user {
firstname,
lastname
}
}
}
}
},
task {
title,
},
${AddMessageMutation.getFragment('conversation')},
}
`,
},
});

To refetch all the fragments for a container, you can use the imperative forceFetch API given by RelayContainer, see docs
Basically inside StartConversationModal you could set up a poller that calls forceFetch off this.props.relay. Note that this will refetch the entire connection which might not be what you want, you'd have to experiment with it.

Related

How can I map data of multiple collections in snapshot?

I am not too confident working with Firestore and have trouble with more complex API calls to get data. Usually I use SQL backends in my apps.
For the section that I am working on, I would like to combine three collections to get an array of ToDos with the involved users and the category the current user labelled this ToDo with. Every involved person can label the ToDo like they prefer, which makes things a little more complicated. Broken down the collections are structured as follows.
todo: Firestore Database Document
{
title: string,
involved: string[], //user ids
involvedCategory: string[] //category ids mapped by index to involved
}
(I tried to have an array of objects here instead of the two arrays, but it seems I would not be able to query the array for the current user´s ID, like mentioned here, so this is a workaround)
category: Firestore Database Document
{
title: string,
color: string
}
user: Firebase Authentication User
{
uid: string,
displayName: string,
photoURL: string,
...
}
THE GOAL
An array of ToDo items like this:
{
id: string,
title: string,
involved: User[],
category?: {
title: string,
color: string
}
}
As I am working with TypeScript, I created an interface to use a converter with. My code looks like this so far:
import {
DocumentData,
FirestoreDataConverter,
WithFieldValue,
QueryDocumentSnapshot,
SnapshotOptions,
query,
collection,
where,
} from 'firebase/firestore'
import { store } from '../firebase'
import { useCollectionData } from 'react-firebase-hooks/firestore'
import { User } from 'firebase/auth'
import { useCategories } from './categories'
import { useAuth } from '../contexts/AuthContext'
interface ToDo {
id: string
title: string
involved: User[]
category?: {
title: string
color: string
}
}
const converter: FirestoreDataConverter<ToDo> = {
toFirestore(todo: WithFieldValue<ToDo>): DocumentData {
return {} //not implemented yet
},
fromFirestore(
snapshot: QueryDocumentSnapshot,
options: SnapshotOptions
): ToDo {
const data = snapshot.data(options)
return {
id: snapshot.id,
title: data.title,
category: undefined, //?
involved: [], //?
}
},
}
export function useToDos() {
const { currentUser } = useAuth()
const { categories } = useCategories() //needed in converter
const ref = query(
collection(store, 'habits'),
where('involved', 'array-contains', currentUser.uid)
).withConverter(converter)
const [data] = useCollectionData(ref)
return {
todos: data,
}
}
Is there any way I can do this? I have a Hook that returns all of the user´s categories, but I obviously can´t call that outside the
useToDos-Hook. And creating the const in the hook does not help, either, as it results in an infinite re-render.
I know this is a long one, but does anyone have tips how I could approach this? Thanks in advance ^^
UPDATE:
I had to make two small adjustments to #ErnestoC ´s solution in case anyone is doing something similar:
First, I changed the calls for currentUser.id to currentUser.uid.
Afterwards I got the very missleading Firestore Error: PERMISSION_DENIED: Missing or insufficient permissions, which made me experiment a lot with my security rules. But that is not where the error originated. Debugging the code line by line, I noticed the category objects resolved by the promise where not correct and had a weird path with multiple spaces at the beginning and the end of their ids. When I removed them before saving them in the promises array, it worked. Although I do not see where the spaces came from in the first place.
promises.push(
getDoc(
doc(
store,
'categories',
docSnap.data().involvedCategory[userCatIndex].replaceAll(' ', '')
)
)
)
The general approach, given that Firestore is a NoSQL database that does not support server-side JOINS, is to perform all the data combinations on the client side or in the backend with a Cloud Function.
For your scenario, one approach is to first query the ToDo documents by the array membership of the current user's ID in the involved array.
Afterwards, you fetch the corresponding category document the current user assigned to that ToDo (going by index mapping between the two arrays). Finally, you should be able to construct your ToDo objects with the data.
const toDoArray = [];
const promises = [];
//Querying the ToDo collection
const q = query(collection(firestoreDB, 'habits'), where('involved', 'array-contains', currentUser.id));
const querySnap = await getDocs(q);
querySnap.forEach((docSnap) => {
//Uses index mapping
const userCatIndex = docSnap.data().involved.indexOf(currentUser.id);
//For each matching ToDo, get the corresponding category from the categories collection
promises.push(getDoc(doc(firestoreDB, 'categories', docSnap.data().involvedCategory[userCatIndex])));
//Pushes object to ToDo class/interface
toDoArray.push(new ToDo(docSnap.id, docSnap.data().title, docSnap.data().involved))
});
//Resolves all promises of category documents, then adds the data to the existing ToDo objects.
await Promise.all(promises).then(categoryDocs => {
categoryDocs.forEach((userCategory, i) => {
toDoArray[i].category = userCategory.data();
});
});
console.log(toDoArray);
Using the FirestoreDataConverter interface would not be that different, as you would need to still perform an additional query for the category data, and then add the data to your custom objects. Let me know if this was helpful.

How to force Apollo Client to use cached data for detail view page

I have a paginated cursor based query TODOS and detail page with query TODO to get data by ID.
Whenever I go to detail view and use useQuery with TODO query (Which contains exactly same data as TODOS query result, it still tries to get data from server not from cache. How can I achieve to not get data from server (Because it already exists), I thought Apollo detect by id and return from the cache but no. Any suggestions ?
Similar issue as no this post, but I don't think thats a right approach, there should be better solution. (I hope)
This is TODOS query:
query TODOS(
$paginationOptions: PaginationOptionsInput
) {
todos(paginationOptions: $paginationOptions) {
pagination {
minCursor
maxCursor
sortOrder
limit
hasMoreResults
}
result {
id
...SomeTodoFields
}
}
And on detail page I have second query TODO
query (
$todoId: String!
) {
todo(todoId: $todoId) {
id
...SomeTodoFields
}
}
Since I am using Apollo-client < 3.0 for me cacheRedirect worked fine, you can have a look farther here. Read every note carefully it is really important! My code example:
cache: new InMemoryCache({
fragmentMatcher,
cacheRedirects: {
Query: {
todo: (_, args, { getCacheKey }) => {
return getCacheKey({ __typename: 'TodoType', id: args.todoId })
}
}
}
})
})
Found some good relevant article as well, which you might want to check.
This worked for me, hope it helps to someone else as well. :)

React Apollo updating client cache after mutation

I am trying to update my chache after succesfully executing a mutation. Here is my query and mutation:
export const Dojo_QUERY = gql`
query Dojo($id: Int!){
dojo(id: $id){
id,
name,
logoUrl,
location {
id,
city,
country
},
members{
id
},
disziplines{
id,
name
}
}
}`;
export const addDiszipline_MUTATION = gql`
mutation createDisziplin($input:DisziplineInput!,$dojoId:Int!){
createDisziplin(input:$input,dojoId:$dojoId){
disziplin{
name,
id
}
}
}`;
and my mutation call:
const [createDisziplin] = useMutation(Constants.addDiszipline_MUTATION,
{
update(cache, { data: { createDisziplin } }) {
console.log(cache)
const { disziplines } = cache.readQuery({ query: Constants.Dojo_QUERY,variables: {id}});
console.log(disziplines)
cache.writeQuery({
...some update logic (craches in line above)
});
}
}
);
when i execute this mutation i get the error
Invariant Violation: "Can't find field dojo({"id":1}) on object {
"dojo({\"id\":\"1\"})": {
"type": "id",
"generated": false,
"id": "DojoType:1",
"typename": "DojoType"
}
}."
In my client cache i can see
data{data{DojoType {...WITH ALL DATA INSIDE APPART FROM THE NEW DISZIPLINE}}
and
data{data{DisziplineType {THE NEW OBJECT}}
There seems to be a lot of confusion around the client cache around the web. Somehow none of the posed solutions helped, or made any sense to me. Any help would be greatly appreciated.
EDIT 1:
Maybe this can help?
ROOT_QUERY: {…}
"dojo({\"id\":\"1\"})": {…}​​​​​
generated: false​​​​​
id: "DojoType:1"​​​​​
type: "id"​​​​​
typename: "DojoType"​​​​​
<prototype>: Object { … }​​​​
<prototype>: Object { … }
Edit 2
I have taken Herku advice and started using fragment. however it still seems to not quite work.
My udated code:
const [createDisziplin] = useMutation(Constants.addDiszipline_MUTATION,
{
update(cache, { data: { createDisziplin } }) {
console.log(cache)
const { dojo } = cache.readFragment(
{ fragment: Constants.Diszilines_FRAGMENT,
id:"DojoType:"+id.toString()});
console.log(dojo)
}
}
);
with
export const Diszilines_FRAGMENT=gql`
fragment currentDojo on Dojo{
id,
name,
disziplines{
id,
name
}
}
`;
however the result from console.log(dojo) is still undefined.Any advice?
So I think your actual error is that you have to supply the ID as as a string: variables: {id: id.toString()}. You can see that these two lines are different:
dojo({\"id\":1})
dojo({\"id\":\"1\"})
But I would highly suggest to use readFragment instead of readQuery and update the dojo with the ID supplied. This should update the query as well and all other occurrences of the dojo in all your queries. You can find documentation on readFragment here.
Another trick is as well to simply return the whole dojo in the response of the mutation. I would say people should be less afraid of that and not do to much cache updates because cache updates are implicit behaviour of your API that is nowhere in your type system. That the new disziplin can be found in the disziplins field is now encoded in your frontend. Imagine you want to add another step here where new disziplins have to be approved first before they end up in there. If the mutation returns the whole dojo a simple backend change would do the job and your clients don't have to be aware of that behaviour.

Is it ok to put business logic into a saga

I'm developing a small react/redux/saga application. It is basically a multistep form with Back and Next buttons, and a progress bar. Each form step (or form screen) has a number of fields.
According to spec, the form should be able to store the user entered data on server step-by-step – i.e., upon each click on the Next button. This button acts like a typical submit button for the current form step, but it should also take the user further if all requests are successful. If the server returns an error for at least one field on the current step, the user should not have the possibility to progress to the next one.
The catch is that requests to the server are not predetermined – some entities may be modified, some – added, and some – deleted. This all depends on the user choices.
So, technically, before letting the user move to the next step, I must ensure that only needed requests were sent to the server, and, of course, that the server responded with success statuses to those requests.
I'm using redux-saga to manage side effects, and I just don't see how I can make individual requests (like addUserEducation, deleteUserEducation, etc.) from within the component itself and still meet the spec. It feels like this specific case forces me to put a significant part of my "business" logic into a saga that takes all data from the form screen and use it to decide on what should be sent to the server. Since I use a blocking call for each request, I can, hopefully, rely on the error selector in the end of the saga to take (or not to take) the user to the next step conditionally.
This approach works but I`m not sure if it agrees with redux-saga best practises because a number of my sagas become too "fat" and not strictly doing only side effects stuff. I think it may be suboptimal and there should be more elegant alternatives to solve this problem. Please help!
// Education.js (one of the form screens)
const sendForms = (nextStep, history) => {
handleEducation(
{ userId, educationList, forms, formList },
nextStep,
history,
);
};
// userSagas.js
function* handleEducation({ payload }) {
try {
const {
screenData: { userId, educationList, forms, formList },
nextStep,
history,
} = payload;
// Collect ids of the forms that were `dropped` by the user,..
const formsToDelete = educationList.reduce((acc, edObj) => {
if (!Object.prototype.hasOwnProperty.call(forms, edObj.id)) {
return [...acc, edObj.id];
}
return acc;
}, []);
// ...and delete the corresponding educations on server.
yield all(
formsToDelete.map(educationId =>
call(deleteUserEducation, userId, educationId),
),
);
// Handle forms that are still present.
yield all(
formList.map(([educationId, form], index) => {
const { school, startYear, endYear, toDate } = form;
let fields = {
institution: school,
enrolled_on: startYear,
graduated_on: endYear,
is_current: toDate,
};
const isLast = index + 1 === formList.length;
if (educationId.includes('new')) {
// Add new education
if (isLast && toDate === true) {
// Omit `graduated_on` field
const { graduated_on: graduatedOn, ...withoutGraduated } = fields;
fields = withoutGraduated;
return call(addUserEducation, userId, fields);
}
return call(addUserEducation, userId, fields);
}
// ...more code here...
const errors = yield select(userSelectors.selectErrors);
if (!errors) {
history.push(steps[nextStep].route);
}
} catch (er) {
const { response } = er;
console.log('errors: ', response.data);
}
}

Apollo Client cache

I just started using apollo client on a React application and I'm stuck on caching.
I have a home page with a list of products where I do a query to get the id and name of those products and a product page where I do query for the ID, name, description and image.
I would like that if a user visits the home page fist then a specific product page to only do a query for that product's description and image, also display the name during the loading (since I should have cached it already).
I followed "Controlling the Store" part of the documentation (http://dev.apollodata.com/react/cache-updates.html) but still couldn't resolve it.
The query that is done when we go to the product page still asks for both the product's id and name whereas they should be cached since I already asked for them.
I think I'm missing something but I can't figure it out.
Here is a bit of the code:
// Create the apollo graphql client.
const apolloClient = new ApolloClient({
networkInterface: createNetworkInterface({
uri: `${process.env.GRAPHQL_ENDPOINT}`
}),
queryTransformer: addTypename,
dataIdFromObject: (result) => {
if (result.id && result.__typename) {
console.log(result.id, result.__typename); //can see this on console, seems okey
return result.__typename + result.id;
}
// Make sure to return null if this object doesn't have an ID
return null;
},
});
// home page query
// return an array of objects (Product)
export default graphql(gql`
query ProductsQuery {
products {
id, name
}
}
`)(Home);
//product page query
//return an object (Product)
export default graphql(gql`
query ProductQuery($productId: ID!) {
product(id: $productId) {
id, name, description, image
}
}
`,{
options: props => ({ variables: { productId: props.params.id } }),
props: ({ data: { loading, product } }) => ({
loading,
product,})
})(Product);
And my console output:
The answer to your question actually has two parts:
The client cannot actually tell for sure that these queries resolve to the same object in the cache, because they have a different path. One starts with products, the other with product. There's an open PR for client-side resolvers, which will let you give the client hints about where to find things in the cache, even if you haven't explicitly queried for them. I expect that we will publish that feature within a week or two.
Even with client-side resolvers, Apollo Client won't do exactly what you described above, because Apollo Client no longer does query diffing since version 0.5. Instead, all queries are fully static now. That means even if your query is in the cache partially, the full query will be sent to the server. This has a number of advantages that are laid out in this blog post.
You will still be able to display the part that's in the cache first, by setting returnPartialData: true in the options.
This question is quite old, however, there is a solution to map the query to the correct location using cacheRedirects
In my project, I have a projects query and a project query.
I can make a cacheRedirect like below:
const client = new ApolloClient({
uri: "http://localhost:3000/graphql",
request: async (operation) => {
const token = await localStorage.getItem('authToken');
operation.setContext({
headers: {
authorization: token
}
});
},
cacheRedirects: {
Query: {
project: (_, { id }, { getCacheKey }) => getCacheKey({ id, __typename: 'Project' })
}
}
});
Then when I load my dashboard, there is 1 query which gets projects. And then when navigating to a single project. No network request is made because it's reading from the cache 🎉
Read the full documentation on Cache Redirects

Resources