I am using react-admin's useGet... query to gather data from my rails backend. The main problem here is the filter property (the last pair of curly braces in the useGetList operation). How can i filter the data by Dates (like get only transactions of the last month etc.)
This is the current (working) approach:
const { data, loading, error } = useGetList(
'transactions',
{ page: 1, perPage: 10000 },
{ field: 'id', order: 'ASC' },
{},
)
if (loading) return <p>Loading</p>
if (error) return <p>Error</p>
if (!data) return null
The entries in the database all have a createdAt and updatedAt property.
My approach would be to create a filter like this:
// constraints could be dates that I can easily set beforehand
{
{'createdAt', desiredLowerTimeConstraint, operation: '<='},
{'createdAt', desiredUpperTimeConstraint, operation: '<='}
}
The react-admin documentation is quite sparce with the filter property, I couldn't find good examples for how these objects are supposed to look like.
It all depends on how your API expects filters to look like.
For instance, in REST APIs served by JSONServer, a _lte suffix on a query string parameter name indicates that you want results "Less Than or Equal to" the value:
GET /transactions?createdAt_lte=2019-12-05
Provided you use the ra-data-simple-rest, you can craft this request by passing the parameter in the filter:
const { data, loading, error } = useGetList(
'transactions',
{ page: 1, perPage: 10000 },
{ field: 'id', order: 'ASC' },
{ createdAt_lte: '2019-12-05' },
)
If your API behaves differently, then you may use the same syntax for useGetList, and transform the parameter in your dataProvider before it's sent to the API.
Related
I have a query that retrieves a Model. Inside this model, there are nested models with fields.
The shape is roughly like this:
{
model: [
{
id: 1,
fields: [...]
},
{
id: 2,
fields: [...]
}
]
}
Additionally, the frontend needs the model normalized into a list, like this:
{
modelFields: [
{...},
{...},
{...},
{...}
]
}
I’m attempting to derive modelFields declaratively when a query or cache update changes model. I’m trying to achieve this in type-policies section on Model: { merge: modelMergeMiddleware }, like so:
export function modelMergeMiddleware(
__: ModelFragment,
incoming: ModelFragment,
{cache, readField}: FieldFunctionOptions
) {
if (incoming) {
cache.writeQuery({
query: ModelFieldsDocument,
data: {
modelFields: incoming.fieldsets.reduce(
(fields: ModelFieldFragment[], fieldset: FieldsetFragment) => {
return fields.concat(newFields)
},
[]
)
}
})
}
return incoming
}
However, this runs into problems:
nested cache references don’t get passed through leaving empty data
readField and lodash’s _.cloneDeep both result in Readonly data that cause errors
My question is two-fold:
Is there a method to work around the problems mentioned above to derive data in a merge function?
Is there a different approach where I can declaratively derive local-only state and keep it synchronized with cached objects?
Per question 2, my backup approach is to use a reactiveVar/Recoil to store this data. This approach has the tradeoff of needing to call a setter function in all the places the Model object gets queried or mutated in the app. Not the end of the world, but it’s easy to miss a spot or forget about the setter function.
I have following queries with in my codebase :
Query all the articles
useInfiniteQuery(
['articles', { pageSize: props.pageSize }],
queryFn
);
Query articles of an single category
useInfiniteQuery(
['articles', { categoryId : props.categoryId , pageSize: props.pageSize }],
queryFn
);
Query articles related to single a user
useInfiniteQuery(
['articles', { username : props.username , pageSize: props.pageSize }],
queryFn
);
and for every article there is 'Like' feature so i have created a mutation for it.
useMutation(
articleApi.likePost(props),
{
onMutate: () => {
// I want to implement cache update here
// is there any way to update the liked article
// from all 3 queries at the same time if it present in all of
// them or some of them
},
}
);
My question is is there any way to update the liked article onMutate from all 3 queries at the same time if it present in all of them or some of them.
Have a look at setQueriesData
It will call setQueryData for all matching queries, and you can use fuzzy matching to find your entries. Especially if all 3 entries have the same structure, you can do:
queryClient.setQueriesData(['articles'], newData)
to update them all
I followed this video on the best practices for creating flat databases with firestore: Converting SQL structures to Firebase structures
I came up with something that looks like this:
const firestore = {
events: {
eventID: { // Doc
description: "Event Description", // Field
title: "Event Title", // Field
}
},
eventComments: { // Collection
eventID: { // Doc
comments: { // Field
commentID1: true, // Value
commentID2: true, // Value
commentID3: true, // Value
}
}
},
comments: { // Collection
commentID1: { // Doc
createdAt: "Timestamp", // Field
createdBy: "uid", // Field
content: "Comment Body" // Field
},
commentID2: {...},
commentID3: {...},
},
};
I'm not sure what the best way to get the related data is however
I'm using react and react-redux-firestore to access the data. My current setup for the app looks like this
<EventsDetailPage>
<Comments>
<Comment />
<Comment />
<Comment />
</Comments>
</EventsDetailPage>
I've come up with two potential methods...
Method 1
I have useFirestoreConnect in each component. The top level gets the event and passes the eventID to the comments component, the comments component uses the eventID to get the eventComments list which passes the individual commentID for each comment to the comment component, then finally the individual comment component uses the commentID to get the relevant comment data.
My issue with this: Wouldn't this mean that there is a listener for the event, comment list, and every individual comment? Is that frowned upon?
EX: This would be in the event, the comments, and comment component but each with respective values
useFirestoreConnect(() => [
{collection: 'events', doc: eventID},
]);
const event = useSelector(({firestore: {data}}) => data.events && data.events[eventID]);
Method 2
Let's say I have a list of events, I can do a query to get the lists
useFirestoreConnect(() => [{
collection: 'events',
orderBy: ["createdAt", "desc"],
limitTo: 10
}]);
const events = useSelector(({ firestore: { ordered } }) => ordered.events);
This is great because I believe it's one listener but if any of the data is changed in any of the events the listener will still respond to the changes.
My issue with this: I don't know how to do a where clause that would return all events for a given list of IDs.
So like say if I wanted to get a list of events with where: ['id', '==', ['eventID1', 'eventID2', 'eventID3']]
To retrieve up to 10 items by their ID, you can use an in query:
.where('id', 'in', ['eventID1', 'eventID2', 'eventID3'])
If you have more than 10 IDs, you'll have to run multiple of these queries.
I have a list of Items of whatever type. I can query all of them with query items or one with query item(id).
I realize apollo can't know what will be returned. It knows the type, but it doesn't know the exact data. Maybe there is a way not to make additional request? Map one query onto another?
Pseudo-code:
// somewhere in Menu.tsx (renders first)
let items = useQuery(GET_ITEMS);
return items.map(item => <MenuItemRepresenation item={item} />);
// meanwhile in apollo cache (de-normalized for readability):
{ ROOT_QUERY: {
items: [ // query name per schema
{ id: 1, data: {...}, __typename: "Item" },
{ id: 2, data: {...}, __typename: "Item" },
{ id: 3, data: {...}, __typename: "Item" },
]
}
}
// somewhere in MainView.tsx (renders afterwards)
let neededId = getNeededId(); // 2
let item = useQuery(GET_ITEM, { variables: { id: neededId } } );
return <MainViewRepresentation item={item} />;
Code like this will do two fetches. Even though the data is already in the cache. But it seems apollo thinks on query level. I would like a way to explain to it: "If I make item query, you need to look over here at items query you did before. If it has no item with that id go ahead and make the request."
Something akin to this can be done by querying items in MainView.tsx and combing through the results. It might work for pseudo-code, but in a real app it's not that simple: cache might be empty in some cases. Or not sufficient to satisfy required fields. Which means we have to load all items when we need just one.
Upon further research Apollo Link looks promising. It might be possible to intercept outgoing queries. Will investigate tomorrow.
Never mind apollo link. What I was looking for is called cacheRedirects.
It's an option for ApolloClient or Cache constructor.
cacheRedirects: {
Query: {
node: (_, args, { getCacheKey }) => {
const cacheKey = getCacheKey({
__typename: "Item",
id: args.id,
});
return cacheKey;
},
},
},
I'd link to documentation but it's never stable. I've seen too many dead links from questions such as this.
I have a model - Configuration:
var Configuration = Model.extend({
props: {
name: 'string'
}
});
In the database, configuration model / table has 3 columns -> id, name and fields. The latter stores site config as a serialized array. When retrieving the entry from the database, I unserialize it and then pass it to the front end, so the front end receives this:
{
"id": 1,
"name": 'global',
"fields": {
"enabled": true,
"site_name": "Test"
}
};
What I want to do is to set whatever is inside fields object as properties on my model, or maybe session so that things get triggered throughout the site when they are updated. To visualize it, I want to achieve something like this:
var Configuration = Model.extend({
props: {
enabled: 'boolean',
site_name: 'string'
}
});
So basically, is there are a way to 'unwrap' stuff in fields object somehow?
The parse method is what you're looking for in this case. See https://github.com/AmpersandJS/ampersand-state/blob/master/ampersand-state.js#L93-L98 It allows you to transform incoming props.