DB Setup:
- users
- A: { // private user info }
- usersPublic
- A: { // public user info }
- rooms
- 1: { readAccess: ['A'] }
I have a component that displays all rooms and am fetching that in the following way:
useFirestoreConnect(() => [{collection: 'rooms'}] )
This is working fine, but I now want to also load in the info from usersPublic for each user in the rooms readAccess array.
I'm attempting to use populates in the following way:
useFirestoreConnect(() => [{
collection: 'rooms',
populates: [{
root: 'usersPublic',
child: 'A'
}]
}])
I'm pretty sure my implementation of populates is wrong and I'm failing to understand exactly how to make this work.
I could return a bunch of other query configs for all users with read access once I have the room object but that seems inefficient and it seems that populates is meant to solve exactly this problem.
I'm also open to suggestions on modeling the DB structure - the above made sense to me and offers a nice separation between private/public user info but there might be a better way.
The way to do it is:
populates: [{ root: usersPublic, child: 'read_access ' }]
This results in a redux state that looks like:
...etc
data: {
rooms: { ... rooms ... }
usersPublic: { A: { ... usersPublic[A] ... }, etc }
}
Related
I have a query that retrieves a Model. Inside this model, there are nested models with fields.
The shape is roughly like this:
{
model: [
{
id: 1,
fields: [...]
},
{
id: 2,
fields: [...]
}
]
}
Additionally, the frontend needs the model normalized into a list, like this:
{
modelFields: [
{...},
{...},
{...},
{...}
]
}
I’m attempting to derive modelFields declaratively when a query or cache update changes model. I’m trying to achieve this in type-policies section on Model: { merge: modelMergeMiddleware }, like so:
export function modelMergeMiddleware(
__: ModelFragment,
incoming: ModelFragment,
{cache, readField}: FieldFunctionOptions
) {
if (incoming) {
cache.writeQuery({
query: ModelFieldsDocument,
data: {
modelFields: incoming.fieldsets.reduce(
(fields: ModelFieldFragment[], fieldset: FieldsetFragment) => {
return fields.concat(newFields)
},
[]
)
}
})
}
return incoming
}
However, this runs into problems:
nested cache references don’t get passed through leaving empty data
readField and lodash’s _.cloneDeep both result in Readonly data that cause errors
My question is two-fold:
Is there a method to work around the problems mentioned above to derive data in a merge function?
Is there a different approach where I can declaratively derive local-only state and keep it synchronized with cached objects?
Per question 2, my backup approach is to use a reactiveVar/Recoil to store this data. This approach has the tradeoff of needing to call a setter function in all the places the Model object gets queried or mutated in the app. Not the end of the world, but it’s easy to miss a spot or forget about the setter function.
I followed this video on the best practices for creating flat databases with firestore: Converting SQL structures to Firebase structures
I came up with something that looks like this:
const firestore = {
events: {
eventID: { // Doc
description: "Event Description", // Field
title: "Event Title", // Field
}
},
eventComments: { // Collection
eventID: { // Doc
comments: { // Field
commentID1: true, // Value
commentID2: true, // Value
commentID3: true, // Value
}
}
},
comments: { // Collection
commentID1: { // Doc
createdAt: "Timestamp", // Field
createdBy: "uid", // Field
content: "Comment Body" // Field
},
commentID2: {...},
commentID3: {...},
},
};
I'm not sure what the best way to get the related data is however
I'm using react and react-redux-firestore to access the data. My current setup for the app looks like this
<EventsDetailPage>
<Comments>
<Comment />
<Comment />
<Comment />
</Comments>
</EventsDetailPage>
I've come up with two potential methods...
Method 1
I have useFirestoreConnect in each component. The top level gets the event and passes the eventID to the comments component, the comments component uses the eventID to get the eventComments list which passes the individual commentID for each comment to the comment component, then finally the individual comment component uses the commentID to get the relevant comment data.
My issue with this: Wouldn't this mean that there is a listener for the event, comment list, and every individual comment? Is that frowned upon?
EX: This would be in the event, the comments, and comment component but each with respective values
useFirestoreConnect(() => [
{collection: 'events', doc: eventID},
]);
const event = useSelector(({firestore: {data}}) => data.events && data.events[eventID]);
Method 2
Let's say I have a list of events, I can do a query to get the lists
useFirestoreConnect(() => [{
collection: 'events',
orderBy: ["createdAt", "desc"],
limitTo: 10
}]);
const events = useSelector(({ firestore: { ordered } }) => ordered.events);
This is great because I believe it's one listener but if any of the data is changed in any of the events the listener will still respond to the changes.
My issue with this: I don't know how to do a where clause that would return all events for a given list of IDs.
So like say if I wanted to get a list of events with where: ['id', '==', ['eventID1', 'eventID2', 'eventID3']]
To retrieve up to 10 items by their ID, you can use an in query:
.where('id', 'in', ['eventID1', 'eventID2', 'eventID3'])
If you have more than 10 IDs, you'll have to run multiple of these queries.
I have a list of Items of whatever type. I can query all of them with query items or one with query item(id).
I realize apollo can't know what will be returned. It knows the type, but it doesn't know the exact data. Maybe there is a way not to make additional request? Map one query onto another?
Pseudo-code:
// somewhere in Menu.tsx (renders first)
let items = useQuery(GET_ITEMS);
return items.map(item => <MenuItemRepresenation item={item} />);
// meanwhile in apollo cache (de-normalized for readability):
{ ROOT_QUERY: {
items: [ // query name per schema
{ id: 1, data: {...}, __typename: "Item" },
{ id: 2, data: {...}, __typename: "Item" },
{ id: 3, data: {...}, __typename: "Item" },
]
}
}
// somewhere in MainView.tsx (renders afterwards)
let neededId = getNeededId(); // 2
let item = useQuery(GET_ITEM, { variables: { id: neededId } } );
return <MainViewRepresentation item={item} />;
Code like this will do two fetches. Even though the data is already in the cache. But it seems apollo thinks on query level. I would like a way to explain to it: "If I make item query, you need to look over here at items query you did before. If it has no item with that id go ahead and make the request."
Something akin to this can be done by querying items in MainView.tsx and combing through the results. It might work for pseudo-code, but in a real app it's not that simple: cache might be empty in some cases. Or not sufficient to satisfy required fields. Which means we have to load all items when we need just one.
Upon further research Apollo Link looks promising. It might be possible to intercept outgoing queries. Will investigate tomorrow.
Never mind apollo link. What I was looking for is called cacheRedirects.
It's an option for ApolloClient or Cache constructor.
cacheRedirects: {
Query: {
node: (_, args, { getCacheKey }) => {
const cacheKey = getCacheKey({
__typename: "Item",
id: args.id,
});
return cacheKey;
},
},
},
I'd link to documentation but it's never stable. I've seen too many dead links from questions such as this.
Given is a nested model structure like this:
Model Website
+ id
+ name
+ images[] // List of Image instances
Model Image
+ imageName
+ imageUrl
A serialised version of the response looks like:
{
"id": 4711,
"name": "Some name",
"images" [
{"imageName": "Beach", "imageUrl": "http://example.com/whatever.jpg"},
...
]
}
This nested model set is persisted in a document store and is returned on request by Website.id.
There is no by-id-relation to the nested list of images, as they are persisted as a list directly in the parent model. As far as I know, the classic relations in Ext.data.Model refer to the related models via a by-id-relation.
The question is: Is there any way that I can tell the parent model to use the Image model for each of the children in it's images list?
As a first step, you can make your images data to be loaded into the model by using a field type of auto:
Ext.define('My.Model', {
extend: 'Ext.data.Model'
,fields: [
{name: 'images', type: 'auto'}
// ... other fields
}
});
Then:
myModel.get('images');
Should return:
[
{"imageName": "Beach", "imageUrl": "http://example.com/whatever.jpg"},
...
]
From there, you should theoretically be able to implement a fully automatized solution to creates the models from this data, and -- the hardest part -- try to keep these created records and the children data in the parent model synchronized. But this is a very involved hack, and a lot of entry points in Ext code base have to be covered. As an illustration, I once tried to do that for "has one" relations, and that represent a lot of code. As a result, I never took the time to consolidate this code, and finally never used it.
I would rather advocate for a simple and local (to the model) solution. You can add a simple method to your model to get the images as records. For example:
Ext.define('My.Model', {
// ...
,getImages: function() {
var store = this.imageStore;
if (!store) {
store = new Ext.data.Store({
model: 'My.ImageModel'
,data: this.get('images') || []
});
this.imageStore = store;
}
return store;
}
});
Creating a store for the associated model will save you from having to play with the proxy and the reader. It also gives you an interface that is close to Ext's default one for associations.
If you need support for loading images more than once for the same parent record, you can hook on the field's convert method.
Finally, you may also need to handle client-side modifications of associated data, in order to be able to save them to the server. If your associated model allows it, you could simply use the children store's sync method (and don't forget to update the parent model's data in the sync callback!). But if your associated model isn't connected to an endpoint on the server-side, you should be able to hook on the serialize method to save the data in the associated store (as opposed to the one stored in the parent record, that won't get updated if you work with the associated store).
Here's a last example showing both:
Ext.define('My.Model', {
extend: 'Ext.data.Model'
,fields: [
{
name: 'images'
,type: 'auto'
// enables associated data update
,convert: function(data) {
var store = this.imageStore;
if (store) {
store.loadData(data || []);
}
return data;
}
// enables saving data from the associated store
,serialize: function(value, record) {
var store = record.imageStore,
if (store) {
// care, the proxy we want is the associated model's one
var writer = store.proxy && store.proxy.writer;
if (writer) {
return Ext.Array.map(store.getRange(), function(record) {
return writer.getRecordData(record);
});
} else {
// gross implementation, simply use the records data object
return Ext.pluck(store.getRange(), 'data');
}
} else {
return record.get('images');
}
}
}
// ... other fields
}
,getImages: function() {
var store = this.imageStore;
if (!store) {
store = new Ext.data.Store({
model: 'My.ImageModel'
,data: this.get('images') || []
});
this.imageStore = store;
}
return store;
}
});
Please notice that I haven't tested this code, so it might still contains some mistakes... But I hope it will be enough to give you the general idea!
I have a simple data model that looks something like this (actual code below):
model Game:
fields: id, team_1_id, team_2_id
model GameScore:
fields: id, game_id, team_1_score, team_2_score, is_final, submission_date
model SpiritScore:
fields: id, game_id, team_1_score, team_2_score
What I want seems simple. I already have code that loads Games and GameScores in bulk. I have a 'Game' instance in hand, and can call gameScores(). And I get a store, but it's empty. I have code that will dynamically load it, by placing the store into the model's hasMany definition. But what I would really like is some way to bind the Game.gameScores() call to the my existing GameScores store. Even if it used a normal filter underneath, that gives me a single record that I can bind and use in a view. (Important note: the data does not come in nested form.)
This leads to my second question. Game:GameScores is 1:many, but I only ever display the most recent one (from live score reporting). What is the general approach here? I can also manually build a filter from the game_id, but I can only bind 1 record to a view, so I don't see how I can bring that other information into a view, short of a proper hasMany relationship. Is there another way?
Any and all advice, including telling me to RTFM (with a link to the relevant manual) would be greatly appreciated! I've been pulling my hair out on this (pro bono side project) for the last week.
Cheers!
b
Ext.define('TouchMill.model.Game', {
extend: 'Ext.data.Model',
config: {
fields: [ 'id', 'team_1_id', 'team_2_id' ],
hasMany: {
model: 'TouchMill.model.GameScore',
name: 'gameScores',
},
},
});
Ext.define('TouchMill.model.GameScore', {
extend: 'Ext.data.Model',
config: {
fields: [ 'id', 'game_id', 'team_1_score', 'team_2_score', 'is_final', 'submission_date', ],
},
// belongsTo necessary? Don't think so unless I want parent func?
});
Ext.define('TouchMill.model.SpiritScore', {
extend: 'Ext.data.Model',
config: {
fields: [ 'id', 'game_id', 'team_1_score', 'team_2_score', ],
},
},
I've never used touch, so I'm speaking about Ext4 here (4.2 to be precise)... And, your model definitions seem a bit broken to me (is that working with touch?). But whatever, you'll get the general idea. If my code don't work in touch, please try with Ext4.
Also, I understood that you're loading all your scores at once. If that's not the case, my solution will need to be adapted...
So, my general reasoning is the following: if you've loaded all your scores in memory, then why not use a memory proxy that uses the score store's data as the data source for the store generated for the association? I tried that and, quite to my surprise, it worked without a glitch.
To understand this, you need to know that a proxy is an independant data source, that is a proxy can be shared between multiple stores without problem. On the other hand, a store is expected to be bound to a single view or task. For example, if you bind the same store to two different grids, then filtering the first grid will affect the second as well.
And while most proxies do not "contain" their data, memory proxy do. Here's a relevant excerpt of Ext.data.proxy.Memory#read method:
resultSet = operation.resultSet = me.getReader().read(me.data)
So, enough theory, here's the proof of concept (tested in this fiddle):
// I instantiate this proxy myself in order to have a reference available
var masterScoreProxy = Ext.create('Ext.data.proxy.Memory');
Ext.define('TouchMill.model.GameScore', {
extend: 'Ext.data.Model',
fields: [ 'id', 'game_id', 'team_1_score', 'team_2_score', 'is_final', 'submission_date' ],
// I've used a remote server to ensure this all works even asynchronously
proxy: {
// configure your own
}
});
Ext.define('TouchMill.model.Game', {
extend: 'Ext.data.Model'
,fields: [ 'id', 'team_1_id', 'team_2_id' ]
,hasMany: {
model: 'TouchMill.model.GameScore'
,name: 'gameScores'
// required in order to avoid Ext autogenerating it as 'touchmill.model.game_id'
,foreignKey: 'game_id'
// needed if we don't want to have to call gameRecord.gameScores().load()
,autoLoad: true
// first part of the magic: make the generated store use my own proxy
,storeConfig: {
proxy: masterScoreProxy
}
}
});
// Just mocking a store with two games
var gameStore = Ext.create('Ext.data.Store', {
model: 'TouchMill.model.Game'
,data: [{id: 1}, {id: 2}]
,proxy: 'memory'
});
// Creating the "master" score store (that will use the model's proxy)
var scoreStore = Ext.create('Ext.data.Store', {
model: 'TouchMill.model.GameScore'
// second part's in there
,listeners: {
load: function(store, records, success) {
if (success) {
// 1. replace the data of the generated association stores' proxy
// (I must say I'm quite surprised that I didn't had to extract the data of
// every records, nor to configure a reader and all for my shared proxy...
// But hey, that works!)
masterScoreProxy.data = records;
// 2. update already generated stores
// Alternatively, you could call gameRecord.gameScores().load() individually
// before each usage of gameRecord.gameStores()
gameStore.each(function(record) {
var childStore = record.gameScoresStore;
if (childStore) {
childStore.load();
}
});
}
}
}
});
// test first load
scoreStore.load({
callback: function(records, operation, success) {
if (success) {
// and here's to prove it
gameStore.each(function(record) {
record.gameScores().each(function(score) {
console.log('Game ' + record.id + ': ' + JSON.stringify(score.data, undefined, 2));
});
});
testRefreshedData();
}
}
});
function testRefreshedData() {
// test refreshing
scoreStore.load({
callback: function(records, operation, success) {
if (success) {
console.log('--- Scores have changed ---');
gameStore.each(function(record) {
record.gameScores().each(function(score) {
console.log('Game ' + record.id + ': ' + JSON.stringify(score.data, undefined, 2));
});
});
}
}
});
}
Regarding your other questions...
If you have a 1:n for Game:Score, you've got a 1:1 for Game:MostRecentScore... So, I'd try to use that.
As for the view, there should always be a way -- even if hackish -- to access data nested in your records. The way will depend on what you're calling view here... See, for example this question.