Sort Data on react-firebase-hooks/database - reactjs

So in React, I'm reading a firebase real-time database using the "react-firebase-hooks/database" package.
import { useList } from "react-firebase-hooks/database";
import { db, auth } from "../../firebase";
function GameHistory() {
var dbRef = db.ref("/" + user.uid);
const [snapshots, loading, error] = useList(dbRef);
So basically snapshots variable contains all the firebase realtime database data.
Then later in my code I simply map each element of the snapshot array into a component.
Here is the problem, I want to sort my snapshots data in order of the .timestamp firebase property in my data. I'm not sure how to do this.
I tried to sort the snapshot data when I map it:
snapshots
.sort((a, b) => {
return a.val().timestamp > b.val().timestamp;
})
.map((game, index) => (MORE CODE
But that doesn't work because timestamp is a firebase object, and JavaScript doesn't know what to do with it.
Just to establish more context on the timestamp variable, I defined it as such:
timestamp: firebase.firestore.FieldValue.serverTimestamp()
So is there any way to sort my snapshot data? Or should I use another package? If I should use something else please show code for how you read and sort the realtime db.

You're mixing up two different databases here:
The import { useList } from "react-firebase-hooks/database" and other code are for the Realtime Database.
The timestamp: firebase.firestore.FieldValue.serverTimestamp() is for Cloud Firestore
While both databases are part of Firebase, they are completely separate and each has their own API.
To write a server-side timestamp to Realtime Database, use:
timestamp: firebase.database.ServerValue.TIMESTAMP
I'd actually also let the database server handle the sorting, instead of doing this in your code, which is done with:
var dbRef = db.ref("/" + user.uid);
const dbQuery = dbRef.orderByChild("timestamp");
const [snapshots, loading, error] = useList(dbQuery);
...

Related

Specifically, how does Reactjs retrieve data from firebase function triggers?

I am using express to create my firebase functions, and I understand how to create regular callable functions. I am lost however on the exact way to implement trigger functions for the background (i.e. onCreate, onDelete, onUpdate, onWrite), as well as how Reactjs in the frontend is supposed to receive the data.
The scenario I have is a generic chat system that uses react, firebase functions with express and realtime database. I am generally confused on the process of using triggers for when someone sends a message, to update another user's frontend data.
I have had a hard time finding a tutorial or documentation on the combination of these questions. Any links or a basic programmatic examples of the life cycle would be wonderful.
The parts I do understand is the way to write a trigger function:
exports.makeUppercase = functions.database.ref('/messages/{pushId}/original')
.onWrite((change, context) => {
// Only edit data when it is first created.
if (change.before.exists()) {
return null;
}
// Exit when the data is deleted.
if (!change.after.exists()) {
return null;
}
// Grab the current value of what was written to the Realtime Database.
const original = change.after.val();
console.log('Uppercasing', context.params.pushId, original);
const uppercase = original.toUpperCase();
// You must return a Promise when performing asynchronous tasks inside a Functions such as
// writing to the Firebase Realtime Database.
// Setting an "uppercase" sibling in the Realtime Database returns a Promise.
return change.after.ref.parent.child('uppercase').set(uppercase);
});
But I don't understand how this is being called or how the data from this reaches frontend code.
Background functions cannot return anything to client. They run after a certain event i.e. onWrite() in this case. If you want to update data at /messages/{pushId}/original to other users then you'll have to use Firebase Client SDK to listen to that path:
import { getDatabase, ref, onValue} from "firebase/database";
const db = getDatabase();
const msgRef = ref(db, `/messages/${pushId}/original`);
onValue(msgRef, (snapshot) => {
const data = snapshot.val();
console.log(data)
});
You can also listen to /messages/${pushId} with onChildAdded() to get notified about any new node under that path.

How to store data in this very simple React Native app?

I'm developing an app using React Native that allows you to create your own checklists and add items to them.
For example you'd have "Create Checklist", and inside that you'll have the option to "Add Item", "Delete Item" "Edit Item", basic CRUD methods etc.
It's going to be completely offline but I'm wondering what the best approach to storing this data locally would be.
Should I be using a DB such as firebase? I have read that it is overkill and to use something like Redux but I'm not sure if the latter will accomplish everything I need. As long as it's storing data which can be edited, and will save on the user's device (with minimal effort) it sounds good to me.
Would appreciate some input on this, thanks!
You could use AsyncStorage for persisting data locally on the user's phone. It is a simple persistent key-value-storage.
Each checklist is most likely an array of JS objects. The documentation provides an example on how to store objects.
const storeData = async (value) => {
try {
const jsonValue = JSON.stringify(value)
await AsyncStorage.setItem('#storage_Key', jsonValue)
} catch (e) {
// saving error
}
}
The value parameter is any JS object. We use JSON.stringify to create a JSON string. We use AsyncStorage.setItem in order to persist the data. The string #storage_Key is the key for the object. This could be any string.
We retrieve a persisted object as follows.
const getData = async () => {
try {
const jsonValue = await AsyncStorage.getItem('#storage_Key')
return jsonValue != null ? JSON.parse(jsonValue) : null;
} catch(e) {
// error reading value
}
}
Both examples are taken from the official documentation.
Keep in mind that this functionality should be used for persistence only. If the application is running, you should load the complete list, or parts of the list if it is very large, in some sort of application cache. The implementation for this functionality now heavily depends on how your current code looks like. If you have a plain view, then you could access the local storage in an effect and just store it in a local state.
function MySuperList() {
const [list, setList] = useState([]);
React.useEffect(() => {
// retrieve data using the above functionality and set the state
}, [])
// render list
return (...)
}
I would implement some sort of save button for this list. If it is pressed, then we persist the data in the local storage of the phone.

Airtable blocks: Multiple useRecords hooks to different tables force repeated app restarts

My blocks custom app is reading data from multiple Airtable tables, then displays them in context and interconnected. To be able to react to user changes I'm using the useRecords hook, so that any user updates also propagate to the app.
When searching why the app loads so slowly I realized that it starts over and over again, with each useRecords hook. Is there a way/pattern to avoid this, while still being able to read and keep up-to-date all the necessary table data?
My code:
import { initializeBlock, useBase, useRecords } from '#airtable/blocks/ui';
import React from 'react';
...
function MyApp() {
...
const base = useBase();
// get all necessary Airtable tables
const tables = React.useMemo(() => {
return {
productTable: base.getTableByNameIfExists(Constants.PRODUCTS_TABLE_NAME),
costTable: base.getTableByNameIfExists(Constants.COST_TABLE_NAME),
contactsTable: base.getTableByNameIfExists(Constants.CONTACTS_TABLE_NAME),
conversionTable: base.getTableByNameIfExists(Constants.CONVERSIONS_TABLE_NAME),
contributorTable: base.getTableByNameIfExists(Constants.CONTRIBUTORS_TABLE_NAME),
};
}, [base]);
console.log('tables have been connected...')
const conversions = useRecords(tables.conversionTable);
console.log('conversions data has been read...')
const contributors = useRecords(tables.contributorTable);
console.log('contributor data has been read...')
const costs = useRecords(tables.costTable);
console.log('tracker data has been read...')
const products = useRecords(tables.productTable);
console.log('product data has been read...')
const vendors = useRecords(tables.contactsTable);
console.log('vendor data has been read...')
...
Console Output:
tables have been connected...
...
tables have been connected...
conversions data has been read...
...
tables have been connected...
conversions data has been read...
contributor data has been read...
...
tables have been connected...
conversions data has been read...
contributor data has been read...
tracker data has been read...
...
tables have been connected...
conversions data has been read...
contributor data has been read...
tracker data has been read...
product data has been read...
...
tables have been connected...
conversions data has been read...
contributor data has been read...
tracker data has been read...
product data has been read...
vendor data has been read...
...
For anyone facing the same issue, here FYI the response from Airtable dev support:
The useRecords() hook will update the component used by your app with
the underlying data changes to records, but it also handles loading
data automatically.
Within the useRecords() hook we use suspense while loading data so
that downstream logic does not get called until the data has finished
loading. As you see from your console output, this causes the
component to be re-rendered with each call of useRecords() before the
next line can be executed.
You might be able to reduce some of the slowness your app is
experiencing by starting the data load outside of the useRecords()
call which uses the suspended loads. loadDataAsync() will allow you to
do this asynchronously for each table and can be called from a query
result, such as the one returned by table.selectRecords(). That way,
the data for each table starts loading before suspense is used via
useRecords().
You’ll still see the component re-render once the data for each is
finished loading, but since the loading will now happen
asynchronously, this should cut out some of the overall load time.
Here’s what that might look like:
const base = useBase();
// get all necessary Airtable tables
const tables = React.useMemo(() => {
return {
productTable: base.getTableByNameIfExists(Constants.PRODUCTS_TABLE_NAME),
costTable: base.getTableByNameIfExists(Constants.COST_TABLE_NAME),
contactsTable: base.getTableByNameIfExists(Constants.CONTACTS_TABLE_NAME),
conversionTable: base.getTableByNameIfExists(Constants.CONVERSIONS_TABLE_NAME),
contributorTable: base.getTableByNameIfExists(Constants.CONTRIBUTORS_TABLE_NAME),
};
}, [base]);
console.log('tables have been connected...')
const productTableQueryResult = tables.productTable.selectRecords()
const costTableQueryResult = tables.costTable.selectRecords()
const contactsTableQueryResult = tables.contactsTable.selectRecords()
const conversionTableQueryResult = tables.conversionTable.selectRecords()
const contributorTableQueryResult = tables.contributorTable.selectRecords()
React.useEffect( () => {
productTableQueryResult.loadDataAsync()
costTableQueryResult.loadDataAsync()
contactsTableQueryResult.loadDataAsync()
conversionTableQueryResult.loadDataAsync()
contributorTableQueryResult.loadDataAsync()
return () => {
productTableQueryResult.unloadData()
costTableQueryResult.unloadData()
contactsTableQueryResult.unloadData()
conversionTableQueryResult.unloadData()
contributorTableQueryResult.unloadData()
}
}, [])
const conversions = useRecords(tables.conversionTable);
console.log('conversions data has been read...')
const contributors = useRecords(tables.contributorTable);
console.log('contributor data has been read...')
const costs = useRecords(tables.costTable);
console.log('tracker data has been read...')
const products = useRecords(tables.productTable);
console.log('product data has been read...')
const vendors = useRecords(tables.contactsTable);
console.log('vendor data has been read...')

Firebase realtime database - filtering query not on client side Web/React

Quite new to Firebase and I'm facing some issue on the logic on querying/filtering the needed requests.
I have my users stored in the /users and they have a list of projects such as :
users : {
userA : {
projects: {
projectId1: true,
projectId2: true
},
...
}
...
}
And obviously I have the projects as such:
projects: {
projectId1: {
name: "bla"
}
...
}
I want for a user to query all the projects that are in his projects list based on their Ids.
Right now I only succeed to query every single projects of the database and their filter on the client side but obviously this has some serious security implication and loading time as well as I don't want anyone to query all the projects and get them. I can add security rules but then I have access to nothing as I can't query /projects/ anymore but need to be specific.
I'm using https://github.com/CSFrequency/react-firebase-hooks/tree/master/database
and getting the data as such:
const [projects, loading, error] = useListVals(firebase.db.ref("projects"), {
keyField: "uid",
});
And so would like to be able to add an array of projected in this request like where({ id is included in [projectsId]})
You'll need to load each individual project for the user separately, pretty much like a client-side join operation. This is not nearly as slow as you may think, as Firebase pipelines the operations over a single connection.
I don't see anything built into the library you use for such client-side joins, but in regular JavaScript it's something like this:
let userRef = firebase.database().ref('users').child(firebase.auth().currentUser.uid);
userRef.once('value').then((projectKeys) => {
let promises = [];
projectSnapshot.forEach((projectKey) => {
let key = projectKey.key;
let projectRef = firebase.database().ref('projects').child(key);
promises.push(projectRef.once('value');
});
Promise.all(promises).then((snapshots) => {
console.log(snapshots.map(snapshot => snapshot.val()));
});
});

Storing big data in redux state - 30,000 items

I currently have a web socket which pushes data in a JsonAPI format to a react-native app, which then places the items in a redux store at intervals.
Works perfectly for 5,000 - 10,000 items. However I need to support around 30,000 items. Each item(object) has on average 12-ish keys.
Currently this crashes the app, either when I normalize parts of the data or merge new chunks of data with existing data already in my store.
Any advice would be really appreciated.
import debounce from 'lodash/debounce'
import merge from 'lodash/merge
let data = {}
const migrateData = () => {
InteractionManager.runAfterInteractions(() => {
dispatch(createSocketData(data))
data = {}
})
}
const debounced = debounce(migrateData, 3000)
cable.subscriptions.create('SyncChannel', {
received: (payload) => {
if(payload.serialized){
data = merge(data, normalize(payload.serialized))
debounced()
}
}
})
I would not recommend you to store big data in redux store. For now you need to support 30,000 items. Tomorrow you need to support 50,000 items, 100,000 items...
You can try to use storage specific components to solve your problem.
For example i use SQLite3.
SQLite3 component for react-native support both for IOS and Android
You might want to use Realm.
There sholud be no issues storing 1000 objects in Realm. It has been optimized for storing huge data and only loads data into memory when needed.

Resources