Clone Tensorflow.js Loaded Model - tensorflow.js

I have a model, hand-shape, that is sequential and stateful.
This means that on every inference, it depends on the output of the previous inference, in sequence.
I would like to run this model for both the left and the right hand, independently.
My current working solution, is to:
leftModel = await tf.loadLayersModel('hand-shape.json')
rightModel = await tf.loadLayersModel('hand-shape.json')
However, this has a flaw of loading the model twice from disk, instead of once.
Is there a way to load the model, then clone it?
leftModel = await tf.loadLayersModel('hand-shape.json')
rightModel = cloneModelSomehow(leftModel) // don't reload from disk

I found a solution with tfjs
// Load model from disk
leftModel = await tf.loadLayersModel('hand-shape.json');
// Clone model
const modelData = new Promise<ModelArtifacts>(resolve => leftModel.save({save: resolve as any}));
rightModel = await tf.loadLayersModel({load: () => modelData});

Related

How to store data in this very simple React Native app?

I'm developing an app using React Native that allows you to create your own checklists and add items to them.
For example you'd have "Create Checklist", and inside that you'll have the option to "Add Item", "Delete Item" "Edit Item", basic CRUD methods etc.
It's going to be completely offline but I'm wondering what the best approach to storing this data locally would be.
Should I be using a DB such as firebase? I have read that it is overkill and to use something like Redux but I'm not sure if the latter will accomplish everything I need. As long as it's storing data which can be edited, and will save on the user's device (with minimal effort) it sounds good to me.
Would appreciate some input on this, thanks!
You could use AsyncStorage for persisting data locally on the user's phone. It is a simple persistent key-value-storage.
Each checklist is most likely an array of JS objects. The documentation provides an example on how to store objects.
const storeData = async (value) => {
try {
const jsonValue = JSON.stringify(value)
await AsyncStorage.setItem('#storage_Key', jsonValue)
} catch (e) {
// saving error
}
}
The value parameter is any JS object. We use JSON.stringify to create a JSON string. We use AsyncStorage.setItem in order to persist the data. The string #storage_Key is the key for the object. This could be any string.
We retrieve a persisted object as follows.
const getData = async () => {
try {
const jsonValue = await AsyncStorage.getItem('#storage_Key')
return jsonValue != null ? JSON.parse(jsonValue) : null;
} catch(e) {
// error reading value
}
}
Both examples are taken from the official documentation.
Keep in mind that this functionality should be used for persistence only. If the application is running, you should load the complete list, or parts of the list if it is very large, in some sort of application cache. The implementation for this functionality now heavily depends on how your current code looks like. If you have a plain view, then you could access the local storage in an effect and just store it in a local state.
function MySuperList() {
const [list, setList] = useState([]);
React.useEffect(() => {
// retrieve data using the above functionality and set the state
}, [])
// render list
return (...)
}
I would implement some sort of save button for this list. If it is pressed, then we persist the data in the local storage of the phone.

How to force ngrx-data to clear cashed entities and reload data from db

I have a typical ngrx-data arrangement of 'User' entities linked to db.
I implement the standard service to handle the data:
#Injectable({providedIn: 'root'})
export class UserService extends EntityCollectionServiceBase<UserEntity> {
constructor(serviceElementsFactory: EntityCollectionServiceElementsFactory) {
super('User', serviceElementsFactory);
}
}
I read the data using:
this.data$ = this.userService.getAll();
this.data$.subscribe(d => { this.data = d; ... }
Data arrives fine. Now, I have a GUI / HTML form where user can make changes and update them. It also works fine. Any changes user makes in the form are updated via:
this.data[fieldName] = newValue;
This updates the data and ngrx-data automatically updates the entity cache.
I want to implement an option, where user can decide to cancel all changes before they are written to the db, and get the initial data before he made any adjustments. However, I am somehow unable to overwrite the cached changes.
I tried:
this.userService.clearCache();
this.userService.load();
also tried to re-call:
this.data$ = this.userService.getAll();
but I am constantly getting the data from the cache that has been changed by the user, not the data from the db. In the db I see the data not modified. No steps were taken to write the data to db.
I am not able to find the approach to discard my entity cache and reload the original db data to replace the cached values.
Any input is appreciated.
You will need to subscribe to the reassigned observable when you change this.data$, but it will be a bit messy.
First you bind this.data$ via this.data$ = this.userService.entities$, then no matter you use load() or getAll(), as long as the entities$ changed, it fire to this.data$.subscribe(). You can even skip the load and getAll if you already did that in other process.
You can then use the clearCache() then load() to reset the cache.
But I strongly recommand you to keep the entity data pure. If the user exit in the middle without save or reset, the data is changed everywhere you use this entity.
My recommand alternatives:
(1) Use angular FormGroup to set the form data with entity data value, then use this setting function to reset the form.
(2) Make a function to copy the data, then use this copy function as reset.
For example using _.cloneDeep.
(2.1) Using rxjs BehaviourSubject:
resetTrigger$ = new BehaviourSubject<boolean>(false);
ngOnInit(){
this.data$ = combineLastest([
this.resetTrigger$,
this.userService.entities$
]).subscribe([trigger, data])=>{
this.data = _.cloneDeep(data)
});
// can skip if already loaded before
this.userService.load();
}
When you want to reset the data, set a new value to the trigger
resetForm(){
this.resetTrigger$.next(!this.resetTrigger$.value)
}
(2.2) Using native function (need to store the original):
this.data$ = this.userService.entities$.pipe(
tap(d=>{
this.originData = d;
resetForm();
})
).subscribe()
resetForm:
resetForm:()=>{
this.data = _.cloneDeep(this.originData);
}

Loading large CSV file with Papaparse not working (only first chunk loaded)

I would like to load a local file (client-side) with papaparse into my React application. Unfortunately, it only loads the first chunk but never the whole file. My file contains about 500 rows and there are never more than 300 rows loaded. It seems like the complete functions is already called after the first chunk.
Since I need to navigate to another page when the file is loaded completely, this is bothering me since I need the complete file for further functions.
The code I use at the moment:
async getData() {
const self = this;
let dataList = [];
Papa.parse(await this.fetchCsv(),
{
delimiter: ',',
header: true,
chunk: function (result, parser) {
parser.pause();
dataList = dataList.concat(result.data)
parser.resume();
},
complete: function () {
self.updateData(dataList);
}
});
}
async fetchCsv() {
const response = await fetch(this.props.location.state.filename);
const reader = response.body.getReader();
const result = await reader.read();
const decoder = new TextDecoder('utf-8');
return decoder.decode(result.value);
}
What I've also tried is using step instead of chunk but this did not change anything.
Can anyone tell me what I'm doing wrong here and why papaparse does not load the whole file?
You may be able to let papaparse do more. It can read local File or stream data from a remote server.
If you only have about 500 records, you may not need to add the complexity associated with streaming. This is especially true if you're just accumulating the data (which it appears you are). Use streaming primarily to process the data 1 record at a time.
If you want to stream, I'd recommend using the "step" callback instead of the "chunk" callback so you can process each row of data.
If you use the step or chunk callbacks, then you don't need the complete callback. If it's called, it won't have the data.

Updating multiple firestone collections

I'm trying to update 2 firebase collections that will contain an array element that is the same.
For example, I'm building a job app, so when a user creates a job, it pushes that job object into a firebase collection called alljobs under a document called alljobs. In addition, the same job is pushed to a firebase collection called created jobs. Where each user on the app has their individual created jobs, each doc is named the users id.
Is there an easy way to update this specific job in both alljobs collection and the createdjobs collection?
For example, my approach of doing it would be like this.
Individual Job component (obtained by previously mapping through all the jobs)
const [userjobs, setUserjobs] = useState([])
const {job, createdjobs} = props
function updateJob(){
createdjobs?.map(job1=>{
if(job1.jobid===job.jobid){
const jobindex = createdjobs.indexOf(job1)
createdjobs[jobindex].jobtitle = 'New title'
db.collection('createdjobs').doc(user.uid).update({
jobs: createdjobs
})
}
})
}
I'll basically have to repeat this same process to update once again the job that has just been updated in the createdjobs collection. This gets repetitive and messy. So looking for a solution to this. By mapping through alljobs this time.
useEffect(()=>{
db.collection('alljobs').doc('alljobs').onSnapshot(snap=>{
setAlljobs(snap.data().jobs)
})
},[])
There is no shortcut for your problem I think. But I suggest you to write a sync function to Firebase Function.
It will watch changes of one source and sync to others. So that your logic code only needs to manage one source of trust.
As #Thanh Le suggested you can write a Google Cloud Function to fulfill this purpose. In cloud functions there is a function type named triggers. You can use this triggers.
Cloud Function triggers
We can write functions which will automatically trigger when the specfied document or set of documents,
onCreate - Trigger when document creating
onUpdate - Triggered when a document already exists and has any value changed.
onDelete - Trigger when document deleting
onWrite - Triggered when onCreate, onUpdate or onDelete is triggered.
From these triggers you can use onWrite Trigger to to implement the function.
exports.updateAllJobsTrigger = functions.firestore.document('createdJob/{userId}')
onWrite(async (change, context) => {
// Check if document is deleted
if (!change.after.exists) {
logger.log('Document not existing. Function exited.');
return;
}
const allJobsRef = admin.firestore().collection('alljobs').doc('alljobs');
// Check if document created
if (!change.before.exists) {
try {
// This is a newly created document. Therefore newjob should be in first element
const newJob = change.after.data().jobs[0];
const data = (await allJobsRef.get()).data();
if (data) {
const jobs = data.jobs;
await allJobsRef.update({
jobs: [...jobs, newJob]
});
logger.info('Job added to All jobs queue.');
}
} catch (exception) {
logger.error(exception)
}
return;
}
try {
// This is a updating document.newly added job is in the last element of the array
const newJob = change.after.data().jobs[change.after.data().jobs.length - 1];
const data = (await allJobsRef.get()).data();
if (data) {
const jobs = data.jobs;
await allJobsRef.update({
jobs: [...jobs, newJob]
});
logger.info('Job added to All jobs queue.');
}
} catch (exception) {
logger.error(exception)
}
});
As #Nimna Perera said, you can use Cloud Functions to solve this issue. Your CF should be triggered when a document is updated, created or deleted (so the onWrite option). Another way to do this is through transactions, when you need to read and write the documents or batched writes when you only need to write in one or various documents. In both cases you are not limited to a single collection, so it should work for your issue.

How to Update an array without downloading all data from firebase ReactJs

Actually am new in react and am trying to create an event app in which a user can join an event
here is code for joining an event
export const JoinEvent = (id) => {
return async dispatch => {
let data = await firebase.firestore().collection('Events').doc(id).get()
let tmpArray = data.data()
let currentUser = firebase.auth().currentUser
let newArray = tmpArray.PeopleAttending
await firebase.firestore().collection('Events').doc(id).update({
PeopleAttending : {...newArray, [currentUser.uid]: {displayName : currentUser.displayName}}
})
}
}
actually i have created an action bascailly in JoinEvent an id is passed of the particular event which is clicked.
here is my firestore structure look like this..
so basically i have to download the whole data and store in local array and then add new user and then finally update
So here am basically download the whole data is there any way to just simply add new Object without downloading whole data??
thankyou
You are doing it wrong. Firestore document size limit is Maximum size for a document 1 MiB (1,048,576 bytes), so sooner or later you're going to reach that limit if you keep adding data like this. It may seems that you're not going to reach that limit, but it's very unsafe to store data that way. You can check Firestore query using an object element as parameter how to query objects in firestore documents, but I suggest you don't do it that way.
The proper way to do it, is to create a subcollection PeopleAttending on each document inside the Events collection and then use that collection to store the data.
Also you can try document set with merge or mergeFields like documented here https://googleapis.dev/nodejs/firestore/latest/DocumentReference.html#set and here https://stackoverflow.com/a/46600599/1889685.

Resources