Bad shape in LSTM model - tensorflow.js

I'm using tensorflow js and I have this code to build my model of recurrent neural network to a classification problem with 3 classes, instances of size 250, .
I have the following error message when I try to fit my model:
Error: Error when checking target: expected dense_Dense1 to have shape [,3], but got array with shape [4827,1].
I pretty new to construct my own model in tfjs, and I think I messed up with tensor shapes
PS: my dataset contains 4827 instances and my embeddingSize is 32
function buildModel(maxLen, vocabularySize, embeddingSize, numClasses)
{
const model = tensorflow.sequential();
model.add(tensorflow.layers.embedding(
{
inputDim: vocabularySize,
outputDim: embeddingSize,//embeddingSize = 32
inputLength: maxLen//maxLen = 250
}));
model.add(tensorflow.layers.lstm({units: embeddingSize/*, returnSequences: true*/}));
model.add(tensorflow.layers.dense({units: numClasses, activation: 'softmax'}));//numClasses = 3
return model;
}
const history = await model.fit(data, labels, {
epochs: epochs,
batchSize: batchSize,
validationSplit: validationSplit,
callbacks: () =>
{
console.log("Coucou");
}
});
console.log(history);
Thank you

You need to change the layers dimension by returning false to the lstm layer
model.add(tensorflow.layers.lstm({units: embeddingSize, returnSequences: false}));

The problem was my data which had label un 1D format (0, 1 or 2) instead of having 3D format ([1,0,0], [0,1,0], [0,0,1])

Related

Tensorflow.js predict returning NaNs

I converted a keras model into a tensorflowjs model using the simple tensorflowjs_converter --input_format keras ./model/L_keypoint_classifier_final.h5 L_layer_model. I managed to get this model working on a .ts (TypeScript) file.
Now I am focused on deploying this model using React and Typescript (in a .tsx file). My app component is loading the models as such:
const [models, setModels] = useState<Models>({
L_Model: undefined,
R_Model: undefined,});
useEffect(() => {
loadModel().then((models) => {
setModels(models);
setIsLoading(false);
}); }
The loadModel() function is exported from another file and it is:
export async function loadModel() {
let result: Models = { R_Model: undefined, L_Model: undefined };
result.R_Model = await tf.loadLayersModel("/right/model.json");
result.L_Model = await tf.loadLayersModel("/left/model.json");
return result;
}
That directory of the models is in the public folder of my project. After loading the models in the app component, I pass them to a child component using props.
<Camera models={models}></Camera>
They are received in the camera component as:
const Camera: FunctionComponent<CameraProps> = (props) => {
const { R_Model, L_Model } = props.models;
In the camera component I pass in a tf.Tensor2D. This tensor does in fact contain values that I checked. But when I pass them to the model.predict() function, it just returns a tensor full of NaNs.
This is my code for preprocessing the input and passing it to the model:
//Preprocess Landmarks
//#ts-ignore
let landmark_list = calc_landmark_list(landmarks);
landmark_list = pre_process_landmarks(landmark_list);
//#ts-ignore
landmark_list = tf.tensor2d([landmark_list]);
console.log(landmarks_list.dataSync());
let prediction;
if(isRightHand){
prediction = R_Model?.predict(landmark_list);
}else{
prediction = L_Model?.predict(landmark_list);
}
const scores = prediction.arraySync()[0];
After that, I try to find the maxScore of the predictions, but since arraySync() returns a NaN array, it does not work. My team and me have try searchig for different options. Some include wrapping the predict function inside an aasync function, but that doesn't seem to work either (or maybe we have done it incorrectly, although we have followed the examples thoroughly).
The console.log of the landmark_list.dataSync() prints out:
Float32Array(42) [0, 0, -0.2683601677417755, -0.1023331806063652, -0.4781370162963867, -0.397993803024292, -0.5191399455070496, -0.6676312685012817, -0.46050554513931274, -0.8477477431297302, -0.30691489577293396, -0.9023468494415283, -0.49582260847091675, -1, -0.5734853148460388, -0.7551659941673279, -0.5509241223335266, -0.5708747506141663, -0.15572300553321838, -0.9109046459197998, -0.38624173402786255, -0.9391834735870361, -0.4641483426094055, -0.6930190920829773, -0.4609870910644531, -0.49743616580963135, -0.00984301045536995, -0.8530527353286743, -0.25299814343452454, -0.7750100493431091, -0.32405075430870056, -0.5182365775108337, -0.32825687527656555, -0.3154793083667755, 0.11740472167730331, -0.7356364130973816, -0.12479904294013977, -0.6477926969528198, -0.21985816955566406, -0.43255504965782166, -0.24492989480495453, -0.25398018956184387, buffer: ArrayBuffer(168), byteLength: 168, byteOffset: 0, length: 42, Symbol(Symbol.toStringTag): 'Float32Array']

Custom Layer with kwargs in tfjs

I'm new to tensorflowjs and I'm struggling to implement some custom layers, if someone could point me in the right direction that would be really helpful!
For example, I have a layer in InceptionResnetV1 architecture where I'm multiplying the layer by a constant scale (this was originally an unsupported Lambda layer which I'm switching out for a custom layer), but the value of this scale changes per block. This works fine in Keras with an implementation such as below, and using load_model with ScaleLayer in the custom objects
class ScaleLayer(tensorflow.keras.layers.Layer):
def __init__(self, **kwargs):
super(ScaleLayer, self).__init__(**kwargs)
def call(self, inputs, **kwargs):
return tensorflow.multiply(inputs, kwargs.get('scale'))
def get_config(self):
return {}
x = ScaleLayer()(x, scale = tensorflow.constant(scale))
I tried defining this in a similar way in javascript and then registered the class
class ScaleLayer extends tf.layers.Layer {
constructor(config?: any) {
super(config || {});
}
call(input: tf.Tensor, kwargs: Kwargs) {
return tf.tidy(() => {
this.invokeCallHook(input, kwargs);
const a = input;
const b = kwargs['scale'];
return tf.mul(a, b);
});
}
static get className() {
return 'ScaleLayer';
}
}
tf.serialization.registerClass(ScaleLayer);
However I'm finding that the kwargs are always empty. I tried another similar method where I passed scale as another dimension of the input, then did input[0] * input[1], which again worked fine for the keras model but not in javascript.
I feel like I'm missing something key on the way to defining this kind of custom layer with a changing value per block on the javascript end, so if someone would be able to point me in the right direction it would be much appreciated! Thanks.
constructor(config?: any) {
super(config || {});
}
The config are passed to the parent constructor. But as indicated by the question, the ScaleLayer layer also needs to keep some config properties
constructor(config?: any) {
super(config || {});
// this.propertyOfInterest = config.propertyOfInterest
// make sure that config is an object;
this.scale = config.scale
}
Then for the computation, the ScaleLayer property propertyOfInterest can be used
call(input: tf.Tensor) {
return tf.tidy(() => {
this.invokeCallHook(input, kwargs);
const a = input;
return tf.mul(a, this.scale);
});
}
Use the layer this way:
const model = tf.sequential();
...
model.add(new ScaleLayer({scale: 1}));
...

How to upload image and pass in to tensorflowjs model to get prediction using reactjs?

EDIT
Using the graph model format and the updated code example, I've managed to get it to return a prediction. Issue is now it always returns 1, no matter which image I feed it, so wondering if I am not passing in the right image data?
Second EDIT: Changed the way I was passing in the img object, but still getting 1 for every image I feed it.
I have only just started looking into tensorflowjs and am using a prebuilt keras model I have been given access to. This model is a binary classifier. The model has been saved as an .h5 file and I have been asked to run it in the browser using tensorflowjs and react. Essentially I want to select an image from my local storage or an sd card and feed it to the model to get a yes or no classification.
I’ve followed the tensorflowjs docs in converting the keras model to a TF.js Layers format, but then can’t load the model. I’m getting an error about an unknown layer: RandomFlip. So I then tried converting the model to a graph model as I couldn’t find a solution to the error and thought I’d give it a try. This loaded the model but then there were more issues when feeding it the image. The shape of dict['image_tensor'] provided in model.execute(dict) must be [-1,380,380,1], but was [380, 380] . Then I searched for that and got it to resize to [-1,380,380,1] , but then it was complaining about size not being the expected, so I thought maybe I've messed up in some of the previous steps.
To convert to a graph model I used the following command: tensorflowjs_converter --input_format keras --output_format tfjs_layers_model /Users/myUser/Documents/save_at_45.h5 /Users/myUser/Documents/convert-keras-model and in my code loading it with the loadGraphModel method. Following this path has at least allowed me to load the model.
I also tried converting it to a Layers format with: tensorflowjs_converter --input_format keras --output_format tfjs_layers_model /Users/myUser/Documents/save_at_45.h5 /myUser/mariomendes/Documents/convert-keras-model and in my code loading it with the loadLayersModel. This returns the error Unknown layer: RandomFlip. I've tried searching for a solution to this, but haven't been able to find one.
Does knowing it is a .h5 file mean I should know if it needs to be converted to a tf Graph format or Layers format or is there something else that determines which format it should be converted to?
I've stored the converted model in both formats and it's weights in S3 and am getting it from there.
For my react code I have done the following:
import React, { useState, useEffect } from "react";
import "./index.css";
import * as tf from "#tensorflow/tfjs";
function ImgImporter() {
const [file, setFile] = useState(null);
const [model, setModel] = useState(null);
const [processing, setProcessing] = useState(false);
const [prediction, setPrediction] = useState(null);
const [imageLoaded, setImageLoaded] = useState(false);
function readImage(file) {
return new Promise((rs, rj) => {
const fileReader = new FileReader();
fileReader.onload = () => rs(fileReader.result);
fileReader.onerror = () => rj(fileReader.error);
fileReader.readAsDataURL(file);
});
}
async function handleImgUpload(event) {
const {
target: { files },
} = event;
const _file = files[0];
const fileData = await readImage(_file);
setFile(fileData);
setProcessing(true);
}
useEffect(() => {
async function loadModel() {
if (!model) {
const _model = await tf.loadGraphModel("/model.json");
setModel(_model);
}
}
loadModel();
});
useEffect(() => {
async function predict() {
if (imageLoaded && file) {
const imageElement = document.createElement("img");
imageElement.src = file;
imageElement.onload = async () => {
const tensor = tf.browser
.fromPixels(imageElement, 1)
.resizeNearestNeighbor([380, 380])
.expandDims()
.toFloat();
const prediction = await model.predict(tensor).data();
setPrediction(parseInt(prediction, 10));
setProcessing(false);
setImageLoaded(false);
};
}
}
predict();
}, [imageLoaded, model, file]);
return (
<div className="File-input-container">
<form className="Form">
<label htmlFor="upload-image">Upload image</label>
<input
id="image-selector"
type="file"
name="upload-image"
accept="image/*"
className="File-selector"
onChange={handleImgUpload}
disabled={!model || processing}
/>
</form>
<div className="Img-display-container">
<img
onLoad={() => {
setImageLoaded(true);
}}
alt=""
src={file}
/>
</div>
<div className="Img-processing-container">
{processing ? (
<p>Loading ...</p>
) : prediction !== null ? (
<div>
<p>{prediction === 1 ? "Yes" : "No"}</p>
</div>
) : null}
</div>
</div>
);
}
export default ImgImporter;
When I upload an image this is returning the following result in the console as the value of prediction:
dataId: {id: 195}
dtype: "float32"
id: 94
isDisposedInternal: false
kept: false
rankType: "2"
scopeId: 6
shape: (2) [1, 1]
size: 1
strides: [1]
Would be great if someone could shed some light on this or help me finding the right direction.
If you want to get the value, you can use prediction.dataSync() or its promise counterpart await prediction.data()
Regarding your second edit. You're converting the image to a float, but does the model expect a normalized float? You might need to append .div(255) or whatever normalization is needed. Please post the specs for your model.
Also, as stated by edkeveked, you can used dataSync() to get your data, but it's worth noting you could have also used arraySync too, which would maintain the returned tensor depth.
Also, I noticed you didn't do any cleanup. So your tensors will build up in GPU memory. Don't forget to dispose.
I hope these things help.

How to use geometry of vector tile layer in mapbox to make intersections with Turf.js?

I am using Mapbox-gl in React.
I am trying to make an intersection between a polygon selected and a complete layer
map.on('click', 'buildings', function(e) {
map.getCanvas().style.cursor = 'pointer';
// getting the main polygon
const features = map.queryRenderedFeatures(e.point, {layers:['buildings']);
// getting the features of layer with which to make intersection
const featurestoTest = map.queryRenderedFeatures({layers: ['floodplains_from_2016']});
let building = undefined;
let fll = undefined;
features.forEach( feat => {
building = turf.polygon(feat.geometry);
featurestoTest.forEach( (feature) => {
fll = turf.polygon(feature.geometry);
// until here there is no error, but one it tries to intersect throws error
const featureIntersect = turf.intersect(fll, building);
})
}
});
});
The error thrown is
Error: coordinates must only contain numbers
The geometry types returned by MAPBOX are like this:
type: "Polygon"
coordinates: Array(1)
0: Array(25)
0: (2) [-77.02939443290234, 38.89539175681929]
1: (2) [-77.02935017645359, 38.89536461871285]
...
So I wonder how should be constructed the polygons to use turf.intersection after clicked a polygon in Mapbox.js
The turf.polygon() method takes an array of coordinates as an argument. You should use feat.geometry.coordinates instead of only feat.geometry

Convert firebase json to typescript array

I can easily get the data working in my HTML, but when I have to convert it for visualization purposes it is a struggle.
I get the data as a FirebaseListObservable. In my case there's 3 value-types in each List, but it is only one of them there has to be part of the array. How do I convert a FirebaseListObservable to an array in typescript?
The reason of converting is to use graphs in my app.
Typescript getting the data:
this.measure= this.db.list('/Users/'+this.username+'/Measure');
enter code here
Typescript for chartjs
this.data = [12, 24, 91, 23]
The data has to be the data from the firebase
Declare a empty array data=[];
then get the data with .subscribe()
this.db.list('/Users/'+this.username+'/Measure').subscribe(measures => {
measures.forEach(measure => {
this.data.push(measure.number);//.number should be your field in firebase
})
})
this will fetch all list and search through all items and push your desired values to the array.
FirebaseListObservable is an observable. That means you will get the list itself (=the array you want) when you subscribe to it.
this.measure: FirebaseListObservable<Measure[]> = this.db.list('/Users/'+this.username+'/Measure');
this.measure
.map(measurement => measurement.numbers) // this extracts only numbers field so we have basically array of arrays
.map(numbersArrays => numbersArrays.reduce((curr, prev) => curr.concat(prev))) // so lets concat all those arrays into a single one
.subscribe(numbersArray: number[] => {
console.log(numbersArray);
});
interface Measure {
data: Date;
geolocation: string;
numbers: number[];
}

Resources