How to use geometry of vector tile layer in mapbox to make intersections with Turf.js? - reactjs

I am using Mapbox-gl in React.
I am trying to make an intersection between a polygon selected and a complete layer
map.on('click', 'buildings', function(e) {
map.getCanvas().style.cursor = 'pointer';
// getting the main polygon
const features = map.queryRenderedFeatures(e.point, {layers:['buildings']);
// getting the features of layer with which to make intersection
const featurestoTest = map.queryRenderedFeatures({layers: ['floodplains_from_2016']});
let building = undefined;
let fll = undefined;
features.forEach( feat => {
building = turf.polygon(feat.geometry);
featurestoTest.forEach( (feature) => {
fll = turf.polygon(feature.geometry);
// until here there is no error, but one it tries to intersect throws error
const featureIntersect = turf.intersect(fll, building);
})
}
});
});
The error thrown is
Error: coordinates must only contain numbers
The geometry types returned by MAPBOX are like this:
type: "Polygon"
coordinates: Array(1)
0: Array(25)
0: (2) [-77.02939443290234, 38.89539175681929]
1: (2) [-77.02935017645359, 38.89536461871285]
...
So I wonder how should be constructed the polygons to use turf.intersection after clicked a polygon in Mapbox.js

The turf.polygon() method takes an array of coordinates as an argument. You should use feat.geometry.coordinates instead of only feat.geometry

Related

Three js change color on press button removes the details of the model

Hope you all are doing best in you life!
I've newly intraction with 3d work. I'm making the 3d virtual showroom you can say 3d commerce page,where i need to put 3d Models. I want to add functionality of changing models color (limited colors) by users. like this: [https://codesandbox.io/s/v1fgk].(I'm using three js not react three fiber.)
THREE JS Version: "three": "^0.135.0",
The problem is that:
1) When i change the color of the mesh the details of the model are deleted.what can i do for this?
2) I face alot of lighting issues during this work.finally I'm using HDRI envornment instead of lighting in my models.
am i doing best?
Actual Model:
On Change Color:
Why my models's details are going to delete?
On Changing Color i'm doing this:
// const loader = new THREE.TextureLoader();
// const aoTexture = loader.load('textures/BottleModel/Model2/Material.002_Mixed_AO.png')
// const displacementTexture = loader.load('textures/BottleModel/Model2/base_Height.png')
// const metalicTexture = loader.load('textures/BottleModel/Model2/Material.002_Metallic.png')
// const roughnessTexture = loader.load('textures/BottleModel/Model2/Material.002_Roughness.png')
const BottleMesh=model.current.children.find((item)=>item.name==="cup_base")
BottleMesh.children.map((item)=>{
if(item.name==="Cylinder"){
item.material = new THREE.MeshStandardMaterial({
color: code,
// aoMap:aoTexture,
// metalnessMap : metalicTexture,
// displacementMap:displacementTexture,
// roughnessMap:roughnessTexture
})
}
})
CREATING MODEL AND ADDING ENVORNMENT:
export const createModel = (model, scene, path, setLoading,renderer) => { //model is a react ref, remember that
setLoading(true)
const dracoLoader = new DRACOLoader();
// // Specify path to a folder containing WASM/JS decoding libraries.
dracoLoader.setDecoderPath('https://www.gstatic.com/draco/v1/decoders/');
let loader = new GLTFLoader();
loader.setDRACOLoader(dracoLoader);
let generator = new THREE.PMREMGenerator(renderer);
new RGBELoader().setPath('/textures/hdri/') .load('dancing_hall_4k.hdr', function (texture) {
// let envmap =generator.fromEquirectangular(texture)
texture.mapping = THREE.EquirectangularReflectionMapping;
// scene.background = texture;
scene.environment = texture;
});
loader.load(path, function (gltf) {
model.current = gltf.scene;
model.current.position.x=0
model.current.position.y=-3
scene.add(model.current);
setLoading(false);
}, undefined, function (error) {
console.error(error)
});
}

Tensorflow.js predict returning NaNs

I converted a keras model into a tensorflowjs model using the simple tensorflowjs_converter --input_format keras ./model/L_keypoint_classifier_final.h5 L_layer_model. I managed to get this model working on a .ts (TypeScript) file.
Now I am focused on deploying this model using React and Typescript (in a .tsx file). My app component is loading the models as such:
const [models, setModels] = useState<Models>({
L_Model: undefined,
R_Model: undefined,});
useEffect(() => {
loadModel().then((models) => {
setModels(models);
setIsLoading(false);
}); }
The loadModel() function is exported from another file and it is:
export async function loadModel() {
let result: Models = { R_Model: undefined, L_Model: undefined };
result.R_Model = await tf.loadLayersModel("/right/model.json");
result.L_Model = await tf.loadLayersModel("/left/model.json");
return result;
}
That directory of the models is in the public folder of my project. After loading the models in the app component, I pass them to a child component using props.
<Camera models={models}></Camera>
They are received in the camera component as:
const Camera: FunctionComponent<CameraProps> = (props) => {
const { R_Model, L_Model } = props.models;
In the camera component I pass in a tf.Tensor2D. This tensor does in fact contain values that I checked. But when I pass them to the model.predict() function, it just returns a tensor full of NaNs.
This is my code for preprocessing the input and passing it to the model:
//Preprocess Landmarks
//#ts-ignore
let landmark_list = calc_landmark_list(landmarks);
landmark_list = pre_process_landmarks(landmark_list);
//#ts-ignore
landmark_list = tf.tensor2d([landmark_list]);
console.log(landmarks_list.dataSync());
let prediction;
if(isRightHand){
prediction = R_Model?.predict(landmark_list);
}else{
prediction = L_Model?.predict(landmark_list);
}
const scores = prediction.arraySync()[0];
After that, I try to find the maxScore of the predictions, but since arraySync() returns a NaN array, it does not work. My team and me have try searchig for different options. Some include wrapping the predict function inside an aasync function, but that doesn't seem to work either (or maybe we have done it incorrectly, although we have followed the examples thoroughly).
The console.log of the landmark_list.dataSync() prints out:
Float32Array(42) [0, 0, -0.2683601677417755, -0.1023331806063652, -0.4781370162963867, -0.397993803024292, -0.5191399455070496, -0.6676312685012817, -0.46050554513931274, -0.8477477431297302, -0.30691489577293396, -0.9023468494415283, -0.49582260847091675, -1, -0.5734853148460388, -0.7551659941673279, -0.5509241223335266, -0.5708747506141663, -0.15572300553321838, -0.9109046459197998, -0.38624173402786255, -0.9391834735870361, -0.4641483426094055, -0.6930190920829773, -0.4609870910644531, -0.49743616580963135, -0.00984301045536995, -0.8530527353286743, -0.25299814343452454, -0.7750100493431091, -0.32405075430870056, -0.5182365775108337, -0.32825687527656555, -0.3154793083667755, 0.11740472167730331, -0.7356364130973816, -0.12479904294013977, -0.6477926969528198, -0.21985816955566406, -0.43255504965782166, -0.24492989480495453, -0.25398018956184387, buffer: ArrayBuffer(168), byteLength: 168, byteOffset: 0, length: 42, Symbol(Symbol.toStringTag): 'Float32Array']

Bad shape in LSTM model

I'm using tensorflow js and I have this code to build my model of recurrent neural network to a classification problem with 3 classes, instances of size 250, .
I have the following error message when I try to fit my model:
Error: Error when checking target: expected dense_Dense1 to have shape [,3], but got array with shape [4827,1].
I pretty new to construct my own model in tfjs, and I think I messed up with tensor shapes
PS: my dataset contains 4827 instances and my embeddingSize is 32
function buildModel(maxLen, vocabularySize, embeddingSize, numClasses)
{
const model = tensorflow.sequential();
model.add(tensorflow.layers.embedding(
{
inputDim: vocabularySize,
outputDim: embeddingSize,//embeddingSize = 32
inputLength: maxLen//maxLen = 250
}));
model.add(tensorflow.layers.lstm({units: embeddingSize/*, returnSequences: true*/}));
model.add(tensorflow.layers.dense({units: numClasses, activation: 'softmax'}));//numClasses = 3
return model;
}
const history = await model.fit(data, labels, {
epochs: epochs,
batchSize: batchSize,
validationSplit: validationSplit,
callbacks: () =>
{
console.log("Coucou");
}
});
console.log(history);
Thank you
You need to change the layers dimension by returning false to the lstm layer
model.add(tensorflow.layers.lstm({units: embeddingSize, returnSequences: false}));
The problem was my data which had label un 1D format (0, 1 or 2) instead of having 3D format ([1,0,0], [0,1,0], [0,0,1])

Cannot find a differ supporting object '33.265625' of type 'number'

I have a Ionic App using google maps. I am trying to get latitude and longitude from data json api for flight route and that data json api then inject that data to Google Maps polyline . Fetch data json api working fine without problem , but when l put objects inside Google Maps l get error Error: Cannot find a differ supporting object '33.265625' of type 'number'. NgFor only supports binding to Iterables such as Arrays. l use forEach to move through the array .
my code :
async getmarker(){
this.http.get('/v1/flightjson?flightId=201',{},{})
.then( data=>{
// this.latitude = JSON.parse(data.data).result.response.data.flight.track.latitude
// this.longitude = JSON.parse(data.data).result.response.data.flight.track
for(let datas of JSON.parse(data.data).result.response.data.flight['track']) {
this.longitude = datas.longitude
this.latitude = datas.latitude
console.log(this.longitude)
// Do something.
}
})
}
loadMap() {
let AIR_PORTS = [
this.latitude,
this.longitude
];
this.map = GoogleMaps.create('map_canvas');
let polyline: Polyline = this.map.addPolylineSync({
points: AIR_PORTS,
color: '#AA00FF',
width: 10,
geodesic: true,
clickable: true // clickable = false in default
});
polyline.on(GoogleMapsEvent.POLYLINE_CLICK).subscribe((params: any) => {
let position: LatLng = <LatLng>params[0];
let marker: Marker = this.map.addMarkerSync({
position: position,
title: position.toUrlValue(),
disableAutoPan: true
});
marker.showInfoWindow();
});
}
my data json url
html
<div id="map_canvas"></div>
for(let datas of JSON.parse(data.data).result.response.data.flight['track']) {
this.longitude = datas.longitude
this.latitude = datas.latitude
}
In above for loop, I assume this.longitude and this.latitude will be arrays.
You should push elements inside this.longitude and this.latitude (as shown below), instead of replacing previous value.
this.longitude.push(datas.longitude);
this.latitude.push(datas.latitude);
You may be using these variables to iterate through different points on the map, as they are plain numbers, ngFor will throw an error saying that content inside ngFor should be iterable.
Array is an iterable whereas number is not.

Leaflet: How to use `getLatLngs` on a geoJSON polyline?

I create a geoJSON polyline object called bezier using turf.js like this:
var linestring = turf.linestring(
[[121.465, 31.233], [121.500634, 31.233499], [121.588107, 31.190172], [121.501545, 31.207394], [121.337514, 31.196079]]
, {
"stroke": "#25561F",
"stroke-width": 5
});
var bezier = turf.bezier(linestring, 50000, 0.85);
bezier.properties = {
"stroke": "#6BC65F",
"stroke-width": 5,
"description": "Bezier line from polyline"
}
L.mapbox.featureLayer().setGeoJSON(bezier).addTo(map);
Then, I used bezier.geometry.coordinates to access its point array.. But what I really need is the array of LatLng object (because the L.animatedMarker in this plugin needs latlngs), I was wondering whether there is a way to extract the LatLng array like what getLatLngs method did on leaflet object.
You'll first need to get a reference to the actual layer from the layer you've added it to, in this instance that would be your L.mapbox.featureLayer. Once you've got that, you can just use the getLatLngs method. You can do this in multiple ways:
Use the layeradd event, the cleanest way:
var featureLayer = L.mapbox.featureLayer().addTo(map);
featureLayer.on('layeradd', function (e) {
var latLngs = e.layer.getLatLngs();
})
var featureLayer = L.mapbox.featureLayer().setGeoJSON(bezier).addTo(map);
If you're only going to insert one layer like you're doing now you could also fetch it directly from the layers object contained in the featurelayer:
var key = Object.keys(featureLayer._layers)[0];
var latLngs = featureLayer._layers[key].getLatLngs();
Or if you've got multiple layers in your featureLayer and don't want to use events you can loop over the featureLayer and grab it from there:
featureLayer.eachLayer(function (layer) {
var latLngs = layer.getLatLngs();
});

Resources