I am building simple application, but one problem is I lost a lot of time, that's why I write here, I have images in my app that I get from another server, I need to know the original size of the image, I used to use React-native getSize() method to get the right dimensions, it worked on IOS but on Android it was calculating incorrect dimensions, this happened when the size of the image was too large, after reading a few articles and comments I analyzed that there might have been a problem in Fresco, I was unable to solve this problem.
//returns wrong dimensions
Image.getSize(myUri, (width, height) => {setState({width, height})});
const setImageSize = (imageUrl:any) => {
const img = new Image();
img.src =`https:${imageUrl}`;
img.onload = () => {
console.log(
"height-:", img.height,
"width-:", img.width
);
};
};
useEffect(()=>{
setImageSize(image)
},[])
After that I tried to write a function that would call this url and would get dimensions, but I can not create constructor function with new Image() it's throws error TypeError: Object is not a constructor (evaluating 'new _reactNativeElements.Image()'). I do not know what to do or where to find the solution. thank you
Image.getSize(myUri, (width, height) => { this.setState({ width, height }) });
Refer to this!
Related
I am attempting to get the elevation of certain coordinates on the map using mapbox.
Based on the documentation, i can use queryTerrainElevation.
Sample Code:
map.on("click", (data) => {
console.log(data);
const elevation = map?.queryTerrainElevation(data.lngLat, {
exaggerated: true,
});
console.log("Elevation: " + elevation)
});
Console logs:
Using the mapbox tilequery with the same coordinates:
https://api.mapbox.com/v4/mapbox.mapbox-terrain-v2/tilequery/95.9345,41.2565.json?access_token=<mapbox_token>
There is an elevation value in the response:
You must be adding the custom layer after just loading the style. So, the terrain value isn't updated at that time. That's why you get null.
Do it like this and it should work.
map.on('idle', () => {
const elevation = map.queryTerrainElevation(coordinates, {exaggerated: false});
})
This runs the layer code after the map is loaded. Instead of just the style.
Below are snippets of my code. Im mapping my data to rename item.image.url but I am getting and undefined value on my image.
export function flattenProducts(data) {
return data.map((item) => {
let image = item.image.url
return { ...item, image };
});
}
React.useEffect(() => {
axios.get(`${url}/products`).then((response) => {
const products = flattenProducts(response.data);
console.log(products);
});
}, []);
Please see the images below. Any help is appreciated. Thank you
Your question actually answers itself. You try to map Strapi thinks that your image is an array (of images or other data). When you try to map item.image.url it is not actually right. Even if you have one item in the array, you can not refer it without pointing out the index of the array element. For example it would work if you write item.image[0].url instead of item.image.url inside flattenProducts function, as #viet pointed out. Try:
export function flattenProducts(data) {
return data.map((item) => {
let image = item.image[0].url
return { ...item, image };
});
}
The best solution for the problem would be changing the content type of your image element inside Strapi Admin Panel.
Following your data return. The item.image is an array.
So, let image = item.image[0]?.url should work.
As titled,
Is it possible to initialize canvas with image on it using Pixi so that i can add filter on top of it? Im using gsap 3.2.6 and Pixi 5.2.4.
I have the canvas created like this (using React btw)
useEffect(() => {
let ctx = imgDisplayer.current.getContext('2d');
let img = new Image();
img.onload = function(){
imgDisplayer.current.width = img.width; // To set canvas width to image width
imgDisplayer.current.height = img.height; // To set canvas height to image height
ctx.drawImage(img, 0, 0);
setCanvas(imgDisplayer.current);
}
img.src = source;
}, []);
That setCanvas is to store the loaded canvas into state, so that i can use it to initialize Pixi.
if (canvas){
const app = new Pixi.Application({
width: 1300,
height: 866,
backgroundColor: 0x000000,
autoResize: true,
view: canvas
});
}
Problem is, this throws me the following error
Error: This browser does not support WebGL. Try using the canvas renderer
If only i could fetch an https image (direct image link) then this wouldn't be a problem, and i can load the image using Pixi.Sprite insteaad.. but because of cross origin, i cannot think of way on how to render the image. I can render it just fine on canvas, but not with Pixi.
The error you pasted is saying that your browser does not support WebGL.
Please check WebGL support on those pages: https://get.webgl.org/ , https://webglreport.com/
Can you try to initialize Pixi on some simpler page just to check if it works for you? I mean, just create simple html page (without React etc), try to setup Pixi there and try to draw anything (some red circle etc).
In my react app, I am trying to make a view page which shows 3d-mesh exported from pix4d. This mesh consists of three types of files, (.obj, .mtl, .jpg) pix4d.com.
I am new, with react-three-fiber, which I suppose is best way to achieve my solution to my problem.
Below is whole code of react component used to load and render 3D-Model.
code
Thanks in Advance!
I want to understand how to attach texture & material to my obj rendered model.
I was looking for this answer for a couple of weeks, finally I've found a way to make it work.
Material loader is not resolving by itself the import of remote files (it does on web, not in mobile, maybe in the future it will). So, I'm creating material and assigning it images by hand.
Something like this:
import { TextureLoader } from 'expo-three';
import { OBJLoader } from 'three/examples/jsm/loaders/OBJLoader';
import { MTLLoader } from 'three/examples/jsm/loaders/MTLLoader';
// useThree is used to get scene reference
import { useThree } from 'react-three-fiber';
const textureLoader = new TextureLoader();
const mapImage = textureLoader.load(require('path/to/image1.png'))
const normalMapImage = textureLoader.load(require('path/to/image2.png'))
Note that TextureLoader from expo-three can handle the file resource returned from require()
const loaderObj = new OBJLoader();
const loaderMtl = new MTLLoader();
export default props => {
const { scene } = useThree();
loaderMtl.load(
'https://url_to/material.mtl',
mtl => {
mtl.preload();
loaderObj.setMaterials(mtl);
loaderObj.load(
'https://url_to/model.obj',
obj => {
// simple logic for an obj with single child
obj.children[0].material.map = mapImage;
obj.children[0].material.normalMap = normalMapImage;
scene.add(obj)
}
)
}
)
return null;
}
This is my first successful attempt to render an obj with mtl including a map and a normal map, so since it works, we can keep updating the code for improvements.
Another way to load model with texture is by specifying the path where your texture has been stored. In one case, mtl + obj + texture files are stored in your react project's directory 'public/models/'. So you can specify the path by calling setPath() function prior loading your material or object file and it should load your texture on the material. You may also want to make sure that in ./mtl file the texture name is correct. It should be called following by map_Kd in .mtl file.
const Model = () => {
const materialLoader = new MTLLoader().setPath('./model/').load(MaterialFile);
const objLoader = new OBJLoader().setMaterials(materialLoader).load(OojectFile);
return <primitive object={objLoader} />;
};
I'm using snap-svg to display some SVG. I used this answer to do so in react.js.
However, I'd like to take it a step further: is it possible to load 2 svgs from urls, and then embed one inside of the other ?
So far, I managed to load my two SVGs, and then add them to my div, using this simple bit of code
const element = Snap(this.svgDiv)
Snap.load(url, function (baseProduct) {
if (element) {
element.add(baseProduct);
let designUrl = [my url]
Snap.load(designUrl, function (design) {
if (element) {
console.log(baseProduct);
baseProduct.select('rect').attr({fill: 'transparent'})
const image = design.select('image');
baseProduct.select('rect').add(image)
}
})
}
})
Unfortunately, this only displays the first image: the second one, which is supposed to be displayed inside of a rect,is not displayed. However, if I inspect my SVG, the img tac can be found inside of my rect. How comes so ?