I am trying to implement a functionality where I can store a local public image of react folder in React state, after eons of trying i am not able to do it.
Is it possible, if yes can you point me in the right direction
You can convert the images to base64 and then use fetch api to convert them to blob to upload. something like this:
var url = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg=="
fetch(url)
.then(res => res.blob())
.then(blob => console.log(blob))
Related
I am trying to display an image in Supabase storage bucket on my react app. I console log the public url and it is correct but the image wont show. Also when I got to the public url the image downloads automatically rather than appearing in the browser.
useEffect(() => {
const { data, error } = supabase
.storage
.from('images')
.getPublicUrl(`kk2.tiff`)
setImage(data.publicURL)
}, [])
As written in MDN,
Browser compatibility: No browsers integrate support for TIFF; its value is as a download format
You should convert image to other format
I have tried multiple ways and finally created a web camera that uploads towards Cloudinary. Is there a way to take these images and upload them into firebase from Cloudinary? If not, can we create a camera in react.js that can upload to the firebase database?
Not sure about cloudinary because i have never used it but you can add camera in react app and then save the image as blob and later use it to save image in firebase storage.
To open camera use the input element with file type and capture attribute
<input
type="file"
accept="image/*"
capture
/>
On taking an image through camera you can create its URL,
this is in onchange method in file input
const {
target: { files },
} = e;
const imageUrl = window.URL.createObjectURL(files[0]);
Now create a blob from the URL
let resFront = await fetch(imageURL);
let tempblobFront = await resFront.blob();
and then save blob to firebase storage
firebase
.storage()
.ref(put your folder name in firebase storage here)
.child(put name by which you want to save the image)
.put(put your image blob over here i.e tempblobFront)
.then((res) => res)
.catch((err) => {
console.log(err);
}),
You might want to try the cloudinary upload widget: https://cloudinary.com/documentation/upload_widget
It includes camera, local drive, google photos and much more.
My React Native app receives data (products) from an API that contains an array of objects, each object has a link for a picture. There is an option to download all the products for offline view (I'm using redux-persist + realm for that) but the problem is that the pictures itself are not downloaded only the links for them.
What would be the best way for me to download the pictures so that I can attach them to the corresponding products?
There are multiple ways to do that, by a manual way you can download all images as base64 by using their addresses that comes from the API response. you should use JavaScript to download them, let see a JavaScript base64 downloading:
const imageLink = 'https://www.google.com/images/branding/googlelogo/2x/googlelogo_color_272x92dp.png';
fetch(imageLink)
.then(res => res.blob())
.then(data => {
const reader = new FileReader();
reader.readAsDataURL(data);
reader.onloadend = () => {
const base64data = reader.result;
console.log(base64data); // you can store it in your realm
};
});
After downloading each image store it in your realm local database and in usage call it from the realm.
There are other ways like using libraries for catching images. like react-native-cached-image.
So I've successfully downloaded my image file to my MongoDB using Multer and Multer-Gridfs-Storage, but I'm having trouble retrieving it.
When I tried to retrieve the data using GridFS-Stream, it came back like this previous question:
GridFS : How to display the result of readstream.pipe(res) in an <img/> tag?
When I use this code, what's sent to my Front-End is only the first chunk in the collection, but it's actually usable.
const readstream = gfs.createReadStream({ filename: files[0].filename });
readstream.on('data', (chunk) => {
res.send({ image: chunk.toString('base64') })
})
How am I able to get back all of the chunks? Should I give up and start using GridFSBucket?
I ended up trying this and it work!
let data = ''
readstream.on('data', (chunk) => {
data += chunk.toString('base64')
})
readstream.on('end', () => {
res.send(data)
})
I'm working on a google maps project with React. I assign an onClick handler the following method:
getStreetView(lat,lng){
let url = `https://maps.googleapis.com/maps/api/streetview?size=600x300&location=${lat},${lng}&heading=151.78&pitch=-0.76&key=MYAPIKEY`
axios.get(url).then(response=>this.setState({image:response.config.url}));
}
the state = { image: null } is then beeing assigned the url which I later pass on to an image tag to a child component such as <img src={props.image} alt='street view'/>. Everything works like a charm, however I have come across various other solutions such:
function getBase64(url) {
return axios
.get(url, {
responseType: 'arraybuffer'
})
.then(response => Buffer.from(response.data, 'binary').toString('base64'))
}
from b4dnewz from the axios documentation. However I can't find a reasonable approach how to display the image in child component with that response type. My question is, is my approach valid? Is there any reasons why I shouldn't be using it like that?
If you are getting the image url from api, then your first method is the right approach, but the
Buffer.from(response.data, 'binary').toString('base64'))
method is used when the image itself is received from server as a binary which is converted to base 64 and given to <img>