I am creating a custom image search application. I have a search text field where the user enters the text and from my flask backend, I am using the Google Image Search API to get the images searched for.
Now, I want to display these images on the front end, I am using React for my front end.
On the button click, I make the fetch call as below,
fetch('localhost/image_search/'+this.state.inputQuery)
.then(res => res.json())
.then(
(result) => {
this.setState({
images: result
})
}
);
I have 2 questions.
How should I be sending the images from the backend? Send them as blobs?
Once I receive the image from the backend, how can they be displayed on the front end?
TL;DR, I am creating my own google search application to search for images on the web, once the user gives the search text, how can I display searched images on the UI.
You can send the images as urls i.e. array of images
['https://cdn0.tnwcdn.com/wp-content/blogs.dir/1/files/2020/02/Google-Image-Search.jpg','https://cdn0.tnwcdn.com/wp-content/blogs.dir/1/files/2020/02/Google-Image-Search.jpg'];
Since, you are setting them to images in state, in your render method:
render(){
return{
{this.state.images.map((x,i)=>{
<img src={x} alt="image" key={i} />
});}
}
}
Hope this helps!
Related
Hi I am developing an ecommerce site. here I suffer this problem:
Suppose I have an array of product I want to map over the array and hit an post api for all array of items. When all array of items will be post. then I will call another api for getting response.I want to call second api only if the array of elements posted to api.
here is my code :
I am showing all cart items in checkout page. when user want to pay online payment. I place an order and i am getting getway url. I have to clean the cart right? but when my cart cleared the checkout page appearing blank, afer some millisecond page redirect to the gateway URL. I want that after redirect cart will be cleared because if user see blank page it is not good UX.
```await getData(${REST_BASE_API_URL}/mobileapps/orders/${order_id}, {}, userToken())
.then(res => {
let increment_id = res?.increment_id;
if (paymentMethod === 'sslcommerz') {
// navigate('/order-success');
let url = `anything`;
getData(url, {}, userToken())
.then(res => {
window.location.href = res; //redirect to gateway url
dispatch(saveUpdatedCart([])); //cart clear
clearCartAfterOrder(); //set new cart id in redux
console.log('ssl_res', res)
})
.catch(err => {
console.log(err)
})```
I am trying to display an image in Supabase storage bucket on my react app. I console log the public url and it is correct but the image wont show. Also when I got to the public url the image downloads automatically rather than appearing in the browser.
useEffect(() => {
const { data, error } = supabase
.storage
.from('images')
.getPublicUrl(`kk2.tiff`)
setImage(data.publicURL)
}, [])
As written in MDN,
Browser compatibility: No browsers integrate support for TIFF; its value is as a download format
You should convert image to other format
I have tried multiple ways and finally created a web camera that uploads towards Cloudinary. Is there a way to take these images and upload them into firebase from Cloudinary? If not, can we create a camera in react.js that can upload to the firebase database?
Not sure about cloudinary because i have never used it but you can add camera in react app and then save the image as blob and later use it to save image in firebase storage.
To open camera use the input element with file type and capture attribute
<input
type="file"
accept="image/*"
capture
/>
On taking an image through camera you can create its URL,
this is in onchange method in file input
const {
target: { files },
} = e;
const imageUrl = window.URL.createObjectURL(files[0]);
Now create a blob from the URL
let resFront = await fetch(imageURL);
let tempblobFront = await resFront.blob();
and then save blob to firebase storage
firebase
.storage()
.ref(put your folder name in firebase storage here)
.child(put name by which you want to save the image)
.put(put your image blob over here i.e tempblobFront)
.then((res) => res)
.catch((err) => {
console.log(err);
}),
You might want to try the cloudinary upload widget: https://cloudinary.com/documentation/upload_widget
It includes camera, local drive, google photos and much more.
I'm trying to find a way on Gatsbyjs site to add custom Facebook pixel conversion when visitors submit the contact form. Couldn't figure it out even using facebook pixel documentation.
The plugin that I'm using - https://github.com/gabefromutah/gatsby-plugin-facebook-pixel
The event that I need to track is:
fbq('track', 'Lead', {content_name: 'ContactForm'});
Is it the best way to do it through button click on the contact form? If so, how do I add it in?
Any help would be appreciated. Thanks
This plugin (gatsby-plugin-facebook-pixel) only works in production build (i.e gatsby build && gatsby serve), perhaps that's why you're not seeing any tracking?
It sounds like you only want to track successful submission? If so, you might want to avoid only tracking button click and instead track after the submission's been received. For example, if you're using fetch to send your data, it might look roughly like so:
const submitHandler = (data) => fetch(url, {
method: 'POST',
body: JSON.stringify(data)
})
.then(res => res.json())
.then(res => {
if (res.ok) {
fbq('track', ...)
}
})
.catch(err => ...)
I'm working on a google maps project with React. I assign an onClick handler the following method:
getStreetView(lat,lng){
let url = `https://maps.googleapis.com/maps/api/streetview?size=600x300&location=${lat},${lng}&heading=151.78&pitch=-0.76&key=MYAPIKEY`
axios.get(url).then(response=>this.setState({image:response.config.url}));
}
the state = { image: null } is then beeing assigned the url which I later pass on to an image tag to a child component such as <img src={props.image} alt='street view'/>. Everything works like a charm, however I have come across various other solutions such:
function getBase64(url) {
return axios
.get(url, {
responseType: 'arraybuffer'
})
.then(response => Buffer.from(response.data, 'binary').toString('base64'))
}
from b4dnewz from the axios documentation. However I can't find a reasonable approach how to display the image in child component with that response type. My question is, is my approach valid? Is there any reasons why I shouldn't be using it like that?
If you are getting the image url from api, then your first method is the right approach, but the
Buffer.from(response.data, 'binary').toString('base64'))
method is used when the image itself is received from server as a binary which is converted to base 64 and given to <img>