Is it possible to directly upload images captured by camera to Firebase Storage? - reactjs

I'm using React.js to create an application that would take a photo and upload it to Firebase Storage. I am using the react-webcam library, which uses this command to take a photo:
const ImageSrc = webcamRef.current.getScreenshot();
This is how I tried uploading the photo to Storage:
storage.ref(`/images`).put(imageSrc)
.on("state_changed" , alert("success") , alert)
However, the file that is uploaded is undefined (no photo).
I tried to construct an URL of the photo using blob:
const imageUrl = window.URL.createObjectURL(new Blob(webcamRef.current.getScreenshot()))
But I get this error: >Failed to construct 'Blob': The provided value cannot be converted to a sequence.
In the library it is stated that getScreenshot - Returns a base64 encoded string of the current webcam image. So, I tried to use the atob command, but I get the error: Failed to execute 'atob' on 'Window': The string to be decoded is not correctly encoded.
Does anyone know how I could upload the image to Firebase Storage? Any help would be appreciated!

Instead of blob, try using putString() command like this:
const task = firebase.storage().ref(`/images`).putString(imageSrc, 'data_url')

As explained in the doc, if you want to upload from a Base64url formatted string, you need to call the putString() method as follows (example from the doc):
var message = '5b6p5Y-344GX44G-44GX44Gf77yB44GK44KB44Gn44Go44GG77yB';
ref.putString(message, 'base64url').then((snapshot) => {
console.log('Uploaded a base64url string!');
});
In your case, since getScreenshot() returns a base64 encoded string, it would be something like:
const imageSrc = webcamRef.current.getScreenshot();
storage.ref(`/images`).putString(imageSrc, 'imgBase64')
.on("state_changed" , alert("success") , alert)

Related

how to read the contents of a json file returned from s3 storage as a blob in React

I have successfully retrieved a json file from s3 storage. It is returned as a blob. I am able to turn the blob into text with this code (taken from https://docs.amplify.aws/lib/storage/download/q/platform/js/#monitor-progress-of-download):
export async function getS3Item(filename)
{
const result = await Storage.get(filename, { download: true });\
result.Body.text().then(string => {
// // handle the String data return String
console.log(string)
});
}
but the text is all gibberish (I'm assuming since object is in binary?)... such as: "h�b```f�d`a}��ǀ|#1V ..."
Is there a way I can directly read this as a json object in javascript so that I can extract data from it...?
Optionally, I can download the json file (which is shown in the link above and I've gotten this to work) -- but I'd prefer not to download it -- just to extract legible data from the file
thanks so much (I'm quite unfamiliar with blobs).

How to send a local image instead of URL to Computer Vision API using React

I would like to upload local image file and extract text from it. I followed the below link and it works as expected when I pass URL. https://learn.microsoft.com/en-us/azure/developer/javascript/tutorial/static-web-app/add-computer-vision-react-app
I managed to configure for local image and get the base64 encoded dataURL of the uploaded image. But when I pass base64 encoded dataURL to Computer Vision API , it says "Input data is not a valid image" (POST 400 status code). I am getting error in the line that is shown below:
const analysis = await computerVisionClient.analyzeImage(urlToAnalyze, { visualFeatures });
The code I have included for handling local image:
const handleChange = (e) => {
var file = e.target.files[0];
var reader = new FileReader();
reader.onloadend = function()
{
setFileSelected(reader.result) // this is the base64 encoded dataurl
}
reader.readAsDataURL(file);
}
In computerVision.js file, I have changed the 'contentType' in header as below.
const computerVisionClient = new ComputerVisionClient(
new ApiKeyCredentials({ inHeader: {'Ocp-Apim-Subscription-Key': key, 'Content-Type': 'application/octet-stream'} }), endpoint);
I tried replacing client.read() with readTextInStream() as per docs in computerVision.js (please refer above link), but still throws error.
May I know why I get the error "Input data is not a valid image" ? Thanks.
Here is the link for input requirements.
There is a brand new online portal provided by Microsoft https://preview.vision.azure.com/demo/OCR
The advantage is that it will directly list your available resources so you just have to pick the right one, then you test, and there are also some samples.

How to send parsed .csv file as a byte array or ArrayBuffer data from Node.js backend to AngularJS frontend?

I'm working on AngularJS app.
Module I'm currently working on should be able to either show a preview of a spreadsheet file or allow to download it.
The steps:
When clicked on "Preview File" it should send request with needed file's name as a parameter of POST request.
Backend will find neede file, which is a .csv file, convert it to byte array type and send it to frontend.
Frontend should handle this byte array and convert it to .xls or .xlsx filetype
The spreadsheet data should be opened in some small preview read-only window, like 1000x1000 px.
The POST request line looks like that:
this.$http.post(this.url + 'endpoint/getFile', params,
{responseType: "arraybuffer", showLoadingOverlay: true}
)
The response looks indeed like ArrayBuffer: three of it in one object, i.e. Uint8Array, Uint16Array and Uint32Array.
The code which should read this and convert to content suitable for preview is not working:
const byteArray = new Uint8Array(data);
const blob = new Blob([byteArray], { type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' });
const objectUrl = URL.createObjectURL(blob);
this.$window.open(objectUrl, 'C-Sharpcorner', 'width=1000,height=1000');
Because when created the blob, it already has 0 length in bytes, so there's no data inside.
The matter of visualising the .xls in browser window, I think, can be achieved with canvas-datagrid library. Haven't used but it looks cool.
Also, I have a problem with trying to set up a mock data for node.js (and AngularMock), for local testing when there's no data on a java backend.
I'm using 'fs' and 'csv-parse':
const fs = require('fs');
const csvParse = require("csv-parse/lib/es5");
module.exports = function stir(app) {
const getFile = () => {
const csvOutput = csvParse('../static/someData.csv', (parsed) => {
return parsed;
});
fs.readFileSync(csvOutput);
};
app.post('/stir/getFile', (req, res) => res.json(getFile()));
};
Which results in error:
TypeError: path must be a string or Buffer
What is the proper way of parsing the .csv using 'csv-parse' and sending parsed data as an ArrayBuffer to frontend in Node and AngularMock?
csv-parse docs are telling that underneath, the lib will convert the parsed object to node stream.
So why that error happens?

React Native: Read local image as Base64 string to send it in JSON

In React Native I'm trying to load an image stored at a relative path as a base64 string, but the require returns 3 as response instead of the image source.
I'm sure the path is correct, and I the require command works elsewhere in my Reacy Native JSX code to load images from the same relative path (<Image source={require('../resources/examplecar.jpg')}>) without any problem.
How to get a local image source from the filesystem as base64 string to send it in JSON?
var body = {
path: '../resources/examplecar.jpg',
data: {
image: require('../resources/examplecar.jpg'),
}
}
Using RNFS plugin it is possible to access the React Native assets and convert the data into a range of formats including Base64.
imageData = await RNFS.readFile(RNFS.MainBundlePath+"/assets/resources/examplecar.jpg", 'base64').then();
You can't have a variable in require. It won't work for image source in rn.
Do
image: require('../resources/examplecar.jpg')

Firebase storage uploading broken image

I have a cropped image using this cropper. It is returning a base64 version of the cropped image.
I am trying to convert that base64 image to a blob or a file to be able to upload it to firebase storage using the code below.
var imageBase64 = $scope.cropper.croppedImage.split(',')[1];
var blob = new Blob([imageBase64], {type: 'image/png'});
var file = new File([imageBase64], 'imageFileName2.png', {type: 'image/png'});
when I tried to print file it has created an acutal File object that looks like this.
{
lastModified:1471604365544,
lastModifiedDate:"Fri Aug 19 2016 18:59:25 GMT+0800 (PHT)",
name:"imageFileName2.png",
size:228808,
type:"image/png",
webkitRelativePath:""
}
The File is being uploaded successfully though, but the result is broken. The preview on the firebase storage dashboard just looks like this (it never stops loading, and when I tried to download it, I got a broken image) :
To prove that my base64 image is not broken, here's a base64 pic of my cat Akasha, you can preview it in this base64 viewer.
In the 3.3.0 JS client, you can just upload a base64 string (docs)!
var imageBase64 = $scope.cropper.croppedImage.split(',')[1];
ref.putString(imageBase64, 'base64').then(function(snapshot) {
console.log('Uploaded a base64 string!');
});
The Blob and File APIs both take a UInt8Array, which the base64 string isn't, so you were basically creating an array that contained a single item, which was the base64 string, which wasn't a valid blob. Use the above upload to solve the problem.

Resources