Firebase upload multiple files and get status - reactjs

I have a React form where the user can upload multiple files. These are stored in fileList
async function uploadFiles(id) {
try {
const meta = await storageUploadFile(fileList, id);
console.log(meta);
} catch (e) {
console.log(e);
}
}
This calls my helper function that uploads the files to Firebase
export const storageUploadFile = function(files, id) {
const user = firebase.auth().currentUser.uid;
return Promise.all(
files.map((file) => {
return storage.child(`designs/${user}/${id}/${file.name}`).put(file)
})
)
};
What I'd like is on calling uploadFiles, get the total filesize of all items, and then show the overall progress.
At the moment, my code is only returning the file status in an array on completion
[
{bytesTransferred: 485561, totalBytes: 485561, state: "success"},
{bytesTransferred: 656289, totalBytes: 656289, state: "success"}
]

This is the way i do it:
import Deferred from 'es6-deferred';
export const storageUploadFile = function(files, id) {
const user = firebase.auth().currentUser.uid;
// To track the remaining files
let itemsCount = files.length;
// To store our files refs
const thumbRef = [];
// Our main tasks
const tumbUploadTask = [];
// This will store our primses
const thumbCompleter = [];
for (let i = 0; i < files.length; i += 1) {
thumbRef[i] = storage.ref(`designs/${user}/${id}/${file.name}`);
tumbUploadTask[i] = thumbRef[i].put(files[i]);
thumbCompleter[i] = new Deferred();
tumbUploadTask[i].on('state_changed',
(snap) => {
// Here you can check the progress
console.log(i, (snap.bytesTransferred / snap.totalBytes) * 100);
},
(error) => {
thumbCompleter[i].reject(error);
}, () => {
const url = tumbUploadTask[i].snapshot.metadata.downloadURLs[0];
itemsCount -= 1;
console.log(`Items left: ${itemsCount}`)
thumbCompleter[i].resolve(url);
});
}
return Promise.all(thumbCompleter).then((urls) => {
// Here we can see our files urls
console.log(urls);
});
};
Hope it helps.

Related

How to upload multiple audio file in react but sending only 3 POST request

I want to select 100 audio file at a time but want to hit only 3 api call at a time. Once these 3 files uploaded (pass or fail) then only other 3 api request will be sent.
Basically I am providing a input field of file type:
<input type="file" multiple name="file" className="myform"
onChange={handleFileChange}
accept="audio/wav"
/>
and I am storing it as array into a state.
Below this I am providing an UPLOAD button.
When user hit on upload, I want to send 3 POST request using axios. once all 3 done either fail or pass then only next 3 should go.
You can do this by iterating the FileList collection in groups of 3 and sending the requests in parallel using Promise.allSettled().
Simply because I cannot recommend Axios, here's a version using the Fetch API
const BATCH_SIZE = 3;
const [fileList, setFileList] = useState([]);
const [uploading, setUploading] = useState(false);
const handleFileChange = (e) => {
setFileList(Array.from(e.target.files)); // just a guess
};
const handleUploadClick = async (e) => {
e.preventDefault();
setUploading(true);
const files = [...fileList]; // avoid mutation during long uploading process
for (let i = 0; i < files.length; i += BATCH_SIZE) {
const result = await Promise.allSettled(
files.slice(i, i + BATCH_SIZE).map(async (file) => {
const body = new FormData();
body.append("file", file);
const res = await fetch(UPLOAD_URL, { method: "POST", body });
return res.ok ? res : Promise.reject(res);
})
);
const passed = result.filter(({ status }) => status === "fulfilled");
console.log(
`Batch ${i + 1}: ${
passed.length
} of ${BATCH_SIZE} requests uploaded successfully`
);
}
setUploading(false);
};
Promise.allSettled() will let you continue after each set of 3 are uploaded, whether they pass or fail.
This method makes 3 separate requests with 1 file each.
With Axios, it would look like this (just replacing the for loop)
for (let i = 0; i < files.length; i += BATCH_SIZE) {
const result = await Promise.allSettled(
files
.slice(i, i + BATCH_SIZE)
.map((file) => axios.postForm(UPLOAD_URL, { file }))
);
const passed = result.filter(({ status }) => status === "fulfilled");
console.log(
`Batch ${i + 1}: ${
passed.length
} of ${BATCH_SIZE} requests uploaded successfully`
);
}
Axios' postForm() method is available from v1.0.0. See https://github.com/axios/axios#files-posting
If you want to send 3 files in a single request, it would look like this for Fetch
for (let i = 0; i < files.length; i += BATCH_SIZE) {
const body = new FormData();
files.slice(i, i + BATCH_SIZE).forEach((file) => {
body.append("file", file); // use "file[]" for the first arg if required
});
try {
const res = await fetch(UPLOAD_URL, { method: "POST", body });
if (!res.ok) {
throw new Error(`${res.status} ${res.statusText}`);
}
console.log(`Batch ${i + 1} passed`);
} catch (err) {
console.warn(`Batch ${i + 1} failed`, err);
}
}
and this for Axios
for (let i = 0; i < files.length; i += BATCH_SIZE) {
try {
await axios.postForm(
{
file: files.slice(i, i + BATCH_SIZE),
},
{
formSerializer: {
indexes: null, // set to false if you need "[]" added
},
}
);
console.log(`Batch ${i + 1} passed`);
} catch (err) {
console.warn(`Batch ${i + 1} failed`, err.response?.data);
}
}
You can use a combination of JavaScript's for loop and Promise.all functions to achieve this. First, you will need to divide your files array into chunks of 3. You can do this using a for loop and the slice method. Next, you can use Promise.all to send all the requests in parallel, and only move on to the next set of requests once all the promises in the current set have been resolved. Here's some sample code that demonstrates this approach:
const chunkSize = 3;
for (let i = 0; i < files.length; i += chunkSize) {
const fileChunk = files.slice(i, i + chunkSize);
const promises = fileChunk.map(file => {
return axios.post('/api/upload', { file });
});
await Promise.all(promises);
}
This will send 3 post request at a time and will wait until all the request are completed before sending another 3 api request.
You can also use useState hook with useEffect to set the state of files that are uploaded and use a variable to keep track of number of files uploaded.
const [uploadedFiles, setUploadedFiles] = useState([]);
const [uploadCount, setUploadCount] = useState(0);
useEffect(() => {
if (uploadCount === files.length) {
// all files have been uploaded
return;
}
const chunkSize = 3;
const fileChunk = files.slice(uploadCount, uploadCount + chunkSize);
const promises = fileChunk.map(file => {
return axios.post('/api/upload', { file });
});
Promise.all(promises).then(responses => {
setUploadedFiles([...uploadedFiles, ...responses]);
setUploadCount(uploadCount + chunkSize);
});
}, [uploadCount]);
This code will work for you.

Web worker causes a gradual increase of memory usage! how to use transferable objects?

I am trying to create a web-worker logic into a react custom hook, but unfortunately i noticed
that memory usage is gradual increasing. After a research, i found out that in order to transfer large data between web-workers and main thread,a good practice is to use transferable objects. I tried to add transferable objects, but every time i get following errors:
// postMessage(arrayBuffer , '/', [arrayBuffer]) error:
Uncaught TypeError: Failed to execute 'postMessage' on 'DedicatedWorkerGlobalScope': Overload resolution failed.
// postMessage(arrayBuffer, [arrayBuffer]) error:
Uncaught DOMException: Failed to execute 'postMessage' on 'DedicatedWorkerGlobalScope': Value at index 0 does not have a transferable type.
Any ideas how I can solve that problem (any alternative solutions or any possible web worker improvements) and where the problem is?
.
web-worker main job:
connect to a mqtt client
subscribe to topics
listen to changes for every topic, store all values into a object and every 1 second
send stored topics data object to main thread (notice that data is large)
custom hook main job:
create a web-worker,
in every onmessage event, update redux store
// react custom hook code
import React, { useEffect, useRef } from 'react';
import { useDispatch, useSelector } from 'react-redux';
import { setMqttData } from 'store-actions';
const useMqttService = () => {
const dispatch = useDispatch();
const topics = useSelector(state => state.topics);
const workerRef = useRef<Worker>();
useEffect(() => {
workerRef.current = new Worker(new URL('../mqttWorker.worker.js', import.meta.url));
workerRef.current.postMessage({ type: 'CONNECT', host: 'ws://path ...' });
workerRef.current.onmessage = (event: MessageEvent): void => {
dispatch(setMqttData(JSON.parse(event.data)));
// dispatch(setMqttData(bufferToObj(event.data)));
};
return () => {
if (workerRef.current) workerRef.current.terminate();
};
}, [dispatch]);
useEffect(() => {
if (workerRef.current) {
workerRef.current.postMessage({ type: 'TOPICS_CHANGED', topics });
}
}, [topics ]);
return null;
};
// web-worker, mqttWorker.worker.js file code
import mqtt from 'mqtt';
export default class WorkerState {
constructor() {
this.client = null;
this.topics = [];
this.data = {};
this.shareDataTimeoutId = null;
}
tryConnect(host) {
if (host && !this.client) {
this.client = mqtt.connect(host, {});
}
this.client?.on('connect', () => {
this.data.mqttStatus = 'connected';
trySubscribe();
});
this.client?.on('message', (topic, message) => {
const value = JSON.parse(message.toString());
this.data = { ...this.data, [topic]: value };
});
}
trySubscribe() {
if (this.topics.length > 0) {
this.client?.subscribe(this.topics, { qos: 0 }, err => {
if (!err) {
this.tryShareData();
}
});
}
}
tryShareData() {
clearTimeout(this.shareDataTimeoutId);
if (this.client && this.topics.length > 0) {
postMessage(JSON.stringify(this.data));
// Attemp 1, error:
// Uncaught TypeError: Failed to execute 'postMessage' on
// 'DedicatedWorkerGlobalScope': Overload resolution failed.
// const arrayBuffer = objToBuffer(this.data);
// postMessage(arrayBuffer , '/', [arrayBuffer]);
// Attemp 2, error:
// Uncaught DOMException: Failed to execute 'postMessage' on
// 'DedicatedWorkerGlobalScope': Value at index 0 does not have a transferable type.
// const arrayBuffer = objToBuffer(this.data);
// postMessage(arrayBuffer, [arrayBuffer]);
this.shareDataTimeoutId = setTimeout(() => {
this.tryShareData();
}, 1000);
}
}
onmessage = (data) => {
const { type, host = '', topics = [] } = data;
if (type === 'CONNECT_MQTT') {
this.tryConnect(host);
} else if (type === 'TOPICS_CHANGED') {
this.topics = topics;
this.trySubscribe();
}
};
}
const workerState = new WorkerState();
self.onmessage = (event) => {
workerState.onmessage(event.data);
};
// tranform functions
function objToBuffer(obj) {
const jsonString = JSON.stringify(obj);
return Buffer.from(jsonString);
}
function bufferToObj(buffer) {
const jsonString = Buffer.from(buffer).toString();
return JSON.parse(jsonString);
}
i update tranform functions
function objToBuffer(obj){
// const jsonString = JSON.stringify(obj);
// return Buffer.from(jsonString);
const jsonString = JSON.stringify(obj);
const uint8_array = new TextEncoder().encode(jsonString);
const array_buffer = uint8_array.buffer;
return array_buffer;
}
function bufferToObj(array_buffer) {
// const jsonString = Buffer.from(array_buffer).toString();
// return JSON.parse(jsonString);
const decoder = new TextDecoder('utf-8');
const view = new DataView(array_buffer, 0, array_buffer.byteLength);
const string = decoder.decode(view);
const object = JSON.parse(string);
return object;
}
in web-worker file add
const arrayBuffer = objToBuffer(this.data);
postMessage(arrayBuffer, [arrayBuffer]);
finally in custom hook add in onmessage
dispatch(setMqttData(bufferToObj(event.data)));

How to wait for .map() to finish and generate new keys in the array[index]

I'm trying to generate an array with values as follows:
{ name: 'John', age: 35, employer: 'ABC', paycheck: 5,000, last_paycheck: 4,900, change: 100 } // new array
with the initial values in the array as follow:
{ name: 'John', age: 35, employer: 'ABC' } //inital array
the function convertData() is handling all the array conversion.
async function convertData(data){
if(data.length === 0) return data;
// generates new array
const convertedDataArray = await data.map( async (row) =>{
let name = row.name
let paycheck = 0;
let last_paycheck = 0;
let change = 0;
const response = await axios.get('/getData', {params: {
name,
}});
let apiData = response.data.data;
if(apiData.length > 0){
let newData = apiData[0];
let oldData = apiData[1];
change = newData.payCheck - oldData.payCheck;
paycheck = newData.payCheck;
last_paycheck = oldData.payCheck;
}
console.log(apiData); // prints records up to 100 elements
return {...row, paycheck, last_paycheck, change };
});
console.log(convertedDataArray);// prints [Promise]
return Promise.all(convertedDataArray).then(() =>{
console.log(convertedDataArray); // prints [Promise]
return convertedDataArray;
});
};
where convertData() is called:
const response = await axios.get('/getEmployees',{params: {
token: id,
}});
const dataRows = response.data; //inital array
const tableRows = await convertData(dataRows);
return Promise.all(tableRows).then(() =>{
console.log(tableRows); // prints [Promise]
dispatch(setTableRows(tableRows));
});
I'm not sure why i keep getting Promise return I am still learning how to use promise correctly. Any help would be great, thank you in advance!
You should get a array of promises and use Promises.all to get all the data first.
Then use map() function to construct your data structure.
Example below:
async function convertData(data) {
try {
if (data.length === 0) return data;
const arrayOfPromises = data.map(row =>
axios.get("/getData", {
params: {
name: row.name,
},
})
);
const arrayOfData = await Promise.all(arrayOfPromises);
const convertedDataArray = arrayOfData.map((response, i) => {
const apiData = response.data.data;
let paycheck = 0;
let last_paycheck = 0;
let change = 0;
if (apiData.length > 0) {
const newData = apiData[0];
const oldData = apiData[1];
change = newData.payCheck - oldData.payCheck;
paycheck = newData.payCheck;
last_paycheck = oldData.payCheck;
}
return { ...data[i], paycheck, last_paycheck, change };
});
return convertedDataArray;
} catch (err) {
throw new Error(err);
}
}
(async function run() {
try {
const response = await axios.get("/getEmployees", {
params: {
token: id,
},
});
const dataRows = response.data;
const tableRows = await convertData(dataRows);
dispatch(setTableRows(tableRows));
} catch (err) {
console.log(err);
}
})();

Lifecycle of useState hook in React.js

I have the following synchronism problem. Given that I know that the React useState hook is asynchronous, I run into the following: I'm downloading some images from Amazon S3, I manage to save it correctly in my hook: defaultSelfiePicture and depending on the weight of the image (or so I think) sometimes I get the images loaded correctly and sometimes not. I have tried to force state changes after I finish saving the object in my hook but it never renders the image, only if I change component and come back is when it is shown in the cases that it takes longer to load.
const [defaultSelfiePictures, setDefaultSelfiePictures] = useState([])
useEffect(() => {
if (savedUser.docs !== undefined) {
loadAllPictures()
}
}, [savedUser.docs.length])
const loadAllPictures = () => {
let p1 = loadUrlDefaultFrontPictures()
let p2 = loadUrlDefaultBackPictures()
let p3 = loadUrlDefaultSelfiePictures()
Promise.all([p1, p2, p3]).then(result => {
console.log('end all promises')
setTimestamp(Date.now())
})
}
const loadUrlDefaultSelfiePictures = async () => {
if (savedUser.docs.length > 0) {
let readedPictures = []
for (let i = 0; i < savedUser.docs.length; i++) {
if (
savedUser.docs[i].type === 'SELFIE'
//&& savedUser.docs[i].side === 'FRONT'
) {
if (
savedUser.docs[i].s3Href !== null &&
savedUser.docs[i].s3Href !== undefined
) {
const paramsKeyArray =
savedUser.docs[i].s3Href.split('')
let paramsKey = paramsKeyArray.pop()
let params = {
Bucket: process.env.REACT_APP_S3_BUCKET,
Key: paramsKey
}
await s3.getSignedUrl('getObject', params, function (err, url) {
readedPictures.push({
idKycDoc: savedUser.docs[i].idKycDoc,
name: 'selfie.jpeg',
type: savedUser.docs[i].type,
url: url
})
})
} else {
let urlPicture = savedUser.docs[i].localHref
let response = await axios.get(`${URL_IMG}${urlPicture}`, {
responseType: 'blob'
})
function readAsDataURL(data) {
return new Promise((resolve, reject) => {
const reader = new FileReader()
reader.readAsDataURL(data)
reader.onloadend = () => {
resolve(reader.result)
}
})
}
const base64Data = await readAsDataURL(response.data)
readedPictures.push({
idKycDoc: savedUser.docs[i].idKycDoc,
name: 'selfie.jpeg',
type: savedUser.docs[i].type,
url: `data:image/jpeg;base64,${base64Data.slice(21)}`
})
}
}
}
setDefaultSelfiePictures(readedPictures)
}
}
And I obtain this :
I can see that the hook has content, but that content is not updated until the next rendering of the component, also if I try to make any changes when I detect that the .length has changed it tells me that it is 0...
And right after the next render I get this:

React Ant Design multiple files upload doesn't work

I'm in the process of sending multiple files from "React.js" by formData.append() to a backend.
At the backend(Spring boot), I was able to see that multiple files were saved well with postman.
The problem occurred in React.
(I'm using "Ant Design" that is React UI Library.)
Below is the source that append files to formdata with extra data.
const formData = new FormData();
formData.append('webtoonId', this.state.selectedToonId);
formData.append('epiTitle', this.state.epiTitle);
formData.append('eFile', this.state.thumbnail[0].originFileObj);
for( let i = 0; i< this.state.mains.length ; i++){
formData.append('mFiles', this.state.mains[i].originFileObj);
}
uploadEpi(formData)
uploadEpi() is POST API.
Below is about state.
this.state = {
toons: [],
epiTitle :'',
thumbnail : [],
mains : [],
selectedToonID : ''
}
When I submit, Text and single file are stored in the DB normally, but only multiple files cannot be saved.
There was no error. Just multiple files didn't be saved.
The state "mains" is configured as shown below.
I guess it's because I'm using Ant Design incorrectly.
(Ant Design : https://ant.design/components/upload/)
Why I guessed so, because when I add multiple attribute to <Dragger> like below,
<Dragger onChange={this.onChangeMain} beforeUpload={() => false} multiple={true}>
the state "mains" multiple files became undefined.
Below is onChange={this.onChangeMain}
onChangeMain=({ fileList })=> {
this.setState({ mains : fileList }, function(){
console.log(this.state)
});
}
The bottom line is, I want to know how to upload multiple files through <Upload> (or <Dragger>) in "React Ant Design."
I don't know what should I do.
this is my github about this project.
I'd appreciate with your help. thx.
const [loading, setLoading] = useState<boolean>(false);
const [fileList, setFileList] = useState<any[]>([]);
const [/* fileListBase64 */, setFileListBase64] = useState<any[]>([]);
const propsUpload = {
onRemove: (file:any) => {
const index = fileList.indexOf(file);
const newFileList:any = fileList.slice();
newFileList.splice(index, 1);
return setFileList(newFileList)
},
beforeUpload: (file:any) => {
setFileList([...fileList, file]);
return false;
},
onChange(info:any) {
setLoading(true);
const listFiles = info.fileList;
setFileList(listFiles);
const newArrayFiles = listFiles.map((file:any) => file.originFileObj? (file.originFileObj) : file );
const anAsyncFunction = async (item:any) => {
return convertBase64(item)
}
const getData = async () => {
return Promise.all(newArrayFiles.map((item:any) => anAsyncFunction(item)))
}
getData().then(data => {
/* setFileSend(data) */
setFileListBase64(data);
setLoading(false);
// console.log(data);
});
},
directory: true,
fileList: fileList,
};
const convertBase64 = (file:File) => {
return new Promise((resolve, reject) => {
const fileReader = new FileReader();
fileReader.readAsDataURL(file)
fileReader.onload = () => {
resolve(fileReader?.result);
}
fileReader.onerror = (error) => {
reject(error);
}
})
}
const handleDeleteListFiles = () => {
setFileList([]);
setFileListBase64([]);
}
It seems like you are overriding the value of mFiles.
const formData = new FormData();
formData.append('webtoonId', this.state.selectedToonId);
formData.append('epiTitle', this.state.epiTitle);
formData.append('eFile', this.state.thumbnail[0].originFileObj);
let mFiles = [];
for( let i = 0; i< this.state.mains.length ; i++){
mFiles[i] = this.state.mains[i].originFileObj;
}
formData.append('mFiles', mFiles)
uploadEpi(formData)
Maybe this can work: formData.append('mFiles[]', mFiles)
If you add [] to the string it should not overwrite but add to the array

Resources