I am stuck with a file upload process in a react app. In the app, I am trying to upload local video files to Google Cloud Storage.
I take the input file from:
<input type={`file`} accept=".mp4" onChange={VideoSelectChangeFunc} />
In VideoSelectChangeFunc, I get the local URL of the input file by,
let file = URL.createObjectURL(event.target.files[0])
Then I use it in axios to send to cloud
export const UploadVideo = async (file, signedurl, asset_uuid) => {
let resultState = { state: '', data: {} };
await axios({
method: 'put',
url: signedurl,
data: file,
headers: {
'Content-Type': 'application/octet-stream',
},
}).then(function (response) {
resultState.state = 'success';
resultState.data = response.data
}).catch(function (error) {
resultState.state = 'error';
resultState.data.message = error.message;
window.toastr.error(error.message);
console.log(error)
})
return resultState;
}
What I see on cloud is:
blob:http://localhost:3000/9b650cbf-8b49-440b-9e90-da6bdb5d392a
This is just the local URL of the file as string but not the video itself, when I copy and paste it on browser I can see the video. I searched the situation and saw 'Content-Type': 'blob' would solve the problem. However, we are checking headers in our CORS Policy, so it has to be 'Content-Type': 'application/octet-stream'. Is there a way to work this out?
Before sending it, converting the blob url into file worked. I have only added these two lines then, called axios.
let blob = await fetch(blobURL).then(r => r.blob());
var file = new File([blob], "thisVideo.mp4",{type:"video/mp4", lastModified:new Date().getTime()})
This can be useful, in the situations where the file is not uploaded right away but the url saved temporarily to be called later on which was the case here. If you are interested visit this question too:
How to get a file or blob from an object URL?
Related
We have requirement where we have to read 1-10000 image URL from one location. Sample array
[
{
"asin": "B00HZ9Q8XM",
"image": "https://m.media-amazon.com/images/I/41gKlxTYnkL._SL500_.jpg"
},
{
"asin": "B00JOW20TY",
"image": "https://m.media-amazon.com/images/I/511Ae304D5L._SL500_.jpg"
}
]
We need to loop through above array, download or read image, call an external node api which will accept image as file with multipart/form-data and file type can only be jpg
We wanted to do this without saving files on local server as there will be 100 of such request and don't want to fill up server space.
What we tried so far, we can download image as blob or arrayBuffer like following;
export const downloadImageFromUrl = async(url) => {
// const response = await fetch({
// url,
// method: 'GET',
// responseType: 'stream'
// });
const response = await axios({
url,
method: 'GET',
responseType: 'blob'
});
console.log('response ', response)
//fs.writeFileSync('./temp.jpg', res.data);
return response;
}
But when we send it to node api, it says filetype is blob and reject our request. Can you please advise best approach to achieve this ?
Just use the arraybuffer response type.
P.S. Blobs are not currently supported by Axios on the Node.js platform;
const response = await axios({
url,
method: 'GET',
responseType: 'arraybuffer'
});
I am trying to upload a file to the server and the server APIs are written using django. The file upload is working perfectly from Postman but when i try to upload from mobile app (React Native) using axios the backend is not able to read it.
Following is the Frontend Snippet:
let accessToken = await AsyncStorage.getItem('accessToken')
let formData = new FormData()
formData.append('doc_type', this.state.selectedDoc.id)
formData.append('document', this.state.selectedFiles) // <- This is the fetched file in array format . [{filname:'abc', size:12344,.....}]
formData.append('description', this.state.description.value)
formData.append('data', JSON.stringify(this.state.selectedDoc.fields))
let url = `${AppConstants.url}api/${AppConstants.apiVersion}/upload_doc`
var config = {
method: 'post',
url: url,
data: formData,
headers: {
'Authorization': `Bearer ${accessToken}`,
}
}
axios(config)
.then((resp) => {
resolve(resp)
})
.catch((err) => {
reject(err)
});
And the backend-end if else statement is as follows:
if(request.FILES.getlist("document")):
files = request.FILES.getlist("document")
....
....
....
else:
return response.JsonResponse({
'success' : False,
'message' : 'Please Upload a file'
}, status = status.HTTP_200_OK)
The above else block is executed even though the UI is sending a valid file.
Request you to please share a solution.
Have a client-side server in react that need to request files from backend server (kotlin + spring boot) to download them.
Using the request endpoint in Swagger, Postman and Insomnia, i can success download any file with any size.
In my client-side server, have a list of this files that download can be triggered by a click in an icon. I can download files that has less than 10mb with no error, but when file has more than 10mb, it fails with Failed to fetch error.
Actually, it's a weird behavior. Let say i have a file named FILE A that has under than 10mb and FILE B with 25MB (is the max size allowed to upload). In first entried of the page, if i first request to download FILE B, it throw Failed to fetch. Now, if first request is in FILE A and after FILE B, FILE B download is successed. I'm really confused what is going on here.
Code:
const options = {
method: 'GET',
headers: { "Authorization": `Bearer ${user?.token}` },
};
fetch(`http://localhost:8080/storage/download?fileName=${filePath}`, options)
.then(function (response) {
return response.blob();
})
.then(function (myBlob) {
setSpinControl(false);
const file = new Blob(
[myBlob],
{ type: 'application/pdf' }
);
const fileURL = URL.createObjectURL(file);
if (window) {
window.open(fileURL, '_blank');
}
})
.catch((err) => {
setSpinControl(false);
console.log(err)
});
Already tried some alternatives:
Using axios (throw Network Error);
Using libraries as file-saver;
Setting timeout to 9999999;
All achieve same behavior.
I read too that createObjectURL uses memory to perform download, but max size of a file is validated to be 25MB.
Some print of Network tab:
Request Header:
Request Response:
Network List:
Any tips what i can do here?
I get a pop-up "Do you want to allow this website to open an app on your computer?" in IE11 when a pdf is downloaded.
With the code below in angularts the correct pop-up is opened "Do you want to open or save the file"? But also "Do you want to allow this website to open an app on your computer?"
const headerOptions = new HttpHeaders({
// 'Cache-Control': 'private',
// 'Content-Disposition': 'attachment; filename = ' + filename,
'Content-Type': 'application/pdf'
});
const requestOptions = {
headers: headerOptions,
responseType: 'blob' as 'blob'
};
this.http
.post(
`${this.url}?id=${id}&datasource=${datasource}&device=${device}&browser=${browser}&link=${link}`,
dataObj,
requestOptions
)
.catch(error => {
return this.clickHandlerError(error);
})
.pipe(
map((data: any) => {
const blob = new Blob([data], {
type: 'application/pdf'
});
window.navigator.msSaveOrOpenBlob(blob, filename);
})
)
.subscribe((result: any) => {});
I expect to have just the correct pop-up to open or save the file.
This is a client side error message letting the user know that this action will cause an application to open on the user's system. If you want this to stop then you'll need to configure the client machine not the web application.
I suggest you to check your IE options and disable this prompt.
If issue persist than you can reset your IE application can help to fix this issue.
I'm trying to put a video file to my bucket using a pre-signed url in angular4.
Node:
let s3 = new AWS.S3();
s3.config.update({
accessKeyId: process.env.VIDEO_ACCESS_KEY,
secretAccessKey: process.env.VIDEO_SECRET_KEY
})
let videoId = await Video.createVideo()
let params = {
ACL: "public-read",
Bucket: process.env.BUCKET_NAME,
ContentType: 'video/mp4',
Expires: 100,
Key: req.jwt.username+"/"+videoId,
}
return s3.getSignedUrl('putObject', params, function (err, url) {
if(!err) {
console.log(url);
res.status(200);
res.json({
url: url,
reference: `${process.env.BUCKET_NAME}/${req.jwt.username}/${videoId}`,
acl: params.ACL,
bucket: params.Bucket,
key: params.Key,
contentType: params.ContentType,
});
} else {
console.log(err);
res.status(400);
res.json({
message: "Something went wrong"
})
}
});
This successfully generates a url for me, and I try to use it in my post request in the front end.
Angular:
this.auth.fileUpload().subscribe((result) => {
console.log(result["key"], result["acl"], result["bucket"], result["contentType"])
if(!result["message"]) {
let formData = new FormData();
formData.append('file', file.files[0]);
const httpOptions = {
headers: new HttpHeaders({
"Key": result["key"],
"ACL": result["acl"],
"Bucket": result["bucket"],
"Content-Type": result["contentType"],
})
};
this.http.post(result["url"], formData, httpOptions ).subscribe((response) => {
console.log("response");
console.log(response);
let reference = `https://s3.amazonaws.com/${result["reference"]}`
this.auth.makeVideo(result["reference"]).subscribe((result) => {
console.log(result);
});
}, (error) => {
console.log("error");
console.log(error);
})
But this generates an error.
SignatureDoesNotMatch
The request signature we calculated does not match the signature you provided. Check your key and signing method
Here's the URL that I generate
https://MY_BUCKET_HERE.s3.amazonaws.com/admin/87f314f1-9f2e-462e-84ff-25cba958ac50?AWSAccessKeyId=MY_ACCESS_KEY_HERE&Content-Type=video%2Fmp4&Expires=1520368428&Signature=Ks0wfzGyXmBTiAxGkHNgcYblpX8%3D&x-amz-acl=public-read
I'm pretty sure I'm just making a simple mistake, but I can't figure it out for the life of me. Do I need to do something with my headers? Do I need to change the way I read the file for the post? I've gotten it to work with a public bucket with FormData and a simple post request with no headers, but now that I'm working with Policies and a private bucket, my understanding is much less. What am I doing wrong?
If you generate a pre-signed URL for PutObject then you should use the HTTP PUT method to upload your file to that pre-signed URL. The POST method won't work (it's designed for browser uploads).
Also, don't supply HTTP headers when you invoke the PUT. They should be supplied when generating the pre-signed URL, but not when using the pre-signed URL.