How do I upload pdf file/blob to S3? - reactjs

I'm using react-pdf library and it generates me a blob and url as shown in the docs: https://react-pdf.org/advanced#on-the-fly-rendering
Blob object copied from the console:
{blob: Blob}
blob: Blob
size: 3597607
type: "application/pdf"
[[Prototype]]: Blob
[[Prototype]]: Object
I try to upload this to S3.
Frontend code:
const fileName = Date.now().toString() + ".pdf";
const file = new File([blob], fileName);
axios
.post("api/upload-poster", {
fileName,
file,
})
.then(({ data }) => console.log(data));
Backend code (nextjs handler):
const { fileName, file } = req.body;
const params = {
Bucket: "my-bucket-name",
Key: fileName,
Body: file,
contentType: "application/pdf",
};
const uploaded = await S3.upload(params).promise();
res.status(200).json({ uploaded });
I get the error Error: Unsupported body payload object

Based on the documentation for AWS.S3.Upload, you need to be passing a buffer, blob object or a stream.
Pass the blob object - not the blob URL - back to your API as part of params and the rest of your code should work perfectly fine.

Related

Firebase Storage not displaying image properly (shows a small box)

EDIT: I've updated the CORS config but its still showing the same error.
I have a Tinymce RTE on my page, and when u drop an image into the editor, I have some functions that upload it to firebase storage, then swaps out the src of the text editor with the url fetched from firebase. It works kinda ok, but its being displayed as a broken link image icon.
When I check the link, its because originally it downloads the image when the link is clicked. I added a metadata property when it uploads it, but now its just showing a tiny box.
Here is the code where the image dropped into the editor is uploaded into firebase storage
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: 'image/jpeg',
};
await uploadBytes(storageRef, file, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message
console.log(error.message);
}
};
Originally, i didnt include the contentType metadata, and it was just uploading as application/octet-stream, which i assume is why it prompts you to save the image.
Image link: https://firebasestorage.googleapis.com/v0/b/cloudnoise-news.appspot.com/o/ref.jpg?alt=media&token=1edc90e7-1668-4a06-92a3-965ce275798b
Currently its displaying this
Somethings i checked through
firebase storage rules is in test mode, so should be able to read and write by anyone.
i tried sticking in different MIME types but it either shows the tiny box, or it shows "undefined"
the files upload successfully and the "swap" in Tinymce editor is also all good.
Any idea why this is happening?
you need to set the metadata tag
const metadata = {
contentType: file.type,
};
This should ensure that the correct content type is set when the image is uploaded to Firebase Storage.
If this does not resolve the issue, you may need to check that the URL returned from getDownloadURL is valid and points to the correct image. You can try opening the URL in a new browser tab to verify that the image is accessible.
I fixed it by adding a blob, I created a blob object with the file data, then i just made it upload the blob object instead of the single file.
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: file.type,
};
// Create a new Blob object with the file data
const blob2 = await new Blob([file], { type: file.type });
// Upload the Blob to Firebase Storage
await uploadBytes(storageRef, blob2, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message;;
console.log(error.message)
}
};

How do I download a file in react when server returning a gzip file?

I want to download a gzip file from the server. I have an API that returns a gzip file. So I have to call that API from React and download the gzip file from the client-side.
Here is the code I have called the API:
const res = await API.get(`${baseUrl}/${advId}/${type}`, {
params: { campaign_id: id, start_date: startDate, end_date: endDate },
});
const data = res.data;
const url = URL.createObjectURL( new Blob(data));
const link = document.createElement('a');
link.href = url;
link.click();
Tried to convert the response from the server into Blob. But the error was Unhandled Rejection (TypeError): Failed to construct 'Blob': The provided value cannot be converted to a sequence.. the response data should be a gzip file.
I can think of 2 things you may try:
if you are using axios, make sure to set the responseType to arrayBuffer axios.get(`${baseUrl}/${advId}/${type}`, { responseType: 'arraybuffer' })
use an array to create your Blob, as such new Blob([response.data])

upload image to S3 presigned url using react-native-image-picker and axios

I am trying to get an presigned url image upload working correctly. Currently the upload succeeds when selecting an image from the IOS simulator, however when I actually try to view the file it seems the file is corrupted and will not open as an image. I suspect it has something to do with my FormData but not sure.
export async function receiptUpload(file) {
const date = new Date();
const headers = await getAWSHeaders();
const presignUrl = await request.post(
urls.fileUpload.presignUpload,
{file_name: `${date.getTime()}.jpg`},
{headers}
)
.then(res => res.data);
const formData = new FormData();
formData.append('file', {
name: `${date.getTime()}.jpg`,
uri: file.uri,
type: file.type
});
const fileUpload = presignUrl.presignUrl && await request.put(
presignUrl.presignUrl,
formData
)
.then(res => res.status === 200);
}
I have tried from other fixes to change the file uri like so...
Platform.OS === 'android' ? file.uri : file.uri.replace('file://', '');
however this does not seem to work either.
I did this just recently in my current project and the following code is a working example for my use case. I didn't need to convert to a blob either though I am uploading to AWS S3 so if you are uploading elsewhere that may be the issue.
export const uploadMedia = async (fileData, s3Data, setUploadProgress = () => {}) => {
let sendData = { ...fileData };
sendData.data.type = sendData.type;
let formData = new FormData();
formData.append('key', s3Data.s3Key);
formData.append('Content-Type', fileData.type);
formData.append('AWSAccessKeyId', s3Data.awsAccessKey);
formData.append('acl', 'public-read');
formData.append('policy', s3Data.s3Policy);
formData.append('signature', s3Data.s3Signature);
formData.append('file', sendData.data);
return axios({
method: 'POST',
url: `https://${s3Data.s3Bucket}.s3.amazonaws.com/`,
data: formData,
onUploadProgress: progressEvent => {
let percentCompleted = Math.floor((progressEvent.loaded * 100) / progressEvent.total)
setUploadProgress(percentCompleted);
}
})
}
I would first check to see where the issue is occurring. After uploading can you view it on whatever storage service you are trying to upload it to. If so it's something on React Native side. If it doesn't ever get uploaded to the location you know its an error in your upload process. Might help you track the exact location of the error.
I had to do this recently for a project. I believe the data is a base64 string when coming directly from the file input. So the issue is your are uploading a base64 string not the image by simply passing the data field. I had to process it before uploading to the signed URL with the following method.
private dataUriToBlob(dataUri) {
const binary = atob(dataUri.split(',')[1]);
const array = [];
for (let i = 0; i < binary.length; i++) {
array.push(binary.charCodeAt(i));
}
return new Blob([new Uint8Array(array)], { type: 'image/jpeg' });
}
This answer fixed it for me: How can I upload image directly on Amazon S3 in React Native?
I had tried uploading with axios and fetch with FormData. The download went through but the image file was not readable, even when downloaded to my Mac from the S3 console:
The file "yourfile.jpg" could not be opened. It may be damaged or use a file format that Preview doesn’t recognize.
Only after trying to upload with XHR with the correct Content-Type header did it work. Your signedUrl should be correct as well, which seems to be the case if the download goes through.

upload local image file to aws s3 without input field

I'm making a react app where I let the user upload a pdf file and then I convert the pdf file to a jpg file, store it locally then upload it to aws s3. However currently, when I upload the jpg file to aws s3 the file is not uploaded as an image file. I want to upload a local image file to aws s3 without having to use <input type="file" />
I tried this using fs.readFile() with s3.upload() but the file that is uploaded is not an image file. I also tried using multer-s3 but this requires the user to manually select the image file using a <input type="file" />, which I want to avoid.
Here is how I'm uploading the jpg file:
const aws = require("aws-sdk");
const multer = require("multer");
const fs = require("fs");
const convertPdf = require("pdf-poppler");
const s3 = new aws.S3({
accessKeyId: "<key>",
secretAccessKey: "<secret>",
Bucket: "<bucketName>"
});
const storage = multer.diskStorage({
destination: function(req, file, cb) {
cb(null, path.join(__dirname + "/uploads/"));
},
filename: function(req, file, cb) {
// let pdfName = "samplePDF";
// req.body.file = pdfName;
cb(null, file.originalname);
}
});
const upload = multer({
storage: storage
}); //SAVING PDF TO DISK STORAGE
router.post("/pdf", upload.single("pdf"),(req, res, next) => {
const uploadPath = req.file.path;
var imagePath = req.file.destination + req.file.originalname.slice(0, -4) + "-1.jpg";
let opts = {
format: "jpg",
out_dir: req.file.destination,
out_prefix: path.basename(req.file.path).slice(0, -4),
page: null
}
//CONVERTING PDF TO JPG
convertPdf.convert(uploadPath, opts).then(() =>
{
fs.readFile(imagePath, (err, data) => {
if (err) throw err;// UPLOADING FILE BUT NOT IN IMAGE FORMAT
const params = {
Bucket: "<bucketName>",
Key: path.basename(imagePath),
Body: data
};
s3.upload(params, (s3Error, data) => {
if (s3Error) throw s3Error;
console.log(`File uploaded successfully at ${data.Location}`);
res.json({
image: data.key,
location: data.Location
});
});
});
});
});
I expected an image file to be uploaded but not the uploaded file is not an image file which is the problem. Is there any way to upload a local image file to aws s3 without requiring the use of an input field?
EDIT: turns out aws s3 makes the uploaded file private by default which is why the file could not be read, issue is resolved when I make the file public.
The uploaded file was an image file, it just didn't have the right permissions. I added the ACL: "public-read" param and now the file displays as expected.
Updated code:
fs.readFile(imagePath, (err, data) => {
if (err) throw err;
const params = {
Bucket: "flyingfishcattle",
Key: path.basename(imagePath),
Body: data,
ACL: "public-read"
};
s3.upload(params, (s3Error, data) => {
if (s3Error) throw s3Error;
console.log(`File uploaded successfully at ${data.Location}`);
res.json({
image: data.key,
location: data.Location
});
});
});

React dropzone, how to upload image?

Using React dropzone, I've successfully accessed the image using the onDrop callback. However, I'm trying to upload to Amazon S3 by sending the image to my server, saving to an S3 bucket, and returning a signed url to the image back to the client.
I can't do this with the information I have so far and the docs don't seem to mention this to my knowledge.
onDrop triggers a function call in my redux actions with the files:
export function saveImageToS3 (files, user) {
file = files[0]
// file.name -> filename.png
// file -> the entire file object
// filepreview -> blob:http:localhost:3000/1ds3-sdfw2-23as2
return {
[CALL_API] : {
method:'post',
path: '/api/image',
successType: A.SAVE_IMAGE,
body: {
name: file.name,
file: file,
preview: file.preview,
username: user
}
}
}
}
However, when I get to my server, I'm not sure how to save this blob image (that's only referenced from the browser.)
server.post('/api/image', (req, res) => {
// req.body.preview --> blob:http://localhost:3000/1ds3-sdfw2-23as2
// req.body.file -> {preview:blob:http://localhost:3000/1ds3-sdfw2-23as2}, no other properties for some reason
})
React Dropzone returns an array of File objects which can be sent to a server with a multi-part request. Depend on the library you use it can be done differently.
Using Fetch API it looks as follows:
var formData = new FormData();
formData.append('file', files[0]);
fetch('http://server.com/api/upload', {
method: 'POST',
body: formData
})
Using Superagent you would do something like:
var req = request.post('/api/upload');
req.attach(file.name, files[0]);
req.end(callback);

Resources