I'm making a react app where I let the user upload a pdf file and then I convert the pdf file to a jpg file, store it locally then upload it to aws s3. However currently, when I upload the jpg file to aws s3 the file is not uploaded as an image file. I want to upload a local image file to aws s3 without having to use <input type="file" />
I tried this using fs.readFile() with s3.upload() but the file that is uploaded is not an image file. I also tried using multer-s3 but this requires the user to manually select the image file using a <input type="file" />, which I want to avoid.
Here is how I'm uploading the jpg file:
const aws = require("aws-sdk");
const multer = require("multer");
const fs = require("fs");
const convertPdf = require("pdf-poppler");
const s3 = new aws.S3({
accessKeyId: "<key>",
secretAccessKey: "<secret>",
Bucket: "<bucketName>"
});
const storage = multer.diskStorage({
destination: function(req, file, cb) {
cb(null, path.join(__dirname + "/uploads/"));
},
filename: function(req, file, cb) {
// let pdfName = "samplePDF";
// req.body.file = pdfName;
cb(null, file.originalname);
}
});
const upload = multer({
storage: storage
}); //SAVING PDF TO DISK STORAGE
router.post("/pdf", upload.single("pdf"),(req, res, next) => {
const uploadPath = req.file.path;
var imagePath = req.file.destination + req.file.originalname.slice(0, -4) + "-1.jpg";
let opts = {
format: "jpg",
out_dir: req.file.destination,
out_prefix: path.basename(req.file.path).slice(0, -4),
page: null
}
//CONVERTING PDF TO JPG
convertPdf.convert(uploadPath, opts).then(() =>
{
fs.readFile(imagePath, (err, data) => {
if (err) throw err;// UPLOADING FILE BUT NOT IN IMAGE FORMAT
const params = {
Bucket: "<bucketName>",
Key: path.basename(imagePath),
Body: data
};
s3.upload(params, (s3Error, data) => {
if (s3Error) throw s3Error;
console.log(`File uploaded successfully at ${data.Location}`);
res.json({
image: data.key,
location: data.Location
});
});
});
});
});
I expected an image file to be uploaded but not the uploaded file is not an image file which is the problem. Is there any way to upload a local image file to aws s3 without requiring the use of an input field?
EDIT: turns out aws s3 makes the uploaded file private by default which is why the file could not be read, issue is resolved when I make the file public.
The uploaded file was an image file, it just didn't have the right permissions. I added the ACL: "public-read" param and now the file displays as expected.
Updated code:
fs.readFile(imagePath, (err, data) => {
if (err) throw err;
const params = {
Bucket: "flyingfishcattle",
Key: path.basename(imagePath),
Body: data,
ACL: "public-read"
};
s3.upload(params, (s3Error, data) => {
if (s3Error) throw s3Error;
console.log(`File uploaded successfully at ${data.Location}`);
res.json({
image: data.key,
location: data.Location
});
});
});
Related
EDIT: I've updated the CORS config but its still showing the same error.
I have a Tinymce RTE on my page, and when u drop an image into the editor, I have some functions that upload it to firebase storage, then swaps out the src of the text editor with the url fetched from firebase. It works kinda ok, but its being displayed as a broken link image icon.
When I check the link, its because originally it downloads the image when the link is clicked. I added a metadata property when it uploads it, but now its just showing a tiny box.
Here is the code where the image dropped into the editor is uploaded into firebase storage
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: 'image/jpeg',
};
await uploadBytes(storageRef, file, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message
console.log(error.message);
}
};
Originally, i didnt include the contentType metadata, and it was just uploading as application/octet-stream, which i assume is why it prompts you to save the image.
Image link: https://firebasestorage.googleapis.com/v0/b/cloudnoise-news.appspot.com/o/ref.jpg?alt=media&token=1edc90e7-1668-4a06-92a3-965ce275798b
Currently its displaying this
Somethings i checked through
firebase storage rules is in test mode, so should be able to read and write by anyone.
i tried sticking in different MIME types but it either shows the tiny box, or it shows "undefined"
the files upload successfully and the "swap" in Tinymce editor is also all good.
Any idea why this is happening?
you need to set the metadata tag
const metadata = {
contentType: file.type,
};
This should ensure that the correct content type is set when the image is uploaded to Firebase Storage.
If this does not resolve the issue, you may need to check that the URL returned from getDownloadURL is valid and points to the correct image. You can try opening the URL in a new browser tab to verify that the image is accessible.
I fixed it by adding a blob, I created a blob object with the file data, then i just made it upload the blob object instead of the single file.
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: file.type,
};
// Create a new Blob object with the file data
const blob2 = await new Blob([file], { type: file.type });
// Upload the Blob to Firebase Storage
await uploadBytes(storageRef, blob2, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message;;
console.log(error.message)
}
};
I'm using react-pdf library and it generates me a blob and url as shown in the docs: https://react-pdf.org/advanced#on-the-fly-rendering
Blob object copied from the console:
{blob: Blob}
blob: Blob
size: 3597607
type: "application/pdf"
[[Prototype]]: Blob
[[Prototype]]: Object
I try to upload this to S3.
Frontend code:
const fileName = Date.now().toString() + ".pdf";
const file = new File([blob], fileName);
axios
.post("api/upload-poster", {
fileName,
file,
})
.then(({ data }) => console.log(data));
Backend code (nextjs handler):
const { fileName, file } = req.body;
const params = {
Bucket: "my-bucket-name",
Key: fileName,
Body: file,
contentType: "application/pdf",
};
const uploaded = await S3.upload(params).promise();
res.status(200).json({ uploaded });
I get the error Error: Unsupported body payload object
Based on the documentation for AWS.S3.Upload, you need to be passing a buffer, blob object or a stream.
Pass the blob object - not the blob URL - back to your API as part of params and the rest of your code should work perfectly fine.
After looking through numerous documentation on Multer I believe that I have the proper setup on my express server. However, each time I attempt to access the req.files.filename attribute of my incoming file, the value is returned as undefined.
Here is my Express server side code:
const storage = multer.diskStorage({
destination: '../Uploads',
filename: (req, file, cb) => cb(null, file.originalname)
});
const upload = multer({
storage: storage
});
Router.post("/", upload.single("file"), async (req, res) => {
debug.log("connected");
debug.log(req.file);
debug.log(req.body);
if (!req.files) {
debug.log("receiving connection but file not found");
res.send({
status: false,
message: "No file uploaded",
});
} else {
debug.log("receiving connection and file found");
try {
debug.log(req.files.filename);
debug.log("attempting to save file" + req.files);
const post = await new Post({
title: req.files.filename,
owner: req.body.owner,
industry: req.body.industry,
});
If I understand correctly, calling upload.single should make req.file.filename a usable attribute, but it does not seem to be working.
EDIT:
I solved this problem FINALLY!
Apparently using the app.use(fileUpload()); middleware was somehow interfering with Multer.
Before when I had app.use(fileUpload()); enabled I could access the file through req.files, but the file wouldn't save.
After commenting out app.use(fileUpload()); I can access the file through req.file and it will save correctly.
When you upload a single file (upload.single), look at req.file, not req.files.
I am integrating CKEDITOR in react project and I am using AWS S3 bucket to upload the image that i add in text-editor.. Upload is working fine... The problem is if I delete the image in text-editor.It does not delete it from the AWS bucket.
Causing a lot of unwanted images in bucket. Hence , I need to delete the image from AWS server if it's not present in the text-editor..
How Can I do it??
I have the link to the image in the React Part as the repsonse of the upload.
You need to have a bucket name and key of that file in order to delete that file form AWS s3
const deleteS3Object = async (key, BUCKET_NAME) => {
return new Promise((resolve, reject) => {
try {
let s3bucket = new AWS.S3({
accessKeyId: IAM_USER_KEY,
secretAccessKey: IAM_USER_SECRET,
Bucket: BUCKET_NAME,
});
var params = { Bucket: BUCKET_NAME, Key: key };
s3bucket.deleteObject(params, function(err, data) {
if (err) reject(err);
// an error occurred
else resolve(data); // successful response
});
} catch (e) {
reject(e);
}
});
};
Using React dropzone, I've successfully accessed the image using the onDrop callback. However, I'm trying to upload to Amazon S3 by sending the image to my server, saving to an S3 bucket, and returning a signed url to the image back to the client.
I can't do this with the information I have so far and the docs don't seem to mention this to my knowledge.
onDrop triggers a function call in my redux actions with the files:
export function saveImageToS3 (files, user) {
file = files[0]
// file.name -> filename.png
// file -> the entire file object
// filepreview -> blob:http:localhost:3000/1ds3-sdfw2-23as2
return {
[CALL_API] : {
method:'post',
path: '/api/image',
successType: A.SAVE_IMAGE,
body: {
name: file.name,
file: file,
preview: file.preview,
username: user
}
}
}
}
However, when I get to my server, I'm not sure how to save this blob image (that's only referenced from the browser.)
server.post('/api/image', (req, res) => {
// req.body.preview --> blob:http://localhost:3000/1ds3-sdfw2-23as2
// req.body.file -> {preview:blob:http://localhost:3000/1ds3-sdfw2-23as2}, no other properties for some reason
})
React Dropzone returns an array of File objects which can be sent to a server with a multi-part request. Depend on the library you use it can be done differently.
Using Fetch API it looks as follows:
var formData = new FormData();
formData.append('file', files[0]);
fetch('http://server.com/api/upload', {
method: 'POST',
body: formData
})
Using Superagent you would do something like:
var req = request.post('/api/upload');
req.attach(file.name, files[0]);
req.end(callback);