I've uploaded few svg image to a S3 bucket(I'v set to public-all).After I uploaded all svg images.I get each image's URL.When I clicked on those Url it just download the images for me.
Also. Does anyone know why when I use those Url in a img tag (e.g.<img src='https://***.s3.***.amazonaws.com/***.svg />`).It just shows a broken image
Here is my lambda function
'use strict'
const aws = require('aws-sdk')
const s3 = new aws.S3()
const { parse } = require('aws-multipart-parser')
const response = (statusCode, data) => ({
statusCode,
headers: {
'Content-Type': 'application/json',
'Access-Control-Allow-Origin': '*'
},
body: JSON.stringify(data)
})
exports.handler = async event => {
const inputData = parse(event, true)
if (inputData.file) {
try {
const params = {
Bucket: ***,
region: ***,
Key: `${inputData.file.filename}`,
Body: inputData.file.content,
ACL: 'public-read'
}
const s3Response = await s3.upload(params).promise()
return response(200, { statusCode: 200, url: s3Response['Location'] })
} catch (error) {
console.log('Error: ', error)
return response(500, {
error: error.message
})
}
} else {
return response(400, {
error: 'Please provide input file.'
})
}
}
You need to set the content type for you svg image on S3 to "image/svg+xml".
To change the content-type through S3 console :
Select the object on the S3 console
Click on actions
Click on change metadata
Change the content-type to image/svg+xml. The value is already available in the drop down.
As you are using the API gateway to upload the images, you can set the respective content type in you putObject request.
Reference : AWS S3 Put Object API documentation
You can refer the following AWS documentation to upload image with ContentType through JS :
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/S3.html#putObject-property
I faced a similar issue with SVGs being served from S3. On opening the svg url on the browser, it was working fine, but on passing the url in the imgs src attribute gave a broken image.
After comparing my svg with other working svgs I found that below attributes were missing in my svg.
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink"
On adding them it started working fine.
Something like this it has to be -
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
...
...
</svg>
My Front-End developers were complaining about this same issue. Blame AWS!
For some reason, they mark SVG images as binary/octet-stream instead of marking them as image/svg+xml.
My Back-End is developed in Golang and I use the official AWS SDK to Upload the files (https://github.com/aws/aws-sdk-go). What I did to fix the problem was pass by parameter the MimeType (ContentType) of the file.
If you don't pass AWS any MimeTypes then they will try to automatically identify a MimeType for your file, however they can make a mistake in some cases. If you pass a MimeType to AWS they will use your MimeType instead of trying to identify it, and that will likely solve the problem.
In my case in Golang I used this package to be able to automatically identify the MimeType https://github.com/gabriel-vasile/mimetype
var mimeTypePtr *string
mime := mimetype.Detect(file)
if mime != nil {
mimeType := mime.String()
mimeTypePtr = &mimeType
}
uploader := s3manager.NewUploader(cfg)
result, err := uploader.Upload(&s3manager.UploadInput{
Bucket: os.Getenv("AWS_BUCKET"),
Key: filename,
Body: file,
ContentType: mimeTypePtr,
})
In my case SVG image still gets forces download if its mime-type is image/svg
You can fix it by set the mime-type for the file is image/svg+xml
Related
EDIT: I've updated the CORS config but its still showing the same error.
I have a Tinymce RTE on my page, and when u drop an image into the editor, I have some functions that upload it to firebase storage, then swaps out the src of the text editor with the url fetched from firebase. It works kinda ok, but its being displayed as a broken link image icon.
When I check the link, its because originally it downloads the image when the link is clicked. I added a metadata property when it uploads it, but now its just showing a tiny box.
Here is the code where the image dropped into the editor is uploaded into firebase storage
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: 'image/jpeg',
};
await uploadBytes(storageRef, file, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message
console.log(error.message);
}
};
Originally, i didnt include the contentType metadata, and it was just uploading as application/octet-stream, which i assume is why it prompts you to save the image.
Image link: https://firebasestorage.googleapis.com/v0/b/cloudnoise-news.appspot.com/o/ref.jpg?alt=media&token=1edc90e7-1668-4a06-92a3-965ce275798b
Currently its displaying this
Somethings i checked through
firebase storage rules is in test mode, so should be able to read and write by anyone.
i tried sticking in different MIME types but it either shows the tiny box, or it shows "undefined"
the files upload successfully and the "swap" in Tinymce editor is also all good.
Any idea why this is happening?
you need to set the metadata tag
const metadata = {
contentType: file.type,
};
This should ensure that the correct content type is set when the image is uploaded to Firebase Storage.
If this does not resolve the issue, you may need to check that the URL returned from getDownloadURL is valid and points to the correct image. You can try opening the URL in a new browser tab to verify that the image is accessible.
I fixed it by adding a blob, I created a blob object with the file data, then i just made it upload the blob object instead of the single file.
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: file.type,
};
// Create a new Blob object with the file data
const blob2 = await new Blob([file], { type: file.type });
// Upload the Blob to Firebase Storage
await uploadBytes(storageRef, blob2, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message;;
console.log(error.message)
}
};
I've successfully built a React UI to select and upload N files. The key part of it is this:
<input type='file' accept='image/*' id='selectFiles' multiple onChange={handleFileChange} />
The selected files are stored in this state variable:
const [fileList, setFileList] = React.useState<FileList>();
I know they're correctly there because I iterate through them and show them in a preview DIV.
Following ImageKit's instructions, I successfully built an Auth endpoint which returns the auth credentials.
Then, within a useEffect I iterated through fileList to upload one photo at a time to the ImageKit server. But even trying just one file, I keep getting a 400 error informing me that the fileName parameter is missing. It definitely is not missing so I suspect that the problem lies with what I'm providing as the file parameter.
Here's the critical code (with some data obscured for privacy reasons) :
const uploadFile = async (file: File) => {
try {
const body = {
file: file,
publicKey: 'my_public_key',
signature: 'imageKit_signature',
expire: 'imageKit_expiry_value',
token: 'imageKit_token',
fileName: 'test123.jpg',
useUniqueFileName: false,
folder: userName,
overwriteFile: false,
};
const response = await axios.post('https://upload.imagekit.io/api/v1/files/upload', body);
console.log(response.status, response.data);
} catch (err) {
console.error(err);
}
};
Might anyone see what I'm doing wrong?
Robert
I solved the problem. If you're going to upload a file to ImageKit using their POST endpoint, you need to explicitly set the headers like this:
const response = await axios.post(
'https://upload.imagekit.io/api/v1/files/upload',
body,
{ headers: {'Content-Type': 'multipart/form-data'} }
);
For those wondering, here's the barebones body:
const body = {
file: base64File,
publicKey: imageKitPublicKey,
signature: authParams?.signature,
expire: authParams?.expire,
token: authParams?.token,
fileName: 'someFilename.ext',
};
But indeed, specifying Content-Type like above will allow you to upload a file to their server.
I am trying to get an presigned url image upload working correctly. Currently the upload succeeds when selecting an image from the IOS simulator, however when I actually try to view the file it seems the file is corrupted and will not open as an image. I suspect it has something to do with my FormData but not sure.
export async function receiptUpload(file) {
const date = new Date();
const headers = await getAWSHeaders();
const presignUrl = await request.post(
urls.fileUpload.presignUpload,
{file_name: `${date.getTime()}.jpg`},
{headers}
)
.then(res => res.data);
const formData = new FormData();
formData.append('file', {
name: `${date.getTime()}.jpg`,
uri: file.uri,
type: file.type
});
const fileUpload = presignUrl.presignUrl && await request.put(
presignUrl.presignUrl,
formData
)
.then(res => res.status === 200);
}
I have tried from other fixes to change the file uri like so...
Platform.OS === 'android' ? file.uri : file.uri.replace('file://', '');
however this does not seem to work either.
I did this just recently in my current project and the following code is a working example for my use case. I didn't need to convert to a blob either though I am uploading to AWS S3 so if you are uploading elsewhere that may be the issue.
export const uploadMedia = async (fileData, s3Data, setUploadProgress = () => {}) => {
let sendData = { ...fileData };
sendData.data.type = sendData.type;
let formData = new FormData();
formData.append('key', s3Data.s3Key);
formData.append('Content-Type', fileData.type);
formData.append('AWSAccessKeyId', s3Data.awsAccessKey);
formData.append('acl', 'public-read');
formData.append('policy', s3Data.s3Policy);
formData.append('signature', s3Data.s3Signature);
formData.append('file', sendData.data);
return axios({
method: 'POST',
url: `https://${s3Data.s3Bucket}.s3.amazonaws.com/`,
data: formData,
onUploadProgress: progressEvent => {
let percentCompleted = Math.floor((progressEvent.loaded * 100) / progressEvent.total)
setUploadProgress(percentCompleted);
}
})
}
I would first check to see where the issue is occurring. After uploading can you view it on whatever storage service you are trying to upload it to. If so it's something on React Native side. If it doesn't ever get uploaded to the location you know its an error in your upload process. Might help you track the exact location of the error.
I had to do this recently for a project. I believe the data is a base64 string when coming directly from the file input. So the issue is your are uploading a base64 string not the image by simply passing the data field. I had to process it before uploading to the signed URL with the following method.
private dataUriToBlob(dataUri) {
const binary = atob(dataUri.split(',')[1]);
const array = [];
for (let i = 0; i < binary.length; i++) {
array.push(binary.charCodeAt(i));
}
return new Blob([new Uint8Array(array)], { type: 'image/jpeg' });
}
This answer fixed it for me: How can I upload image directly on Amazon S3 in React Native?
I had tried uploading with axios and fetch with FormData. The download went through but the image file was not readable, even when downloaded to my Mac from the S3 console:
The file "yourfile.jpg" could not be opened. It may be damaged or use a file format that Preview doesn’t recognize.
Only after trying to upload with XHR with the correct Content-Type header did it work. Your signedUrl should be correct as well, which seems to be the case if the download goes through.
I am a newbie into React. I have been trying to upload file (images, json files etc) to AWS S3 bucket from a reactJS application using ReactS3Uploader (version 4.8.0). I am following this example : https://www.npmjs.com/package/react-s3-uploader
I have added the below code into one of my component where I want the file upload functionality :
<ReactS3Uploader
getSignedUrl={getSignedUrl}
accept="image/*"
s3path="/uploads/test/"
preprocess={this.onUploadStart}
onSignedUrl={this.onSignedUrl}
onProgress={this.onUploadProgress}
onError={this.onUploadError}
onFinish={this.onUploadFinish}
signingUrlHeaders={{ }}
signingUrlQueryParams={{ }}
signingUrlWithCredentials={ true } // in case when need to pass authentication credentials via CORS
uploadRequestHeaders={{ 'x-amz-acl': 'public-read' }} // this is the default
contentDisposition="auto"
scrubFilename={(filename) => filename.replace(/[^\w\d_\-.]+/ig, '')}
inputRef={cmp => this.uploadInput = cmp}
autoUpload={true}
server="http://cross-origin-server.com"
/>
I have also created another component for getSignedUrl (S3SignedUrl.js) as follows (as described here https://www.npmjs.com/package/react-s3-uploader ) :
import React, { Component } from 'react';
import { toast } from 'react-toastify';
import axios from '../../shared/axios';
function getSignedUrl(file, callback) {
console.log('.........Inside getSignedUrl()>>file.nameeeee.........'+file.name)
console.log('.........Inside getSignedUrl()>>file.size.........'+file.size)
const filename = file.name;
const params = {
filename: file.name
//contentType: file.type
};
var headers = {
'Content-Type': 'application/json'
}
axios.post(`/api/link/admin/v1/s3/sign?filename=${filename}`, {headers: headers})
.then(data => {
console.log('data.data.signedUrl>>>>>>>>>>>'+data.data.signedUrl)
callback(data);
return data.data
})
.catch(error => {
console.error(error);
});
}
export default getSignedUrl;
I have a groovy based backend api (springboot application) which creates the s3 signed url in the following format :
{
"signedUrl": “<complete signed url>”,
"uploadPath": “mybucket/apidocs/dev/version/logo/04137a9c-fb60-48dd-ae0f-c53d78e4e379/logo.png",
"expiresAt": 1552083549794
}
I am successfully able to call my groovy /s3/sign url from my react application through (S3SignedUrl.js which uses Axios) but right after that when ReactS3Uploader component tries to upload the file to the AWS S3 bucket, it gives me an error with HTTP 403.
When I see into the network tab (by inspecting within the google chrome), the underlying call being made my ReactS3Uploader component is
PUT https://localhost:3000/apps/gateway/undefined with Http 403
I am not sure what is undefined here within the url. Shouldn’t ReactS3Uploader component automatically be doing a HTTP PUT to the signedURL ?
I do see some fixes in react-s3-uploader version 4.6.2 around undefined in file path when not providing s3path property. https://changelogs.md/github/odysseyscience/react-s3-uploader/
But not sure if it has anything too do with the problem I am getting. By the way I am using using version 4.8.0.
Just to confirm I can successfully upload the file using that SignedURL manually thru curl.
Any help here would highly be appreciated.
Thanks in advance
I know this is an old post, but I've been searching for something similar and came across this.
You should have callback(data.data);.
ReactS3Uploader will redirect to an undefined URL if it's configured to use a getSignedUrl function and is not returned a signedUrl field.
Using React dropzone, I've successfully accessed the image using the onDrop callback. However, I'm trying to upload to Amazon S3 by sending the image to my server, saving to an S3 bucket, and returning a signed url to the image back to the client.
I can't do this with the information I have so far and the docs don't seem to mention this to my knowledge.
onDrop triggers a function call in my redux actions with the files:
export function saveImageToS3 (files, user) {
file = files[0]
// file.name -> filename.png
// file -> the entire file object
// filepreview -> blob:http:localhost:3000/1ds3-sdfw2-23as2
return {
[CALL_API] : {
method:'post',
path: '/api/image',
successType: A.SAVE_IMAGE,
body: {
name: file.name,
file: file,
preview: file.preview,
username: user
}
}
}
}
However, when I get to my server, I'm not sure how to save this blob image (that's only referenced from the browser.)
server.post('/api/image', (req, res) => {
// req.body.preview --> blob:http://localhost:3000/1ds3-sdfw2-23as2
// req.body.file -> {preview:blob:http://localhost:3000/1ds3-sdfw2-23as2}, no other properties for some reason
})
React Dropzone returns an array of File objects which can be sent to a server with a multi-part request. Depend on the library you use it can be done differently.
Using Fetch API it looks as follows:
var formData = new FormData();
formData.append('file', files[0]);
fetch('http://server.com/api/upload', {
method: 'POST',
body: formData
})
Using Superagent you would do something like:
var req = request.post('/api/upload');
req.attach(file.name, files[0]);
req.end(callback);