HEIC is Apple's own format to store high resolution images made with iOS cameras. But I would store on backend JPG, because HEIC is not displayed in most browsers, not even in Safari.
SOLUTION 1
I tried this for the conversion:
const buffer = Buffer.from(await file.arrayBuffer())
const d = heicConvert({ buffer, format: 'JPEG' })
const imgBase64 = btoa(
d.reduce((data, byte) => `${data}${String.fromCharCode(byte)}`, '')
)
but because I use Next.js it is not compatible with it.
Failed to compile.
./node_modules/libheif-js/libheif/libheif.js
Module not found: Can't resolve 'fs' in '/Users/janoskukoda/Workspace/tikex/portal/team/node_modules/libheif-js/libheif'
SOLUTION 2
I tried this also:
export default uploadImage
const buffer = await file.arrayBuffer()
const image = sharp(buffer)
const metadata = await image.metadata()
if (metadata.format === 'heic') {
// Convert the image to JPG
const jpgBuffer = await image.jpeg().toBuffer()
// Encode the JPG image as a base64 string
const imgBase64 = btoa(
jpgBuffer.reduce((data, byte) => `${data}${String.fromCharCode(byte)}`, '')
)
}
But I can not compile, seems sharp is not recommended to use in client side.
Do you have any other way to do it?
Anyway idea comes here: https://itnext.io/tackling-iphone-or-ipad-images-support-in-browser-8e3e64e9aaa1
If you show me a solution uses serverless api, it is also ok. It is important file comes from html input element.
SOLUTION 3
import loadImage from 'blueimp-load-image'
convertedImage = await new Promise((resolve, reject) => {
loadImage(
propsAndFile.file,
resolve,
{ orientation: true, canvas: true },
reject
)
})
Before getting to the answer: You can never trust data uploaded by a client — you must always validate/convert using a process that is not accessible by a user (a backend process) to ensure data validity: even if there are mechanisms in place to validate received network requests as coming from an authenticated user on your site, any user can still use developer tools to execute arbitrary JavaScript and send whatever kinds of network requests with whatever payloads that they want to.
Regarding iOS devices and getting JPEG instead of HEIF from a file input: You don't need to do this yourself — iOS can do it for you.
<input type="file"> elements support an accept attribute that can be used to restrict the kinds of file media types that can be uploaded: read more at the Limiting accepted file types section of the MDN documentation article for <input type="file">.
Below is an example which shows how to use that attribute. When an iOS user selects the input, they can choose to take a photo using their camera or select one from the photo library. iOS will perform the necessary file conversion to JPEG automatically in both of these cases (for example, even when a selected image from the photo library is in HEIF format). The example demonstrates this when you try it with an HEIF-encoded image on an iOS device:
const input = document.getElementById('image-upload');
const output = document.getElementById('output');
const updateDisplayedFileInfo = () => {
const file = input.files?.[0];
if (!file) {
output.textContent = 'No file selected';
return;
}
const dateModified = new Date(file.lastModified).toISOString();
const {name, size, type} = file;
const data = {
dateModified,
name,
size,
type,
};
const json = JSON.stringify(data, null, 2);
output.textContent = json;
};
input.addEventListener('change', updateDisplayedFileInfo);
updateDisplayedFileInfo();
pre { background-color: hsla(0, 0%, 50%, 0.1); padding: 0.5rem; } code { font-family: monospace; }
<input id="image-upload" type="file" accept="image/jpeg" />
<pre><code id="output"></code></pre>
Install the heic2jpeg library by running the following command in your terminal:
npm install heic2jpeg
Import the heic2jpeg library in your React component:
import heic2jpeg from 'heic2jpeg';
Convert the HEIC file to JPG by calling the convert method of the heic2jpeg library and passing in the HEIC file as an argument:
const jpegData = await heic2jpeg.convert(heicFile);
You can then use the jpegData to create a new JPG file or display it in an img element:
const jpegFile = new File([jpegData], 'image.jpg', { type: 'image/jpeg' });
// or
const imageElement = document.createElement('img');
imageElement.src = URL.createObjectURL(jpegFile);
document.body.appendChild(imageElement);
Note that the heic2jpeg library requires the canvas and process modules to be available in the global context, so you may need to include these modules in your application as well.
Related
EDIT: I've updated the CORS config but its still showing the same error.
I have a Tinymce RTE on my page, and when u drop an image into the editor, I have some functions that upload it to firebase storage, then swaps out the src of the text editor with the url fetched from firebase. It works kinda ok, but its being displayed as a broken link image icon.
When I check the link, its because originally it downloads the image when the link is clicked. I added a metadata property when it uploads it, but now its just showing a tiny box.
Here is the code where the image dropped into the editor is uploaded into firebase storage
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: 'image/jpeg',
};
await uploadBytes(storageRef, file, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message
console.log(error.message);
}
};
Originally, i didnt include the contentType metadata, and it was just uploading as application/octet-stream, which i assume is why it prompts you to save the image.
Image link: https://firebasestorage.googleapis.com/v0/b/cloudnoise-news.appspot.com/o/ref.jpg?alt=media&token=1edc90e7-1668-4a06-92a3-965ce275798b
Currently its displaying this
Somethings i checked through
firebase storage rules is in test mode, so should be able to read and write by anyone.
i tried sticking in different MIME types but it either shows the tiny box, or it shows "undefined"
the files upload successfully and the "swap" in Tinymce editor is also all good.
Any idea why this is happening?
you need to set the metadata tag
const metadata = {
contentType: file.type,
};
This should ensure that the correct content type is set when the image is uploaded to Firebase Storage.
If this does not resolve the issue, you may need to check that the URL returned from getDownloadURL is valid and points to the correct image. You can try opening the URL in a new browser tab to verify that the image is accessible.
I fixed it by adding a blob, I created a blob object with the file data, then i just made it upload the blob object instead of the single file.
const imagesUploadHandler = async (blobInfo, success, failure) => {
try {
const file = blobInfo.blob();
const storageRef = ref(storage, file.name);
const metadata = {
contentType: file.type,
};
// Create a new Blob object with the file data
const blob2 = await new Blob([file], { type: file.type });
// Upload the Blob to Firebase Storage
await uploadBytes(storageRef, blob2, metadata);
const url = await getDownloadURL(storageRef);
console.log(url);
return url;
} catch (error) {
// Call the failure callback with the error message;;
console.log(error.message)
}
};
I'm using Uppy for file uploads in React, with a Rails API using Shrine.
I'm trying to show a preview for an uploaded video before submitting a form. It's important to emphasize that this is specifically for a video upload, not an image. So the 'thumbnail:generated' event will not apply here.
I can't seem to find any events that uppy provides that returns a cached video preview (like thumbnail:generated does) or anything that passes back a presigned url for the uploaded file (less expected, obviously), so the only option I see is constructing the url manually. Here's what I'm currently trying for that (irrelevant code removed for brevity):
import React, { useEffect, useState } from 'react'
import AwsS3 from '#uppy/aws-s3'
import Uppy from '#uppy/core'
import axios from 'axios'
import { DragDrop } from '#uppy/react'
import { API_BASE } from '../../../api'
const constructParams = (metadata) => ([
`?X-Amz-Algorithm=${metadata['x-amz-algorithm']}`,
`&X-Amz-Credential=${metadata['x-amz-credential']}`,
`&X-Amz-Date=${metadata['x-amz-date']}`,
'&X-Amz-Expires=900',
'&X-Amz-SignedHeaders=host',
`&X-Amz-Signature=${metadata['x-amz-signature']}`,
].join('').replaceAll('/', '%2F'))
const MediaUploader = () => {
const [videoSrc, setVideoSrc] = useState('')
const uppy = new Uppy({
meta: { type: 'content' },
restrictions: {
maxNumberOfFiles: 1
},
autoProceed: true,
})
const getPresigned = async (id, type) => {
const response = await axios.get(`${API_BASE}/s3/params?filename=${id}&type=${type}`)
const { fields, url } = response.data
const params = constructParams(fields)
const presignedUrl = `${url}/${fields.key}${params}`
console.log('presignedUrl from Shrine request data: ', presignedUrl)
setVideoSrc(presignedUrl)
}
useEffect(() => {
uppy
.use(AwsS3, {
id: `AwsS3:${Math.random()}`,
companionUrl: API_BASE,
})
uppy.on('upload-success', (file, _response) => {
const { type, meta } = file
// First attempt to construct presigned URL here
const url = 'https://my-s3-bucket.s3.us-west-1.amazonaws.com'
const params = constructParams(meta)
const presignedUrl = `${url}/${meta.key}${params}`
console.log('presignedUrl from upload-success data: ', presignedUrl)
// Second attempt to construct presigned URL here
const id = meta.key.split(`${process.env.REACT_APP_ENV}/cache/`)[1]
getPresigned(id, type)
})
}, [uppy])
return (
<div className="MediaUploader">
<div className="Uppy__preview__wrapper">
<video
src={videoSrc || ''}
className="Uppy__preview"
controls
/>
</div>
{(!videoSrc || videoSrc === '') && (
<DragDrop
uppy={uppy}
className="UploadForm"
locale={{
strings: {
dropHereOr: 'Drop here or %{browse}',
browse: 'browse',
},
}}
/>
)}
</div>
)
}
export default MediaUploader
Both urls here come back with a SignatureDoesNotMatch error from AWS.
The manual construction of the url comes mainly from constructParams. I have two different implementations of this, the first of which takes the metadata directly from the uploaded file data in the 'upload-success' event, and then just concatenates a string to build the url. The second one uses getPresigned, which makes a request to my API, which points to a generated Shrine path that should return data for a presigned URL. API_BASE simply points to my Rails API. More info on the generated Shrine route here.
It's worth noting that everything works perfectly with the upload process that passes through Shrine, and after submitting the form, I'm able to get a presigned url for the video and play it without issue on the site. So I have no reason to believe Shrine is returning incorrectly signed urls.
I've compared the two presigned urls I'm manually generating in the form, with the url returned from Shrine after uploading. All 3 are identical in structure, but have different signatures. Here are those three urls:
presignedUrl from upload-success data:
https://my-s3-bucket.s3.us-west-1.amazonaws.com/development/cache/41b229fb17cbf21925d2cd907a59be25.mp4?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAW63AYCMFA4374OLC%2F20221210%2Fus-west-1%2Fs3%2Faws4_request&X-Amz-Date=20221210T132613Z&X-Amz-Expires=900&X-Amz-SignedHeaders=host&X-Amz-Signature=97aefd1ac7f3d42abd2c48fe3ad50b542742ad0717a51528c35f1159bfb15609
presignedUrl from Shrine request data:
https://my-s3-bucket.s3.us-west-1.amazonaws.com/development/cache/023592fb14c63a45f02c1ad89a49e5fd.mp4?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAW63AYCMFA4374OLC%2F20221210%2Fus-west-1%2Fs3%2Faws4_request&X-Amz-Date=20221210T132619Z&X-Amz-Expires=900&X-Amz-SignedHeaders=host&X-Amz-Signature=7171ac72f7db2b8871668f76d96d275aa6c53f71b683bcb6766ac972e549c2b3
presigned url displayed on site after form submission:
https://my-s3-bucket.s3.us-west-1.amazonaws.com/development/cache/41b229fb17cbf21925d2cd907a59be25.mp4?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAW63AYCMFA4374OLC%2F20221210%2Fus-west-1%2Fs3%2Faws4_request&X-Amz-Date=20221210T132734Z&X-Amz-Expires=900&X-Amz-SignedHeaders=host&X-Amz-Signature=9ecc98501866f9c5bd460369a7c2ce93901f94c19afa28144e0f99137cdc2aaf
The first two urls come back with SignatureDoesNotMatch, while the third url properly plays the video.
I'm aware the first and third urls have the same file name, while the second url does not. I'm not sure what to make of that, though, but the relevance of this is secondary to me, since that solution was more of a last ditch effort anyway.
I'm not at all attached to the current way I'm doing things. It's just the only solution I could come up with, due to lack of options. If there's a better way of going about this, I'm very open to suggestions.
I am only recently dealing with the AWS SDK and thus please excuse if my approach is complete nonsense.
I want to upload a simple media file to my S3. I was following this tutorial and so far I am able to upload files without a problem. For userbility a progress bar would be a nice extra and therefore I was researching how to achieve this. I quickly found that the current AWS SDK v3 does not support httpUploadProgress anymore but we should use #aws-sdk/lib-storage instead. Using this library, I am still able to upload files to the S3 but I can't get the progress tracker to work! I assume this has something to do with me not fully understanding how to deal with async within a React component.
So here is my minified component example (I am using Chakra UI here)
const TestAWS: React.FC = () => {
const inputRef = useRef<HTMLInputElement | null>(null);
const [progr, setProgr] = useState<number>();
const region = "eu-west-1";
const bucketname = "upload-test";
const handleClick = async () => {
inputRef.current?.click();
};
const handleChange = (e: any) => {
console.log('Start file upload');
const file = e.target.files[0];
const target = {
Bucket: bucketname,
Key: `jobs/${file.name}`,
Body: file,
};
const s3 = new S3Client({
region: region,
credentials: fromCognitoIdentityPool({
client: new CognitoIdentityClient({ region: region }),
identityPoolId: "---MY ID---",
}),
});
const upload = new Upload({
client: s3,
params: target,
});
const t = upload.on("httpUploadProgress", progress => {
console.log("Progress", progress);
if (progress.loaded && progress.total) {
console.log("loaded/total", progress.loaded, progress.total);
setProgr(Math.round((progress.loaded / progress.total) * 100)); // I was expecting this line to be sufficient for updating my component
}
});
await upload.done().then(r => console.log(r));
};
console.log('Progress', progr);
return (
<InputGroup onClick={handleClick}>
<input ref={inputRef} type={"file"} multiple={false} hidden accept='video/*' onChange={e => handleChange(e)} />
<Flex layerStyle='uploadField'>
<Center w='100%'>
<VStack>
<PlusIcon />
<Text>Choose Video File</Text>
</VStack>
</Center>
</Flex>
{progr && <Progress value={progr} />}
</InputGroup>
);
};
export default TestAWS;
So basically I see the event getting fired (start file upload). Then it takes a while and I see the Promise result and the Progress, 100 in my console. This means to me that the state variable gets updated (at least once) but the component does not re-render?
What is it what I am doing wrong here? Any help appreciated!
Alright, I have found the solution. The callback on the state variable works fine and does what it should. But the configuration of the Upload object was off. After digging into the source I found out that the event listener only gets triggered if the uploader has uploaded more data. Because Uploader chunks the uploads you have two separate config parameters which allow you to split your upload into separate chunks. So
const upload = new Upload({
client: s3,
params: target,
queueSize: 4, // 4 is minimum
partSize: 5*1024*1024 // 5MB is minimum
});
basically does the job when the file we upload is larger than 5MB! Only then the event gets triggered again and updates the state variable.
Since this uploader is made for handling large file uploads, this totally makes sense and we could simply adjust queueSize and partSize according to the file we want to upload. Something like
let queueSize = 10;
const file = event.target.files[0];
let partSize = file.size / (10 * 1024 * 1024); // 1/10th of the file size in MB
const upload = new Upload({
client: s3,
params: target,
queueSize: partSize > 5 queueSize : undefined,
partSize: partSize > 5 ? partsize : undefined
});
Obviously, this can be done much more sophisticated but I did not want to spend too much time on this since it is not part of the original question.
Conclusion
If your file is large enough (>5MB), you will see progress update, depending on the number of chunks (of 5MB or more) you have chosen to split your file.
Since this only affects the handleChange method from the original example, I post this for completeness
const handleChange = async ( event ) => {
const file = event.target.files[0]
const target = {
Bucket: 'some-S3-bucket',
Key: `jobs/${file.name}`,
Body: file,
};
const s3 = new S3Client({
region: 'your-region',
credentials: fromCognitoIdentityPool({
client: new CognitoIdentityClient({ region: 'your-region' }),
identityPoolId: "your-id",
}),
});
// this will default to queueSize=4 and partSize=5MB
const upload = new Upload({
client: s3,
params: target
});
upload.on("httpUploadProgress", progress => {
console.log('Current Progress', progress);
setProgr(progress);
});
await upload.done().then(r => console.log(r));
}
Maybe this helps someone who has the same problem.
I came across your answer after having exactly the same problem (with Vue) today!
Indeed you are right: the AWS SDK JS v3 event only fires per part which is not at all clear and I wasted time debugging that too. Like for a 4MB file, it would only ever fire at 100%.
As you say, you can experiment with the part size but the minimum is 5MB and so on a slow connection I found it can appear that an upload is stuck as you have to wait for 5MB to get any data. Hmm. So what I did was look at the size of the file being uploaded. And if it is under a threshold (say 25MB, or whatever is applicable), well it's probably safe to upload that all in one go as you don't really need multipart uploading. And so I also made a presigned URL (https://aws.amazon.com/blogs/developer/generate-presigned-url-modular-aws-sdk-javascript/) which can be used to PUT using axios (since fetch does not support progress events yet).
So that way you can use upload for large files (where you actually need multipart uploading and where 5MB as a percentage of the file size is small), and use a presigned URL for small files and so get much more frequent updates.
The same progress event handler can be used by both.
this.$axios
.request({
method: "PUT",
url: SIGNED-URL-HERE,
data: file,
timeout: 3600 * 1000,
onUploadProgress: this.uploadProgress,
})
.then((data) => {
console.log("Success", data);
})
.catch((error) => {
console.log("Error", error.code, error.message);
});
Not ideal but it helps.
I have a create-react-app that reads and writes local files using File System Access API. When run in a browser (Chrome or Edge that support it), both reading and writing files work fine.
When the app is run in Electron, reading works but writing fails due to: Uncaught (in promise) DOMException: The request is not allowed by the user agent or the platform in the current context.
I am using the latest Electron (12.0.1) which uses the same Chromium (89.0.4389.82) as the one in my Chrome browser.
Below is the relevant code. The console log after requestPermission call shows true and granted in the browser and true and denied in Electron.
I tried disabling webSecurity when creating BrowserWindow, disabling sandbox with appendSwitch but nothing helped.
Is there a way to give Chromium in Electron more permissions?
If not, I am willing to handle file writing differently when in Electron. In that case, what to write in place of TODO in the code? Note that because it is a create-react-app, the fs module is not available.
export async function chooseAndReadFile() {
const fileHandle = await window.showOpenFilePicker().then((handles) => handles[0])
const file = await fileHandle.getFile()
const contents = await file.text()
return contents
}
export async function chooseAndWriteToFile(contents: string) {
const fileHandle = await window.showSaveFilePicker()
const descriptor: FileSystemHandlePermissionDescriptor = {
writable: true,
mode: "readwrite"
}
const permissionState = await fileHandle.requestPermission(descriptor)
console.log(window.isSecureContext)
console.log(permissionState)
const writable = await fileHandle.createWritable()
await writable.write(contents)
await writable.close()
}
let isElectron = require("is-electron")
export async function chooseAndWriteToFileUniversal(contents: string) {
if (isElectron()) {
// TODO: Do what???
} else {
chooseAndWriteToFile(contents)
}
}
Answering my own question, I finally used a solution with HTML download attribute, nicely described here. When this technique is used in Electron, it presents a file save dialog which is exactly what I want. When used in a browser, this technique just downloads the file without a prompt, so I will continue using File System Access API for browser environments.
Here is the code that handles downloading when running in Electron.
function download(filename: string, contents: string) {
var element = document.createElement('a');
element.setAttribute('href', 'data:text/plain;charset=utf-8,' + encodeURIComponent(contents));
element.setAttribute('download', filename);
element.style.display = 'none';
document.body.appendChild(element);
element.click();
document.body.removeChild(element);
}
let isElectron = require("is-electron");
export async function chooseAndWriteToFileUniversal(contents: string) {
if (isElectron()) {
download("data.txt", contents)
} else {
chooseAndWriteToFile(contents) // See the original question for implementation of this function
}
}
Still, would be nice to know why/how is Chromium in Electron more restricted than in a normal Chrome or Edge browser, and if it can be changed.
I have a problem of uploading image to image column in sharepoint online via pnpjs
I don't know how to convert image and upload to image column in sharepoint list.
I tried lot of ways (by convert image to blob, filedata) nothing works.
Keep in much this is not an attachments for the list..
It's a new column(image) in sharepoint list
reference image click here
Looks like the image is not stored in the list. JSON is stored. So you can just upload image to site assets (that's what sharepoint does when you set the image manually) and then put the json to the field. I would try something like this (assuming you are using pnpjs)
import * as React from "react";
import { sp } from "#pnp/sp/presets/all";
// hello world react component
export const HelloWorld = () => {
const uploadFile = async (evt) => {
const file: File = evt.target.files[0];
// upload to the root folder of site assets in this demo
const assets = await sp.web.lists.ensureSiteAssetsLibrary();
const fileItem = await assets.rootFolder.files.add(file.name, file, true);
// bare minimum; probably you'll want other properties as well
const img = {
"serverRelativeUrl": fileItem.data.ServerRelativeUrl,
};
// create the item, stringify json for image column
await sp.web.lists.getByTitle("YourListWithImageColumn").items.add({
Title: "Hello",
YourImageColumn: JSON.stringify(img)
});
};
return (<div>
<input type='file' onChange={uploadFile} />
</div>);
};
#azarmfa,
The image file in fact did not store in the image field. The field just references its location. You could first upload the image file to a library (by default it will be site asset), then update the item like below:
let list = sp.web.lists.getByTitle("mylinks");
let json = {
"fileName": "Search_Arrow.jpg",
"serverUrl": "https://abc.sharepoint.com",
"serverRelativeUrl": "/sites/s01/Style%20Library/Images/Search_Arrow.jpg"
};
let jsonstr = JSON.stringify(json);
const i = await list.items.getById(3).update({
Title: "My New Tit",
img: jsonstr
});
BR