How can I upload large media file in S3 from client side? - reactjs

I have a form where a user submits an image from the client-side. Currently, I am using AWS amplify to upload images to s3. I have not faced an issue so far because most of the times users selected the images less than 5MB. But right now I need to expand this limit and also optimizes the performance as well because the size limit is increased. When I have researched into performance, I got to know about the transfer acceleration and its compatibility with AWS SDK but not AWS amplify. So, I had to move my upload logic to SDK. The architecture I am following in the backend is Serverless and Lambda's. Let's say the user submits his form, the request cycle contains the image of size 10MB. But lambda will not process this request because the lambda request-response cycle is limited to 6MB transactions. How can I improve the performance, along with increasing the size limit? Also, can it be achieved with the AWS amplify itself?

Your use case is best served with S3 pre-signed urls. In this case your lambda function generates a url using the aws-sdk for a s3 upload and give back only the url for your frontend. Then you can upload directly to this url from the frontend without AWS Lambda being invoked again. All the data transfer will occur between your frontend and S3.
You can find more details about this solution here.

Related

How can I access the generated thumbnail in S3 in frontend?

I am created a lambda trigger , when a video file uploaded in a s3 input bucket, it will create a thumbnail in output bucket, but I don't know how to access it. Please help me.
Iam generating 3 thumbnail from a single video, in this bottom image 👇, there this 4 video.
But I have the name of the file as dfdf and treasersonthing and
vijay.treaser.ve and vollyball , but I want all 3 images using this
file name.
The question is quite open - do you want to access the generated thumbnails on the frontend of your website? If so, I will try and provide some ideas for the architecture, based on some assumptions.
Publicly Accessible Thumbnails
Assuming you want to make the thumbnails publicly accessible, S3 can expose them through its own HTTP endpoint. See AWS documentation. Please note that this involves enabling Public Access on your bucket, which can potentially be risky. See How can I secure files in my Amazon S3 bucket for more information. While this is an option, I'm not going to elaborate on it, as it's probably not the best.
The preferred way is to have a CloudFront distribution serve files from your S3 bucket. This has the advantages of a typical CDN - you can have edge locations caching your files across the globe, thus reducing the latencies your customers see. See this official guide on how to proceed with this CloudFront + S3 solution.
Restricted-access Thumbnails
If your thumbnails are not meant to be public, then you can consider two options:
implement your own service (hosted on any compute engine you prefer) to handle the authentication & authorization, then return the files to your customers, or
use the CloudFront + S3 solution and control the authentication and authorization with Lambda#Edge. See AWS docs.

Reactjs FastApi - Get azure blob image?

I have a developed a Reactjs front end, FastAPI backend and MongoDb database. I want to upload images to a private Azure storage account and consume them on the site.
What I'm struggling to understand is what the best practice is for loading the images?
Should I:
Get a SAS key from my API, then go directly from Reactjs to the storage account URL?
Get the image from the Azure storage using the API and serve it to the Reactjs
A better option?
Thanks in advance
Preferred way would be to go with option 1 - Get a SAS key from my API, then go directly from Reactjs to the storage account URL.
The advantages of this approach is that the Browser is directly requesting the images from Azure Storage so that there is less load on your API to fetch and serve the images. You may need to configure CORS settings on your Storage account if your React app is making an Ajax request to fetch the blob contents from Storage.
Doing this way however exposes your SAS URL to the client. If you do not want the clients to know from where the images are being served, then go with option 2.

Allowing client to uploading large number of files to cloud storage bucket

I have a React web application in which I allow users to upload DICOM files to Google Healthcare API. The current implementation is that the files first gets uploaded to my back-end server which uploads them to Healthcare API. I am allowing users to upload a full DICOM study (100MB - 2+GB) which could have anywhere from 1-500+ DICOM files (each usually 50KB-50MB). Our current approach as worked thus far but as we are expanding, it seems insufficient use of my server.
My goal is to allow user to directly upload to Google Cloud Storage bucket from the React app. I want to perform some validation logic before I export it to Google Healthcare API. I have looked into signed urls but since the files being uploaded are medical images I wasn't sure if they would be secure enough. The users don't necessarily have a google account.
What is the best way I can allow user to directly upload a directory to GCS bucket without going through my server? Are there dangers involved with this approach if the user uploaded a virus? Also signed urls are valid for a set amount of time, can I deactivate a signed url as soon the uploads are complete?

Upload videos from ReactJS directly to Azure MediaService

I would like to upload videos directly from ReactJS component to Azure media service without middle server.
I investigated and found this article and this #azure/arm-mediaservices SDK. and it seems that secret tokens are involved and I assume it's not a good idea to use in on the client side.
Can someone share thoughts and examples how to simple upload a video directly from client side (ReactJS Component) to azure media service with a temporary token ?
I found a way to upload Images to Blob in the way i want to, but I didn't find a way to do the same for videos to media services (without middle server side operations).
You are correct that we do not recommend ever storing your account secrets on the client side. You will still want a mid tier for that. One simple way to do that securely is with Azure Functions.
All you need to do is creat the Asset and then get a SAS Url to the container to upload content into it. Once you have those that there should be some samples out there if uploading to a SAS Url in Azure storage.

Is it better to upload a file to an s3 bucket on the front end or backend?

I am creating a web app where there is an option for authenticated users to upload pictures. I am confused as to whether its better to do it on the front end or backend. I have already implemented it on the front end but I had to include my "accessKeyId" and "secretKey". I don't know if this compromises my security. I am using cloud functions for my back end. If anyone can help me with best practices in relation to this I will be very grateful.
You can generate pre-signed urls from your backend, then your frontend can upload files safely directly into S3 without exposing your credentials.
Take a look into the documentation here.
Also, this article points out some of the advantages of that strategy:
You can still allow only authenticated users to get access to presigned urls
You can set expiration time for the generated presigned urls
You save bandwidth, memory processing and upload time by avoiding your files to pass through your backend function

Resources