Best practice for uploading vidoes into s3 bucket - reactjs

What is the best practice for uploading images and videos into S3 buckets. In my use case users can upload their vidoes and images and I have to store those images and videos in effective way into S3 bucket (without data loss). I read some related posts but I could not find out the better solution. I am using React JS and I have to upload it from React JS code. And each video's size would be more than 200 MB. So I am worrying about how to send those videos into S3 bucket in very less time and effective way. Please anyone suggest me a good approach to overcome this problem.
Thanks in advance!!!

S3 will not lose your data. If you receive a 200 response from S3, you can be confident there will be no data lost.
As for best practices, you should use PUT Object for files that are smaller than 5 GB. You can also use POST Object to allow your users to upload files directly from the browser. The 5 GB size limit still applies in the case of POST Object.
Once you reach the 5 GB limit, your only choice is use S3's multipart upload API. Using the multipart upload API, you can upload files of to 5 TB in size.

Related

Public s3 bucket security

I have simple MERN stack application that allows to upload gallery images. I host all static assets including all gallery images on public s3 bucket. I did it on purpose because I want every image to be public and load time is much faster because we don't need to retrieve and encode images to display them on the website. However, I'm worried about some unexpected cost in case that someone will try to generate unwanted traffic to my bucket. As we know AWS charge for every request and data transfer so my question is what are my options to prevent it. Although, generating images using server like mentioned before is probably the best option, I don't want to do it. I read somewhere that I could use budget alerts and trigger lambda function to make s3 bucket private but I can't find information how to implement it.
Maybe there is some other way to do it?

Uploading a large file to AWS database

I have to upload large files to the AWS S3 bucket, approximately 500,000 lines long and 30MB. What would be the best way to do this, taking speed and number of calls into account?
You can upload large objects using the AWS SDK. For example, assume you are using Java. To upload large files to Amazon S3 bucket, use createMultipartUpload().
To see an example of how to use this method, see:
https://github.com/awsdocs/aws-doc-sdk-examples/blob/master/javav2/example_code/s3/src/main/java/com/example/s3/S3ObjectOperations.java
You can try uploading these files to s3 either from console or programmatically if you face any speed related issue and upload takes long time then you should consider using S3 multipart upload which upload files in multiple parts. There are several factors to consider when you use multipart. go through them before using it
https://docs.aws.amazon.com/AmazonS3/latest/userguide/mpuoverview.html
From my point of view uploading 30 MB file will be any issue. you can try directly.

Display a 1 GB video file in react js frontend, stored in private S3 bucket using flask

I need to display/stream large video files in reactjs. These files are being uploaded to private s3 bucket by user using react form and flask.
I tried getObject method, but my file size is too large. get a signed url method required me to download the file.
I am new to AWS-python-react setup. What is the best/most efficient/least costly approach to display large video files in react?
AWS offers other streaming specific services but if you really want to get them off S3 you could retrieve the files using torrent which, with the right client/videoplayer would allow you to start playing them without having to download the whole file.
Since you mentioned you're using Python, you could do this using AWS SDK like so:
import boto3
s3 = boto3.client('s3')
response = client.get_object_torrent(
Bucket='my_bucket',
Key='/some_prefix/my_video.mp4'
)
The response object will have this format:
{
'Body': StreamingBody()
}
Full docs here.
Then you could use something like webtorrent to stream it on the frontend.
Two things to note about this approach (quoting docs):
Amazon S3 does not support the BitTorrent protocol in AWS Regions launched after May 30, 2016.
You can only get a torrent file for objects that are less than 5 GBs in size.

How to best upload high volume of resources through filepond?

I've recently been using filepond for an enterprise web application that allows end-users to upload a maximum of 1,500 images (medium size, avg 200Kb, max 500Kb).
There is a very low degree of backend processing once an image is uploaded other than its temporary storage in a database. We later perform asynchronous processing picking up the files from that temporary storage. But the current challenge we are seeing is that the browser serialization is extending the upload up to 2 hours! We've been able to decrease this time close to 1 hour by increasing the max parallel uploads in filepond already, but this is still far from acceptable (the target is 20min), and we still see the serialization occurring in Chrome Dev Tools with such a volume of images being uploaded.
With this in mind, I'm currently looking for a new filepond plugin to zip the dropped files and then upload a single archive to the backend, without the user bothering to do that himself. I couldn't find anything related at filepond's plugins page and most listed there seem to be related to image transformation. Hopefully the jszip library could do the trick. Am I on the right track? Any further suggestions?
Other things in the radar our team is exploring:
creating multiple DNS endpoints to increase the number of parallel requests by the browser;
researching CDN services alternatives;
Thanks a bunch!

Patterns to show content in a React and AWS website

Sorry about the "stupid" title, but I don't really know how to explain this.
I want to have a webpage on my site (built in React), that will show the release notes for each version of my site/product. I can hardcode the content of the release notes in the page, but I want to do something that allows me not to have to recompile my site just to change content.
My site is hosted in AWS, so I was thinking if there are any patterns to store the content of the page in an S3 bucket as a text file, or as an entry in DynamoDB.
Does this make sense?
These are things I remember, but I would like to ask how "you" have done this in the past.
Thank you in advance!
You could really use either S3 or DynamoDB, though S3 ends up being more favorable for a few reasons.
For S3, the general pattern would be to store your formatted release notes as an HTML file (or multiple files) as S3 objects and have your site make AJAX requests to the S3 object(s) and load the HTML stored there as your formatted release notes.
Benefits:
You can make the request client-side and asynchronous via AJAX, so the rest of the page load time isn't negatively impacted.
If you want to change the formatting of the release notes, you can do so by just changing the S3 object. No site recompilation required.
S3 is cheap.
If you were to use DynamoDB, you would have to request the contents server-side and format them server-side (the format would not be changeable without site recompilation). You get 25 read capacity units for free, but if your site sees a lot of traffic, you could end up paying much more than you would with S3.

Resources