Upload big files into S3 from React - reactjs

I'm playing around making a video platform (for fun), I currently have 3 big pieces: a front end written in ReactJS, a backend written in NodeJS and a filesystem to store user media in digital ocean spaces (s3 bucket).
I want to keep this project lightweight and as it is practice for me I don't want to use a library that does everything for me. It's just an architecture question.
I already implemented a way for a user to upload an image on the backend (it uploads binaries from FE to the BE and in there it checks that everything is ok and then uploads to the filesystem). I also have an endpoint to stream the video from the filesystem, which works fine.
Now I am trying to do it for videos, but I have a dilemma, if I do the same as I do for images I will have to upload it first to the BE and then to the filesystem and AFAIK that's really inefficient.
Would it be a better idea to put the video upload directly on the Front end and then store the uploaded URL of the video into the DB? And if that is the case how can I make the S3 connection secure? Not exposing S3 bucket information.

Related

How to store the files uploaded from a website or a html?

Now I am creating a real digital library website. The website is required to store uploaded file in a database and receive it from the database. I have found a way to store the uploaded files in Google Cloud. May I know is there another way better than this?

Professional way to send images from server to client

We are planning our final school project and I need to find out a way to send images from server to client (Flutter App). Due to a lack of experience in a professional environment, I'm struggling to do so.
I've always saved the image name or image path in the database in my smaller projects, got the data via an API, and then just called for the image, which was located on a web server, via HTTP or HTTPS. Pretty easy in Flutter with Image.Network.
However, that doesn't sound like the best option
We are planning on using:
Ubuntu or Microsoft Server (still to decide)
MariaDB alone or with MongoDB, or even MS SQL Server(still to decide)
ASP.NET Core for the API
Flutter App and Web-Interface for client-side
Any suggestions are appreciated!
You are doing correctly in your smaller projects. This is a best way to do. When frontend(mobile app or web app) uploads image using an API, backend(in your case ASP.NET Core) simply stores that in server(in your case case Ubuntu or Microsoft Server). But I would say stores all media files like audio, video, images, documents, etc on AWS S3 bucket because it would be difficult to you increase server disk space if its low where AWS-S3 can store any amount of data.
And after saving that media files on S3 or server store its file url in database. Send this url via API to client when it requests it and from client side you just need to use that url to show or download.

How can I upload files to my S3 bucket from my React app without a backend?

I am creating a web app using React where the user can upload files to a folder in my S3 bucket. This folder will have a unique passcode name. The user (or someone else) can use this passcode to retrieve these files from the S3 folder. So basically there is no login/authentication system.
My current issue is how do I safely allow read/write access to my S3 bucket? Almost every tutorial stores the access keys to the client code which I read is very bad practice but I also don't want to create a backend for something this simple. Someone suggested presigned URLs but I have no idea how to set that up (do I use Lambda? IAMs?). I'm really new to AWS (and webdev in general). Does anyone have any pointers on what I could look into?
do I use Lambda? IAMs?
The setup and process if fully explained in AWS blog:
Uploading to Amazon S3 directly from a web or mobile application

Best way to upload REALLY big files with angularjs and rails-api

I have a rails-api conected to an angular app with the main purpose of uploading different files to the server and make copies of them, until now it works perfectly but the biggest files i've had to upload so far are 500ish mb, and now i have to uploas arrays of files that tend to be more than 20 gb of data.
I use angular np-upload to upload my files with http method and carrierwave gem to manage file and folder creation on the server side but i don't want to do the same when the payload is this big, So now my question is there a way to make this differently?? like google drive folders that synchronize constantly and the user doesn't even realize, Iwant to acomplish something like tis with my aplication so that when the really big files are uploading the app doesn't freeze and the user can upload several things at once and keep interacting with the app even though is uploading or synchronizing files to the server.
Thanks in advance for any advice

Using SoundManager2 to create a non downloadable playlist

Currently I am trying to develop a Music streaming service: using SoundManager2 for streaming. I am using Google App engine to do the same. Music files are stored in Google Cloud Storage Buckets.
After completion of each song I would do an ajax query to fetch the url of the next playable song.
But while doing so I expose the direct URL to the .mp3 file and I am able to download the same thorough the URL (with a little help from Chrome Developers Tool).
How can we stop the song from getting downloaded or do some thing like SoundCloud where the URL is not getting identified in Developers tool?

Resources