I would like to upload videos directly from ReactJS component to Azure media service without middle server.
I investigated and found this article and this #azure/arm-mediaservices SDK. and it seems that secret tokens are involved and I assume it's not a good idea to use in on the client side.
Can someone share thoughts and examples how to simple upload a video directly from client side (ReactJS Component) to azure media service with a temporary token ?
I found a way to upload Images to Blob in the way i want to, but I didn't find a way to do the same for videos to media services (without middle server side operations).
You are correct that we do not recommend ever storing your account secrets on the client side. You will still want a mid tier for that. One simple way to do that securely is with Azure Functions.
All you need to do is creat the Asset and then get a SAS Url to the container to upload content into it. Once you have those that there should be some samples out there if uploading to a SAS Url in Azure storage.
Related
I have a developed a Reactjs front end, FastAPI backend and MongoDb database. I want to upload images to a private Azure storage account and consume them on the site.
What I'm struggling to understand is what the best practice is for loading the images?
Should I:
Get a SAS key from my API, then go directly from Reactjs to the storage account URL?
Get the image from the Azure storage using the API and serve it to the Reactjs
A better option?
Thanks in advance
Preferred way would be to go with option 1 - Get a SAS key from my API, then go directly from Reactjs to the storage account URL.
The advantages of this approach is that the Browser is directly requesting the images from Azure Storage so that there is less load on your API to fetch and serve the images. You may need to configure CORS settings on your Storage account if your React app is making an Ajax request to fetch the blob contents from Storage.
Doing this way however exposes your SAS URL to the client. If you do not want the clients to know from where the images are being served, then go with option 2.
We are planning our final school project and I need to find out a way to send images from server to client (Flutter App). Due to a lack of experience in a professional environment, I'm struggling to do so.
I've always saved the image name or image path in the database in my smaller projects, got the data via an API, and then just called for the image, which was located on a web server, via HTTP or HTTPS. Pretty easy in Flutter with Image.Network.
However, that doesn't sound like the best option
We are planning on using:
Ubuntu or Microsoft Server (still to decide)
MariaDB alone or with MongoDB, or even MS SQL Server(still to decide)
ASP.NET Core for the API
Flutter App and Web-Interface for client-side
Any suggestions are appreciated!
You are doing correctly in your smaller projects. This is a best way to do. When frontend(mobile app or web app) uploads image using an API, backend(in your case ASP.NET Core) simply stores that in server(in your case case Ubuntu or Microsoft Server). But I would say stores all media files like audio, video, images, documents, etc on AWS S3 bucket because it would be difficult to you increase server disk space if its low where AWS-S3 can store any amount of data.
And after saving that media files on S3 or server store its file url in database. Send this url via API to client when it requests it and from client side you just need to use that url to show or download.
I am creating a web app using React where the user can upload files to a folder in my S3 bucket. This folder will have a unique passcode name. The user (or someone else) can use this passcode to retrieve these files from the S3 folder. So basically there is no login/authentication system.
My current issue is how do I safely allow read/write access to my S3 bucket? Almost every tutorial stores the access keys to the client code which I read is very bad practice but I also don't want to create a backend for something this simple. Someone suggested presigned URLs but I have no idea how to set that up (do I use Lambda? IAMs?). I'm really new to AWS (and webdev in general). Does anyone have any pointers on what I could look into?
do I use Lambda? IAMs?
The setup and process if fully explained in AWS blog:
Uploading to Amazon S3 directly from a web or mobile application
I have a REST API that supports a multi-user React App that I have been running on an EC2 instance for a while and I am moving into uploading photos/pdfs that will be attached to specific users. I have been able to accomplish this in testing using EFS which I had provisioned to the same EC2 instance I had the API on but I am looking to move towards S3.
When I was testing it out using EFS, I would send everything through the REST API, the user would do a POST and then the API would store the file in EFS along with the metadata for where the file was stored in my DB, then in order to retrieve the data the user would do a GET to the REST API and the server would fetch the data from EFS based on metadata in the DB.
I am wondering what is the usual use case for S3? Do I still have to send everything through my REST API if I want to be sure that users only have access to the pdfs/images that they are supposed to or is there a way for me to ensure their identity and directly request the resources from S3 on the front-end and just allow my API to return a list of S3 urls for the files?
The particular use case I have in mind is making it so users are able to upload profile pictures and then when a user searches for another user by name, all of the profile pictures of the users returned in the query of users are able to be displayed in the list of results.
As far as I know, there is no "normal" way to deal with this particular situation - either could make sense depending on your needs.
Here are some options:
Option 1
It's possible to safely allow users to access resources directly from S3 by using AWS STS to generate temporary credentials that your users can utilise to access the S3 APIs.
Option 2
If your happy for the pics to be public, you could configure a bucket as a static website and simply use those public URLs in your web application.
Option 3
Use Cloudfront to serve private content from your S3 buckets.
I am building a web application , where users can upload images & videos and store them in their account. I want to store these files somewhere and save only the URL in the DB.
What is the right way to do it using Azure services? Is there a dedicated server for this, or some VM?
Yes, there is a dedicated service for this purpose. It is the Azure Blob Storage. And you are highly advised to save all and any user uploaded content to that service instead to the local file system.
The provided link has samples for almost any language that has client SDK provided by microsoft.
If, at the end you use some platform or language that is not directly supported by an SDK, you can always refer to the Blob Storage REST API documentation.
You will need to go through the blob service concepts to get deeper understanding of the service and how to use it.