I have a REST API that supports a multi-user React App that I have been running on an EC2 instance for a while and I am moving into uploading photos/pdfs that will be attached to specific users. I have been able to accomplish this in testing using EFS which I had provisioned to the same EC2 instance I had the API on but I am looking to move towards S3.
When I was testing it out using EFS, I would send everything through the REST API, the user would do a POST and then the API would store the file in EFS along with the metadata for where the file was stored in my DB, then in order to retrieve the data the user would do a GET to the REST API and the server would fetch the data from EFS based on metadata in the DB.
I am wondering what is the usual use case for S3? Do I still have to send everything through my REST API if I want to be sure that users only have access to the pdfs/images that they are supposed to or is there a way for me to ensure their identity and directly request the resources from S3 on the front-end and just allow my API to return a list of S3 urls for the files?
The particular use case I have in mind is making it so users are able to upload profile pictures and then when a user searches for another user by name, all of the profile pictures of the users returned in the query of users are able to be displayed in the list of results.
As far as I know, there is no "normal" way to deal with this particular situation - either could make sense depending on your needs.
Here are some options:
Option 1
It's possible to safely allow users to access resources directly from S3 by using AWS STS to generate temporary credentials that your users can utilise to access the S3 APIs.
Option 2
If your happy for the pics to be public, you could configure a bucket as a static website and simply use those public URLs in your web application.
Option 3
Use Cloudfront to serve private content from your S3 buckets.
Related
I have a developed a Reactjs front end, FastAPI backend and MongoDb database. I want to upload images to a private Azure storage account and consume them on the site.
What I'm struggling to understand is what the best practice is for loading the images?
Should I:
Get a SAS key from my API, then go directly from Reactjs to the storage account URL?
Get the image from the Azure storage using the API and serve it to the Reactjs
A better option?
Thanks in advance
Preferred way would be to go with option 1 - Get a SAS key from my API, then go directly from Reactjs to the storage account URL.
The advantages of this approach is that the Browser is directly requesting the images from Azure Storage so that there is less load on your API to fetch and serve the images. You may need to configure CORS settings on your Storage account if your React app is making an Ajax request to fetch the blob contents from Storage.
Doing this way however exposes your SAS URL to the client. If you do not want the clients to know from where the images are being served, then go with option 2.
I have a React web application in which I allow users to upload DICOM files to Google Healthcare API. The current implementation is that the files first gets uploaded to my back-end server which uploads them to Healthcare API. I am allowing users to upload a full DICOM study (100MB - 2+GB) which could have anywhere from 1-500+ DICOM files (each usually 50KB-50MB). Our current approach as worked thus far but as we are expanding, it seems insufficient use of my server.
My goal is to allow user to directly upload to Google Cloud Storage bucket from the React app. I want to perform some validation logic before I export it to Google Healthcare API. I have looked into signed urls but since the files being uploaded are medical images I wasn't sure if they would be secure enough. The users don't necessarily have a google account.
What is the best way I can allow user to directly upload a directory to GCS bucket without going through my server? Are there dangers involved with this approach if the user uploaded a virus? Also signed urls are valid for a set amount of time, can I deactivate a signed url as soon the uploads are complete?
I am using this API to build an app (Xcode) and the maximum number of calls a day is 5000. The way I have currently built the app for testing purposes is to call the API every time the user refreshes the data. So, I am running out of calls per day. So, I was wondering how to connect an API to a database like firebase. Then update the data in the database maybe 4 times a day at a specific time. When the user would be refreshing, they would pull data from the database instead. I'm new to programming and am not sure if this is the best solution and would appreciate if anyone could direct me to more resources. Thanks!
This is the api I am using: https://projects.propublica.org/api-docs/congress-api/?
Edit: Also would something like this also mean I would build a REST API? https://github.com/unitedstates/congress It is a repository that includes data importing scripts and scrapers. I'm guessing this isn't compatible with swift but is compatible with building a REST API in AWS or Firebase?
You can use AWS (Amazon Web Services). Their free tier allows many of their services for free (12 months, and usage limit) including the ones I would recommend you for this project:
Make an AWS account.
Use S3 storage buckets to host a datafile.
Use API Gateway to make an API.
Use Lambda to run a Python/Javascript in the cloud which connects the API with the S3 bucket (your data).
Use IAM to create roles and permissions for the S3 bucket, API and Lambda scripts to communicate.
Here's how you set up the API: https://www.youtube.com/watch?v=uFsaiEhr1zs
Here's how you read the S3 bucket: https://www.youtube.com/watch?v=6LvtSmJhVRE
You can also work with these tools to set up an API that PUTS data to the S3 bucket and updates the data regularly.
I'm writing an app where users can write Notes, and each note can have many files attached to it.
I would like users to be able to click 'Browse', select multiple files, which will be uploaded when the user clicks 'Save Note'.
I want these files to be uploaded directly into Amazon S3 (or some other cloud storage solution?) without going through my server, so I don't have to worry about uploads blocking my server.
What is the best way to accomplish this?
I have seen many examples to upload directly into Amazon S3, but none of them support multiple files. Will I somehow have to do this all in Javascript by looping through a collection of files selected with the Browse button?
Thanks!
Technically, your javascript residing in the browser could make HTTP RESTful calls to AWS and store data in S3, but then you would be exposing the security credentials to connect to AWS in the script.. not good.
I guess the only way is to process it thru a web-server which can securely access AWS and store the notes.. or, you could just write those notes to a local disk (where the webserver sits), and schedule tools like s3cmd to automatically synch them with S3 buckets.
I am building an iPhone app that stores user logon credentials in an AWS DynamoDB. In another DynamoDB I am storing locations of files (stored in S3) for that user. What I don't understand is how to make this secure. If I use a Token Vending Machine that gives that application an ID with access to the user DynamoDB, isn't it possible that any user could access the entire DB and just add or delete any information that they desire? They would also be able to access the entire S3 bucket using this setup. Any recommendations on how I could set this up securely and properly?
I am new to user DB management, and any links to helpful resources would be much appreciated.
Regarding S3 and permissions, you may find the answer on the following question useful:
Temporary Credentials Using AWS IAM
IAM permissions are more finegrained than you think. You can allow/disallow specific API calls, so for example you might only allow read operations. You can also allow access to a specific resource only. On S3 this means that you can limit access to a specific file or folder , but dynamodb policies can only be set at the table level.
Personally I wouldn't allow users direct access to dynamodb - I'd have a webservice mediating access to that, although users being able to upload directly to s3 or download straight from s3 is a good thing (Your web service can in general give out pre signed urls for that though)