How to protect some files/objects in public bucket? - reactjs

I would like to create a public read aws s3 bucket with some files read restricted by a IAM role.
First of all:
I using amplify cli for deploying my «static» website.
The website is a react app
This app have public pages/react components and a admin area
I would like to restrict admin area/admin pages/admin react components with a aws IAM role
More details:
The react app is very big so I splited components using asyncComponent feature like const Dashboard = asyncComponent(() => import('./pages/Dashboard'))
So when I build the app instead to have one big file I have several small files. And all these files are on the same bucket.
Now I want to build admin pages. Always using asyncComponent we get a collection of «Admin» files and there are hosted on the same bucket. But for security reason I want to restrict access to authenticated users with a certain IAM role (for ex AdminRole).
I go through lot of doc from amplify config or AWS::S3::Bucket from cloudFormation and I saw different things that tell me it's possible but I'm very lost in this doc.
So finally I ask:
How can I protect some files/objects for reading access in s3 buckets with a IAM role?
And how can I «tag» admin components in the react app? or via amplify? maybe using regex for match files? or a specified folder? In order to apply this read restriction.
Thank you in advance for your reply.

Content in Amazon S3 is private by default.
Therefore, anything you are happy for everyone in the world to view can be made publicly accessible via a Bucket Policy (whole bucket or part of a bucket) or via Access Control Lists (ACLs) on the objects themselves.
To serve content that should be restricted to specific users, take advantage of Pre-Signed URLs. These are time-limited URLs that provide temporary access to private objects in Amazon S3. They are easy to generate (no API calls required).
The way it would work is:
Users would authenticate with your application
When they wish to access restricted content, the application would determine whether they are permitted access
If they are permitted access, the application would generate a pre-signed URL. These can also be used in <a> and <img> tags to refer to pages and images.
Users will receive/view the content just like normal web components
Once the expiry time has passed, the pre-signed URLs will no longer work
See: Share an Object with Others - Amazon Simple Storage Service
(I'm not an Amplify person, so I can't speak to how Amplify would specifically generate/use pre-signed URLs.)

Related

Serving a JSON file from an AWS Amplify React Project

We have an AWS Amplify React project associated with our domain, which leads to all files and contents being sourced by the underlying react router.
In order to support backend API communications with Microsoft APIs, we need to host a specific JSON file at a particular location within our domain, such as mydomain.com/.well-known/microsoft-identity-association.json.
I am unsure how to do this. My first question is whether this is best accomplished via static routes within the react router or, instead, configuring Cloud Front and Route 53 to serve up the JSON file for this exact URL.
I have been trying the second approach and have created a distribution in Cloud Front for a specific S3 bucket storing the JSON file. I have named the S3 bucket "mydomain" with a subfolder ".well-known" and a contained JSON filed entitled "microsoft-identity-association.json". My problem is that I do not know how to configure Route 53 to route to this distribution as my root domain (mydomain.com) is associated with my Amplify project and is handled by the react router. I'm not sure if I can somehow configure a specific route or alias to serve up the exact JSON file.
I have reviewed this post (How do I return a json file from s3 to a specific url, but only that url) but it seems to be addressing a slightly different problem.
Any and all guidance appreciated.
Addressed this issue by splitting my site. I used a static S3-hosted site for public pages (including the JSON file) and redirected the React app to a subdomain.

How can I upload files to my S3 bucket from my React app without a backend?

I am creating a web app using React where the user can upload files to a folder in my S3 bucket. This folder will have a unique passcode name. The user (or someone else) can use this passcode to retrieve these files from the S3 folder. So basically there is no login/authentication system.
My current issue is how do I safely allow read/write access to my S3 bucket? Almost every tutorial stores the access keys to the client code which I read is very bad practice but I also don't want to create a backend for something this simple. Someone suggested presigned URLs but I have no idea how to set that up (do I use Lambda? IAMs?). I'm really new to AWS (and webdev in general). Does anyone have any pointers on what I could look into?
do I use Lambda? IAMs?
The setup and process if fully explained in AWS blog:
Uploading to Amazon S3 directly from a web or mobile application

Amazon S3 for Media Storage for Restful API

I have a REST API that supports a multi-user React App that I have been running on an EC2 instance for a while and I am moving into uploading photos/pdfs that will be attached to specific users. I have been able to accomplish this in testing using EFS which I had provisioned to the same EC2 instance I had the API on but I am looking to move towards S3.
When I was testing it out using EFS, I would send everything through the REST API, the user would do a POST and then the API would store the file in EFS along with the metadata for where the file was stored in my DB, then in order to retrieve the data the user would do a GET to the REST API and the server would fetch the data from EFS based on metadata in the DB.
I am wondering what is the usual use case for S3? Do I still have to send everything through my REST API if I want to be sure that users only have access to the pdfs/images that they are supposed to or is there a way for me to ensure their identity and directly request the resources from S3 on the front-end and just allow my API to return a list of S3 urls for the files?
The particular use case I have in mind is making it so users are able to upload profile pictures and then when a user searches for another user by name, all of the profile pictures of the users returned in the query of users are able to be displayed in the list of results.
As far as I know, there is no "normal" way to deal with this particular situation - either could make sense depending on your needs.
Here are some options:
Option 1
It's possible to safely allow users to access resources directly from S3 by using AWS STS to generate temporary credentials that your users can utilise to access the S3 APIs.
Option 2
If your happy for the pics to be public, you could configure a bucket as a static website and simply use those public URLs in your web application.
Option 3
Use Cloudfront to serve private content from your S3 buckets.

Securing S3 bucket for users?

I have developed a react native app, with AWS Amplify to support the backend (DynamoDB, S3). All users of the app have to use Auth.signIn() to sign in and are part of a user pool.
Once in, they can start to upload videos to S3 via the app or view videos in the app that are in the S3 bucket that is PUBLIC.
I use the path to the S3 video (https://myS3bucket....) as the source URL of the video. However the videos are only visible in my app when the bucket is public. Any other setting (protected/private) and no video is visible. How can i make this more secure?
S3 Buckets have 3 methods of managing security:
IAM: Any user or role within the same AWS account as the bucket can be granted permissions to interact with the S3 Bucket and its objects.
S3 Bucket Policies: Grant bucket wide (or prefix) access to S3 buckets.
S3 ACLs - Per object level permissions.
Its generally advised against using S3 ACLs these days as their functionality was improved via S3 bucket policies. Only use them if you need a specific object to have a different set of permissions.
I sugeest not to make files or the bucket public if you want authenticated users to upload and/or download files. For this, use S3 signed URLs to give users access to files. In other words, the backend will authenticate users accordingly, generate them signed URLs and then the react native app will interpret that URL accordingly, ie a video file.
You will need to change a few things but this guide should cover that
I have recently published an article which describes in detail the security best practices, which help address the following points:
How to secure an S3 buckets, which store sensitive user data and the application code.
How to securely configure a CloudFront distribution.
How to protect frontend apps against common OWASP threats with CloudFront Functions.
To learn more have a look at the article.
Best,
Stefan

Accessing Cloud Storage Objects via App Engine without making bucket publicly readable

I'd like to be able to access the files in a Cloud Storage Bucket from my App Engine App without making the objects or the bucket itself Publicly Readable. While I'm aware of a bunch of options out there that allow access to bucket objects with authentication (client libraries, signed urls, etc.), the complicating factor is that I'd like to be able to access the files with path that is similar to the folder structure of the bucket in question.
For example, if I make my bucket publicly readable, I can access objects with the public link: https://storage.googleapis.com/MY_BUCKET/FOLDER_IN_MY_BUCKET/FILE_IN_FOLER.txt. This url mimics the internal folder structure of the bucket. However, there doesn't appear to be a comparable url if the bucket is not publicly readable. My App Engine App service account has been added as a storage admin for the bucket I need but I'm not sure if there's a url that I can use to access the buckets objects. An object's mediaLink won't work because generation information is appended to the end, and selfLink results in a 404 error.
The need for a url like this is because the bucket contains several thousand objects. Downloading them using a client library to the the App Engine's persistent storage kind of defeats the purpose of using cloud storage in my case. Obtaining signed urls for all of them when a request is made would be time consuming and then I'd have to manage thousands of signed urls somewhere.
Is there a way to read from the cloud storage bucket with a predictable url, like the public url, while also still authenticating the request?
Rather than trying to vend thousands of signed URLs in the response you can create a 'redirect' endpoint in your app engine app.
e.g. user does a 'GET' against www.myapp.com/fetch/<bucket>/<object>
Your app engine code handling this endpoint authorizes the user to make sure they should have access, pulls the bucket/object out of the URL, then generates a signed URL granting access to the resource and returns a 302 redirect to that URL.
The URL you mention can be accessed without making the bucket or file public given that your browser is authenticated with an account having access to those resources:
https://storage.cloud.google.com/MY_BUCKET/FOLDER_IN_MY_BUCK‌​ET/FILE_IN_FOLDER
Regarding the access to the file through a different application (for example App Engine), you can always use the client libraries for your preferred language. You can test how the API works in the documentation, just by defining the bucket parameter as MY_BUCKET and the object parameter as FOLDER_IN_MY_BUCKET/FILE_IN_FOLDER. You should use this same structure when applying it to the client library of your choice.

Resources