I'm writing an app where users can write Notes, and each note can have many files attached to it.
I would like users to be able to click 'Browse', select multiple files, which will be uploaded when the user clicks 'Save Note'.
I want these files to be uploaded directly into Amazon S3 (or some other cloud storage solution?) without going through my server, so I don't have to worry about uploads blocking my server.
What is the best way to accomplish this?
I have seen many examples to upload directly into Amazon S3, but none of them support multiple files. Will I somehow have to do this all in Javascript by looping through a collection of files selected with the Browse button?
Thanks!
Technically, your javascript residing in the browser could make HTTP RESTful calls to AWS and store data in S3, but then you would be exposing the security credentials to connect to AWS in the script.. not good.
I guess the only way is to process it thru a web-server which can securely access AWS and store the notes.. or, you could just write those notes to a local disk (where the webserver sits), and schedule tools like s3cmd to automatically synch them with S3 buckets.
Related
I am creating a web app using React where the user can upload files to a folder in my S3 bucket. This folder will have a unique passcode name. The user (or someone else) can use this passcode to retrieve these files from the S3 folder. So basically there is no login/authentication system.
My current issue is how do I safely allow read/write access to my S3 bucket? Almost every tutorial stores the access keys to the client code which I read is very bad practice but I also don't want to create a backend for something this simple. Someone suggested presigned URLs but I have no idea how to set that up (do I use Lambda? IAMs?). I'm really new to AWS (and webdev in general). Does anyone have any pointers on what I could look into?
do I use Lambda? IAMs?
The setup and process if fully explained in AWS blog:
Uploading to Amazon S3 directly from a web or mobile application
I have a REST API that supports a multi-user React App that I have been running on an EC2 instance for a while and I am moving into uploading photos/pdfs that will be attached to specific users. I have been able to accomplish this in testing using EFS which I had provisioned to the same EC2 instance I had the API on but I am looking to move towards S3.
When I was testing it out using EFS, I would send everything through the REST API, the user would do a POST and then the API would store the file in EFS along with the metadata for where the file was stored in my DB, then in order to retrieve the data the user would do a GET to the REST API and the server would fetch the data from EFS based on metadata in the DB.
I am wondering what is the usual use case for S3? Do I still have to send everything through my REST API if I want to be sure that users only have access to the pdfs/images that they are supposed to or is there a way for me to ensure their identity and directly request the resources from S3 on the front-end and just allow my API to return a list of S3 urls for the files?
The particular use case I have in mind is making it so users are able to upload profile pictures and then when a user searches for another user by name, all of the profile pictures of the users returned in the query of users are able to be displayed in the list of results.
As far as I know, there is no "normal" way to deal with this particular situation - either could make sense depending on your needs.
Here are some options:
Option 1
It's possible to safely allow users to access resources directly from S3 by using AWS STS to generate temporary credentials that your users can utilise to access the S3 APIs.
Option 2
If your happy for the pics to be public, you could configure a bucket as a static website and simply use those public URLs in your web application.
Option 3
Use Cloudfront to serve private content from your S3 buckets.
I have Amazon S3 as my file storage server and EC2 instances as my application logic server. Now my app needs to upload some files which need necessary processing. I can think of two ways to do this:
Upload the file directly from mobile devices and get the file name and location(url), then send the url to my backend. Backend get the file by URL and do its job.
Send the file to backend using a multipart form, backend accepts the file do its job and finally save the file to Amazon S3.
Which is the standard way? What are the reasons?
Sending the object direct to Amazon S3 will be more scalable, less error-prone and cheaper (since you need less web server capacity to handle the uploads). Send the corresponding information to a Simple Queueing Service (SQS) queue that the back-end service can monitor and process. That way, if your back-end is ever offline, the jobs will simply queue-up and will get processed when the server is running again. A good use of loose coupling.
A third option would be to send the file directly from your mobile to Amazon S3, using Metadata fields to identify originating user, and then configure the S3 bucket to trigger some code in AWS Lambda that can process the file. This could do all the processing, or could simply trigger a process on your web server. Again, this reduces load on the web server and would not require sending a message to trigger the processing.
In my GWT application, a 'root' user upload a specific text file with data and that data should be available to anyone who have access to the app (using GAE).
What's the classic way to store a data that will be available to all users? I don't want to use any database (objectify!?) since this is a relatively small amount of information and it changes from time to time by root.
I was wondering if there was such static MAP on the 'engine level' (not user's session) that this info can be stored (and if the server is down - no bigi, root will upload again)
Thanks
You have three primary options:
Add this file to your /war/ directory and deploy with the app. This is what we typically do with all static files that rarely change (like .css file, images, etc.) This file will be available to all users, whether they are authenticated or not.
Add this file to your /war/WEB-INF/ directory and deploy with the app. This file will be available to your server-side code, so you can read it on the server-side and show to a user. This way you can decide which users can see this file and which users should not have access to it.
Upload this file to Google Cloud Storage. You can do it through an app, or you can simply upload it manually to a bucket using a GCS console or gsutil command-line tool. Then you simply provide a link to your users. The advantage of this option is that you do not have to redeploy your app when a file changes.
The only reason to go with the first two options is to have this file under version control. If you don't need that, I would recommend going with the GCS option.
Here is the basic concept of what I am trying to do. My web app allows my clients to log in to a dashboard.
One of the things I want to show on their dashboard is THEIR work files.. ie: PDF files.
I store these files in OneDrive in a seperate folder for each client
Root Doc Directory
- Client A
- File1.pdf
- File2.pdf
- Client B
- File1.pdf
etc
so when client A logs in, I want to show all the files in the ClientA folder...
concept sounds simple, and with storage on my own server, I can do this easy, but I cant find how to do it using OneDrive...
Does anyone out there have any ideas?? All the info I have found about OneDrive APIs requires users to actually log into onedrive which I dont want.
Basically you're using OneDrive wrong. You should be asking each user of your service to sign in with their Microsoft Account and store the files in the user's OneDrive. Storing them all in your OneDrive means they can't access those files outside of your app (like by logging into OneDrive). Instead of using Microsoft Account as the security for those files, you're putting all of the security requirements on your own ability to protect access to your OneDrive account. Basically, doing it way you proposed is strongly not recommended.
You can pretty easily integrate OAuth into your website so that a user can connect your site to OneDrive and then have access to their files from OneDrive in your service.
The alternative would be to use something like Azure Blob Storage to store/retrieve these files. Then your app would just have the set of access keys required to access storage and you wouldn't have to deal with signing into a single OneDrive account from the service and keeping the access and refresh tokens up to date.