What's the best way to control S3? - reactjs

I am using:
React as front end,
.net core as back end.
I have s3 bucket
Purpose: upload file to S3
allow the users to upload directly via browser ( can be via was sdk or HTTP post)
What I want to achieve:
because of the s3 key, I don't want to give the key to one user, otherwise, one user can upload and read other users file.
I don't want to pass the file to the server, then upload via server.
What's the best way for me to control this? getting a unique key via the my backend server for a particular user?
Or any suggested link/training I can go to?

You can generate an S3 presigned POST URL in the backend using the secret access keys of an appropriate IAM user (let's call it User A). Then return this presigned POST URL to the client, and now client can use this presigned POST URL upload files to S3 bucket on behalf of User A. Here is the documentation which describes how to POST an object to S3 in detail.

Related

Sending files to Kloudless saver from clientside

I'm currently using a dropbox client js script to push zip files to a folder (in test, a couple of k, in production, a couple of hundred meg) - there currently isn't a server/back end, so am posting from an arraybuffer, not a server url.
var zip = new JSZip();
zip.file("test.txt", "Hello World\n");
var content = zip.generate({type:"arraybuffer"});
// ... code to pick a dropbox folder ...//
client.writeFile(url+"/"+fileName, content, function(error){ ... etc
This all works fine - client is able to write the binary file (which Dropbox's own Saver is unfortunately unable to do). I'm trying to see if Kloudless is able to perform the same, since I also need to support google, box, etc at some point. https://github.com/kloudless/file-explorer/'s documentation about its saver says the files are an array of urls ..
explorer({
...
files: [{
"url": "http://<your image url>",
"name": "filename.extension"
},
It doesn't seem to like local storage file references using URL.createObjectURL(blob), so I'm guessing the api is telling the remote services to pull the files rather than pushing their data.
You are correct that the Kloudless API backend servers stream the file from the URL to the final destination in whichever cloud service you would like the file to be uploaded to (e.g. a folder in a Dropbox account).
If the files are present only on the client-side, I would recommend using the Kloudless Chooser to instead prompt the user to choose a folder to save the files in, and then handle uploading the file data to that destination via the client-side manually.
To do this, refer to this example configuration: https://jsfiddle.net/PB565/139/embedded/
I have set retrieve_tokens to true so that my client-side JavaScript will receive not just the metadata of the folder the user chooses to upload the data to but also the Bearer token to use to gain access to the user's account. This allows the client-side JavaScript to then make upload or multipart upload requests to Kloudless to upload the file data to that folder. The advantage of multipart uploads is that an error uploading one chunk wouldn't require the whole upload to be retried.
Be sure to add the domain you are hosting the File Explorer on to your Kloudless App's Trusted Domains (on the App Details page) so that it can in fact receive the Bearer token in the response JS callback. In my JSFiddle example, I would have to add 'fiddle.jshell.net' to my app's list of Trusted Domains to be able to receive the Bearer token to perform further requests from the client side to the Kloudless API.

GAE Calling Servlet with user authenticated through gapi.auth.authorize

I have a Google Cloud Endpoint which I access from an HTML page through JavaScript and the Google JavaScript client Library.
I authenticate with OAuth2.0 by using the standard
gapi.auth.authorize({client_id: CLIENT_ID, scope: SCOPES, immediate: mode}, callback);
Everything works correctly and I am able to read/write data from/to the underlying Datastore.
In the same AppEngine project I have a servlet that generates a PDF based on data that is in the Datastore.
I would like to be able to call this Servlet from my HTML page using the same user that was authenticated through the api.auth.authorize() method.
And in the servlet, get the User through
UserService userService = UserServiceFactory.getUserService();
and query the datastore for the data of this user and then generate a PDF showing this data.
I have no idea how to call this url (servlet) with the credentials of the OAuth autheticated user.
Can you help me please??
Thanks in advance!
Note that the same question was asked some months ago but without a "complete" answer: GAE User API with OAuth2
You should look into bucket/object ACLs. When your API endpoint gets the User object, it can use the user's email to set the ACL on the PDF which is generated. That way, you can serve the PDF file to the user simply using its URL. You could also check with an endpoints API call whether the user is indeed authenticated as the person who is allowed to access the requested PDF (having stored a Datastore entry, perhaps, parallel to the object), and generate a signed URL once this is confirmed.

How to authenticate requests to images in an angularJS app using an access_token

I have an angularJS application whose authentication system is made with an access_token and communicating with a backend server written in Go
Therefore to authenticate the user, I have to add the access_token to each request ?access_token=SomeTokenHere or via a header Access-Token: SomeTokenHere
It works great.
The issue is that my backend serves protected images, and I cannot put the access token in the image src for some security reasons(If the user copy/paste the link to someone else, the access_token would be given too ...)
What I'd like to do is to inject a header containing my access token, but since the request is made from the browser it doesn't possible.
This could probably be possible with a cookie, but I'd to avoid using a cookie especially because the angularApp and the backend are not on the same sub-domain.
Another idea would be to create a directive that would get the raw data, and set the data to the image. First I don't know if this is possible and is it a good idea for hundreds of images in the webpage ?
If you have any idea, I'd be glad to hear from you !
Thanks
It is typical problem, and fortunately it was already solved - for example on Amazon S3.
Solution idea is simple: instead of directly using secret key - you can generate signature with access policy, using that key.
There is good algorithm designed especially to generate signatures for this purpose - HMAC
You can have one secret key on your server and use it to sign every image URL before it would be sent to client, also you can add some additional policies like url expiration time...

get value of access token

I started using google API recently . I am using simpleauth https://github.com/crhym3/simpleauth for authentication to google app engine . Now I am using google blogger API for publishing my blog and fetching data .
This API requires access_token value to use the API https://developers.google.com/blogger/docs/3.0/using#RetrievingPostsForABlog for Authorization . I cant find a way to get value of access token .
Is there a way to get the value of acces token or am I doing something wrong ?
You need to register your webapp with Google to get a client ID and client secret. Then, you can configure your OAuth2 library with these details to allow you to send fully authenticated requests from your webapp to Blogger.
For the specific scenario you listed, retrieving a blog post, I think you can follow step 1 of this page and then follow these steps. You should be able to copy+paste the key from there into the query params of the GET request.
To issue fully authenticated requests, for publishing new posts, for example, you'll have to get your OAuth2 library with the client ID and client secret and have it issue the requests for you.

Redirect GET request and include an authentication token

GAE = Google App Engine
GCS = Google Cloud Storage
My GAE application receives GET requests for files that are actually stored on a bucket of GCS. I would like to redirect those requests to their real location and include an auth token in the redirected request so that GCS accepts to serve them.
To issue a redirection, GAE exposes webapp2.RequestHandler.redirect which does not let me add any header to the original request.
Is it possible to redirect the GET request and include an auth token in it?
HTTP redirect is just a response with 3XX status code. You can't forward a header or response body to a new location.
That said, you will want to implement some logic on a client. Your client has to issue one request to your GAE application, then process the response, and then issue one more request to GCS with all the headers or body that you want to supply (auth token in your case).
I updated another thread with this as well, but just in case you didn't see it.
In the upcoming 1.6.4 release of AppEngine we've added the ability to pass a Google Storage object name to the blobstore.send_blob() to send Google Storage files of any size from you AppEngine application. We create the correct token for your application to access the objects in the Google Storage bucket.
Here is the pre-release announcement for 1.6.4.

Resources