salesforce uploading to S3 - salesforce

Our app directory upload images to S3 from client side and on success we store the url to custom object in salesforce. However I saw a lot of people are indirectly upload images to S3 through salesforce using AWS API. I wonder if my approach pass through the salesforce security review.

1. Its about Salesforce Limits.
If you upload files to AWS via Salesforce, you will not be able to upload files more than 5MB as salesforce can accept files upto 5MB only. By directly uploading to AWS S3, you dont need to worry about this limit anymore.
2. Its Easy.
Uploading files to AWS is very easy science Amazon provides required APIs to do it. You must be knowing difficulties of working in Salesforce Environment while doing this.
3. Its More Secure
S3 provides all possible kind of security features possible. Directing files from Salesforce will make process slow and less secure.

Related

How do you delete S3 object from the bucket in Nextjs?

It is my first time using AWS S3. I am trying to store images (jpg/png etc.) on S3 and have the URL stored on a database. The application essentially logs the user's session and S3's URL when they submit a picture (profile picture), and when they want to update the profile picture, the original S3 stored picture will be deleted and replaced with a new one.
Currently I am using https://next-s3-upload.codingvalue.com/setup next-s3-upload, which successfully uploads images onto S3 but not deleting. The entire tech stack is NextJS, Prisma and Planetscale, to be hosted on Vercel.
Any tips on how to operating CRUD for S3 would be appreciated, as I am stuck on reading up AWS-SDK but no joy on React/NextJS.
Read the AWS SDK for JavaScript Developer Guide. This will teach you how to perform AWS operations using the AWS SDK for JavaScript, including many Amazon S3 operations.
What's the AWS SDK for JavaScript?
This SDK can be used with React too.
Getting started in React Native
For Amazon S3 operations, see:
Amazon S3 examples using SDK for JavaScript V3
To delete an object, see this topic that shows you the code example:
https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/javascript_s3_code_examples.html#w4aac23b9c25c13
You can also find Amazon S3 code examples in Github here:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/main/javascriptv3/example_code/s3/src

Uploading a large file to Google Storage directly from App engine

I'm trying to build a system where a user selects a large dataset from their dropbox, and this data is downloaded to a google cloud storage bucket.
The problem is that my backend code runs on AppEngine, and therefore I cannot download the large file to disk for uploading to the bucket.
Is there a way to programmatically tell Cloud storage to retrieve data from a URL?
Or is there another way to download this data on an AppEngine instance, and upload it from there?
You can't directly tell GCS to please download one file from the Internet and save it in a bucket.
On the other hand, moving a large collection of objects is the business of Google's Storage Transfer service. It may suit your needs, depending on what you mean by "a large dataset." https://cloud.google.com/storage-transfer/docs/overview

Upload image to google bucket using jQuery-File-Upload

I've been able to get jQuery-File-Upload from https://github.com/blueimp/jQuery-File-Upload (python version) to allow my website users to upload images to a google app engine app, however it saves the images (for a limited time) as a blob.
It would be preferable to have the images saved in a google bucket but I'm unable to figure out how to get the files to save in a bucket rather than a blob.
Can anyone recommend a similar way for my website visitors to upload images to my google bucket, or if jQuery-File-Upload to do so?
I created an Appengine Python signed url example here:
There is a cloud API to directly upload files to google cloud bucket via javascript, however implementing security could make things a bit sticky.
In general,It should be fairly easy to use google cloud services found here
use storage insert api call
Now if you like to give access to clients to upload data on you application's behalf, then you may have to build a signed url ,signed with your key and then directly upload to cloud, read more here
I find it easy to do server side operations if the upload size is fairly small by using BlobStore API. You can use the Blobstore API to store blobs in Cloud Storage instead of storing them in Blobstore. Read all about it here.
This method might be the easiest to implement in your case .

Direct File upload to Google Cloud cdn Storage in Angular without uploading file to app server

I'm using cloud storage(bucket) as a cdn server for my application. I do not store the user upload files on my app server instead they are saved to Google cloud. Current implementation is as follows: user upload file from website(HTML page) to API server and then API server save this file to Google cloud storage(CDN/bucket), which is a dual process.
But now I want to direct upload file from angular HTML page i.e. from website to the Google cloud storage. I know we can upload file from API server(python or java or many other) to cloud storage but I don't want to do that. The reason behind is that I want to eliminate dual process time means first uploading file to our server and then server will send that file to google cloud. If the file size is small above we can consider the above work flow but what if I'm uploading large size files to server.
Is this possible to create some Authentication Headers values on server side and client(HTML/website) will use it for direct uploading file to Google Cloud Storage(bucket)? And after response from google cloud api the client will call the API server to save the respective changes.
I did lots of R&D's on it but did find any useful solution.
I also checked the below link but no luck.
Upload file to Google Cloud Storage using AngularJS multipart form data
Create a signed urls to upload a file directly to gcs without authentication.
https://cloud.google.com/storage/docs/access-control#Signed-URLs

Google App Engine - uploading files, best practices

I want to upload files into my Google App Engine project.
I've been reading a while on this issue, and there is a lot of answers arguing Blobstore is the best option.
But if I did understand this well, these are database objects. I would like to upload them as system files to a caching theme or to ease a possible future migration to another CDN.
Imagine I want to save several files for each user, a couple of images, text files, maybe some video, etc...
If some day I want to move these static files to another CDN, shouldn't it be out of the database?
Is that a good idea? Is there a solution?
I would recommend you to use GAE datastore to store the files references (for example, video, images, etc.) and upload the content on Amazon S3. You may even allow the clients to upload the content directly to Amazon S3 without to process the content through GAE using Amazon HTML POST Forms (Browser Uploads to S3 using HTML POST Forms). Amazon AWS also offers CDN (CloudFront) with tight integration with S3 service.
If you need to store only small pictures, you may also consider the Google images services as it offers "free" CDN and some cool out-of the box transformations.
You can't write to the filesystem on App Engine, so your options are to store data to the datastore, or the blobstore. Of the two, the blobstore is generally the better choice for storing file uploads.
Regarding porting to a CDN, no matter how you do this it's going to require changes. App Engine is not a CDN, no two CDNs are exactly alike, and it's unlikely the CDN would expose an interface that you can interact with using standard filesystem operations anyway.

Resources