Direct File upload to Google Cloud cdn Storage in Angular without uploading file to app server - angularjs

I'm using cloud storage(bucket) as a cdn server for my application. I do not store the user upload files on my app server instead they are saved to Google cloud. Current implementation is as follows: user upload file from website(HTML page) to API server and then API server save this file to Google cloud storage(CDN/bucket), which is a dual process.
But now I want to direct upload file from angular HTML page i.e. from website to the Google cloud storage. I know we can upload file from API server(python or java or many other) to cloud storage but I don't want to do that. The reason behind is that I want to eliminate dual process time means first uploading file to our server and then server will send that file to google cloud. If the file size is small above we can consider the above work flow but what if I'm uploading large size files to server.
Is this possible to create some Authentication Headers values on server side and client(HTML/website) will use it for direct uploading file to Google Cloud Storage(bucket)? And after response from google cloud api the client will call the API server to save the respective changes.
I did lots of R&D's on it but did find any useful solution.
I also checked the below link but no luck.
Upload file to Google Cloud Storage using AngularJS multipart form data

Create a signed urls to upload a file directly to gcs without authentication.
https://cloud.google.com/storage/docs/access-control#Signed-URLs

Related

Can frontend applications upload files directly to Google Cloud Storage without a middle tier?

I'm trying to build a a web app that does some image/pdf processing. the intent is to allow users to upload a file and get some response back.
My frontend stack is React and I'm using GCP Storage along with cloud functions that run based on the notification from storage.
All examples that I find for uploading files to Google Storage are for NodeJS implementations with Express and so is their SDK. only references I see for web is via Firebase but I'm not sure I want to go that route. is there any other way to build such functionality without an express implementation?
Most of JS libraries can work on both NodeJS and the Browser (where your React app runs).
Sometimes instead libraries are tied to NodeJs only, or browser only. It depends on the runtime APIs they use (eg. "fs" module is available to NodeJS only)
Moreover, before uploading files you want to make sure your users are authenticated and have permissions to upload. So normally you have:
Mobile app authenticating users
Mobile app invoking your backend/function for upload passing auth token
Backend/function validating the request authentication token
Backend/function pushing the file to the storage (the backend/function itself needs to run using a service account that's allowed to write to the storage bucket)

how to download data from the web and move them directly into google cloud storage

I want to download data from an authorized web portal and store them in google data storage. Certainly I can download them to my desktop and then upload them to the data storage using gsutil. I wonder if it is possible to directly download and upload the data (between the webportal and google cloud storage) without going through my desktop. With that I think I can schedule such task regularly with google cronjob.

how to migrate google blobstore chunk files to azure blob store

I am migrating my google app engine project to windows azure app platform.Please help me to migrate the all google app engine blobstore files to Azure blobstore.
I have one solution with python.But I am not much aware of Python,Please help me if it is possible with javascript,java or any tool.
A simple way to do this (if you are not working with a huge amount of data) would be to download the google app engine blobstore files to a local drive, and then upload them using a tool like Azure Storage Explorer (http://storageexplorer.com/). Azure Storage Explorer lets you upload a directory through the user interface. Or you can use the AzCopy command-line interface to upload local files to Blob storage (https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/).
For downloading google blobstore files, you may be able to use a tool or interface like the ones described here: How can I get to the blobstore in the new Google Cloud Console?
If you have a very large amount of data, then you may need a different option. For uploading very large amounts of data to Azure Blob storage, you can use the Import/Export service, and send Microsoft a drive containing your data (https://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/).

Upload image to google bucket using jQuery-File-Upload

I've been able to get jQuery-File-Upload from https://github.com/blueimp/jQuery-File-Upload (python version) to allow my website users to upload images to a google app engine app, however it saves the images (for a limited time) as a blob.
It would be preferable to have the images saved in a google bucket but I'm unable to figure out how to get the files to save in a bucket rather than a blob.
Can anyone recommend a similar way for my website visitors to upload images to my google bucket, or if jQuery-File-Upload to do so?
I created an Appengine Python signed url example here:
There is a cloud API to directly upload files to google cloud bucket via javascript, however implementing security could make things a bit sticky.
In general,It should be fairly easy to use google cloud services found here
use storage insert api call
Now if you like to give access to clients to upload data on you application's behalf, then you may have to build a signed url ,signed with your key and then directly upload to cloud, read more here
I find it easy to do server side operations if the upload size is fairly small by using BlobStore API. You can use the Blobstore API to store blobs in Cloud Storage instead of storing them in Blobstore. Read all about it here.
This method might be the easiest to implement in your case .

How to transfer large files from Google App Engine to an external server

I need my application in Google App Engine to be able to transfer files up to 25MB size to an external server via web service. How can this be done?
I need to do it the other way to, transfer a large file from an external server to Google App Engine. Is it possible?
Thanks.
For sending files you can't do it with URLFetch as the request size limit is 10MB.
You can do it with sockets, where you can fsockopen() directly to the target web service and then write as much data as you want over the socket.
For retrieving a large file, you can upload up to 100TB files directly to Google Cloud Storage using createUploadURL if you want you app to act as a service, otherwise you can retrieve files up to 32MB using URLFetch.

Resources