I'm trying to build a system where a user selects a large dataset from their dropbox, and this data is downloaded to a google cloud storage bucket.
The problem is that my backend code runs on AppEngine, and therefore I cannot download the large file to disk for uploading to the bucket.
Is there a way to programmatically tell Cloud storage to retrieve data from a URL?
Or is there another way to download this data on an AppEngine instance, and upload it from there?
You can't directly tell GCS to please download one file from the Internet and save it in a bucket.
On the other hand, moving a large collection of objects is the business of Google's Storage Transfer service. It may suit your needs, depending on what you mean by "a large dataset." https://cloud.google.com/storage-transfer/docs/overview
Related
Have a requirement of accessing large pandas data frame files to run some analytics in the worker of the app engine via google cloud tasks
Can somebody please suggest on what component in google cloud can be used for storing and accessing files quickly ?
Any reference to example would be of great help.
I think Google Cloud Storage is the best place to store and accessing the files quickly :
https://cloud.google.com/storage/docs/discover-object-storage-console
GCS can store large files and a big amount of files.
You can also use :
gsutil to move or copy files between buckets.
Storage transfer service
https://cloud.google.com/storage-transfer/docs/overview
Your application from app engine could use Cloud Storage.
I am migrating my google app engine project to windows azure app platform.Please help me to migrate the all google app engine blobstore files to Azure blobstore.
I have one solution with python.But I am not much aware of Python,Please help me if it is possible with javascript,java or any tool.
A simple way to do this (if you are not working with a huge amount of data) would be to download the google app engine blobstore files to a local drive, and then upload them using a tool like Azure Storage Explorer (http://storageexplorer.com/). Azure Storage Explorer lets you upload a directory through the user interface. Or you can use the AzCopy command-line interface to upload local files to Blob storage (https://azure.microsoft.com/en-us/documentation/articles/storage-use-azcopy/).
For downloading google blobstore files, you may be able to use a tool or interface like the ones described here: How can I get to the blobstore in the new Google Cloud Console?
If you have a very large amount of data, then you may need a different option. For uploading very large amounts of data to Azure Blob storage, you can use the Import/Export service, and send Microsoft a drive containing your data (https://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/).
I've been able to get jQuery-File-Upload from https://github.com/blueimp/jQuery-File-Upload (python version) to allow my website users to upload images to a google app engine app, however it saves the images (for a limited time) as a blob.
It would be preferable to have the images saved in a google bucket but I'm unable to figure out how to get the files to save in a bucket rather than a blob.
Can anyone recommend a similar way for my website visitors to upload images to my google bucket, or if jQuery-File-Upload to do so?
I created an Appengine Python signed url example here:
There is a cloud API to directly upload files to google cloud bucket via javascript, however implementing security could make things a bit sticky.
In general,It should be fairly easy to use google cloud services found here
use storage insert api call
Now if you like to give access to clients to upload data on you application's behalf, then you may have to build a signed url ,signed with your key and then directly upload to cloud, read more here
I find it easy to do server side operations if the upload size is fairly small by using BlobStore API. You can use the Blobstore API to store blobs in Cloud Storage instead of storing them in Blobstore. Read all about it here.
This method might be the easiest to implement in your case .
I am writing an Android application that allows users to upload and share photos. The server is based on Google App Engine. App Engine's datastore does not allow to save file, so currently I just have URLs saved. Looking for a way to store files I read about Google Cloud Storage. My question is - if I'm looking for a host for user uploaded files, is Google Cloud Storage what I'm looking for?
Yes. Google Cloud Storage is the way forward. There is also the BlobStore API on App Engine that allows you to store large amounts of information but the road map seems to be clear i.e. use Google Cloud Storage moving forward.
The reason for going with GCS will also be influenced that eventually you might want various tools or utilities that people have written that work directly with GCS. With Blobstore API, you will need to write those utilities yourself or rely on Admin console's support for taking backup,etc - which is not really much.
In summary, go with GCS.
Yes thats what you want. Says the same if you read the docs about google cloud storage.
I'm new to AppEngine and I'm building an app that accept user image uploads from Android devices.
I built it with Cloud Storage but then I realized that I have problems uploading large files (maybe because of request time limits?)
so I figured out I should use Blobstore's upload URL to properly upload multiple large files.
Blobstore also has the on-the-fly image resizing feature which is very nice.
the thing is, Cloud Storage is cheaper than the Blobstore.
should I move the uploaded files from Blobstore to Cloud Storage after uploading ?
is there a way to upload multiple large files to AppEngine without going through the Blobstore upload URL way ?
I'm using Go if it matters..
The simplest answer is probably to use a signed url to allow the user to upload directly to Cloud Storage. This lets you bypass App Engine entirely for your upload, which in turn simplifies the network usage and allows you to take full advantage of all of Cloud Storage's upload infrastructure.
Currently, blobstore is $0.0009 / GBHour, while Cloud Storage is $0.0027 / GBHour, so it seems that blobstore is now 3 times cheaper than Cloud Storage. So while there may be reasons to move to Cloud Storage, cost is not currently one of them. Note that the prices changed recently.
If you need the richer API provided by Cloud Storage, then that's another story of course.