Copy data from google datastore to CSV - google-app-engine

I am storing data in google cloud datastore via an appengine project and would like to download one of the entities as a CSV file.
I have set up gsutil so that it defaults to this appengine project.
I have also created a bucket under cloud storage.
Is there a way to use gsutil to move data into the bucket?
gsutil cp gs://bucket_name
seems a likely candidate.
But IS this the way to go? And if so, what is the structure of the URI?
Many thanks!

You can use the lookup method to retrieve and entity JSON presentation by key.

Related

Recommend way to access data file in the app engine python services

Have a requirement of accessing large pandas data frame files to run some analytics in the worker of the app engine via google cloud tasks
Can somebody please suggest on what component in google cloud can be used for storing and accessing files quickly ?
Any reference to example would be of great help.
I think Google Cloud Storage is the best place to store and accessing the files quickly :
https://cloud.google.com/storage/docs/discover-object-storage-console
GCS can store large files and a big amount of files.
You can also use :
gsutil to move or copy files between buckets.
Storage transfer service
https://cloud.google.com/storage-transfer/docs/overview
Your application from app engine could use Cloud Storage.

Uploading a large file to Google Storage directly from App engine

I'm trying to build a system where a user selects a large dataset from their dropbox, and this data is downloaded to a google cloud storage bucket.
The problem is that my backend code runs on AppEngine, and therefore I cannot download the large file to disk for uploading to the bucket.
Is there a way to programmatically tell Cloud storage to retrieve data from a URL?
Or is there another way to download this data on an AppEngine instance, and upload it from there?
You can't directly tell GCS to please download one file from the Internet and save it in a bucket.
On the other hand, moving a large collection of objects is the business of Google's Storage Transfer service. It may suit your needs, depending on what you mean by "a large dataset." https://cloud.google.com/storage-transfer/docs/overview

Upload image to google bucket using jQuery-File-Upload

I've been able to get jQuery-File-Upload from https://github.com/blueimp/jQuery-File-Upload (python version) to allow my website users to upload images to a google app engine app, however it saves the images (for a limited time) as a blob.
It would be preferable to have the images saved in a google bucket but I'm unable to figure out how to get the files to save in a bucket rather than a blob.
Can anyone recommend a similar way for my website visitors to upload images to my google bucket, or if jQuery-File-Upload to do so?
I created an Appengine Python signed url example here:
There is a cloud API to directly upload files to google cloud bucket via javascript, however implementing security could make things a bit sticky.
In general,It should be fairly easy to use google cloud services found here
use storage insert api call
Now if you like to give access to clients to upload data on you application's behalf, then you may have to build a signed url ,signed with your key and then directly upload to cloud, read more here
I find it easy to do server side operations if the upload size is fairly small by using BlobStore API. You can use the Blobstore API to store blobs in Cloud Storage instead of storing them in Blobstore. Read all about it here.
This method might be the easiest to implement in your case .

appcfg download_data on downloading blobstore

In AppEngine i could download the data from appspot to my local using appcfg download_data but it seems to be now downloading the one from the blobstore Api, is there any way i could download the blob as well as the data keeping their keys synchronized?
So the final outcome is I download data from the server (including blob) go to another pc and upload all data and blob thus when i run it, it would look like the one the appspot. Thanks

Google appengine datastore alternative?

I'm using Google AppEngine with build-in datastore. But, I want move all datastore to my new VPS.
I'll use Apache Cassandra. How to move from GAE Datastore to Apache Cassandra?
My guess is you're looking at a tool such as the bulk loader/downloader:
http://code.google.com/appengine/docs/python/tools/uploadingdata.html
You'll want to export all your data into CSV, then write a script to import this into any new format you want.
You can not use the bulk downloader if you are using the "High Replication" datastore.
You can use a manual aproach such as listing all your entities as dictionaries. You will have a JSON formatted string. By using this you can generate your entities again suitable for your new system.

Resources