How do I get GeoPoint data from Cloud Boost database? The doc are not clear how to do this and they actually don't even show how to do it.
Related
How can developer view documents in App Engine's Search API on development machine?
I save document using search API and I can search and fetch results using Java API, but how do I view the documents? Is there a document viewer for development mode?
Details here: https://cloud.google.com/appengine/docs/standard/java/search/
As per Google Documentation:
In the Cloud Console, you can view information about your application's indexes and the documents they contain. Clicking an index name displays the documents that index contains. You'll see all the defined schema fields for the index; for each document with a field of that name, you'll see the field's value. You can also issue queries on the index data directly from the console.
How can I use as NOT LIKE SQL operator in Python and Google App Engine data store? I want to filter string in database.
q = Post.all()
q.filter("text NOT LIKE", "%Something%")
This feature is not supported by AppEngine Datastore.
The datastore can query entities just by indexes (and get by ID and SELECT without any filters/order are basically queries over index on key).
This is an architectural limitation of the distributed data storage.
There are solution which can do what you want but Google Cloud Datastore is not one of them.
So I want to export some data from GAE datastore, this is how I am trying to do it :
Creating a Cloud Datastore backup of the kind which I want to export.
Loading the backup in Google BigQuery
Exporting it from Google BigQuery
Everything works fine but there's one problem, Google BigQuery loader ignores Blob type fields when loading data from the Cloud Datastore backup.(https://cloud.google.com/bigquery/loading-data-cloud-datastore)
How can I achieve exporting all types of fields (including the Blob type field) ?
Blobs cannot be stored in BigQuery, as per the documentation, since it is not its main purpose, which is Big Data analysis. BigQuery only accepts the standard SQL data types specified here. In any case, depending on your use case and what those blob contain you may have a few options.
As Datastore maximum row size is 1MB and BigQuery is 100MB, you could convert the blob before transferring it. On the other hand, you could use another service such as Google Cloud Storage to store the blobs there and enter in your Datastore database a reference to that file in Cloud Storage.
I have created a feature request for this in the public issue tracker, so you will be able to follow its progress there. However, there is not ETA for this implementation.
I am looking into using Google App Engine for a project and would like make sure I have a way to export all my data if I ever decide to leave GAE (or GAE shuts down).
Everything I search about exporting data from GAE points to https://developers.google.com/appengine/docs/python/tools/uploadingdata. However, that page contains this note:
Note: This document applies to apps that use the master/slave
datastore. If your app uses the High Replication datastore, it is
possible to copy data from the app, but Google does not currently
support this use case. If you attempt to copy from a High Replication
datastore, you'll see a high_replication_warning error in the Admin
Console, and the downloaded data might not include recently saved
entities.
The problem is that recently the master/slave datastore was recently deprecated in favor of the High Replication datastore. I understand that the master/slave datastore is still supported for a little while, but I don't feel comfortable using something that has officially been deprecated and is on its way out. So that leaves me with the High Replication datastore and the only way it seems to export the data is the method above that is not officially supported (and thus does not provide me with a guarantee that I can get my data out).
Is there any other (officially supported) way of exporting data from the High Replication datastore? I don't feel comfortable using Google App Engine if it means my data could be locked in there forever.
It took me quite a long time to setup the download of data from GAE as the documentation is not as clear as it should be.
If you extracting data from a Unix server, you maybe could reuse the script below.
Also, if you do not provide the "config_file" parameter, it will extract all your data for this kind but in a proprietary format which can only be used for restoring data afterwards.
#!/bin/sh
#------------------------------------------------------------------
#-- Param 1 : Namespace
#-- Param 2 : Kind (table id)
#-- Param 3 : Directory in which the csv file should be stored
#-- Param 4 : output file name
#------------------------------------------------------------------
appcfg.py download_data --secure --email=$BACKUP_USERID -- config_file=configClientExtract.yml --filename=$3/$4.csv --kind=$2 --url=$BACKUP_WEBSITE/remote_api --namespace=$1 --passin <<-EOF $BACKUP_PASSWORD EOF
Currently app engine datastore supports another option also. Data backup provision can be used to copy selected data into blob store or google cloud storage. This function is available under datastore admin area in app engine console. If required, the backed up data can then be downloaded from the blob viewer or cloud storage. For doing the backup for high replication datastore, it is recommended that datastore writes are disabled before taking the backup.
You need to configure a builtin called remote_api. This article has all the information and guide you need to be able to download all your data today and in the future.
Is it possible to get a data source URL of Google Spreadsheets for appengine datastore entities? I want to use the google visualization query objects to query my datastore. Or how I an expose my datastore with a datasource URL.
And for a Google visualization based project which one is better between Google Spreadsheet and GAE big table. Since Google Spreadsheet has very good query options and a nice harmonics with google visualization. One can get a direct DataTable from a data source URL. To do the same thing needs a good amount of task with GAE big table. Please share your experience in this area.
There's nothing built in to do this. You'll need to write your own code that returns your data in a format GViz supports.