In my GAE App I just do a datastore_backup for all Kinds.
When I try to load it to BigQuery almost Kinds was successfully loaded however with one Kind I have this error:
invalid - Invalid field name "rows.pedido_automatico". Fields must
contain only letters, numbers, and underscores, start with a letter or
underscore, and be at most 128 characters long.
My GAE Kind:
class StockRow(ndb.Model):
pedido_automatico = ndb.StringProperty(default="N",choices=set(["S","N"]))
class Stock(ndb.Model):
rows = ndb.StructuredProperty(SemAlmacenPedidoRutaRow, repeated=True)
Is this a known bug ?
This is a known bug that we've got a fix for internally and should be in our next release.
Related
I am building a tool using the Google Cloud Datastore Java API. The backend of this tool has a bunch of methods and APIs that we made which are hosted on the Google App Engine. The data we collect from the tool comes from a Chrome extension we built and using the above mentioned APIs we store our data in GCD. Everything works perfectly well in our implementation except for one thing, the Identifiers.
I created a method to store all our relevant information in several tables and while submitting I am creating each Entity with an Identifier which is the next number in ascending order from the previous entry in the table. The tool is being used by several people and the entries for that particular day are stored in correct order. However, everyday it seems that thee ID variable is reset and our table starts overwriting information as the ID starts from 1 again. It remains constant during that day but soon as the date changes, the ID starts from 1 again.
AtomicInteger Identifier = new AtomicInteger();
public void DataEntity(String EmpName, String Date, String Col1, String Col2)
{
id = Identifier.incrementAndGet();
Entity en = new Entity("DataTable", id);
task.setProperty("Employee Name", EmpName);
task.setProperty("Submit_Date", Date);
task.setProperty("Column1", Col1);
task.setProperty("Column2", Col2);
...
ds.put(en);
}
My guess is that at the end of the day all the methods are garbage collected. I should also note that our app is threadsafe, hence, data is not getting overwritten simultaneously. Only the next day when all the variables seem to have been reset and everything starts from 1. Any help will be much appreciated. Please let me know in case you have any questions. I'll be happy to provide more info.
Apparently the default query limit on the number of returned documents is currently 20. Changing it is possible by using QueryOptions.Builder.setLimit(). Java dev docs don't seem to indicate the allowed maximum.
I have thousands of records indexed in my application and searches might potentially return a large number of objects. Instead of hardcoding something like MAX_QUERY_RESULTS = 1000 in the app, is there a way to programmatically access this search quota?
The class com.google.appengine.api.search.checkers.SearchApiLimits has a long list of constants of this ilk, including SEARCH_MAXIMUM_LIMIT with the value 1000.
I am trying to use app engine's search API to search locations:
https://developers.google.com/appengine/docs/python/search/overview#Performing_Location-Based_Searches
The problem is no matter what I do, I get zero results. I set the search lat/lng as the the exact point on a document's GeoPoint property and it still returns zero.
I know the regular search is working because if I change the query to be a regular full-text search, it works.
Here is an example of my data (this is actually from the example app here: http://www.youtube.com/watch?v=cE6gb5pqr1k)
Full Text Search > stores1
Document Id: sanjose
Field Name Field Value
store_address 123 Main St.
store_location search.GeoPoint(latitude=37.37, longitude=-121.92)
store_name San Jose
And then my query:
index = search.Index('stores1')
loc = (37.37, -121.92)
query = "distance(store_location, geopoint(37.37, -121.92)) < 4500"
loc_expr = "distance(store_location, geopoint(37.37, -121.92))"
sortexpr = search.SortExpression(
expression=loc_expr,
direction=search.SortExpression.ASCENDING, default_value=4501)
search_query = search.Query(
query_string=query,
options=search.QueryOptions(
sort_options=search.SortOptions(expressions=[sortexpr])))
results = index.search(search_query)
print results
And the returns:
search.SearchResults(number_found=0L)
Am I missing something or doing something wrong? This should return at least that one result, right?
** UPDATE **
After doing some prying/searching/testing I think this may be a bug regarding the google app engine development server.
If I run location searches on the same data in the production environment, I get expected results. When I compare and run the exact same query on the data in the development environment, I get the unexpected 0 results.
If anybody has any insight on this, please advise. Otherwise, for those of you seeing the same problem, I created an issue on app engine's issue tracker
here.
You've probably already figured this out, but in case someone comes across this post, the geosearch feature of AppEngine's Search API returns zero results on the dev server. From https://developers.google.com/appengine/training/fts_intro/lesson2:
"...some search queries are not fully supported on the Development Web Server (running locally), so you’ll need to run them using a deployed application."
Here's another useful link:
https://developers.google.com/appengine/docs/python/search/devserver
I have been unable to get the new dev_appserver to work. When I roll back to the old_dev_appserver my application is having trouble fetching data from the datastore because model.key().id() doesn't seem to be returning the correct id.
Anybody know what I might be doing wrong?
Example:
When looking at the datastore in _ah/admin I can see that the entities have and id of 5764607523034234880 but calling entitiy_instance.key().id() returns 5188146770730811000
In other words calling Model.get_by_id(entity.key().id()) returns None. I believe it should return the entity.
I found that something changed betwene 1.7.5 and 1.7.6 that caused these long number to be truncated in a json.stringify().
I fixed the issue by casting the long id to str before placing in the dict to stringify.
1.7.6 changed the default ID allocation from sequential to scattered, which results in such big ID numbers and the problems you've encountered. There's a bug registered to fix this issue.
Meanwhile, my suggestion for local development, is to manually set ID allocation back to sequential, as described here (Specifying the Automatic ID Allocation Policy)
class MyEntity(db.Model):
timestamp = db.DateTimeProperty()
title = db.StringProperty()
number = db.FloatProperty()
db.GqlQuery("SELECT * FROM MyEntity WHERE title = 'mystring' AND timestamp >= date('2012-01-01') AND timestamp <= date('2012-12-31') ORDER BY timestamp DESC").fetch(1000)
This should fetch ~600 entities on app engine. On my dev server it behaves as expected, builds the index.yaml, I upload it, test on server but on app engine it does not return anything.
Index:
- kind: MyEntity
properties:
- name: title
- name: timestamp
direction: desc
I try splitting the query down on datastore viewer to see where the issue is and the timestamp constraints work as expected. The query returns nothing on WHERE title = 'mystring' when it should be returning a bunch of entities.
I vaguely remember fussy filtering where you had to call .filter("prop =",propValue) with the space between property and operator, but this is a GqlQuery so it's not that (and I tried that format with the GQL too).
Anyone know what my issue is?
One thing I can think of:
I added the list of MyEntity entities into the app via BulkLoader.py prior to the new index being created on my devserver & uploaded. Would that make a difference?
The last line you wrote is probably the problem.
Your entities in the actual real datastore are missing the index required for the query.
As far as I know, when you add a new index, App Engine is supposed to rebuild your indexes for you. This may take some time. You can check your admin page to check the state of your indexes and see if it's still building.
Turns out there's a slight bug in the bulkloader supplied with App Engine SDK - basically autogenerated config transforms strings as db.Text, which is no good if you want these fields indexed. The correct import_transform directive should be:
transform.none_if_empty(str)
This will instruct App Engine to index the uploaded field as a db.StringProperty().