Was wondering if I'm unconsciously using the Put method in my last line of code ( Please have a look). Thanks.
class User(db.Model):
name = db.StringProperty()
total_points = db.IntegerProperty()
points_activity_1 = db.IntegerProperty(default=100)
points_activity_2 = db.IntegerProperty(default=200)
def calculate_total_points(self):
self.total_points = self.points_activity_1 + self.points_activity_2
#initialize a user ( this is obviously a Put method )
User(key_name="key1",name="person1").put()
#get user by keyname
user = User.get_by_key_name("key1")
# QUESTION: is this also a Put method? It worked and updated my user entity's total points.
User.calculate_total_points(user)
While that method will certainly update the copy of the object that is in-memory, I do not see any reason to believe that the change will be persisted to the the datastore. Datastore write operations are costly, so they are not going to happen implicitly.
After running this code, use the datastore viewer to look at the copy of the object in the datastore. I think that you may find that it does not have the changed total_point value.
Related
At some point in the future I may need to bulk load migration data (i.e. from a CSV). Has anyone had exceptions raised doing the following? Also is there any change in behaviour if the ndb.put_multi() function is used?
from google.appengine.ext import ndb
while True:
if not id:
break
id, name = read_csv_row(readline())
x = X(parent=ndb.Key('Y','static_id')
x.id, x.name = id, name
x.put()
class X(ndb.Model):
id = StringProperty()
name = StringProperty()
class Y(ndb.Model):
pass
def read_csv_row(line):
"""returns tuple"""
From my research and thanks to comments it seems that the code above (where it made into valid code) create problems which would eventually lead to google.appengine.api.datastore_errors.Timeout exceptions being thrown.
See another question:
Datastore write limit tests - trying to break app engine, but it won´t break ;)
The best suggestion I have so far is to use a Task Queue to to rate limit this. More information on:
blog.notdot.net/tag/deferred
It might be the most dumb question and my apologies for the same but I am confused
I have the following entity:
class Profile(ndb.Model):
name = ndb.StringProperty()
identifier = ndb.StringProperty()
pic = ndb.BlobKeyProperty() # stores the key to the profile picture blob
I want to delete the "pic" property value of the above entity so that it should look as fresh as if "pic" was never assigned any value. I do not intend to delete the complete entity. Is the below approach correct:
qry = Profile.query(Profile.identifier==identifier)
result_record_list = qry.fetch()
if result_record_list:
result_record_list[0].pic.delete() # or result_record_list[0].pic = none # or undefined or null
I am deleting the actual blob referred by this blob key separately
assign None to it and put it back to the datastore.
result_record_list[0].pic = None
result_record_list[0].put()
The datastore is an OO schemaless databse. So you can add and remove properties from the the Kind (ndb.Model) without the need of a schema update.
If you also want to cleanup the entities look at this anwser from Guido
dear all
Currently I'm using ndb API to store some statistic information. Unfortunately, this becomes the major source of my cost. I'm thinking it should be much cheaper if I only save them to memcache. It doesn't matter if data is lost due to cache expire.
After read the manual, I assume _use_datastore class variable can be used to configure this behaviour:
class StaticModel(ndb.Model):
_use_datastore = False
userid = ndb.StringProperty()
created_at = ndb.DateTimeProperty(auto_now_add=True)
May I know if above statement is the right solution?
Cheers!
I think there are three ways to achieve what you want.
The first is to set _use_datastore = False on the NDB model class as per your question.
The second would be to pass use_datastore=False whenever you put / get / delete a StaticModel. An example would be:
model = StaticModel(userid="foo")
key = model.put(use_datastore=False)
n = key.get(use_datastore=False)
The third option would be to set a datastore policy in the NDB Context which returns false for any StaticModel keys. Something like:
context.set_datastore_policy(lambda key: True if key.kind() == 'StaticModel' else False)
I have a latency problem in my application due to the datastore doing additional queries for referenced entities. I have received good advice on how to handle this for single value properties by the use of the get_value_for_datastore() function. However my application also have one-to many relationships as shown in the code below, and I have not found a way to prefetch these entities. The result is an unacceptable latency when trying to show a table of 200 documents and their associated documentFiles (>6000ms).
(There will probably never be more than 10.000 Documents or DocumentFiles)
Is there a way to solve this?
models.py
class Document(db.Expando):
title = db.StringProperty()
lastEditedBy = db.ReferenceProperty(DocUser, collection_name = 'documentLastEditedBy')
...
class DocUser(db.Model):
user = db.UserProperty()
name = db.StringProperty()
hasWriteAccess= db.BooleanProperty(default = False)
isAdmin = db.BooleanProperty(default = False)
accessGroups = db.ListProperty(db.Key)
...
class DocumentFile(db.Model):
description= db.StringProperty()
blob = blobstore.BlobReferenceProperty()
created = db.DateTimeProperty() # needs to be stored here in relation to upload / download of everything
document = db.ReferenceProperty(Document, collection_name = 'files')
#property
def link(self):
return '%s' % (self.key().id(),self.blob.filename)
...
main.py
docUsers = DocUser.all()
docUsersNameDict = dict([(i.key(), i.name) for i in docUsers])
documents = Document.all()
for d idocuments:
out += '<td>%s</td>' % d.title
docUserKey = Document.lastEditedBy.get_value_for_datastore(d)
out +='<td>%s</td>' % docUsersNameDict.get(docUserKey)
out += '<td>'
# Creates a new query for each document, resulting in unacceptable latency
for file in d.files:
out += file.link + '<br>'
out += '</td>'
Denormalize and store the link in your Document, so that getting the link will be fast.
You will need to be careful that when you update a DocumentFile, you need to update the associated Document. This operates under the assumption that you read the link from the datastore far more often than you update it.
Denormalizing is often the fix for poor performance on App Engine.
Load your files asynchronously. Use get_value_for_datastore on d.files, which should return a collection of keys, which you can then do db.get_async(key) to return a future object. You will not be able to write out your result procedurally as you have done, but it should be trivial to assemble a partial request / dictionary for all documents, with a collection of pending future gets(), and then when you do your iteration to build the results, you can finalize the futures, which will have finished without blocking {~0ms latency}.
Basically, you need two iterations. The first iteration will go through and asynchronously request the files you need, and the second iteration will go through, finalize your gets, and build your response.
https://developers.google.com/appengine/docs/python/datastore/async
I have a python program in Google App Engine
When finding an object in the datastore when I have the key as a string, how can I do a direct read. Below is my code which is performing a loop, not good.....
class Opportunity(db.Model):
customer = db.ReferenceProperty(Customer,collection_name='opportunitys')
BNusername = db.StringProperty()
opportunity_no = db.StringProperty()
# etc etc etc.....
#BnPresets holds the object key as a string
opportunitys = Opportunity.all()
opportunitys.filter('BNusername =',BnPresets.myusername)
for oprec in opportunitys:
if str(oprec.key()) == BnPresets.recordkey:
opportunity = oprec
# I have the object here and can process etc etc
You can instantiate db.Key from string by passing it directly to the constructor:
opportunity_key = db.Key(BnPresets.recordkey)
Once you have that, simply db.get to obtain the entity identified by this key:
opportunity = db.get(opportunity_key)
I guess (by looking at the query you use) that you also want to verify the username of the object you got:
if opportunity.BNusername == BnPresets.myusername
process_opportunity(opportunity)
That should be pretty much it. The bottom line is that you should use the key first - as it uniquely identifies your entity - rather than querying for some other property and iterating through results.