GAE-NDB: how prevent projection changed the results - google-app-engine

I used ndb projection but it did change the results, how the results are not affected by projection?
class T(ndb.Model):
name = ndb.StringProperty()
name2 = ndb.StringProperty(repeated=True)
#classmethod
def test(cls):
for i in range(0, 10):
t = T(name=str(i))
if i%2 == 0:
t.name2=["zzz"]
t.put()
qr = T.query()
qo = ndb.QueryOptions(projection=['name', 'name2'])
items, cursor, more = qr.fetch_page(20, options=qo)
print len(items)
qo = ndb.QueryOptions(projection=['name'])
items, cursor, more = qr.fetch_page(20, options=qo)
print len(items)
The result is 5, 10
How to make result is 10, 10 ?
Thanks

An empty list-property (repeated=True) won't get indexed and as it's the index that projection queries use to return results, entities without values for the property won't be returned.
Your test case is susceptible to the eventual-consistency that Tim's comment mentions, but it isn't the only issue.

Related

Django: executing UPDATE query always returns rowcount 0

I'm new to programming and I'm not sure, whether the problem is in me or in the Django code. I call link method from my view and update field MatchId on Record model. Database is SQL Server 2017.
My view:
class RecordViewSet(viewsets.ModelViewSet):
"""
API for everything that has to do with Records.
Additionally we provide an extra `link` action.
"""
queryset = Record.objects.all().order_by("Id")
serializer_class = RecordSerializer
permission_classes = [permissions.IsAuthenticated]
#action(methods=["post"], detail=False)
def link(self, request, *args, **kwargs):
idToMatch = request.POST.getlist("Id")
recordsToMatch = Record.objects.filter(Id__in=idToMatch)
lastMatchId = Record.objects.latest("MatchId").MatchId
matchedSuccesfully = recordsToMatch.update(MatchId=lastMatchId + 1)
if matchedSuccesfully > 1:
return Response(data=matchedSuccesfully, status=status.HTTP_200_OK)
else:
return Response(data=matchedSuccesfully, status=status.HTTP_404_NOT_FOUND)
For some reason matchedSuccessfully always returns zero. Relevant Django code:
def execute_sql(self, result_type):
"""
Execute the specified update. Return the number of rows affected by
the primary update query. The "primary update query" is the first
non-empty query that is executed. Row counts for any subsequent,
related queries are not available.
"""
cursor = super().execute_sql(result_type)
try:
rows = cursor.rowcount if cursor else 0
is_empty = cursor is None
finally:
if cursor:
cursor.close()
for query in self.query.get_related_updates():
aux_rows = query.get_compiler(self.using).execute_sql(result_type)
if is_empty and aux_rows:
rows = aux_rows
is_empty = False
return rows
I rewrote execute_sql as follows:
def execute_sql(self, result_type):
"""
Execute the specified update. Return the number of rows affected by
the primary update query. The "primary update query" is the first
non-empty query that is executed. Row counts for any subsequent,
related queries are not available.
"""
cursor = super().execute_sql(result_type)
try:
if cursor:
cursor.execute("select ##rowcount")
rows = cursor.fetchall()[0][0]
else:
rows = 0
is_empty = cursor is None
finally:
if cursor:
cursor.close()
for query in self.query.get_related_updates():
aux_rows = query.get_compiler(self.using).execute_sql(result_type)
if is_empty and aux_rows:
rows = aux_rows
is_empty = False
return rows
and now it works, but I'm unsure if there is a more elegant way to resolve this since now I have to ship this exact code everywhere. Source code at:
https://github.com/django/django/blob/main/django/db/models/sql/compiler.py
I've faced the same issue and came to the same point in django's depths.
In my case — the problem was in trigger configured for UPDATE.
It should have return ##ROWCOUNT as a result, but in my case it didn't.
Btw, the thing I did (due to restriction on editing triggers) — overrided save method in base model for such models to force_update=True:
class BaseModel(models.Model):
def save(self, force_insert=False, force_update=False, using=None, update_fields=None):
if self._state.adding:
super().save(force_insert=force_insert, force_update=force_update, using=using, update_fields=update_fields)
else:
try:
super().save(force_insert=force_insert, force_update=True, using=using, update_fields=update_fields)
except DatabaseError as e:
if str(e) == 'Forced update did not affect any rows.':
pass
else:
raise e
class Meta:
managed = False
abstract = True

Golang Ravendb giving wrong query result when using OrderBy

When making query with OrderBy it always query wrong result whereby repeating first document multiple times added to list.
q := session.Advanced().DocumentQueryAllOld(reflect.TypeOf(&models.List{}), "", "user", false)
//q = q.WhereNotEquals("Doc.hh", "Tarzan")
q = q.OrderBy("Docnn")
q = q.Statistics(&statsRef)
//q = q.Skip(0)
//q = q.Take(6)
q.ToList(&queryResult)
If no index before it will query right result but if existing index auto created by a different OrderBy value it will query wrong result

Google App Engine - query vs. filter clarification

My model:
class User(ndb.Model):
name = ndb.StringProperty()
Is there any difference in terms of efficiency/cost/speed between the following two queries?
u = User.query(User.name==name).get()
u = User.query().filter(User.name==name).get()
Should I use one of them over the other? I assume the 2nd one is worse because it firsts get the entire User class queryset and then applies the filter?
There is no difference in functionality between the two so you can choose whatever you like best. On the google documentation, they show these two examples:
query = Account.query(Account.userid >= 40, Account.userid < 50)
and
query1 = Account.query() # Retrieve all Account entitites
query2 = query1.filter(Account.userid >= 40) # Filter on userid >= 40
query3 = query2.filter(Account.userid < 50) # Filter on userid < 50 too
and state:
query3 is equivalent to the query variable from the previous example.

in gql (appengine), how do i check if an entity exists matching 2 filters before inserting a new one?

the 2 filtered fields would actually be a unique index in sql so i want to see if an entity exists based on these 2 fields before inserting a new one.
currently i have:
t2get = db.GqlQuery("SELECT __key__ FROM Talk2 WHERE ccc = :1 AND ms = :2", c, theDay)
for x in t2get:
theKey = x[0]
if theKey:
t2 = Talk2.get(theKey)
else:
t2 = Talk2()
which errors with:
UnboundLocalError: local variable 'theKey' referenced before assignment
if the entity doens't exist.
Any ideas?
If the two fields would actually be a unique index, maybe you should instead use them as the key_name. It will be faster and you can use a transaction, if needed.
def txn():
key_name = "%d.%d." % (c, theDay)
t2 = Talk2.get_by_key_name(key_name)
if not t2:
t2 = Talk2(key_name=key_name)
t2.put()
db.run_in_transaction(txn)
d'uh I figured it out. After hours trawling the web, 10 more minutes finds me the answer:
t2 = Talk2.all().filter('ccc =', c).filter('ms =', theDay).get()
returns the first entity (if any) ready to be edited.

Between query equivalent on App Engine datastore?

I have a model containing ranges of IP addresses, similar to this:
class Country(db.Model):
begin_ipnum = db.IntegerProperty()
end_ipnum = db.IntegerProperty()
On a SQL database, I would be able to find rows which contained an IP in a certain range like this:
SELECT * FROM Country WHERE ipnum BETWEEN begin_ipnum AND end_ipnum
or this:
SELECT * FROM Country WHERE begin_ipnum < ipnum AND end_ipnum > ipnum
Sadly, GQL only allows inequality filters on one property, and doesn't support the BETWEEN syntax. How can I work around this and construct a query equivalent to these on App Engine?
Also, can a ListProperty be 'live' or does it have to be computed when the record is created?
question updated with a first stab at a solution:
So based on David's answer below and articles such as these:
http://appengine-cookbook.appspot.com/recipe/custom-model-properties-are-cute/
I'm trying to add a custom field to my model like so:
class IpRangeProperty(db.Property):
def __init__(self, begin=None, end=None, **kwargs):
if not isinstance(begin, db.IntegerProperty) or not isinstance(end, db.IntegerProperty):
raise TypeError('Begin and End must be Integers.')
self.begin = begin
self.end = end
super(IpRangeProperty, self).__init__(self.begin, self.end, **kwargs)
def get_value_for_datastore(self, model_instance):
begin = self.begin.get_value_for_datastore(model_instance)
end = self.end.get_value_for_datastore(model_instance)
if begin is not None and end is not None:
return range(begin, end)
class Country(db.Model):
begin_ipnum = db.IntegerProperty()
end_ipnum = db.IntegerProperty()
ip_range = IpRangeProperty(begin=begin_ipnum, end=end_ipnum)
The thinking is that after i add the custom property i can just import my dataset as is and then run queries on based on the ListProperty like so:
q = Country.gql('WHERE ip_range = :1', my_num_ipaddress)
When i try to insert new Country objects this fails though, complaning about not being able to create the name:
...
File "/Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/google/appengine/ext/db/__init__.py", line 619, in _attr_name
return '_' + self.name
TypeError: cannot concatenate 'str' and 'IntegerProperty' objects
I tried defining an attr_name method for the new property or just setting self.name but that does not seem to help. Hopelessly stuck or heading in the right direction?
Short answer: Between queries aren't really supported at the moment. However, if you know a priori that your range is going to be relatively small, then you can fake it: just store a list on the entity with every number in the range. Then you can use a simple equality filter to get entities whose ranges contain a particular value. Obviously this won't work if your range is large. But here's how it would work:
class M(db.Model):
r = db.ListProperty(int)
# create an instance of M which has a range from `begin` to `end` (inclusive)
M(r=range(begin, end+1)).put()
# query to find instances of M which contain a value `v`
q = M.gql('WHERE r = :1', v)
The better solution (eventually - for now the following only works on the development server due to a bug (see issue 798). In theory, you can work around the limitations you mentioned and perform a range query by taking advantage of how db.ListProperty is queried. The idea is to store both the start and end of your range in a list (in your case, integers representing IP addresses). Then to get entities whose ranges contain some value v (i.e., between the two values in your list), you simply perform a query with two inequality filters on the list - one to ensure that v is at least as big as the smallest element in the list, and one to ensure that v is at least as small as the biggest element in the list.
Here's a simple example of how to implement this technique:
class M(db.Model):
r = db.ListProperty(int)
# create an instance of M which has a rnage from `begin` to `end` (inclusive)
M(r=[begin, end]).put()
# query to find instances of M which contain a value `v`
q = M.gql('WHERE r >= :1 AND r <= :1', v)
My solution doesn't follow the pattern you have requested, but I think it would work well on app engine. I'm using a list of strings of CIDR ranges to define the IP blocks instead of specific begin and end numbers.
from google.appengine.ext import db
class Country(db.Model):
subnets = db.StringListProperty()
country_code = db.StringProperty()
c = Country()
c.subnets = ['1.2.3.0/24', '1.2.0.0/16', '1.3.4.0/24']
c.country_code = 'US'
c.put()
c = Country()
c.subnets = ['2.2.3.0/24', '2.2.0.0/16', '2.3.4.0/24']
c.country_code = 'CA'
c.put()
# Search for 1.2.4.5 starting with most specific block and then expanding until found
result = Country.all().filter('subnets =', '1.2.4.5/32').fetch(1)
result = Country.all().filter('subnets =', '1.2.4.4/31').fetch(1)
result = Country.all().filter('subnets =', '1.2.4.4/30').fetch(1)
result = Country.all().filter('subnets =', '1.2.4.0/29').fetch(1)
# ... repeat until found
# optimize by starting with the largest routing prefix actually found in your data (probably not 32)

Resources