Closure templates - create a reusable alias for complicated `record` definition - google-closure-templates

I have a soy template that looks like
{template .fullView}
{#param people: list<[age:int, name:string]>}
{call .headers}
{param people: $queries /}
{/call}
{call .content}
{param people: $queries /}
{/call}
{/template}
{template .headers}
{#param people: list<[age:int, name:string]>}
# headers
{/template}
{template .content}
{#param queries: list<[age:int, name:string]>}
# content
{/template}
As the record definition for "people" has become more complex than just age and name it has become tedious to update the definition of the param in all three places. Is it possible to instead create an alias or something that could be reused in each template?
{alias [age:int, name:string] as Person}
{template .headers}
{#param people: list<Person>}
# headers
{/template}

Why not define a proto for Person instead? The docs also seems to recommend using protos over records:
In many cases, defining a protocol buffer is superior to using records
since it is less verbose.
So you could define a message like this?
// syntax: proto3
message Person {
int32 age = 1;
string name = 2;
// more fields here
}

Related

How does one get properties from related table as properties of it's own table in Laravel 5?

The question might sound a little bit confusing but I don't know how to explain it better in one sentence.
This describes basically what the problem is:
I have a Users table which can contain 2 types of users. I know how I can separate by role. But here's the thing, users with role 1(editor_in_chief) have different attributes than users with role 2(reviewer).
My idea was to create a table named 'reviewer_attributes' and 'editor_in_chief_attributes' and create a one-to-one relation with this table to hold the attributes for the users table.
Maybe you have a better idea, that would be great as well. But for this scenario, I would like to know if it is possible to make a call to the database and to get these users' properties from the other table as properties of the User object.
When using a DB call using relations laravel will give me something like this:
user {
id: 1,
name: "Name",
reviewer_attributes: {
attribute_1: 'attribute_1',
attribute_2: 'attribute_2',
attribute_3: 'attribute_3',
}
}
But this is what I want to object to obtain look like:
user {
id: 1,
name: "Name",
attribute_1: 'attribute_1',
attribute_2: 'attribute_2',
attribute_3: 'attribute_3',
}
I want to achieve this by a single database call instead of setting the properties after the call.
I find this a very interesting topic, I hope you could help!
If I got your problem right, you may call somthing like this:
DB::table('users')
->join('reviewer_attributes', 'users.id', '=', 'reviewer_attributes.user_id')
->find($id);
you may add select to get specific attributes of each table:
DB::table('users')
->join('reviewer_attributes', 'users.id', '=', 'reviewer_attributes.user_id')
->select('users.id', 'users.name', 'reviewer_attributes.*')
->find($id);
Update: You can also use collections to restructure your results returned by Eloquent:
$result = User::with('reviewerAttributes')->find($id);
$result = $result->get('reviewer_attributes')
->merge($result->forget('reviewer_attributes')->all())
->all();
You need export model to Json?
If so, override toArray method
public function toArray()
{
$arr = parent::toArray();
$reviewer_attributes = $this->getReviewerAttributesSomeHow();
return array_merge($arr, $reviewer_attributes);
}

Equivalent ndb attribute for _model_class attribute of db queries [duplicate]

Let's say there is ndb.Model that looks like this:
class Foo(ndb.Model):
bar = ndb.StringProperty()
My question is, if my only input is the Foo.query() how can I get the model as an object that this query belongs to?
def query_to_model(query):
# some magic
return model
The Foo.query().kind return the model's name as a string, but I didn't manage to find a way to get it as an object.
The following works using eval, but only when the model is defined in the same file:
def query_to_model(query):
return eval(query.kind)
I want something more general than that.
After you have imported code with this model definition, the list ndb.Model._kind_map should contain it. Here is the magic:
def query_to_model(query):
return ndb.Model._kind_map[query.name]
I use this code to find the model class if you have the kind name:
model_module = KIND_MODULES(kind_name)
mod = __import__(model_module, globals(), locals(), [kind_name], -1)
model_class = getattr(mod, kind_name)
The KIND Modules dict holds the modules to import the models from:
KIND_MODULES = { 'Users' : 'models', 'Comments' : 'models', 'Cities' : 'examples.models' }

GAE: Error when downloading data, if ndb.KeyProperty(repeated=True)

I am creating the bulkloader.yaml automatically from my existing schema and have trouble downloading my data due the repeated=True of my KeyProperty.
class User(ndb.Model):
firstname = ndb.StringProperty()
friends = ndb.KeyProperty(kind='User', repeated=True)
The automatic created bulkloader looks like this:
- kind: User
connector: csv
connector_options:
# TODO: Add connector options here--these are specific to each connector.
property_map:
- property: __key__
external_name: key
export_transform: transform.key_id_or_name_as_string
- property: firstname
external_name: firstname
# Type: String Stats: 2 properties of this type in this kind.
- property: friends
external_name: friends
# Type: Key Stats: 2 properties of this type in this kind.
import_transform: transform.create_foreign_key('User')
export_transform: transform.key_id_or_name_as_string
This is the error message I am getting:
google.appengine.ext.bulkload.bulkloader_errors.ErrorOnTransform: Error on transform. Property: friends External Name: friends. Code: transform.key_id_or_name_as_string Details: 'list' object has no attribute 'to_path'
What can I do please?
Possible Solution:
After Tony's tip I came up with this:
- property: friends
external_name: friends
# Type: Key Stats: 2 properties of this type in this kind.
import_transform: myfriends.stringToValue(';')
export_transform: myfriends.valueToString(';')
myfriends.py
def valueToString(delimiter):
def key_list_to_string(value):
keyStringList = []
if value == '' or value is None or value == []:
return None
for val in value:
keyStringList.append(transform.key_id_or_name_as_string(val))
return delimiter.join(keyStringList)
return key_list_to_string
And this works! The encoding is in Unicode though: UTF-8. Make sure to open the file in LibreOffice as such or you would see garbled content.
The biggest challenge is import. This is what I came up with without any luck:
def stringToValue(delimiter):
def string_to_key_list(value):
keyvalueList = []
if value == '' or value is None or value == []:
return None
for val in value.split(';'):
keyvalueList.append(transform.create_foreign_key('User'))
return keyvalueList
return string_to_key_list
I get the error message:
BadValueError: Unsupported type for property friends: <type 'function'>
According to Datastore viewer, I need to create something like this:
[datastore_types.Key.from_path(u'User', u'kave#gmail.com', _app=u's~myapp1')]
Update 2:
Tony you are to be a real expert in Bulkloader. Thanks for your help. Your solution worked!
I have moved my other question to a new thread.
But one crucial problem that appears is that, when I create new users I can see my friends field shown as <missing> and it works fine.
Now when I use your solution to upload the data, I see for those users without any friend entries a <null> entry. Unfortunately this seems to break the model since friends can't be null.
Changing the model to reflect this, seems to be ignored.
friends = ndb.KeyProperty(kind='User', repeated=True, required=False)
How can I fix this please?
update:
digging further into it:
when the status <missing> is shown in the data viewer, in code it shows friends = []
However when I upload the data via csv I get a <null>, which translates to friends = [None]. I know this, because I exported the data into my local data storage and could follow it in code. Strangely enough if I empty the list del user.friends[:], it works as expected. There must be a beter way to set it while uploading via csv though...
Final Solution
This turns out to be a bug that hasn't been resolved since over one year.
In a nutshell, even though there is no value in csv, because a list is expected, gae makes a list with a None inside. This is game breaking, since retrieval of such a model ends up in an instant crash.
Adding a post_import_function, which deletes the lists with a None inside.
In my case:
def post_import(input_dict, instance, bulkload_state_copy):
if instance["friends"] is None:
del instance["friends"]
return instance
Finally everything works as expected.
When you are using repeated properties and exporting to a CSV, you should be doing some formatting to concatenate the list into a CSV understood format. Please check the example here on import/export of list of dates and hope it can help you.
EDIT : Adding suggestion for import transform from an earlier comment to this answer
For import, please try something like:
`from google.appengine.api import datastore
def stringToValue(delimiter):
def string_to_key_list(value):
keyvalueList = []
if value == '' or value is None or value == []: return None
for val in value.split(';'):
keyvalueList.append(datastore.Key.from_path('User', val))
return keyvalueList
return string_to_key_list`
if you have id instead of name , add like val = int(val)

How do I rename a mongo collection in Mongoid?

I have a collection called artists, i'd like to rename it to artist_lookups. How do I do this?
With mongoid5 / mongo ruby driver 2:
# if you need to check whether foo exists
return unless Mongoid.default_client.collections.map(&:name).include?('foo')
# rename to bar
Mongoid.default_client.use(:admin).command(
renameCollection: "#{Mongoid.default_client.database.name}.foo",
to: "#{Mongoid.default_client.database.name}.bar"
)
Very simple, in mongo shell, do that:
db.artists.renameCollection( "artist_lookups" );
if you want to drop artist_lookups if it exist:
db.artists.renameCollection( "artist_lookups", true );
Some exception you can got.
10026 – Raised if the source namespace does not exist.
10027 – Raised if the target namespace exists and dropTarget is either false or unspecified.
15967 – Raised if the target namespace is an invalid collection name.
From the Mongoid Docs:
class Band
include Mongoid::Document
store_in collection: "artists", database: "music", session: "secondary"
end
Use store_in collection: "artist_lookups" in your model. This will let you store your Artist model in the artist_lookups collection.
If you want to preserve the existing data in the artists collection, and rename it, I suggest shutting down your app temporarily, renaming the collection to artist_lookups on your MongoDB server, and then restarting the app.
db.artists.renameCollection("artist_lookups")
will work for sure.

Case insensitive Charfield in django models

I am trying to achieve a category model where name has unique=True,
but practically I can still add same category name with different cases.
i.e. I have a category called Food
I am still able to add food, FOOD, fOod, FOOd
Is their any philosophy behind this? or it is a work in progress.
Cause in real world if I think of Category Food, it will always be food, no matter what case it has used to mention itself.
Thank you in advance to look at this.
To answer my own question:
I have found I can have clean method on my model. So I added
class Category(models.Model):
name = models.CharField(max_length=200, unique=True)
def clean(self):
self.name = self.name.capitalize()
It is capitalising the first letter, which is then handled by the save method, which calls the validate_unique method to raise error.
You can use Postgre specific model field called Citext fields (case insensitive fields).
There are three option at the moment:
class CICharField(**options), class CIEmailField(**options) and class CITextField(**options)
Example:
from django.db import models
from django.contrib.postgres.fields import CICharField
class Category(models.Model):
name = CICharField(verbose_name="Name", max_length=255)
But don't forget to create an extension for the citext fields.
See here.
Basically, you have to add the extension class in the migration file, inside the operations array, before the first CreateModel operation.
# migration file
operations = [
CITextExtension(), # <------ here
migrations.CreateModel(
...
),
...,
]
Setting the column to case-insensitive collation should fix this. You may need to do it at the SQL level.

Resources