What is dbReferenceProperty? - google-app-engine

In the python app engine docs, I see something called dbReferenceProperty. I can't understand what it is, or how it's used. I'm using the java interface to app engine, so I'm not sure if there's an equivalent.
I'm interested in it because it sounds like some sort of pseudo-join, where we can point a property of a class to some other object's value - something like if we had:
class User {
private String mPhotoUrl;
private String mPhone;
private String mState;
private String mCountry;
.. etc ..
}
class UserLite {
#ReferenceProperty User.mPhotoUrl;
private String mPhotoUrl;
}
then if we had to update a User object's mPhotoUrl value, the change would somehow propagate out to all UserLite instances referencing it, rather than having to update every UserLite object instance manually,
Thanks

A db.ReferenceProperty simply holds the key of another datastore entity, which is automatically fetched from the datastore when the property is used.
There's some additional magic where the entity that is referenced has access to a query for entities of type Foo that reference it in the special attribute foo_set.
The Java datastore API instead has owned relationships, which serve the same purpose.

Related

ApiTransformer for parametrized, unavailable type

I'm using Objectify and wish to have its Key<> type passed around in my API. I've created an ApiTransformer, but my questions is where to declare it, since the serialized Key<> class is not available, hence I cannot declare its transformer as a class annotation. I tried declaring it in the #Api annotation, but it doesn't work, I still get the error:
There was a problem generating the API metadata for your Cloud Endpoints classes: java.lang.IllegalArgumentException: Parameterized type com.googlecode.objectify.Key<[my package].User> not supported.
The ApiTransformer looks like:
public class KeyTransformer implements Transformer<Key<?>, String> {
public String transformTo(Key<?> in) {
return in.getString();
}
public Key<?> transformFrom(String in) {
return Key.valueOf(in);
}
}
And in my #Api I have:
#Api(name = "users", version = "v1",transformers = {KeyTransformer.class})
Unfortunately you can't. As you said you need to declare it on the Key class, your only chances to make this work are either.
1) Recompile the Key class for objectify with the #transformer annotation.
2) Extend the Key class with your own implementation and define the transformer there.
I don't really like any of those options so the way i usually resolve this is to hide the key object getter (by using #ApiResourceProperty(ignored=AnnotationBoolean.TRUE)) and only expose the id from that key.
That way you get a Endpoints frendly object, the only downside is you'll have to reconstitute the key using Key.create(YourClass.class, longId) manually whenever you need it.
You can add transforms to 3rd party classes by listing the transform in #Api annotation. I'm not dead sure it'll work parameterized class, but I don't see why not.
https://cloud.google.com/appengine/docs/java/endpoints/javadoc/com/google/api/server/spi/config/Api#transformers()

JDO on GAE - #Unowned fields returned as nulls

Let's say I have a very easy, classic setup: GAE(1.7.4) + GWT(2.5.0) Application, running on local Jetty (Development Server), using JDO for persistence.
Let's also say I have just 2 #PersistenceCapable classes: Person and Color. Every Person has exactly one favourite Color, but it does not mean that this Person owns this Color - many different Persons can have the same favourite Color. There is a limited number of well-known Colors and a Color may exist even if it is not anyone's favourite.
To model this I should use #Unowned relationship - please correct me if I am wrong:
#PersistenceCapable
public class Color { // just the most regular Entity class
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
private Key key;
#Persistent
String rgb;
// getter, setter, no constructor
}
#PersistenceCapable
public class Person {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
private Key key;
#Persistent
String surname;
#Persistent
#Unowned // here is the tricky part
Color color;
// getters, setters, no constructor
}
With some simple, well-known, PersistentManager-based code, I am able to successfully create and persist an instance of a Color class. I see it in GAE Development Console -> Datastore Viewer, having nice generated Key and ID/Name of (13), and my assigned RGB.
With very similar code, I am able to create an instance of Person class (in another request), assign a pre-existing Color as his favourite color (it pre-existed, I obtained it by pm.getObjectById()) and persist it. I see it in Datastore Viewer, with my nice generated Key and ID/Name of (15) and my assigned surname, and color_key_OID of (13). This looks very promising.
But then, when I fetch the Person(15) back from the DB (simple pm.getObjectById(), no transactions), it has my assigned surname correctly, but has null instead of Color(13)! Right - the Datastore Viewer gets it ok, but my code does not.
Oh, the problematic code? "Person p = pm.getObjectById(Person.class, key);".
(side notes: I am also having the same problem with #Unowned collections (nice list of values in Datastore Viewer, but null Collection field in my code.) My JDO jars on classpath are "datanucleus-api-jdo-3.1.1.jar" and "jdo-api-3.0.1.jar" so I assume they support #Unowned. There is no problem with not-#Unowned fields. I get no exceptions upon persisting or fetching, just plain nulls as field values.)
Either mark the color to be "eagerly fetched"
#Persistent(defaultFetchGroup="true")
#Unowned
Color color
or define your own fetchgroup like this:
#FetchGroup(name="eager", members={#Persistent(name="color")})
#PersistenceCapable
public class Person {
and use it if required by specifying the group to be fetched:
PersistenceManager pm = pmf.getPersistenceManager();
pm.getFetchPlan().addGroup("eager");
I was facing the same issue in one of my #Unowned Lists. I had more other two, which the Array is fetched perfectly.
What solved this issue for me was to change the name of property for a bigger one. In your case is like change the property name from "color" to something bigger, like "myfavoritecolor".
I have the same issue what you describe. How DataNucleus said you need to describe the whole lifecycle of the objects. In my case the problem was solved forcing getting the color, from the person object, before closing the PersistenceManager with the close() function.
Remember JDO uses the lazy-load technique to get objects.
I was able to solve this problem by adding fetch groups to the query and not to persistent manager.
PersistenceManager pm = PMF.get().getPersistenceManager();
logger.info("EVENTS FETCH GROUPS : " + pm.getFetchPlan().getGroups());
/*pm.getFetchPlan().addGroup("eventFetchGroup");
pm.getFetchPlan().setMaxFetchDepth(2);*/
Query q = pm.newQuery(Event.class);
q.getFetchPlan().addGroup("eventFetchGroup");
logger.info("EVENTS FETCH GROUPS : " +q.getFetchPlan().getGroups());
q.setFilter("date >= fromDate && date <= toDate");
q.declareParameters("java.util.Date fromDate, java.util.Date toDate");

Add a new attribute to entity in datastore?

I have an entity in my app engine datastore. There's actually only one instance of this entity. I can see it in my admin console. Is it possible to add a new attribute to the entity via the admin console (using gql perhaps)?
Right now it looks something like:
Entity: Foo
Attributes: mName, mAge, mScore
and I'd like to add a new boolean attribute to this entity like "mGraduated" or something like that.
In the worst case I can write some code to delete the entity then save a new one, but yeah was just wondering.
Thanks
-------- Update ---------
Tried adding the new attribute to my class (using java) and upon loading from the datastore I get the following:
java.lang.NullPointerException:
Datastore entity with kind Foo and key Foo(\"Foo\") has a null property named mGraduated.
This property is mapped to com.me.types.Foo.mGraduated, which cannot accept null values.
This is what my entity class looks like, I just added the new attribute (mGraduated), then deployed, then tried loading the single entity from the datastore (which produced the above exception):
#PersistenceCapable
public class Foo
{
#PrimaryKey
private String k;
/** Some old attributes, look like the following. */
#Persistent
#Extension(vendorName = "datanucleus", key = "gae.unindexed", value="true")
private String mName;
...
/** Tried adding the new one. */
#Persistent
#Extension(vendorName = "datanucleus", key = "gae.unindexed", value="true")
private boolean mGraduated;
The only way to implement this is to use Boolean as the type for the new property..
Than in set method you can accept boolean value, that's no issue.
If you want the get method to also return boolean.. you also can, but be sure to check if the value is null and if so.. return default value (e.g. true)
so
private Boolean newProp = null; // can also assing default value .. e.g. true;
public void setNewProp(boolean val)
{
this.newProp = val;
}
public boolean getNewProp()
{
if(this.newProp == null)
return true; // Default value if not set
return this.newProp.booleanValue();
}
I recommend you not to migrate your data in this case - it can be very costly and can deplete your quota easily (read old data, create new, delete old = 3 operations for every entry in you data store)
You can't do this through the admin console, but you shouldn't have to delete the entity. Instead just update it- the Datastore does not enforce schemas for Kinds.
E.g., if Foo is a subclass of db.Model (Python), change your model subclass to include the new property; fetch the model instance (e.g., by its key), update the instance, including setting the value of the new field; and save the modified instance. Since you just have one instance this is easy. With many such instances to update you'd probably want to do this via task queue tasks or via a mapreduce job.
You have declared the new mGraduated field using the primitive type boolean, which cannot be null. The existing entity can't be loaded into the model class because it doesn't have this property. One option is to declare this property using the Boolean class, which can accept a null value.
The Admin Console only knows about properties in existing entities. You cannot use the Admin Console directly to create a new property with a name not used by any existing entities. (This is just a limitation of the Console. App code can do this easily.)

How to know what class is being deserialized in JackSon Deserializer?

I'm using app engine datastore so I have entity like this.
#PersistenceCapable
public class Author {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
#JsonProperty("id")
#JsonSerialize(using = JsonKeySerializer.class)
#JsonDeserialize(using = JsonKeyDeserializer.class)
private Key key;
....
}
When the model is sent to view, it will serialize the Key object as an Id value. Then, if I send data back from view I want to deserialize the Id back to Key object by using JsonKeyDeserializer class.
public class JsonKeyDeserializer extends JsonDeserializer<Key> {
#Override
public Key deserialize(JsonParser jsonParser, DeserializationContext deserializeContext)
throws IOException, JsonProcessingException {
String id = jsonParser.getText();
if (id.isEmpty()) {
return null;
}
// Here is the problem because I have several entities and I can't fix the Author class in this deserializer like this.
// I want to know what class is being deserialized at runtime.
// return KeyFactory.createKey(Author.class.getSimpleName(), Integer.parseInt(id))
}
}
I tried to debug the value in deserialize's parameters but I can't find the way to get the target deserialized class. How can I solve this?
You may have misunderstood the role of KeySerializer/KeyDeserializer: they are used for Java Map keys, and not as generic identifiers in database sense of term "key".
So you probably would need to use regular JsonSerializer/JsonDeserializer instead.
As to type: it is assumed that handlers are constructed for specific types, and no extra type information is passed during serialization or deserialization process: expected type (if handlers are used for different types) must be passed during construction.
When registering general serializers or deserializers, you can do this when implementing Module, as one of the arguments is type for which (de)serializer is requested.
When defining handlers directly for properties (like when using annotations), this information is available on createContextual() callback of interface ContextualSerializer (and -Deserializer), if your handler implements it: BeanProperty is passed to specify property (in this case field with annotation), and you can access its type. This information needs to be stored to be used during (de)serialization.
EDIT: as author pointed out, I actually misread the question: KeySerializer is the class name, not annotation.

Dapper Correct Object / Aggregate Mapping

I have recently started evaluating Dapper as a potential replacement for EF, since I was not too pleased with the SQL that was being generated and wanted more control over it. I have a question regarding mapping a complex object in my domain model. Let's say I have an object called Provider, Provider can contain several properties of type IEnumerable that should only be accessed by going through the parent provider object (i.e. aggregate root). I have seen similar posts that have explained using the QueryMultiple and a Map extension method but was wondering how if I wanted to write a method that would bring back the entire object graph eager loaded, if Dapper would be able to do this in one fell swoop or if it needed to be done piece-meal. As an example lets say that my object looked something like the following:
public AggregateRoot
{
public int Id {get;set;}
...//simple properties
public IEnumerable<Foo> Foos
public IEnumerable<Bar> Bars
public IEnumerable<FooBar> FooBars
public SomeOtherEntity Entity
...
}
Is there a straightforward way of populating the entire object graph using Dapper?
I have a similar situation. I made my sql return flat, so that all the sub objects come back. Then I use the Query<> to map the full set. I'm not sure how big your sets are.
So something like this:
var cnn = sqlconnection();
var results = cnn.Query<AggregateRoot,Foo,Bars,FooBar,someOtherEntity,AggregateRoot>("sqlsomething"
(ar,f,b,fb,soe)=>{
ar.Foo = f;
ar.Bars = b;
ar.FooBar = fb;
ar.someotherentity = soe;
return ar;
},.....,spliton:"").FirstOrDefault();
So the last object in the Query tag is the return object. For the SplitOn, you have to think of the return as a flat array that the mapping will run though. You would pick the first return value for each new object so that the new mapping would start there.
example:
select ID,fooid, foo1,foo2,BarName,barsomething,foobarid foobaritem1,foobaritem2 from blah
The spliton would be "ID,fooid,BarName,foobarid". As it ran over the return set, it will map the properties that it can find in each object.
I hope that this helps, and that your return set is not too big to return flat.

Resources