I have just finished watching the following videos in an attempt to
understand JDO and Google App Engine datastore:
http://www.youtube.com/watch?v=2jW2iSKDipY
http://www.youtube.com/watch?v=Yl_J-UYE94w
http://www.youtube.com/watch?v=pzctc48c0BM
http://www.youtube.com/watch?v=tx5gdoNpcZM
Now I wonder, take the example where we have an entity of kind Grandparent
having an entity of kind Parent having an entity of kind Child as in one of
the videos. The key for one of the the Child entities could be:
Grandparent:Jane/Parent:Jack/Child:Joe
How do I code the class for this instance in JDO (presumably there will be
three classes) but I would like to see an implementation where we can see
the key values set as part of the key explicitly. Any ideas?
I also wonder, what is the difference between using JDOQL to access my
data and iterating through the various instances using iterators programmatically?
Thanks,
John Goche
There a more concrete example in the App Engine datastore java documentation: Child Objects and Relationships
You can construct Datastore queries using JDOQL string syntax and by calling methods on Query objects, there is no difference when accessible the data both will return a Collection, you can see more concrete examples in Introducing Queries in JDO
Related
I am trying to write a JPA query to get the type for a particular entity, given the id of the entity. I have an abstract class Account, and concrete subclasses CustomerAccount and AdministratorAccount. The id is an attribute of the Account, so I am trying to construct a query to return the Type (i.e. foo.bar.CustomerAccount) given the ID of the account.
I tried the following:
String sql = "SELECT TYPE(a) from Account a where a.id = :userId";
But that doesn't seem to work. Any ideas? I'm using the google app engine jpa implementation (datanucleus) if that helps.
Firstly, FWIW you are using Google's JPA plugin which just happens to use some ancient jars provided by the DataNucleus project. You are not using DataNucleus JPA.
Secondly, the datastore "GAE/Datastore" and Google's JPA plugin are not likely to support JPQL "TYPE" since that came along after their plugin was developed.
Finally, you maybe would get the info you want in a more efficient way by just doing
Object obj = em.find(Account.class, id);
Class type = obj.getClass();
since this also inspects the L1/L2 caches
Although I know there is already a Blob service for Appengine, I want to experiment on storing big blobs within the datastore.
Basically I am trying to persist this object using Objectify:
BigBlob.java
BigBlobFragments.java
However, appengine is complaining that: "BigBlobFragment is not a supported property type"
For the BigBlob type I created a DAO class with CRUD operation and registed the type like this:
static {
ObjectifyService.register(BigBlob.class);
//ObjectifyService.register(BigBlobFragment.class);
}
protected BigblobDaoImpl() {
super(BigBlob.class);
}
I actually also have tried registering BigBlobFragment.
Hopefully someone can share some ideas on how to actually persist Big blobs and fragments using Objectify.
I have not used Blobs myself but I noticed you have #PersistenceCapable above your entities... that should be #Entity.
import com.googlecode.objectify.annotation.Entity;
Then you should be able to register your entity with Objectify.
ObjectifyService.register(BigBlob.class);
You need to register both the BigBlob and the BigBlobFragment and replace all of your JDO annotations with Objectify annotations (assuming you are using ofy4. You might also want to consider embedding the BigBlobFragment object inside of the BigBlob for performance using #Embed.
Is there good information on how to use the Python fixture module with Google App Engines New DB?
It seems there are a few problems, such as:
obj.delete() on teardown (in ndb it's obj.key.delete())
It is not intuitive how to set up nested StructuredProperty elements.
Are there workarounds to permit the Fixture module to work with ndb, or an alternative fixture system that would work with ndb?
Thank you.
I'm guessing that fixture's GoogleDatastoreFixture class intercepts the Datastore operations at the ext.db module level. Since NDB has a different API it needs changing. Perhaps you can contribute a GoogleNdbFixture class. Or perhaps the right thing to do would be to intercept things at a lower level -- again, something you might take up with fixture's author and see if there's a way you can help.
Did you consider using Testbed? It setups GAE service stubs appropriately, so you can test against datastore (and other services) and it will tear down all your datastore writes after each test.
To create fixtures for your tests, you just directly put some entities into datastore in setUp() method. And you can use NDB API to put fixtures and in tests if you like.
We're migrating to the HR datastore. We can't find this in the docs and want to make sure it'll work.
We have a few properties defined as:
ListProperty(db.Key)
Will those keys migrate normally? (they're not strings, they're db.Keys)
Same goes for
ListProperty(SomeAwesomeCustomModel)
Will those get migrated normally?
ListProperty(db.Key) gets migrated just like KeyProperty().
I'm not sure how you could have ListProperty(SomeAwesomeCustomModel) -- ListProperty() only supports a limited list of types, not Model subclasses.
I am using Objectify as a data access layer in my GoogleAppEngine hosted application.
The problem comes when I try to persist a map. My bean looks like this:
#Entity
#Cached
class MyBean{
#Id
private Long id;
#Embedded
Map<String, String> parameters = new HashMap<String, String>();
public MyBean(){}
//getters and setters below
}
First of all note that the map 'parameters' is not private, it was throwing a JRE exception.
When saving the map everything goes well. When retreiving it from the DataStore it fails.
My workaround is to use the #Serialized annotation. This is just a workaround since what I want to acheive is to use the expando feature of GAE Datastore.
According to the objectify doc I'm doing the right operations.
Exception details:
Caused by: java.lang.NullPointerException at
com.googlecode.objectify.impl.Transmog.loadSingleValue(Transmog.java:364)
at
com.googlecode.objectify.impl.load.EmbeddedMapSetter.safeSet(EmbeddedMapSetter.java:65)
at
com.googlecode.objectify.impl.load.CollisionDetectingSetter.set(CollisionDetectingSetter.java:37)
at
com.googlecode.objectify.impl.Transmog.loadSingleValue(Transmog.java:359)
at com.googlecode.objectify.impl.Transmog.load(Transmog.java:340) at
com.googlecode.objectify.impl.ConcreteEntityMetadata.toObject(ConcreteEntityMetadata.java:203)
at
com.googlecode.objectify.impl.QueryImpl$ToObjectIterator.translate(QueryImpl.java:668)
at
com.googlecode.objectify.impl.QueryImpl$ToObjectIterator.translate(QueryImpl.java:657)
at
com.googlecode.objectify.util.TranslatingIterator.next(TranslatingIterator.java:35)
Embedded maps were poorly supported in Objectify3, and should not have been publicly announced. The section on #Embedded Map has been removed from the Objectify3 documentation.
Objectify4 supports maps extensively, including these expando-style maps:
Map (or any primitive)
Map (key references)
Map (embedded classes
In addition, there is a #Mapify annotation that lets you take a normal collection of objects, pick one property out as a key, and store that as a Map.
Unfortunately Objectify4's documentation is not ready at this time. However, the source code is in active use by several parties. If you feel daring, build from trunk.
I also recommend using Objectify 4 - I've upgraded my app and found it fairly easy to do. I much prefer the support for fields of the type Map in particular.
To answer the question, you should never put #Embedded onto an array containing only primitives. So you don't need to specify #Embedded on your map because String is primitive in the Google App Engine Datastore.