I'd like to re-open Deleted Datastore entries reappear as a registered user. Can the old question be deleted?
I'll try to be more specific this time. I'm experiencing the following problem:
Initially I put N entities of the same kind into the Datastore like that:
datastore_entity = MyModel(model_property=property_value)
datastore_entity.put()
Afterwards I delete them. I have used the Datastore Admin interface as well as a self-defined handler for the mapreduce library in order to do so. The deleted entities do not appear neither in the Datastore viewer nor in the Datastore Admin view.
When I put even just one new single entity of this kind into the Datastore, the old Datastore entities reappear in the Datastore Admin view while the new entity does not (judging by the number of entities). On the contrary, the Datastore viewer correctly reflects the Datastore state. A query also returns only the newly created entity.
There are no tasks at the time the new entity is being put into the Datastore.
I'm also not encountering this problem on my local machine where I'm using the --clean_datastore option when starting the server.
The Datastore Admin and Datastore Statistics are not "live". The Datastore viewer offers a live view.
Check "Entity statistics last updated..." and you will notice the difference.
If the old entities are not visible in the Datastore viewer - no need to worry. Eventually the statistics will be updated.
Related
I'm wondering if deletion of these entities in blobuploadsession would affect my app functionality or performance in any which way. The reason for deletion is when a new form is created and there were no files that were uploaded to, then it results in unnecessary entities being created.
(edit: additional info from comment)
I use blobstore (part of NDB) to store images asynchronously via upload URL functionality. When I run the app on localhost, there is an auto-creation of a datastore called "BlobUploadSession". This is the entity where all the URLs for the images to be uploaded are stored as entities. When I upload a photo to the URL, it goes into the "BlobInfo" datastore. Now, I don't have a need of the URLs since the photo has already been uploaded. So, I'm wondering if I can delete the BlobUploadSession entities? Btw, BlobUploadSession and BlobInfo are default datastores automatically created.
The __BlobUploadSession__ and __BlobInfo__ entities are created by and only internally used by the development server while emulating the blobstore functionality.
There are others, similarly named __SomeEntityName__ entities for emulating other pieces of functionality, for example a pile of them are created when you're requesting datastore stats (such function doesn't exist per-se in production).
These entities aren't created on GAE, so no need to worry about them in production.
See also related How to remove built-in kinds' names in google datastore using kind queries
We're using Cloudant as the remote database for our app. The database contains documents for each user of the app. When the app launches, we need to query the database for all the documents belonging to a user. What we found is the CDTDatastore API only allows pulling the entire database and storing it inside the app then performing the query in the local copy. The initial pulling to the local datastore takes about 10 seconds and I imagine will take longer when adding more users.
Is there a way I can save only part of the remote database to the local datastore? Or, are we using the wrong service for our app?
You can use a server side replication filter function; you'll need to add information about your filter to the pull replicator. However replication will have a performance hit when using the function.
That being said a common pattern is to use one database per user, however this has other trade offs and it is something you should read up on. There is some information on the one database per user pattern here.
My application has a flow as the followings:-
User enters New Entity Page.
User hits save button, the system puts the new entity into Datastore.
The system redirects the user immediately to Edit page.
Edit page makes a query for the newly just inserted entity.
(Problem) newly inserted entity sometimes were not available.
I think it is because Datastore needs to do some data replication therefore the newly inserted data will not be available immediately after the Put(..) function returned. What should I do with the problem or do I need to use transaction?
You should read about eventual consistency: https://cloud.google.com/appengine/docs/go/datastore/structuring_for_strong_consistency
You could make an ancestor query or you could try to refer to the newly created entity by key.
I'd like to re-open Deleted Datastore entries reappear as a registered user. Can the old question be deleted?
I'll try to be more specific this time. I'm experiencing the following problem:
Initially I put N entities of the same kind into the Datastore like that:
datastore_entity = MyModel(model_property=property_value)
datastore_entity.put()
Afterwards I delete them. I have used the Datastore Admin interface as well as a self-defined handler for the mapreduce library in order to do so. The deleted entities do not appear neither in the Datastore viewer nor in the Datastore Admin view.
When I put even just one new single entity of this kind into the Datastore, the old Datastore entities reappear in the Datastore Admin view while the new entity does not (judging by the number of entities). On the contrary, the Datastore viewer correctly reflects the Datastore state. A query also returns only the newly created entity.
There are no tasks at the time the new entity is being put into the Datastore.
I'm also not encountering this problem on my local machine where I'm using the --clean_datastore option when starting the server.
The Datastore Admin and Datastore Statistics are not "live". The Datastore viewer offers a live view.
Check "Entity statistics last updated..." and you will notice the difference.
If the old entities are not visible in the Datastore viewer - no need to worry. Eventually the statistics will be updated.
In Singapore, we are teaching students python using Singpath (singpath.appspot.com). In addition to letting students practice writing software in python, we would like to familiarize students with the google.appengine.ext.db API used to access big table.
What is the easiest way to modify db.Model settings in an App Engine app so that any puts or gets access a local, temporary datastore rather than writing to big table? I'm trying to do something similar to how gaeunit creates a new temporary datastore each time unit tests are run.
from google.appengine.ext import db
import logging
class MyModel(db.Model):
name = db.StringProperty()
#Configure a temp datastore to be updated instead of bigtable.
m = MyModel()
m.put() #shouldn't update bigtable
result = MyModel.all() #should fetch from temp datastore
logging.info("There were %s models saved", result.count())
You can certainly do this in the dev server by creating a new stub datastore when you want, like gaeunit. I don't think the concept really transfers to the production environment, though. A temporary datastore would have to have some kind of backing store, either the real datastore or memcache. AFAIK there's no built-in support for either.
An alternative would be to use the real datastore with some sandboxing.
You could override db.Model.kind to prefix a session ID:
#classmethod
def kind(cls):
return "%s_%s" % (SESSION_ID, cls.__name__)
This would give you basic namespacing for user-created entities.
If you have a session entity, you could populate it as the parent entity on any query that does not already specify one. This would force all of your user's entities into one entity group.
In either case, at the start of the session you could schedule a task to run later that would clean up entities that the user created.