I am looking at a way to get a page of data into memcache from datastore. Basically a comment system like Facebook where you load a set of 10 comments at a time. Datastore persists each comment as an object.
I would load 10 comments into an array object and load that on cache with a page ID suffix for the key by convention.
Now the problem : Data store doesn't seem to promise an increment of 1 while creating auto ID's. checked this on SO - Autoincrement ID in App Engine datastore
Upon eviction, how can I load these 10 comments from datastore into cache of a particular range lets say - page #5 or #6 when I can't access datastore objects by an incremented key ?
Any suggestions are welcome, even if you feel the whole approach is flawed let me know.
I did explore google cloud SQL as an alternative to datastore which takes care of my paging and ID-increment problems, and felt its not the best option as I expect this comments table to grow into very large dataset eventually.
Thanks !
Related
Why are the counts I see in my database different than what I see in Google Analytics? The goal conversion number showing in Google Analytics is much lower than what I see in the database. This is the case for several months.
Few reasons here
Sampled data vs. unsampled data: You can read about here: https://support.google.com/analytics/answer/1042498?hl=en - For API work i normally use a web query explorer to verify that my API call's are being sent and responses match to verify the data: https://ga-dev-tools.appspot.com/explorer/
Adblockers: You might get hits/submissions from people where they are using an ad blocker, hence more entries in Database or Google Analytics.
Users vs. Sessions vs. Hits: You are looking at Unique Visitors/Sessions in Google Analytics instead of the total number of "Events", Not sure how your Goal is setup but best to use events and look at "Total Events" and "Unique Events" to get a sense.
Implementation: You may be firing JavaScript after the person has hit the button without waiting for the page change, can happen on some sites where you take them to a thank-you page or something. Best to check how this is setup and the order in which tag fires and page works.
I am new to Google App Engine development. We have developed application with android and Google App Engine.We tried to delete all data but write operations became 100% and we can not simply delete further records. How can we manage to delete the data without exceeding 100%.
Can someone please explain us so we can follow the steps.
Thanks,
Prashant
You simply cannot delete records without accounting it for in the quota.
Ref: https://cloud.google.com/appengine/pricing#costs-for-datastore-calls
Entity Delete (per entity) --> 2 writes + 2 writes per indexed property value + 1 write per composite index value
So if you are in free-quota, you cannot delete or write any further, without enable billing.
Work around
Google allows 25 free app-engine projects to each user id. Create a new project and upload the old project code to new project id. If your daily traffic is within the free quota, you can use it as long as you want.
I am writing a web app and I am trying to improve the performance of search/displaying results. I am relatively new to programming this sort of thing, so I apologize in advance if these are simple questions/concepts.
Right now I have a database of ~20,000 sites, each with properties, and I have a search form that (for now) just asks the database to pull all sites within a set distance (for this example, say 50km). I have put the data into an index and use the Search API to find sites.
I am noticing that the database search takes ~2-3 seconds to:
1) Search the index
2) Get a list of key names (this is stored in the search index)
3) Using key names, pull from datastore (in a loop) and extract data properties to be displayed to the user
4) Transmit data to the user via jinja template variables
This is also only getting 20 results (the default maximum for a Search API query.. I haven't implemented cursors here yet, although I will have to).
For whatever reason, it feels quite slow.. I am wondering what websites do to make the process seem faster. Do they implement some kind of "asynchronous" search, where a page loads while in the background the search/data pulls are processed, and then subsequently shown to the user...?
Are there "standard" ways of performing searches here where the processing/loading feels seamless to the user?
Thanks.
edit
Would doing something like just passing a "query ID" via the page work, and then using AJAX to get data from the datastore via JSON work? Like... can app engine redirect the user to the final page, pass in only a "query ID", and then search in the meantime, and then once the data is ready, pass the information the user via JSON?
Make sure you are getting entities from the datastore in parallel. Since you already have the key names, you just have to pass your list of keys to the appropriate method.
For db:
MyModel.get_by_key_name(key_names)
For ndb:
ndb.get_multi([ndb.Key.from_path('MyModel', key_name) for key_name in key_names])
If you needed to do datastore queries, you could enable parallel fetches with the query.run (db) and query.fetch_async (ndb) methods.
I'm trying to develop a gaming site. User can add other users as their friends. User will get points as he completes various game levels. Now I need to show the average points of all user's friends who had already played the game on its page(example: When a user plays a game A, average of points earned by his friends shall be displayed on the game A page. Similarly game B average points of his friend's shall be shown when he plays game B).
My approach:
Store user's friend list(Max 1000) as multi-valued property in datastore and load it into
GAE memcache when user log's into site.
Use resident backend to cache all the user's game data(points earned for
each specific game). A cron job updates the backend cache every hour.
When user requests for a game page(eg: game A) for the first time, request handler contacts backend for computing average of friends points via URL-Fetch service.
Now backend gets the friends-list(Max 1000) of user from memcache, fetches game A points of friends from in-memory cache(backend cache) and returns the computed average.
Request handler after getting the average, persists it in datastore and also stores it in memcache so that subsequent requests to game A page fetches data from memcache/datastore without computation overhead on backend. This average is valid for 1 hour and re-computed again after that upon next request to game A page.
My questions :
Is the above mentioned approach a right way to solve this problem ?
How to implement an in-memory cache efficiently and reliably with backend instance (python-2.7) ?
How to estimate memory and cpu required at backend for only this job ? (Assuming 0.1 million key-value pairs have to be stored with "userid/gamename" as key and user-points as value. User friend list max size is 1000.)
If I have to use multiple backend instances as the load increases, how to load balance them?
Thanks in advance.
Have a look at this blog post from Nick Johnson, about counters : http://blog.notdot.net/2010/04/High-concurrency-counters-without-sharding
Use NDB datastore for :
- automatic caching, instead of your own memcache
- NDB has some new interesting properties like : json property with compression, repeated propeties, which act like Python lists
And have a look at mapreduce for efficient updating.
I'm making a good progress using Web2py and Google App Engine, but now I have to decide how to store images without waste GAE resources!
I have a "table" were I store products.
Each product can have a maximum of 12 picutes.
When I request the product page, lets say:
/product/7484/
I need only the product informations, without the pictures, but GAE engine get all fields from datastore! I thought that using Google App Engine Projection Queries, this could be solved and I would only fetch the fields I need!
Is that possible with Web2py or will I have to change my database to store imagens on another "table"?
I only will need fetch each picute field when they get requested by browser... but now, each picture requested cause the whole product entity being fetched from database!
We have it as an experimental feature in trunk. Perhaps you can help us test it. Nothing special to do just the usual:
db(query).select(db.table.field1, db.table.field2, etc.)
unless the query selects a single record by id, the arguments of select are converted into a projection query. Unfortunately GAE does not support projection for get_by_id().
Thanks to Christian (howesc) who pulled this off within 24 hrs from you requesting the feature. Please join us on the web2py google group if you can help testing.