Number of entries is a little more than 337k, however queries to fetch even just a single entry is taking anywhere from 600-1500ms. That seems fairly high to me. Is that normal or am I doing something wrong?
The queries are made from a GAE instance in us central using a very simple Go script. Here's what dashboard's telling me: http://take.ms/gpsoZ
Code is at: http://play.golang.org/p/oFjmBKfXgA
As Greg pointed out, if you're on the App Engine, using Cloud Datastore API might not be the way to go. Rather, use the appengine datastore directly
Related
I currently has serviced my app in Korea.
But, my app is installed in us-central because GAE not supported install for Asia.
so, i suppose it is very slow because it is faraway from GAE.
If that's problem, how can I solve this problem?
please suggest to me... thank you.
I have been using Google Cloud Platform for 4 years now, Including the Google App Engine. The performance of your application backend system can only be slow if the developer did not optimise the program well. I would suggest that you try out using some of the following key aspect in solving your problem:
Try so much to use MemCache for requests that are common to users and do not require instant real time updates.
Look at the algorithms you are putting in place. This is very important for your execution through put. For example lets say you want to run a search though a billion records, u can use quick search algorithms like QuickSort3way.
Lastly look at the choice of database you are using. You could mix NoSQL with SQL if you were only using SQL. If you are into big data then use BigQuery. This way your application's performance can drastically increase and scale up enormously.
I have an Python appengine app using datastore with ndb API and I want to do background work and store results into the datastore so appengine uses it.
I wanted to use GCE or my computer to do so, but ndb API is not available outside appengine and the alternative seems to be gcloud.datastore API which is very different.
How do you guarantee that what you push (with gcloud API) is consistent with what you get (ie: matches a ndb entity) ?
I can't do unit-tests because the local server is not the same (gcd vs dev_appserver). Here is a workaround (but in Java).
Should I replace ndb code by gcloud.datastore in appengine to ensure consistency (but loosing ndb advantages like auto caching...) ?
Is there an obvious solution I'm missing ? If someone had the same issue, how did you handle it ?
Thanks
If you're really worried about consistency and you're using fancy features of ndb, you should probably look into using the Remote API for App Engine, which effectively let's you run arbitrary code via a remote (HTTP) interface. It might help you get the job done, but remember that the CPU cycles you're using will be in GAE -- not in GCE.
If you're willing to wait a while, we're working on porting the ndb API to run against the Cloud Datastore API which would mean the same code you run in App Engine will work outside of App Engine (on your local machine or inside Google Compute Engine).
The gcloud.datastore (gcloud-python) API is much more low-level, so you should have even more control over the data that ends up in the Datastore. It wasn't built to be identical to ndb (and therefore doesn't have some of the fancy stuff like derived fields or geo-points as first-class citizens), however ndb stores those fields using it's own Python logic so you should be able to write safely if you're comfortable with the lower-level data representations.
I am working on a project that we are going to put on Google Cloud.
There will be a member requirement so logins and profiles to store. Members will make projects that will be linked to their accounts. Other members can join these projects etc. Its not overly complex but I need it to be fast and scalable from the off.
I have a few (simple) questions about the best setup to go for.
Do I have a PHP front end if PHP is only in beta? Do I just use Python for the front end? Is there a better framework than others to use?
Do I create an App Engine API for the front end to call using Python or Java or something else?
Which database do I use? Do I go down the Compute Engine/MongoDB approach or just go straight for Google datastore? (MySQL is disregarded here)
Do I use a shared memcache or get a dedicated one?
These sort of things. It appears using Google Cloud is 'fairly' straight forward but would appreciate some pointers from those in the know who have already get their hands dirty, in a virtual sense of course!
Many thanks in advance
You appear to have four many-faceted Qs -- and apparently you aren't taking them to Google Groups so let me do my best here.
Do I have a PHP front end if PHP is only in beta? Do I just use Python
for the front end? Is there a better framework than others to use?
For guaranteed solidity use Python or Java - PHP and Go aren't quite as mature yet. Many Python frameworks are fine, from the very-lightweight webapp2 that comes with App Engine, through intermediate-weight ones such as "flask", all the way to rich "django". I'm personally a "frameworks shd stay out of my way!" guy so webapp2 is my own favorite.
Do I create an App Engine API for the front end to call using Python
or Java or something else?
Python and Java are both fully supported and stable. I personally of course prefer Python, but, hey!, that's just me! Endpoints, if that's what you mean by "an App Engine API", is also equally well supported each way, with Python perhaps a tad ahead in integration with the datastore thanks to https://github.com/GoogleCloudPlatform/endpoints-proto-datastore/tree/master/endpoints_proto_datastore .
Which database do I use? Do I go down the Compute Engine/MongoDB
approach or just go straight for Google datastore? (MySQL is
disregarded here)
I think the GAE datastore (with add-ons as needed, e.g to shunt images and videos off to Cloud Storage, or structured data for search including geo functionality to the Search API) is going to serve you fine.
Do I use a shared memcache or get a dedicated one?
Start with the shared (free) variety, then once you have it all working design and run stress load-tests and check how they perform with that vs a dedicated (paid) version. Do data-based decisions -- let the numbers guide you: how much better are you getting by paying $X/month for dedicated cache? Decide accordingly!-)
I am developing a project on my Final year at uni and this will be an Android application.
Basically, the "company" updates the database with jobs to be done around the country. Its field workers will use the app to display the jobs available in their location. Workers then select the jobs they are committing to do and send the selection back to database.
I would like to use Google App Engine for that and I am just studying it at the moment.
I came across two methods how to store the data on GAE: Datastore and Cloud SQL.
Personally, I would like to use NoSql Datastore in order to experiment and learn it.
What would you suggest me to use for my use case?
What are the pros and cons of using both mentioned methods?
If I go with Google Datastore, is this guide good for me to start with? https://developers.google.com/appengine/docs/java/datastore/
I would say both will work. If you want to discover the Google Datastore then go for it.
But I would suggest you have a look at Objectify, this library is excellent to make you the things easier with this technology.
go with Google App engine Database. its very efficient to use. yes tat document is enough to start.
I have an AppEngine application that currently has about 15GB of data, and it seems to me that it is impractical to use the current AppEngine bulk loader tools to back up datasets of this size. Therefore, I am starting to investigate other ways of backing up, and would be interested in hearing about practical solutions that people may have used for backing up their AppEngine Data.
As an aside, I am starting to think that the Google Cloud Storage might be a good choice. I am curious to know if anyone has experience using the Google Cloud Storage as a backup for their AppEngine data, and what their experience has been, and if there are any pointers or things that I should be aware of before going down this path.
No matter which solution I end up with, I would like a backup solution to meet the following requirements:
1) Reasonably fast to backup, and reasonably fast to restore (ie. if a serious error/data deletion/malicious attack hits my website, I don't want to have to bring it down for multiple days while restoring the database - by fast I mean hours, as opposed to days).
2) A separate location and account from my AppEngine data - ie. I don't want someone with admin access to my AppEngine data to necessarily have write/delete access to the backup data location - for example if my AppEngine account is compromised by a hacker, or if a disgruntled employee were to decide to delete all my data, I would like to have backups that are separate from the AppEngine administrator accounts.
To summarize, given that getting the data out of the cloud seems slow/painful, what I would like is a cloud-based backup solution that emulates the role that tape backups would have served in the past - if I were to have a backup tape, nobody else could modify the contents of that tape - but since I can't get a tape, can I store a secure copy of my data somewhere, that only I have access to?
Kind Regards
Alexander
There are a few options here, though none are (currently) quite what you're looking for.
With the latest release of version 1.5.5 of the SDK, we now support interfacing with Google Storage directly - you can see how, here. With this you can write data to Google Storage, but to the best of my knowledge there's no way to write a file that the app will then be unable to delete.
To actually gather the data, you could use the App Engine mapreduce API. It has built in support for writing to the App Engine blobstore; writing to Google Storage would require you to implement your own output writer, currently.
Another option, as WoLpH suggests, is to use the Datastore Admin tool to back up data to another app. With a little extra effort you could modify the remote_api stub to prohibit deletes to the target (backup) app.
One thing you should definitely do regardless is to enable two-factor authentication for your Google account; this makes it a lot harder for anyone to get control of your account, even if they discover your password.
The bulkloader is probably one of the fastest way to backup/restore your data.
The problem with the AppEngine is that you have to do everything through views. So you have the restrictions that views have... the result is that a fast backup/restore still has to use the same API's as the rest of your app. So the bulkloader (possibly with a few modifications) is definately your best option here.
Perhaps though... (haven't tried it yet), you can use the new Datastore Admin to copy the data to another app. One which only you control. That way you can copy it back from the other app when needed.