I am working on google app engine, i have a case of reading 5000 records from the datastore and this read operation can be called any number of times in a day. If i see the Pricing it says 50k operation are free per day after that it will be charged as $0.06 per 100,000 entities.
Here comes my question, since i am reading multiple number of times in a day surely it will be stored in shared cache. Whether reading from cache is counted as fetch operation in app engine ?
The link you mentioned is pricing for datastore. Cache is not datastore.
Related
I'd like to make a GAE app multi-tenant to cater to different clients (companies), database namespaces seems like a GAE endorsed solution. Is there a meaningful way to split GAE fees among client/namespaces? GAE costs for app are mainly depends on user activities - backend instances up time, because new instances are created or (after 15 min delay) terminated proportionally to the server load, not total volume of data user has or created. Ideally the way the fees are split should be meaningful and could be explained to the clients.
I guess the most fair fee splitting solution is just create a new app for a new client, so all costs reported separately, yet total cost will grow up, I expect few apps running on same instances will use server resources more economically.
Every app engine request is logged with a rough estimated cost measurement. It is possible to log the namespace/client associated with every request and query the logs to add up the estimated instance costs for that namespace. Note that the estimated cost field is deprecated and may be inaccurate. It is mostly useful as a rough guide to the proportion of instance cost associated with each client.
As far as datastore pricing goes, the cloud console will tell you how much data has been stored in each namespace, and you can calculate costs from that. For reads/writes, we have set up a logging system to help us track reads and writes per namespace (i.e. every request tracks the number of datastore reads and writes it does in each namespace and logs these numbers at the end of the request).
The bottom line is that with some investments into infrastructure and logging, it is possible to roughly track costs per namespace. But no, App Engine does not make this easy, and it may be impossible to calculate very accurate cost estimates.
I have about 20GB of data in the datastore.
The builtin indexes (which I have no control over) have increased to ~200GB.
I backed up all the data to BigQuery and don't need it anymore.
I am trying to get out of the datastore, but I can't - datastore Admin console can't delete that many entities (bulk delete option uses map reduce, which fails on quota within an hour or so), and the cost of deleting each entity programatically is too expensive (> $1000 - many write operations because of the many indexes).
Meanwhile google charges me $50/month for storing the data I don't need :(
how do I: Close my datastore project (not the app engine project, just the datastore part of it), or just wipe out all the data?
Please help me get out of this!
Wait until July 1, 2016. Then delete all entities.
Starting from July 1, 2016 the pricing is $0.02 per 100,000 deletes regardless of indexed properties.
So I have 25,000 .py scripts running at the same time every 2 minutes as CRON jobs (distributed equally over 1000 servers)
They all have to access the same DB at the same time, ONLY to READ from db, not to write.
So I was wondering if Google Datastore (for example) could handle this? 25000 requests to the same DB at the same time just to read?
I couldnt find this in Google's docs.
Best Regards
Unlike traditional databases, which have separate distinct "instances", Google Datastore is a single huge data storage, which has no explicit per-application limit on the number of requests that it can process. Your requests go the same route as all requests from all the other apps that use the Datastore.
The only limit you might face is on the write side, where one entity group should not be updated more frequently than once per second.
I'm getting the following exception from a query that was working just fine up until a few moments ago:
OverQuotaError: The API call datastore_v3.RunQuery() required more quota than is available.
However, in the quota details it's not showing us as being over any quotas related to the datastore:
Any idea what might be causing this OverQuotaError?
When using Datastore, you should make sure that your App Engine budget is also able to handle any Datastore usage above the Free Quota.
In particular, it looks like you are at 50K Datastore Read operations for the day. This is the maximum amount of daily free quota you receive for read operations. At this point any billable operations must be within your App Engine Budget. You can follow the instructions on the quotas page for increasing your daily budget, which will allow you to exceed the Datastore free quota.
Does anyone have any number on Google App Engine (free quota) in terms of total number of request and unique visitors it allows per day?
Maybe someone who has live production code can tell us this?
Rough number is enough, just to get the idea.
I can not get this information from the pricing model.
Thanks
I had this question when I first started using App Engine, but it's impossible to answer with the information in your question.
You must have an estimate on the individual API quota usages, then calculate based on that.
You might be able to simplify it by trying to figure out which API quota you're likely to hit first, and then figuring out the number of requests you can serve before that quota runs out. ie:
Storing photos or other large data for users? You'll probably hit the blobstore quota first. Daily/unique visitor counts probably won't matter.
Serving lots of photos or large data? You'll probably hit the bandwidth quota first.
Need to start a channel for every view? You'll probably hit the channel quota first and get 100 views per day.
Need to send an email for every view? You'll probably hit the mail quota first.
Need to query the datastore a lot? You'll probably hit the datastore limit first.
The datastore limit is the hardest to calculate. You get 50k read and 50k write ops. Most likely you'd read more than write.
If you need 2 read ops per page, you might could do 25k views per day.
If you need 2 read ops per page, but you're smart and you memcache them, and memcache is effective 80% of the time, you could get 125k views per day.
If you need 500 read ops per page and you can't cache it, you can do 100 views per day. That's provided you don't run out of one of the other quotas.
Do your own math.
The quotas and rates (for free and paid apps) are listed on https://developers.google.com/appengine/docs/quotas.