Datastore & 30 second request limit - google-app-engine

I'm writing an Appengine app: one of its duties is to email all the users every night (I know that I'll have to enable billing to email many users -- that's no problem).
I'm just worried about the 30-second request limit; if I have thousands of users and I have to mail them all a daily reminder, won't that limit be hit sometime soon?
Any ideas on how can I handle this problem? Or maybe Appengine isn't right for this type of application...? Thanks!

Use the task queue: each task emails N users (the number you determine you can safely email well within the 30 secs limit) and queues up another task to email the next N, and so on.
Brett Slatkin's video has more about the best ways to split up such "batch" tasks for the purpose of running them effectively on the app engine.

Related

Quota exceeded for quota metric 'Queries' and limit 'Queries per minute per user' of service 'gmail.googleapis.com'

Our application makes use of a service account that has been authorized for the entire domain by the admin. With this service account our application accesses the domain user's emails with Gmail APIs like GetMessage.
All of a sudden, starting this week, we have started receiving the errors intermittently
Quota exceeded for quota metric 'Queries' and limit 'Queries per minute per user' of service 'gmail.googleapis.com' for consumer 'project_number:XYZ
There is no change in our application or frequency at which we access emails. We are using batch size of 10 while using the API.
The 'Quota exceeded errors count (10 sec) - Queries per minute' graph in the GCP dashboard is empty. So we are really not sure what is going on and why we are suddenly hitting the quota limits.
Also, I am not sure how the 'per-user' limit is applied when my app accesses the user mailboxes with the service account. The documentation around this is vague, at least to me.
These errors are really impacting our ability to serve our customers. Moreover not knowing why we are getting these errors is shaking our confidence in the Gmail APIs.
Any help in this regard is highly appreciated.
Thanks
UPDATE:
Today we are seeing lots of
"User-rate limit exceeded. Retry after <timestamp>"
errors. Seems like this time around we are hitting some quota limit other than 'queries per minute'. While I look in my client implementation and figure out why this is happening, feel free to share any recommendations you may have.
Thanks.
Google analytics has limits and quotas on API requests. You can increase it. If you don't want to do it, daily quotas are refreshed at midnight Pacific Standard Time.

Insufficient tokens for quota 'administrator' and limit 'CLIENT_PROJECT-100s' of service 'pubsub.googleapis.com' for consumer 'project_number:#'

I sometimes get the following error when creating a subscription:
Insufficient tokens for quota 'administrator' and limit 'CLIENT_PROJECT-100s' of service 'pubsub.googleapis.com' for consumer 'project_number:'
Waiting it out does the trick, but I'd like to increase the quota. In the IAM & Admin section of the Google Cloud Console, I can filter on the Pub/Sub API, but can't find the limit...
You are running up against the quota for administrative operations. In the Quotas page, under "Quota type," select "All quotas," then under "Service" select "Google Cloud Pub/Sub API." The quota you want to increase is "Administrator operations per 100 seconds," which you can update up to the maximum allowed limit of 10,000 per 100 seconds, as detailed on the Pub/Sub quota page. Here is a screenshot of the entity you need to update:
I was hitting a similar error.
I checked the quota section — as per Kamal Aboul-Hosn's suggestion — however it was already maxed out.
A work around was to put a sleep function in the code so the API wouldn't get hammered over a hundred second period. I hope that helps.
According to AboulHosen on the GCP Slack:
"It does look like quota is pooled across users of the default application credentials and that the quota is significantly lower for users authenticated in this way. I believe when going to app engine, a service account is created (https://cloud.google.com/appengine/docs/flexible/python/access-control#using_service_accounts), so I would not anticipate this error happening when running on app engine."
So the reason this is happening is because the quota for your admin credentials is being consumed elsewhere, and as far as I know, there is no way to increase this!

How to schedule repeated jobs or tasks from user parameters in Google App Engine?

I'm using Google App Engine and I want like to be able to schedule jobs based users' parameters.
I know this can be done with cron jobs, but looks like it does not allow any flexibility from the user's point of view, but it allows only to schedule predefined jobs.
For example, suppose I have a news app where users can subscribe to different topics: I want the admin to be able to decide when to send a summary email, for instance, every day at 8am, and I want him to be able to edit this.
Is there anything that provides this?
You may want to star Issue 3638: Cron jobs to be scheduled programatically
Meanwhile you can write your own implementation: have a generic cron job running periodically (every 1 minute being the finest resolution) and inside that cron job check user-programmed scheduling data persisted somewhere (in the datastore for example) and, if needed, trigger execution of whatever is needed to be executed, either inlined or by enqueueing a task in some task queue.
It's possible to drive the scheduling resolution even under 1 minute, if needed, see High frequency data refresh with Google App Engine
For my debts tracking app DebtsTracker.io I've implemented it manually.
When a user creates a debt record he can specify due date that is stored in an unindexed DueDate and indexed ReminderDateTime fields.
I have a cron that query records with ReminderDateTime < today and sends notifications. Once notification sent the ReminderDateTime is set to null or far future so it's not picked in next cron run. If user hits Remind me again I am updating the ReminderDateTime to some date in future (user decides when).
If ReminderDateTime is closer then cron interval then I simply create I put a task to queue with appropriate delay.
This works very well and is very cheap to run.

How much AppEngine Instance hours should I expected?

I have just developed a mobile apps which basically for users to upload, download photoes, add, update, search , delete, refresh transaction, and query report. Every action need submit request to Appengine Server.
I am using CloudEndpoint, oAuth2.0 and Objectify to implement this appengine. When I'm testing alone, The instance hours has used up 40% . How much billing for instance can I imagine if 100 people using this app? How does it calculate the instance hours? by request of submitting? or by time of instance working on multiple request??
is it worth?
If my target is more than 100 users to using my apps. Is it worth? Could you please share me what exactly I misunderstood about this instance.
Thanks
As others have commented, the question is very hard to answer. The easiest answer I can think of is by looking at the response header "X-AppEngine-Estimated-CPM-US-Dollars". You have to be a member of the Cloud Platform Project (see the Permissions page in Cloud Platform developers console) to see this header (you can check it in your browser).
The header tells you what the cost of the request was in US Dollars multiplied by 1000.
But think of it as an indication. If your request spawns other processes such as tasks, those costs are not included in the number you see in that header.
The relationship between Frontend instance hours and the number of requests is not linear either. For one, you will be charged a number of minutes (not sure if it's 15 minutes) when the instance spins up. And there are other performance settings that determine how this works.
Your best bet is to run the app for a while against real users and find out what the costs were in a given month or so.

How to send mail to all users in an app engine application

I work on a google app engine application which currently has about 4000 users and I want to write a handler to send email to all users.
The problem is that app engine has limitations on getting entities from datastore. For example, the max number of rows which can be returned from datastore is 1000.
I can get all users incrementally by using a loop and limit, offset parameters of gql. But this time the max lifetime of a handler which is 30 seconds limits me.
I made some research to overcome this problem and I have ended up with backends. But it seems to me backends usage is different I mean it is not appropriate for this operation.
How can I achieve this task?
Thanks in advance..
from google.appengine.api import mail
mail.send_mail(sender="Example.com Support <support#example.com>",
to="Albert Johnson <Albert.Johnson#example.com>",
subject="Your account has been approved",
body="""
Dear Albert:
Your example.com account has been approved. You can now visit
http://www.example.com/ and sign in using your Google Account to
access new features.
Please let us know if you have any questions.
The example.com Team
""")
Task Queues give you a 10-minute deadline. See the documentation
You can get more than 1000 items in one request. Just avoid using fetch and try this:
entities = Entity.all() # <-- no fetch
for e in entities:
mail.send_mail()
This will keep on getting users until the 10 minute limit run out: a lot of entities and more than enough for 4000 users.

Resources