How to send mail to all users in an app engine application - google-app-engine

I work on a google app engine application which currently has about 4000 users and I want to write a handler to send email to all users.
The problem is that app engine has limitations on getting entities from datastore. For example, the max number of rows which can be returned from datastore is 1000.
I can get all users incrementally by using a loop and limit, offset parameters of gql. But this time the max lifetime of a handler which is 30 seconds limits me.
I made some research to overcome this problem and I have ended up with backends. But it seems to me backends usage is different I mean it is not appropriate for this operation.
How can I achieve this task?
Thanks in advance..

from google.appengine.api import mail
mail.send_mail(sender="Example.com Support <support#example.com>",
to="Albert Johnson <Albert.Johnson#example.com>",
subject="Your account has been approved",
body="""
Dear Albert:
Your example.com account has been approved. You can now visit
http://www.example.com/ and sign in using your Google Account to
access new features.
Please let us know if you have any questions.
The example.com Team
""")

Task Queues give you a 10-minute deadline. See the documentation

You can get more than 1000 items in one request. Just avoid using fetch and try this:
entities = Entity.all() # <-- no fetch
for e in entities:
mail.send_mail()
This will keep on getting users until the 10 minute limit run out: a lot of entities and more than enough for 4000 users.

Related

Insufficient tokens for quota 'administrator' and limit 'CLIENT_PROJECT-100s' of service 'pubsub.googleapis.com' for consumer 'project_number:#'

I sometimes get the following error when creating a subscription:
Insufficient tokens for quota 'administrator' and limit 'CLIENT_PROJECT-100s' of service 'pubsub.googleapis.com' for consumer 'project_number:'
Waiting it out does the trick, but I'd like to increase the quota. In the IAM & Admin section of the Google Cloud Console, I can filter on the Pub/Sub API, but can't find the limit...
You are running up against the quota for administrative operations. In the Quotas page, under "Quota type," select "All quotas," then under "Service" select "Google Cloud Pub/Sub API." The quota you want to increase is "Administrator operations per 100 seconds," which you can update up to the maximum allowed limit of 10,000 per 100 seconds, as detailed on the Pub/Sub quota page. Here is a screenshot of the entity you need to update:
I was hitting a similar error.
I checked the quota section — as per Kamal Aboul-Hosn's suggestion — however it was already maxed out.
A work around was to put a sleep function in the code so the API wouldn't get hammered over a hundred second period. I hope that helps.
According to AboulHosen on the GCP Slack:
"It does look like quota is pooled across users of the default application credentials and that the quota is significantly lower for users authenticated in this way. I believe when going to app engine, a service account is created (https://cloud.google.com/appengine/docs/flexible/python/access-control#using_service_accounts), so I would not anticipate this error happening when running on app engine."
So the reason this is happening is because the quota for your admin credentials is being consumed elsewhere, and as far as I know, there is no way to increase this!

Getting "Rate Limit Exceeded" when trying to add new API Project

I have 27 API Projects currently set up on one Google User account, via https://console.developers.google.com. I am needing to add more, but whenever I try I get the following error "Rate Limit Exceeded". I can see from the Requests column that the current projects hardly make any requests at all. I can't see why we would have hit any limit. Is there a limit to the total number of Projects you can set up against one Google User account?
Thanks!
There is a limit of 25 free projects to one google account.
Source
How many applications can I create with Google App Engine?
Each account can host 25 free applications and an unlimited number of
paid applications. If you reach the free limit, you can delete
existing applications to create more. Note that you can't re-register
an application ID.

How much AppEngine Instance hours should I expected?

I have just developed a mobile apps which basically for users to upload, download photoes, add, update, search , delete, refresh transaction, and query report. Every action need submit request to Appengine Server.
I am using CloudEndpoint, oAuth2.0 and Objectify to implement this appengine. When I'm testing alone, The instance hours has used up 40% . How much billing for instance can I imagine if 100 people using this app? How does it calculate the instance hours? by request of submitting? or by time of instance working on multiple request??
is it worth?
If my target is more than 100 users to using my apps. Is it worth? Could you please share me what exactly I misunderstood about this instance.
Thanks
As others have commented, the question is very hard to answer. The easiest answer I can think of is by looking at the response header "X-AppEngine-Estimated-CPM-US-Dollars". You have to be a member of the Cloud Platform Project (see the Permissions page in Cloud Platform developers console) to see this header (you can check it in your browser).
The header tells you what the cost of the request was in US Dollars multiplied by 1000.
But think of it as an indication. If your request spawns other processes such as tasks, those costs are not included in the number you see in that header.
The relationship between Frontend instance hours and the number of requests is not linear either. For one, you will be charged a number of minutes (not sure if it's 15 minutes) when the instance spins up. And there are other performance settings that determine how this works.
Your best bet is to run the app for a while against real users and find out what the costs were in a given month or so.

Salesforce API 10 request limit

I read somewhere that the Salesforce API has a 10 request limit. If we write code to integrate with Salesforce:
1. What is the risk of this limit
2. How can we write code to negate this risk?
My real concern is that I don't want to build our customer this great standalone website that integrates with Salesforce only to have user 11 and 12 kicked out to wait until requests 1-10 are complete?
Edit:
Some more details on the specifics of the limitation can be found at http://www.salesforce.com/us/developer/docs/api/Content/implementation_considerations.htm. Look at the section titled limits.
"Limits
There is a limit on the number of queries that a user can execute concurrently. A user can have up to 10 query cursors open at a time. If 10 QueryLocator cursors are open when a client application, logged in as the same user, attempts to open a new one, then the oldest of the 10 cursors is released. This results in an error in the client application.
Multiple client applications can log in using the same username argument. However, this increases your risk of getting errors due to query limits.
If multiple client applications are logged in using the same user, they all share the same session. If one of the client applications calls logout(), it invalidates the session for all the client applications. Using a different user for each client application makes it easier to avoid these limits.*"
Not sure which limit you're referring to, but the governor limits are all listed in the Apex documentation. These limits apply to code running in a given Apex transaction (i.e. in response to a trigger/web service call etc), so adding more users won't hurt you - each transaction gets its own allocation of resources.
There are also limits on the number of long-running concurrent API requests and total API calls in a day. Most of these are per-license, so, again, as the number of users rises, so do the limits.
Few comments on:
I don't want to build our customer this great standalone website that integrates with Salesforce only to have user 11 and 12 kicked out to wait until requests 1-10 are complete?
There are two major things you need to consider when planning real-time Sfdc integration beside the api call limits mentioned in the metadaddy's answer (and if you make a lot of queries it's easy to hit these limits):
Sfdc has routine maintainance outage periods.
Querying Sfdc will always be significantly slower than a querying local datasource.
You may want to consider a local mirror of you Sfdc data where you replicate your Sfdc data.
Cheers,
Tymek
All API usage limits are calculated over 24 hours period
Limits are applicable to whole organization. So if you have several users connecting through API all of them count against the same limit.
You get 1,000 API requests per each Salesforce user. Even Unlimited Editions is actually limited to 5,000.
If you want to check your current API usage status go to Your Name |
Setup | Company Profile | Company Information
You can purchase additional API calls
You can read more at Salesforce API Limits documentation

Datastore & 30 second request limit

I'm writing an Appengine app: one of its duties is to email all the users every night (I know that I'll have to enable billing to email many users -- that's no problem).
I'm just worried about the 30-second request limit; if I have thousands of users and I have to mail them all a daily reminder, won't that limit be hit sometime soon?
Any ideas on how can I handle this problem? Or maybe Appengine isn't right for this type of application...? Thanks!
Use the task queue: each task emails N users (the number you determine you can safely email well within the 30 secs limit) and queues up another task to email the next N, and so on.
Brett Slatkin's video has more about the best ways to split up such "batch" tasks for the purpose of running them effectively on the app engine.

Resources