Not sure where to find limits for Google Places Api free use - mobile

How much will I pay per month for 10k dau for an average geo location mobile game using google places (for POI)?

Here's a link to pricing and plans comparison link , but they are stated in requests so you need to calculate average request per user for your app and convert it to your 10k DAU

Related

Using the YouTube Data API V3 – quota very quickly runs out

I am making a discord bot using discord.js and am using the YouTube API for music commands.
However, I managed to use up all of my quota just in testing. It can load playlists, search for songs and use urls and uses ytdl-core module to download the videos. Is it normal to use this much of a quota or is something wrong. I also noticed that you cannot get an extension for your quota without having an organisation, which I do not.
I'm running into the same problem. The YouTube Data API has a crazy low quota/limit. While the quota sounds reasonable, 10,000 queries per day, it's actually much lower. First, they count each query as 100. So take 10,000 and divide it by 100 and that's how many queries you are really allowed to make. Furthermore, they paginate their results with a maximum of 50 results (videos) per page. So in reality, YouTube will let you access data on 5,000 videos per day using the YouTube Data APIs search functionality. To confirm my math, I just checked my quota for the day. After querying the API for 10 pages of results (50 videos per page, 500 videos total) my quota/usage is listed at 1,000 out of the 10,000 allotment per day. Essentially, the YouTube Data API is useless.

Track user selected values on ecommerce(amazon.com) website (GAE Python) using Google Analytics API

I have designed an ecommerce website like amazon.com. I want to collect user analytics data on my website. I am still evaluating options and Google Analytics API looks good as it does not charge anything for a good amount of usage. And it should be fast.
What information do I want to collect about user behavior on my website:
What "dropdown" menu values do they select(say for narrowing down the exact product category)
What "checkboxes" do they select to filter out based on price range, size, type, or a specific feature
Which result entries(matched products) on the search page do they click
It would be good if I could get the time of day when the above happened
Also how long the user viewed a page (with an upper limit for cases when a user opens a tab and forgets to close it)
I hope the Google Analytics API has some feature to prepare charts based on the above data, as well as provide the average monthly values for the above counts
I decided to use an external Analytics tool since writing into NDB would take time, increase writes and increase the storage requirement
Any guidance is highly appreciated.. Anyone who has implemented similar analytics for their website, please share your advice..

In app credit/currency using PayPal API

I am building a web app in cakePHP that involves a credit system.
So credits are bought with real money, then at a some point in the future they are spent and the portion the credits value goes to me and the rest goes to a third party.
So far no problem but...
If I use something like PayPal's Express Checkout, PayPal take a cut when the credits are initially bought and another cut when the third party is paid.
Or I can use PayPal's Adaptive Payments and do a Delayed Chained Payment but that has a maximum delay of 90 days.
So my question is do I have any other options?
There are only two options:
Use another payment service provider, but they'll all take their share for their service. But Paypal has become expensive.
Add the fees on top of what the user has to pay for the credits, you'll have to calculate them in advance.

web application design strategy on app engine

I'm trying to develop a gaming site. User can add other users as their friends. User will get points as he completes various game levels. Now I need to show the average points of all user's friends who had already played the game on its page(example: When a user plays a game A, average of points earned by his friends shall be displayed on the game A page. Similarly game B average points of his friend's shall be shown when he plays game B).
My approach:
Store user's friend list(Max 1000) as multi-valued property in datastore and load it into
GAE memcache when user log's into site.
Use resident backend to cache all the user's game data(points earned for
each specific game). A cron job updates the backend cache every hour.
When user requests for a game page(eg: game A) for the first time, request handler contacts backend for computing average of friends points via URL-Fetch service.
Now backend gets the friends-list(Max 1000) of user from memcache, fetches game A points of friends from in-memory cache(backend cache) and returns the computed average.
Request handler after getting the average, persists it in datastore and also stores it in memcache so that subsequent requests to game A page fetches data from memcache/datastore without computation overhead on backend. This average is valid for 1 hour and re-computed again after that upon next request to game A page.
My questions :
Is the above mentioned approach a right way to solve this problem ?
How to implement an in-memory cache efficiently and reliably with backend instance (python-2.7) ?
How to estimate memory and cpu required at backend for only this job ? (Assuming 0.1 million key-value pairs have to be stored with "userid/gamename" as key and user-points as value. User friend list max size is 1000.)
If I have to use multiple backend instances as the load increases, how to load balance them?
Thanks in advance.
Have a look at this blog post from Nick Johnson, about counters : http://blog.notdot.net/2010/04/High-concurrency-counters-without-sharding
Use NDB datastore for :
- automatic caching, instead of your own memcache
- NDB has some new interesting properties like : json property with compression, repeated propeties, which act like Python lists
And have a look at mapreduce for efficient updating.

Geocoding out of Salesforce.com (large number of static adresses)

We have developed an application in Salesforce.com to geocode and display account information (address information) on Google Maps within Salesforce.com. We have around 750k addresses for the initial (one-time) geocoding process (static address data), as we store the lat/lon with the account.
Is their any way to get this geocoded done in one 'batch' as we seem to hit the daily allowance of request per day (3k)?
Thanks for your support and feedback.
If you'd like to use Google to do this and you need to precompute the lat/lon coordinates, you'll need to upgrade to the Google Maps API for Business. The table on that link provides a detailed overview of the limit differences, and I suspect that the terms of use may also be more in line with what you're developing too.
If your goal is to create a Google Map from the addresses, you may also consider just passing the address data instead of the latitude and longitude. Their static maps will work fine as long as you know which locations that you'd like to use on the map, you don't have too many locations in one map and you don't need interactivity. Otherwise, you should use the javascript API and their documentation is very good for that.
If you don't know which locations you'd like to use on the map, you could use the lat/lon coordinates of their zip codes (which are freely available) to create a view port.
If you'd like a static map of the locations (whether or not you precompute the lat/lon), I'd recommend taking a look at the apexgooglestaticmap Github repository. It makes it quite easy to make certain types of static Google maps in APEX and VisualForce. Here's an example from that README:
APEX Controller
String[] homes = new String[]{'Albany, NY','Wellesley, MA','New York, NY','Pittsburgh, PA','01945','Ann Arbor, MI','Chicago, IL'};
GoogleStaticMap.MapPath moves = new GoogleStaticMap.MapPath(homes).color('0x000000ff');
String movesUrl = new GoogleStaticMap().addPath(moves).url;
Visualforce Page
<apex:image value="{!movesUrl}">
Google Maps URL and Image
https://maps.google.com/maps/api/staticmap?sensor=false&size=500x350&markers=label:0%7CAlbany%2C+NY&markers=label:1%7CWellesley%2C+MA&markers=label:2%7CNew+York%2C+NY&markers=label:3%7CPittsburgh%2C+PA&markers=label:4%7C01945&markers=label:5%7CAnn+Arbor%2C+MI&markers=label:6%7CChicago%2C+IL&markers=label:0%7CAlbany%2C+NY&markers=label:1%7CWellesley%2C+MA&markers=label:2%7CNew+York%2C+NY&markers=label:3%7CPittsburgh%2C+PA&markers=label:4%7C01945&markers=label:5%7CAnn+Arbor%2C+MI&markers=label:6%7CChicago%2C+IL&path=weight:5%7Ccolor:0x000000ff%7CAlbany%2C+NY%7CWellesley%2C+MA%7CNew+York%2C+NY%7CPittsburgh%2C+PA%7C01945%7CAnn+Arbor%2C+MI%7CChicago%2C+IL&
There are also many other geocoding APIs available. They each have their own terms of service, so make sure that your application matches the acceptable uses before you build anything with them. I've personally used PCMiler Web Services with success, and Geonames for reverse geocoding (they have a number of other great features too).
Google only allows that max per IP Address. If you were able to maximize the locations you were doing them from, max out at work, home or maybe your local starbucks, you could be able to hit the goal easily. If you have more than 250 employees just assign the 'homework' of going home and geocoding it! Spread the wealth, and the fun, of geocoding!
The other solution is using Google Maps API For Business, which allows you to do a max of 100,000 requests per day. For more information look at: http://code.google.com/apis/maps/documentation/geocoding/ ; for more information specifically on Geoogle Maps API For Business: http://code.google.com/apis/maps/documentation/business/index.html
I use this geocoder, and add my list and use yahoo to get the lat and long.
We have FindNearby installed, so once I get all the addresses geo-coded, I add them back into my excel worksheet in the respective column that has the Record ID. Since we use FindNearby, I have to add a column that is titled Mapping Status, and all cells under this say Located. I used ApexData loader to mass upload into SF. Note, geocoder has a limit of only 1000 records at a single "code" click, but it has no limit of how many lists you can process per day.
I changes some code to use Bing map services instead of google. There limits are based on keys and are much larger. Also when the key hits its max requests, then you can just get a new free key and apply it.

Resources