Assume that I upload 1GB (one gigabyte) data to my gae blobstore everyday. How can I calculate my storage cost at the end of first year?
At the end of the year you will have 365GB which will cost you 0.13$ (in today prices) per GB per month, only for the blob storage you will pay 47.45$ per month (~1.5$ per day).
Related
My simple python website hosted on AppEngine got some increase in traffic. It went from total of 447 visitors and 860 views in September (peak 33 visitors on a day) to 1K visitors and 1.5K views in October (peak 61 visitors on a day).
Meanwhile the cost went from $0.00 in September to $10.66 USD in October. The cost breakdown shows that the complete amount is attributed to front end instances, totaling 930.15 hours of usage. That is about 30 hours a day.
Beforehand I have set my max_instances and max_idle_instances to 1. With a single instance running, how is it possible to have 30 hours of usage in a day that lasts 24 hours?
I am using F4 instance class - once a month I parse an excel sheet (that doesn't depend on the number of visits), and more limited instance classes were exceeding soft memory limit of 256 MB. As well, my front end is optimized and it fits in less than 30Kb. So with only 1.5K views a month, how can I have that much front end instance hours?
It is possible that you consume more than 24 hours in a day because F4 instances consumes 4 hours in an hour.
See here.
For example, if you use an F4 instance for one hour, you see "Frontend Instance" billing for four instance hours at the F1 rate.
And App Engine bills depending on how much hours your instances are up. Even you have no traffic, you may be billed.
I am trying to calculate inventory sales based on missing inventory broken up into hours per day. (Each hour has a sequence number that references the time stamp...ie sequence#1 = 0:00-1:00). I need to get the average sales per hour over the course of a 3 month period. On top of that I also need to throw out upreadings in inventory.
Here is a snippet of my data:
UPDATE:
My plan is to take the current Inventory level(InvValue) and subtract it agains the previous inventory level(InvValue) for that hour. My problem is I don't know how to format the loop in order to organize the data to the right Hour(inv_sequence)
My end goal is to have:
hour 1 average : hour 2 average : hour 3 average
etc.
"average sales per hour" = count of total sales / (days * 24)
You are either overcomplicating things, or your specification of the task is not accurate,,,
All you need is the Average of Inventory per Hour?
I'm just deciding between GSS (Google Site Search) and CSE (Custom Search Engine) with JSON API. But I'm a little bit confused about JSON API billing.
My approved start budget is 100$ per year which allows 20 000 queries/year in GSS but how many queries will I get in JSON API and how I must set quota to not exceed the budget?
I have opinion how google makes billing:
Price of 1 query is 0.005$ = 5$ / 1000 queries => https://developers.google.com/custom-search/json-api/v1/overview#pricing
Google adds day queries (over 100 free) and then create billing for month. So my quota has to be set to 154 (100 free + 54):
54 queries per day * 31 days * 12 months = 20 088 queries * 0.005$ = 100,44$ which is maximum I will pay (lesser maybe).
Am I right? Or google makes billing in different way?
My GAE app will request weekly data from Google Analytics like
number of visitors during last week
number of visitors of particular page during last week
etc.
Then I would like to show this data on my GAE web-page with Google Charts. The data will be shown for last X weeks (let's say, 10 weeks).
What is the best approach to store this data (number of metrics multiplied by number of weeks)? Old data could be deleted.
I don't think I should use datastore like:
class Visitors(ndb.Model):
week1 = ndb.IntegerProperty(default=0) # should store week start and end dates also
week2 = ndb.IntegerProperty(default=0)
...
Probably, it would be better to store data like:
class Analytics(ndb.Model):
visitors = ndb.StringProperty(default=0) # comma separated values like '1000,1001,1002'; last value is previous week
page_visitors = ndb.IntegerProperty(repeated=True,default=0) # [1000,1001,1002]
...
What are you trying to optimize?
With this amount of data, you will pay pennies, or less, for data storage. You are well within the free quota on datastore reads and writes. Performance-wise, the difference is negligible.
I would recommend going with the most straightforward solution: each week is a new entity, each data point is in its own property.
I need a NoSql database to write continuous log data. Approx. 100 write per second. And a single data is contains 3 column and less than 1kb. Read is necessarily only once a day, then I can delete all daily data. But I can't decide that which is the cheapest solution; Google App Engine and Datastore or Heroku and Mongolab?
I can give you costs for GAE:
Taking billing docs and assuming you'll have about 258M operations per (86400 second per day * 100 requests/s) this would cost you
Writing: 258M record * ($0.2 / 100k) = $516 for writing unindexed data
Reading: 258M records * ($0.07 / 100k ops) = $180 for reading once a month
Deleting 258M rec * ($0.2 / 100k) = $516 for deleting unindexed data
Storage: 8.6M entities at 1kb per day = 8.6GB per day = 240 GB / month = averaged 120 GB
Storage cost: 120 GB * 0.12$/GB = $15 / month
So your total operation per month on GAE would be about $1300 per month. Note that using a structured database for writing unstructured data is not optimal and it reflects on the price.
With App Engine, it is recommended that you use memcache for operations like this, and memcache does not incur database charges. Using python 2.7 and ndb, memcache is automatically used and you will get at most 1 database write per second.
At current billing:
6 cents per day for reads/writes.
Less than $1 per day storage