how google channel api pricing works? [closed] - google-app-engine

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I read on the official doc that to open a new channel it will cost $0.01. and it will last 2 hours.
so if I have 1000 concurrent users who use my site daily for 2 hours.
total cost will be 1000*$0.01 = 10$ daily.
bandwidth cost + cpu cost. right ?
Do they charge hourly too ?
i.e. if concurrent users use site daily for 4 hours, the resultant total cost will be 1000*0.01*2=$20 ?

It's only $0.01 per 100 channels, which equates to $0.0001 per channel. You can also change the lifetime of the channel token from 2 hours (you can make it greater or smaller), so you can effectively reuse channel tokens, depending on how they're used for your application.
So, if you leave the channel token lifetime at 2 hours, it would be
'1000 * $0.0001 * 2 = $0.2` for the cost of channel token creation alone.
The rest of the cost, as you've indicated here, will depend on your bandwidth, CPU, and other server-side usage costs.
Seems like the calculation shown in https://developers.google.com/appengine/docs/billing is also wrong.

Related

How to formulate the LP problem and solve it in CPLEX using a set (ranges) [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 days ago.
Improve this question
Exercise question:
Resource data table:
The Dakota Furniture Company manufactures desks, tables, and chairs. The manufacture of each type of furniture requires lumber and two types of skilled labor: finishing and carpentry.
The amount of each resource needed to make each type of furniture is given in Table 2.
Currently, 48 board feet of lumber, 20 finishing hours, and 8 carpentry hours are available.
A desk sells for $60, a table for $30, and a chair for $20.
Dakota believes that demand for desks and chairs is unlimited, but at most five tables can be sold.
Because the available resources have already been purchased, Dakota wants to maximize total revenue
My current code:
MAX OBJECTIVE VALUE Z SHOULD BE 280 WITH x1=2, x2=0, x3=8
BUT I AM GETTING Z = 180.
What is this supposed to mean:
forall (i in I, j in J)
sum(j in J) x[i][j] <= S[i];
Similar for the other constraints.
You may want to generate an LP file to see what OPL is actually generating. It is likely different from what you think.
OPL should really not accept this kind of input.

Binary classification of sensor data using minimal code space [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I am trying to classify the events above as 1 or 0. 1 would be the lower values and 0 would be the higher values. Usually the data is does not look as clean as this. Currently the approach I am taking is to have two different thresholds so that in order to go from 0 to 1 it has to go past the 1 to 0 threshold and it has to be above for 20 sensor values. This threshold is set to the highest value I receive minus ten percent of that value. I dont think a machine learning approach will work because I have too few features to work with and also the implementation has to take up minimal code space. I am hoping someone may be able to point me in the direction of a known algorithm that would apply well to this sort of problem, googling it and checking my other sources isnt producing great results. The current implementation is very effective and the hardware inst going to change.
Currently the approach I am taking is to have two different thresholds so that in order to go from 0 to 1 it has to go past the 1 to 0 threshold and it has to be above for 20 sensor values
Calculate the area on your graph of those 20 sensor values. If the area is greater than a threshold (perhaps half the peak value) assign it as 1, else assign it as 0.
Since your measurements are one unit wide (pixels, or sensor readings) the area ends up being the sum of the 20 sensor values.

Estimation of the Data logging size [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I have a device generating some values say N, each value having 32 bit.
I am logging these values every 10 seconds by writing a new row in an excel file. I will be creating a new file every day.
I have to estimate the hard disk storage capacity necessary to store these log files for a period of 10 years.
Can someone give any hints regarding the calculation of the size of log file generated per day ?
Assuming worst case 2's complement 32-bit ASCII...
-2147483648 is 13 characters per value
1 value / 10 seconds
3600 seconds / hour
24 hours /day
that's 112,320 bytes per day, per number of values N,
"round" that off to 112,640 bytes (divisible by 1024) per day
365.25 days per year
10 years
that's N * 411,417,600 or slightly more than N * 4Mbytes
So if N was 10, that would be slightly more than 41MBytes.
Create a sample spreadsheet. Add 1000 rows and save it as a different name.
That will give an estimate for per-row cost.
Incremental writing is not a good scenario for complex formats such as spread sheet. Text log file could be appended.
A spread sheet would tend to re-write whole file for each flush.

Maxmimum weight for USPS service [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
I am trying to integrate USPS shipping service. I got it works. But I got warning message if i exceed weight limit I got message like "Warning - The package weight cannot exceed 70 pounds." , Here i want to set USPS maximum weight limit. Are USPS provide service up to MAXIMUM 70 lbs? IS it possible to set max weight limit ?
Googling usps maximum package weight tells me that they indeed have a hard limit of 70 pounds.
Remember, we won't accept an item that's over 70 lbs or 130". Mail pieces weighing more than 13 oz bearing only stamps as postage are not eligible for pickup.

Estimating database size [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I was wondering what you do when developing a new application in terms of estimating database size.
E.g. I am planning to launch a website, and I am having a hard time estimating what size I could expect my database to grow. I don't expect you to tell me what size my database will be, but I'd like to know if there are general principles in estimating this.
E.g. When Jeff developed StackOverflow, he (presumably) guesstimated his database size and growth.
My dilemma is that I am going for a hosted solution for my web application (its about cost at this stage), and preferably don't want to shoot myself in the foot by not purchasing enough SQL Server space (they charge a premium for this).
If you have a database schema, sizing is pretty straightforward ... it's just estimated rows * avg row size for each table * some factor for indexes * some other factor for overhead. Given the ridiculously low price of storage nowadays, sizing often isn't a problem unless you intend to have a very high traffic site (or are building an app for a large enterprise).
For my own sizing exercises, I've always created an excel spreadsheet listing:
col 1: each table that will grow
col 2: estimated column size in bytes
col 3: estimated # of rows (per year or max, depending on application)
col 4: index factor (I always set this to 2)
col 5: overhead factor (I always set this to 1.2)
col 6: total column (col 2 X 3 X 4 X 5)
The sum of col 6 (total column), plus the initial size of your database without growth tables, is your size estimate. You can get much more scientific, but this is my quick and dirty way.
Determine:
how many visitors per day, V
how many records of each type will be created per visit, N1, N2, N3...
the size of each record type, S1, S2, S3...
EDIT: forgot index factor which a good rule of thumb is 2 times
Total growth per day = 2* V * (N1*S1 + N2*S2 + N3*S3 + ...)
My rules-of-thumb to follow are
how many users do I expect?
what content can they post?
how big is a user record?
how big is each content item a user can add?
how much will I be adding?
how long will those content items live? forever? just a couple weeks?
Multiply the user record size times the number of users; add the number of users times the content item size; multiply by two (for a convenient fudge factor).
The cost of estimating is likely to be larger than the cost of the storage
Most hosting providers sell capacity by the ammount used at the end of each month, so just let it run

Resources