Im a little confused about this because the docs say I can use stackdriver for "Request logs and application logs for App Engine applications" so does that mean like web requests? Does that mean like millions of web requests?
Stackdriver's pricing is per resource so does that mean I can log all of my web servers web request logs (which would be HUGE) for no extra cost (meaning I would not be charged by the volume of storage the logs use)?
Does stackdriver use GCP cloud storage as a backend and do I have to pay for the storage? It just looks like I can get hundreds of gigabytes of log aggregation for virtually no money just want to make sure Im understanding this.
I bring up ELK because elastic just partnered with google so it must not do everything elasticsearch does (for almost no money) otherwise it would be a competitor?
Things definitely seem to be moving quickly at Google's cloud division and documentation does seem to suffer a bit.
Having said that, the document you linked to also details the limitations -
The request and application logs for your app are collected by a Cloud
Logging agent and are kept for a maximum of 90 days, up to a maximum
size of 1GB. If you want to store your logs for a longer period or
store a larger size than 1GB, you can export your logs to Cloud
Storage. You can also export your logs to BigQuery and Pub/Sub for
further processing.
It should work out of the box for small to medium sized projects. The built in log viewer is also pretty basic.
From your description, it sounds like you may have specific needs, so you should not assume this will be free. You should factor in costs for Cloud Storage for the logs you want to retain and BigQuery depending on your needs to crunch the logs.
Related
I am implementing a dictionary website using App Engine and Cloud Storage. App Engine controls the backend, like user authentication etc., and Cloud Storage is used to store a JSON file for each dictionary entry.
I would like to rate limit how much a user can download in a given time period so they can't bulk download the JSON files and result in a big charge for me. Ideally, the dictionary would display a captcha if a user downloads too much at once, and allow them to keep downloading if they pass the captcha. What is the best way to achieve this?
Is there a specific service for rate limiting based on IP address or authenticated user? Should I do this through App Engine and only access Cloud Storage through App Engine (perhaps slower since it's using some of my dynamic resources to serve static content)? Or is it possible to have the frontend access Cloud Storage and implement the rate limiting on Cloud Storage directly? Is a Cloud bucket the right service for storage, here? And how can I allow search engine indexing bots to bypass the rate limiting?
As explained by Doug Stevenson in this post
"There is no configuration for limiting the volume of downloads for
files stored in Cloud Storage."
and explaining further:
"If you want to limit what end users can do, you will need to route
them through some middleware component that you build that tracks how
they're using your provided API to download files, and restrict what
they can do based on their prior behavior. This is obviously
nontrivial to implement, but it's possible."
When I try to store an image into my AppEngine blobstore, I get the following error:
The API call blobstore.CreateUploadURL() required more quota than is available.
My app is a paid app. I checked my storage bucket where Google stores blobstore data and it is at 5GB -- the limit they set for free apps. I have tried to find the quota settings both in my app engine settings as well as in the general quota settings for my project, but I can't seem to find my blobstore quota anywhere.
I expect to be able to store more than 5GB of data, but it seems that I've hit some sort of limit. I don't want to migrate to Google Cloud Storage because that will take time.
You will need to enable billing on your project to uncap the 5GB limit.
If you have a daily budget for your application it could affect the quota, you could raise it to a non-zero value and see if that helps.
I have deployed an application to Google App Engine which will be consumed by millions of users.
I want to test the application against high amount of traffic before go live just to make sure i have provided the correct configuration that supports auto load balancing and scale-ability.
While going through google documentations. App Engine should handle all of this headache, but i have to be sure 100%.
Is there are anything should i put in mind before go live (database connection, other resources in the cloud storage,..., etc.)?
Thanks.
You should look into making sure your are using the Cloud SQL instance effectively. For example, how many total connections do you expect to have from your app engine to MySQL?
There's ultimately a limit on the number of concurrent connections that a MySQL server can handle. You would want to make sure your application is designed such that you are reusing connections when possible.
I would recommend performing a load test to determine the limits of your application and its dependencies.
My application is currently on app engine server. My application writes the records(for logging and reporting) continuously.
Scenario: Views count in the website. When we open the website it hits the server to add the record with time and type of view. Showing these counts in the users dashboard.
Seems these requests are huge now. For now 40/sec. Google App Engine writes are going heavy and cost is increasing like anything.
Is there any way to reduce this or any other db to log the views?
Google App Engine's Datastore is NOT suitable for such a requirement where you have to continuously write to datastore and read less often.
You need to offload this task to a third party service (either you write one or use existing one)
Better option for user tracking and analytics is Google Analytics (Although you wont be directly able to show the hit counters on website using analytics).
If you want to show your user page hit count use a page hit counter: https://www.google.com/search?q=hit+counter
In this case you should avoid Datastore.
For this kind of analytics it's best to do the following:
Dump data to GAE log (yes, this sounds counter-intuitive, but it's actually advice from google engineers). GAE log is persistent and is guaranteed to not loose data you write to it.
Periodically parse the log for your data and then export it to BigQuery.
BigQuery has a quite powerful query language so it's capable of doing complex analytics reports.
Luckily this was already done before: see the Mache framework. Also see related video.
Note: there is now a new BigQuery feature called streaming inserts, which could potentially replace the cumbersome middle step (files on Cloud Storage) used in Mache.
New in GAE development and have some question regarding extracting data.
I have an app that collects data from end users and data is stored in high availability datastore, and there is a need to send subset of data the app collected to business partners on a regular basis.
Here are my questions,
1. How can I backup data in the datastore on a regular basis, say daily incremental backup and weekly full backup?
2. what are the best practices to generate daily data dump files that can be downloaded or send to my partners in a secured approach. I expect few hundred MB data files everyday and eventually will be in few GB range.
3. Can my business partners be authenticated though basic HTTP auth, or have to use OAuth?
Google is in effect backing up your data by storing it in multiple data centres.
You can however use the bulk loader if desired and back it up manually:
Uploading and Downloading Data
You can authenticate users however you choose, it's totally up to you. The "users" service is integrated into app engine directly however so if everybody has or could have google accounts that's even easier for you to use.
The users service
Due to the size of your files unless you want to piece them together from the datastore you'll have to use something else, as the datastore has a 1MB limit per model. It's perfectly possible to do that however.
But you should probably look at The Google Cloud Storage API instead as there are no file size limits.