How to measure execution time of query on Google Cloud - google-app-engine

How to measure execution time of query on Google Cloud?
Earlier there was provision of Query Browser of Google. Which is now absolute and redirected to google cloud console.
Thanks.

One easy way is to use Stackdriver Trace. You can search "Trace" in GCP console, and then it shows list of requests. Click on the requests you are interested and you can see execution timelines.
Appstats provides more details and customization, but definitely needs more work.

Related

Setting up Google App Engine cron job logs

I'm wondering how to set up logging for Google App Engine cron jobs. I haven't found any information about this specific topic in the App Engine documentation.
There's a page https://console.cloud.google.com/appengine/cronjobs in GCP. Every cron job has a "View" link in the "Log" column, which leads a user to the Logs Viewer with the following filters:
protoPayload.taskName="..."
protoPayload.taskQueueName="__cron"
In my case, no logs for cron jobs are displayed.
The service that serves the endpoints for the cron jobs is a node.js application that uses Winston logging with the transport provided by #google-cloud/logging-winston package. This application is responsible not only for processing cron jobs, and the logging works there fine: for instance, I'm able to filter specific queries by Google's trace id.
Is there anything I can provide with the logs payload to be able to filter them by taskName and taskQueueName? And where would I take these values, i.e. are there any request headers I could read them from and write with logs?
It would be great if it's something achievable with #google-cloud/logging-winston. If not, a library/language agnostic answer would also be helpful.

My Domain shows under construction in google search

When I uploaded the site to the domain and shows the given details in google search.
Can anyone tell me why does it happen and how to solve this.
My actual hosting process is Heroku as free dynos
hey it Looks your website needs time to be crawled and indexed by google algorithm. it may take some time so just wait i guess

Stackdriver vs ELK for app engine

Im a little confused about this because the docs say I can use stackdriver for "Request logs and application logs for App Engine applications" so does that mean like web requests? Does that mean like millions of web requests?
Stackdriver's pricing is per resource so does that mean I can log all of my web servers web request logs (which would be HUGE) for no extra cost (meaning I would not be charged by the volume of storage the logs use)?
Does stackdriver use GCP cloud storage as a backend and do I have to pay for the storage? It just looks like I can get hundreds of gigabytes of log aggregation for virtually no money just want to make sure Im understanding this.
I bring up ELK because elastic just partnered with google so it must not do everything elasticsearch does (for almost no money) otherwise it would be a competitor?
Things definitely seem to be moving quickly at Google's cloud division and documentation does seem to suffer a bit.
Having said that, the document you linked to also details the limitations -
The request and application logs for your app are collected by a Cloud
Logging agent and are kept for a maximum of 90 days, up to a maximum
size of 1GB. If you want to store your logs for a longer period or
store a larger size than 1GB, you can export your logs to Cloud
Storage. You can also export your logs to BigQuery and Pub/Sub for
further processing.
It should work out of the box for small to medium sized projects. The built in log viewer is also pretty basic.
From your description, it sounds like you may have specific needs, so you should not assume this will be free. You should factor in costs for Cloud Storage for the logs you want to retain and BigQuery depending on your needs to crunch the logs.

Monitoring: Quota exceeded: Your table exceeded quota for imports or query appends per table

We're getting this error periodically despite what I believe to be usage well under the 1000 batch daily updates per table.
My problem is that there's no way to inspect what Google believes the current usage is for a particular table. This makes it impossible to isolate the underlying process that is responsible for the majority of the quota usage.
How can we view/inspect the current usage levels for this quota?
I would start by enabling audit logs and inspecting the logs.
Audit logs are available via Google Cloud Logging, where they can be immediately filtered to provide insights on specific jobs or queries, or exported to Google Cloud Pub/Sub, Google Cloud Storage, or BigQuery.
To analyze your aggregated BigQuery usage using SQL, set up export of audit logs back to BigQuery. For more information about setting up exports from Cloud Logging, see Overview of Logs Export in the Cloud Logging documentation.
Analyzing Audit Logs Using BigQuery: https://cloud.google.com/bigquery/audit-logs

How can you check Search API Quota usage in GAE?

For a few days in a row I have a Search API error:
OverQuotaError: The API call search.IndexDocument()
required more quota than is available.
I almost sure that I have not reached [quota limits] but I cannot find a way to make sure.
How can I check current quota usage, if not in admin them maybe by code.
You can't check Search quota usage right now. It'll be viewable in admin console soon.
You can now check your Search API Quota from within the Quota Admin Console. It is located below the Storage section, of the Quote Details page. This feature was added yesterday (5-23-2012) and was blogged about on the official Google App Engine Blog.
You can check quotas for all Google services which provides API here. I don't find Custom Search API there although.
I found quotas on appengine search api for java
which says that there is a rate limiting at about 100 to 120 API calls per minute.
If you need additional quota for the App Engine search API, you can request it here.

Resources