I have Google Cloud and BigQuery setup. I need to have push notification (Like Email) on specific event / any database changes. I have Google compute engine (without any applications loaded yet)
Can someone give me directions on how can this be done? What application can be installed ? Or is it something BigQuery already has?
BigQuery doesn't currently have a notification system. You could set up an App Engine task that monitors it instead.
(feature request? https://code.google.com/p/google-bigquery/issues/list)
Related
I am a big fan of particle.io and was very excited when they added a Google Cloud Platform (GCP) integration so I can save my IoT data into a GCP "DataStore".
I've followed their tutorial and got it working but I need some advice on implementing this so it can scale on GCP.
My current implementation is like so:
https://docs.particle.io/tutorials/integrations/google-cloud-platform/#example-use-cases
Basically I have a GCP "Compute Engine" instance which runs a node.js script that listens for the PubSub events (sent by my IoT devices) and saves it to DataStore.
Now because I want it to scale, ideally this node.js script should run on a managed service that can respond to spikes automatically. But GCP does not seem to have anything like this.
In AWS I could so this:
IoT Data -> Particle.io AWS WebHook -> AWS API Gateway Endpoint -> AWS Lambda -> AWS DynamoDB
All the AWS points are managed.
What's the best way to have that node.js script always running in a fully-managed, always-available way on GCP? which can run my node.js script that listens for PubSub events and saves to the DataStore and automatically scales as load increases
Any help/advice will be appreciated.
Thanks very much,
Mark
You have a number of options:
1- As someone else mentioned, there is Cloud Functions. It's basically a Node.js function you deploy and Google Cloud takes care of scaling it up/down for you.
2- You can deploy your Node.js app to App Engine Flex which has autoscaling enabled by default.
3- If you want to stay on Compute Engine, you can manually set autoscaling on Compute Engine.
We got a couple millions data in the current GAE project using Google Cloud store. Mostly GPS point information. We want to be able to use all these GPS points in another demo instance, which is hosted in another GAE instance. Anyway we can do it?
Using Golang + Google App Engine
There is a Google Cloud Datastore API that you can use to access your Datastore data from any other deployment, including a different App Engine app. It's not available in Go, so you will have to mix in some Python or Java.
I am new to Google App Engine (GAE) and its Python api. For the past weekend, I have been trying to set up a simple app that will send SMS message to my phone whenever it sees an upcoming event in the calendar. For that purpose, I am using GAE with Twilio and Google Calendar API (v3)--all in Python.
I have a GAE account and was able to run simple tutorial web app by following an example here https://developers.google.com/appengine/docs/python/gettingstartedpython27/introduction and here https://developers.google.com/appengine/articles/twilio. I was also able to run cron job to send SMS messages using Python scripts on GAE successfully. But when I am having trouble figuring out how to write a script (cron job) that will retrieve calendar events (of my GAE account) using Google Calendar API. I have read a bunch of tutorials and instructions on Google developer, but just found myself getting confused more and more.
Is that possible to create a Python script for cron job to be run on GAE to retrieve calendar events? I came across OAuth2 authentication in GAE accessing Calendar API V3 (domain hosted) which is close to what I was trying to achieve, but not quite (my plan is not to create web app for users). Is the rumor true that Google Calendar API doesn't work with Service Accounts?
Using OAuth 2.0 requires the user to allow access. What I understand is that it will return an access token to the requesting application for a certain period (1hr?) and then will require user to refresh or go through the granting-access-procedure again? If that's the case, I do not want to use it since my goal is to have a script that periodically runs (say every fifteen minutes) to see if there is any upcoming calendar event in next 15 minutes.
If someone could direct me to an example Python script that will work on retrieving Google Calendar events from an owner's account on GAE, that would be greatly helpful.
If someone can answer any of the above questions, I would greatly appreciate your help.
I have been developing an event registration form with Google Apps script. The form is required to add the data entries to a Google spreadsheet and process orders with Google Wallet. I have tried using HTMLServices, but it did not work. Is there any way to integrate Google Wallet dynamically in the Google Apps Script service? If not, will I need to use the App Engine? And what language would be best?
You'll need a server component to handle callbacks from Google after the Wallet transactions. The server handler must be able to process XML or JSON depending on the API used.
If you're using the Google Checkout API, have a look at:
https://developers.google.com/checkout/developer/Google_Checkout_XML_API_Notification_API
https://developers.google.com/checkout/samplecode
If you're using the Wallet for digital goods API, have a look at:
https://developers.google.com/commerce/wallet/digital/docs/postback
I want to ingest logs from an appengine app to bigquery without using appengine mapreduce ?
We've open sourced a Java implementation of migrating appengine logs to bigquery here: http://blog.streak.com/2012/07/export-your-google-app-engine-logs-to.html
See the bigquery docs here. You can post a multi-part http request that has the data you want to add to the table. If you are doing an append you won't need to provide the schema.
There's also a Python runtime implementation called "log2bq" that demonstrates how to ingest App Engine logs into BigQuery: http://code.google.com/p/log2bq/
Google has recently released a (BETA) feature called "Google Cloud Logging: Logs Export"
https://cloud.google.com/logging/docs/install/logs_export
They summarize it as:
Export your Google Compute Engine logs and your Google App Engine logs to a Google Cloud Storage bucket, a Google BigQuery dataset, a Google Cloud Pub/Sub topic, or any combination of the three.
I haven't tried out all of the functionality provided by this new service but...
We have recently started using the "Stream App Engine Logs to BigQuery" feature in our Python GAE project. This sends our app's logs directly to BigQuery as they are occurring to provide near real-time log records in a BigQuery dataset.