Access Google Calendar within Google App Engine project - google-app-engine

I am new to Google App Engine (GAE) and its Python api. For the past weekend, I have been trying to set up a simple app that will send SMS message to my phone whenever it sees an upcoming event in the calendar. For that purpose, I am using GAE with Twilio and Google Calendar API (v3)--all in Python.
I have a GAE account and was able to run simple tutorial web app by following an example here https://developers.google.com/appengine/docs/python/gettingstartedpython27/introduction and here https://developers.google.com/appengine/articles/twilio. I was also able to run cron job to send SMS messages using Python scripts on GAE successfully. But when I am having trouble figuring out how to write a script (cron job) that will retrieve calendar events (of my GAE account) using Google Calendar API. I have read a bunch of tutorials and instructions on Google developer, but just found myself getting confused more and more.
Is that possible to create a Python script for cron job to be run on GAE to retrieve calendar events? I came across OAuth2 authentication in GAE accessing Calendar API V3 (domain hosted) which is close to what I was trying to achieve, but not quite (my plan is not to create web app for users). Is the rumor true that Google Calendar API doesn't work with Service Accounts?
Using OAuth 2.0 requires the user to allow access. What I understand is that it will return an access token to the requesting application for a certain period (1hr?) and then will require user to refresh or go through the granting-access-procedure again? If that's the case, I do not want to use it since my goal is to have a script that periodically runs (say every fifteen minutes) to see if there is any upcoming calendar event in next 15 minutes.
If someone could direct me to an example Python script that will work on retrieving Google Calendar events from an owner's account on GAE, that would be greatly helpful.
If someone can answer any of the above questions, I would greatly appreciate your help.

Related

which GCP component to use to fetch data from an API

I'm a little bit confused between gcp components, here is my use case :
daily, I need to fetch data from an external API (the API return json data), store it in GCS then load it in Bigquery, I already created the python script fetching the data and store it in GCS and i'm confused which component to use for deployment :
Cloud run : from the doc it is used for deploying services, so I think its a bad choose
Cloud function: I think it works, but it is used for even based processing (through single purpose function...)
composer :(I'll use composer to orchestrate tasks, such as preprocessing of files in GCS, load them to BQ, transfert them to an archive Bucket) through kubernetesPodOperator, create a task that trigger the script to get the data
compute engine: I don't think that its the best chose since there are better ones
app engine: also I don't think it a good idea since it is used to deploy and scale web app ...
(correcte me if i'm wrong in what I said, ) so my question is : what is the GCP component used for this kind of task
Cloud run : from the doc it is used for deploying services
app engine: also I don't think it a good idea since it is used to deploy and scale web app ...
I think you've misunderstood. Both Cloud run and Google App Engine (GAE) are serverless offerings from Google Cloud. You deploy your code to any of them and you can invoke their urls which in turn will cause your code to execute and do stuff like go fetch data from somewhere and save it somewhere.
Google App Engine has a shorter timeout than Cloud Run (can't remember if Cloud Run has time out). So, if your code will take a long time to run, you don't want to use Google App Engine (unless you make it a background task) and if you don't need a UI, then you don't need GAE.
For your specific scenario, you can deploy your code to Cloud Run and use Cloud Scheduler to schedule it to be invoked at specific times. We have that architecture running in a similar scenario (we have a task that runs once daily; it's deployed to Cloud Run; Google Scheduler invokes the endpoint, it runs and saves data to datastore linked to an App Engine App). We wrote a blog article on deploying to Cloud Run and another on securing your cloud run (based off our experience in the earlier described scenario)
GAE Timeout:
Every request to a Google App Engine (Standard) must complete within 1 - 10 minutes for automatic scaling and up to 24 hours for basic scaling (see documentation). For Google App Engine Flexible, the timeout is 60 minutes (documentation).

Long running script in Google Cloud

I have a piece of code, based on NodeJs, that does not serve any HTTP request, but monitors some online systems and sends report emails.
This code is run by a shell script and keeps running 24x7.
Which Google Cloud offering is best suited to host this?
I tried with App Engine, but after one hour of console inactivity, the console exists and the script stops running.
I am not sure if Compute Engine would be best for this. I can host this in AWS EC2, it would work there... but wondering about Google.
Any tips appreciated.
Thanks
This can be done with a simple Python app running in App Engine Standard Platform. See this post for details.
If you're able to modify it so it can run periodically you could run it on AWS Lambda with a schedule as trigger and use SES to send out e-mails.
Alternatively, if you have control over the "online systems", you could use CloudWatch custom metrics and create alerts based on the thresholds of your metrics.
If you must use Google Cloud, you could use Google Cloud Functions instead of AWS Lambda, and Google Cloud Monitoring / Logging.
The second version of Google Cloud Functions can run for up to 60 minutes (thanks to Google Cloud Run).
To sum up on GCP:
Google Cloud Functions 1nd: 9 mins
Google Cloud Functions 2nd: 60 mins
Google Cloud Run: 60 mins
Google App Engine: 10 minutes (Automatic scaling) and 24 hours (Basic scaling)
Google Compute engine : infinite (you manage the VM)

Does it cost money to build and deploy an app with Google Apps Script?

I have built an app using Google Apps Script (GAS). It displays a form. When the user submits the form, the submitted data is written to a Google docs spreadsheet. I have deployed the app using the built in Deploy as web app option in the GAS script builder page
What I can't seem to find out is whether it is free to build and deploy web apps using Google Apps Script, or is it the case that one needs to pay?
I did come across a paid service called Google App Engine, but I am not sure if this is relevant to Google Apps Script.
Thanks.
Google Apps Script is a javascript cloud scripting language and it is free to use as long as you do not need higher quotas than defined here: https://developers.google.com/apps-script/guides/services/quotas
If you need higher quotas than listed there - I would suggest you take into consideration to build your own Google Appengine Application for your service.
However if you did not hear about that since now you should first do some examples listed here https://cloud.google.com/appengine/docs - and get familiar with the Platform as a Service Google offers. It is also free of any charge as long as your application applies to the free quotas.
Important fact: Every Google Apps Script has it's own Developers Console Project assigned to it - however it is not neccessary to configure anything on the Console for App Script to work properly. You can review your assigned Appengine Project by
using the Menu: Resources - Developers Console Project... and click on the link that looks similar to this -
https://console.developers.google.com/project/project-id-YOUR_PROJECT_ID

Integrate Google Wallet with Google Apps Script?

I have been developing an event registration form with Google Apps script. The form is required to add the data entries to a Google spreadsheet and process orders with Google Wallet. I have tried using HTMLServices, but it did not work. Is there any way to integrate Google Wallet dynamically in the Google Apps Script service? If not, will I need to use the App Engine? And what language would be best?
You'll need a server component to handle callbacks from Google after the Wallet transactions. The server handler must be able to process XML or JSON depending on the API used.
If you're using the Google Checkout API, have a look at:
https://developers.google.com/checkout/developer/Google_Checkout_XML_API_Notification_API
https://developers.google.com/checkout/samplecode
If you're using the Wallet for digital goods API, have a look at:
https://developers.google.com/commerce/wallet/digital/docs/postback

Recipe Needed to Upload Data to Google App Engine Datastore

While I've been busy finishing my Google App Engine solution during the last several months, I now find Google has me painted me into a corner due to changes and differences between the local dev_server and appspot.
The scenario: My app is deployed on appspot with Federated OpenID authentication.
The problem: Google does not support data uploads while apps are configured to use OpenID. (They are aware of this problem and do not consider it a bug.)
Several years ago Nick Johnson posted a remote_api and OpenID workaround on his blog, but several people report it no longer works.
In addition, the latest (2nd) edition of Dan Sanderson's book "Programming Google App Engine" no longer contains a chapter on data uploads. There is a chapter on Backup and Restore, but I can not restore data until I back it up, and I can not back it up until the data exists!
I can not believe I'm the only one in this predicament -- it seems it should be a very common need -- I simply need to upload data while my app uses OpenID.
Keep in mind that everything about my local dev_server Python app works great (appcfg.py, upload_data, remote_api, CSV yaml configs, etc.), but this problem on appspot prevents me from releasing my app!
Does anyone have a simple, up-to-date, and documented recipe to upload thousands of records to app engine? Custom upload handler endpoint? Bundle the data file(s) with new app verions, then read them somehow? Post CSV files to Google Drive and read them from a task queue?
Any ideas?
You can try this as a work around:
Create a new application with "normal" Google account authentication.
Upload the data into that application.
Backup the data into Google Cloud Storage.
Restore the date from Google Cloud Storage into the original application.
As of SDK 1.7.3 google says you can change the authentication method http://code.google.com/p/googleappengine/wiki/SdkReleaseNotes . You could always revert to "google accounts api" whilst doing you initial data load via remote api, then set auth back to Federated Login once your done.

Resources