Google Pub/Sub access rights - google-app-engine

I created a topic in my project Project 1 and I have an app on Google app engine which posts every minute a message to this topic.
I have a google cloud compute machine in a second project (Project 2) which subscribed to this topic and receives the messages.
I did not give any access right to the machine on my Project 2, but even without access rights, It managed to receive the messages. More precisely, I did not write specific permissions associated to the topic I created.
My questions are:
1- is this normal? Shouldn't the machine on Project 2 get a "forbidden access error"?
2- how can I restrain access on a certain topic?
Here is the code of my subscription part:
import httplib2
import base64
import pandas
import json
from apiclient import discovery
from oauth2client import client as oauth2client
from oauth2client.client import SignedJwtAssertionCredentials
from oauth2client.client import GoogleCredentials
def create_pubsub_client(http=None):
credentials = GoogleCredentials.get_application_default()
if not http:
http = httplib2.Http()
credentials.authorize(http)
return discovery.build('pubsub', 'v1', http=http)
client = create_pubsub_client()
# You can fetch multiple messages with a single API call.
batch_size = 1
subscription_str = 'projects/<myproject1>/subscriptions/testo'
# Create a POST body for the Pub/Sub request
body = {
# Setting ReturnImmediately to false instructs the API to wait
# to collect the message up to the size of MaxEvents, or until
# the timeout.
'returnImmediately': False,
'maxMessages': batch_size,
}
while True:
resp = client.projects().subscriptions().pull(
subscription=subscription_str, body=body).execute()
received_messages = resp.get('receivedMessages')
if received_messages is not None:
ack_ids = []
for received_message in received_messages:
pubsub_message = received_message.get('message')
if pubsub_message:
# Process messages
msg = base64.b64decode(str(pubsub_message.get('data')))
treatment(msg)
# Get the message's ack ID
ack_ids.append(received_message.get('ackId'))
# Create a POST body for the acknowledge request
ack_body = {'ackIds': ack_ids}
# Acknowledge the message.
client.projects().subscriptions().acknowledge(
subscription=subscription_str, body=ack_body).execute()

The ability of the machine in Project 2 to access the topic/subscription in Project 1 depends entirely on how machine is authenticated. If it is authenticated with something that has permissions on both projects, e.g., your developer account, then you would be able to access the subscription on the topic in Project 1. That is normal.
If you want to restrict the access, create a service account in Project 1 and set the permissions on your topic and/or subscription to allow only that service account. You would do so in the Pub/Sub section of the Google Developers Console. Then, only machines authenticated via that service account will be able to access them.

Related

Cloud function doesn't have permission to other projects under the same billing account to implement a spending limit

I am following the documentation here https://cloud.google.com/appengine/docs/managing-costs to disable App Engine using cloud functions when a spending limit is reached.
I want to disable App Engine for all projects associated with the same billing account, which the docs say should work in the following note
Note: The source code assumes that the function you are creating and the app you want to disable are in the same Google Cloud project. If the function and the app are in separate projects, change the source code so APP_NAME identifies the project that contains the app you want to disable.
But I get this error
Error: function terminated. Recommended action: inspect logs for
termination reason. Details: <HttpError 403 when requesting
https://appengine.googleapis.com/v1/apps/other-project?alt=json
returned "The caller does not have permission">
I also verified that I am using the App Engine default service account with Admin role per the docs
Select a service account that has the App Engine Admin role. The App Engine default service account has this role by default.
I modified the example for testing which is basically this
import base64
import json
import os
from googleapiclient import discovery
APP_NAME = os.getenv('GCP_PROJECT')
def limit_use_appengine(data, context):
pubsub_data = base64.b64decode(data['data']).decode('utf-8')
pubsub_json = json.loads(pubsub_data)
cost_amount = pubsub_json['costAmount']
budget_amount = pubsub_json['budgetAmount']
if cost_amount <= budget_amount:
print(f'No action necessary. (Current cost: {cost_amount})')
appengine = discovery.build(
'appengine',
'v1',
cache_discovery=False
)
apps = appengine.apps()
# for testing
display_status(apps, APP_NAME) # works fine
display_status(apps, "billing-account-project-name") # also works fine
display_status(apps, "other-project") # permission error
return
appengine = discovery.build(
'appengine',
'v1',
cache_discovery=False
)
apps = appengine.apps()
check_app(apps, "APP_NAME")
check_app(apps, "other-project")
def check_app(apps, appName):
# Get the target app's serving status
target_app = apps.get(appsId=appName).execute()
current_status = target_app['servingStatus']
# Disable target app, if necessary
if current_status == 'SERVING':
print(f'Attempting to disable app {appName}...')
body = {'servingStatus': 'USER_DISABLED'}
apps.patch(appsId=appName, updateMask='serving_status', body=body).execute()
def display_status(apps, appName):
target_app = apps.get(appsId=appName).execute()
current_status = target_app['servingStatus']
print(f'Serving status for {appName} is {current_status}')
What am I missing?
Service Accounts either inherit permissions or are assigned permission to the project. A billing account does not affect a service account's permissions.
Solution: assign permission in the project for the service account where you are getting the permissions error. If you are using Organizations, assign the permission higher up so that the service account inherits permissions to projects.

Google Cloud Tasks cannot authenticate to Cloud Run

I am trying to invoke a Cloud Run service using Cloud Tasks as described in the docs here.
I have a running Cloud Run service. If I make the service publicly accessible, it behaves as expected.
I have created a cloud queue and I schedule the cloud task with a local script. This one is using my own account. The script looks like this
from google.cloud import tasks_v2
client = tasks_v2.CloudTasksClient()
project = 'my-project'
queue = 'my-queue'
location = 'europe-west1'
url = 'https://url_to_my_service'
parent = client.queue_path(project, location, queue)
task = {
'http_request': {
'http_method': 'GET',
'url': url,
'oidc_token': {
'service_account_email': 'my-service-account#my-project.iam.gserviceaccount.com'
}
}
}
response = client.create_task(parent, task)
print('Created task {}'.format(response.name))
I see the task appear in the queue, but it fails and retries immediately. The reason for this (by checking the logs) is that the Cloud Run service returns a 401 response.
My own user has the roles "Service Account Token Creator" and "Service Account User". It doesn't have the "Cloud Tasks Enqueuer" explicitly, but since I am able to create the task in the queue, I guess I have inherited the required permissions.
The service account "my-service-account#my-project.iam.gserviceaccount.com" (which I use in the task to get the OIDC token) has - amongst others - the following roles:
Cloud Tasks Enqueuer (Although I don't think it needs this one as I'm creating the task with my own account)
Cloud Tasks Task Runner
Cloud Tasks Viewer
Service Account Token Creator (I'm not sure whether this should be added to my own account - the one who schedules the task - or to the service account that should perform the call to Cloud Run)
Service Account User (same here)
Cloud Run Invoker
So I did a dirty trick: I created a key file for the service account, downloaded it locally and impersonated locally by adding an account to my gcloud config with the key file. Next, I run
curl -H "Authorization: Bearer $(gcloud auth print-identity-token)" https://url_to_my_service
That works! (By the way, it also works when I switch back to my own account)
Final tests: if I remove the oidc_token from the task when creating the task, I get a 403 response from Cloud Run! Not a 401...
If I remove the "Cloud Run Invoker" role from the service account and try again locally with curl, I also get a 403 instead of a 401.
If I finally make the Cloud Run service publicly accessible, everything works.
So, it seems that the Cloud Task fails to generate a token for the service account to authenticate properly at the Cloud Run service.
What am I missing?
I had the same issue here was my fix:
Diagnosis: Generating OIDC tokens currently does not support custom domains in the audience parameter. I was using a custom domain for my cloud run service (https://my-service.my-domain.com) instead of the cloud run generated url (found in the cloud run service dashboard) that looks like this: https://XXXXXX.run.app
Masking behavior: In the task being enqueued to Cloud Tasks, If the audience field for the oidc_token is not explicitly set then the target url from the task is used to set the audience in the request for the OIDC token.
In my case this meant that enqueueing a task to be sent to the target https://my-service.my-domain.com/resource the audience for the generating the OIDC token was set to my custom domain https://my-service.my-domain.com/resource. Since custom domains are not supported when generating OIDC tokens, I was receiving 401 not authorized responses from the target service.
My fix: Explicitly populate the audience with the Cloud Run generated URL, so that a valid token is issued. In my client I was able to globally set the audience for all tasks targeting a given service with the base url: 'audience' : 'https://XXXXXX.run.app'. This generated a valid token. I did not need to change the url of the target resource itself. The resource stayed the same: 'url' : 'https://my-service.my-domain.com/resource'
More Reading:
I've run into this problem before when setting up service-to-service authentication: Google Cloud Run Authentication Service-to-Service
1.I created a private cloud run service using this code:
import os
from flask import Flask
from flask import request
app = Flask(__name__)
#app.route('/index', methods=['GET', 'POST'])
def hello_world():
target = os.environ.get('TARGET', 'World')
print(target)
return str(request.data)
if __name__ == "__main__":
app.run(debug=True,host='0.0.0.0',port=int(os.environ.get('PORT', 8080)))
2.I created a service account with --role=roles/run.invoker that I will associate with the cloud task
gcloud iam service-accounts create SERVICE-ACCOUNT_NAME \
--display-name "DISPLAYED-SERVICE-ACCOUNT_NAME"
gcloud iam service-accounts list
gcloud run services add-iam-policy-binding SERVICE \
--member=serviceAccount:SERVICE-ACCOUNT_NAME#PROJECT-ID.iam.gserviceaccount.com \
--role=roles/run.invoker
3.I created a queue
gcloud tasks queues create my-queue
4.I create a test.py
from google.cloud import tasks_v2
from google.protobuf import timestamp_pb2
import datetime
# Create a client.
client = tasks_v2.CloudTasksClient()
# TODO(developer): Uncomment these lines and replace with your values.
project = 'your-project'
queue = 'your-queue'
location = 'europe-west2' # app engine locations
url = 'https://helloworld/index'
payload = 'Hello from the Cloud Task'
# Construct the fully qualified queue name.
parent = client.queue_path(project, location, queue)
# Construct the request body.
task = {
'http_request': { # Specify the type of request.
'http_method': 'POST',
'url': url, # The full url path that the task will be sent to.
'oidc_token': {
'service_account_email': "your-service-account"
},
'headers' : {
'Content-Type': 'application/json',
}
}
}
# Convert "seconds from now" into an rfc3339 datetime string.
d = datetime.datetime.utcnow() + datetime.timedelta(seconds=60)
# Create Timestamp protobuf.
timestamp = timestamp_pb2.Timestamp()
timestamp.FromDatetime(d)
# Add the timestamp to the tasks.
task['schedule_time'] = timestamp
task['name'] = 'projects/your-project/locations/app-engine-loacation/queues/your-queue/tasks/your-task'
converted_payload = payload.encode()
# Add the payload to the request.
task['http_request']['body'] = converted_payload
# Use the client to build and send the task.
response = client.create_task(parent, task)
print('Created task {}'.format(response.name))
#return response
5.I run the code in Google Cloud Shell with my user account which has Owner role.
6.The response received has the form:
Created task projects/your-project/locations/app-engine-loacation/queues/your-queue/tasks/your-task
7.Check the logs, success
The next day I am no longer able to reproduce this issue. I can reproduce the 403 responses by removing the Cloud Run Invoker role, but I no longer get 401 responses with exactly the same code as yesterday.
I guess this was a temporary issue on Google's side?
Also, I noticed that it takes some time before updated policies are actually in place (1 to 2 minutes).
For those like me, struggling through documentation and stackoverflow when having continuous UNAUTHORIZED responses on Cloud Tasks HTTP requests:
As was written in thread, you better provide audience for oidcToken you send to CloudTasks. Ensure your requested url exactly equals to your resource.
For instance, if you have Cloud Function named my-awesome-cloud-function and your task request url is https://REGION-PROJECT-ID.cloudfunctions.net/my-awesome-cloud-function/api/v1/hello, you need to ensure, that you set function url itself.
{
serviceAccountEmail: SERVICE-ACCOUNT_NAME#PROJECT-ID.iam.gserviceaccount.com,
audience: https://REGION-PROJECT-ID.cloudfunctions.net/my-awesome-cloud-function
}
Otherwise seems full url is used and leads to an error.

Allow cloud functions ip from app engine firewall rules

We have created an app engine instance to work as backend and another one is from cloud function.
Now cloud functions needs to access the api from app engine within the same google project this works fine if the firewall from app engine allows everyone to access. But in our case we need to restrict the access from cloud function only.
I'm new to GCP I would higly appreciate your suggestions. Thanks in advance.
The best solution would be to activate the IAP for App Engine ( Identity aware proxy ). Here you can find a guide on how to activate your IAP on App Engine.
IAP will interdict the acces of anyone, any application to access your App Engine instance, but the one you will allow manually. In your situation you will need to allow the Cloud Functions service account to access your application. You can check on this guide on how to achieve that programmatically from Cloud Functions. You have examples for C#, Python, Java and PHP.
eg Python :
import google.auth
import google.auth.app_engine
import google.auth.compute_engine.credentials
import google.auth.iam
from google.auth.transport.requests import Request
import google.oauth2.credentials
import google.oauth2.service_account
import requests
import requests_toolbelt.adapters.appengine
IAM_SCOPE = 'https://www.googleapis.com/auth/iam'
OAUTH_TOKEN_URI = 'https://www.googleapis.com/oauth2/v4/token'
def make_iap_request(url, client_id, method='GET', **kwargs):
"""Makes a request to an application protected by Identity-Aware Proxy.
Args:
url: The Identity-Aware Proxy-protected URL to fetch.
client_id: The client ID used by Identity-Aware Proxy.
method: The request method to use
('GET', 'OPTIONS', 'HEAD', 'POST', 'PUT', 'PATCH', 'DELETE')
**kwargs: Any of the parameters defined for the request function:
https://github.com/requests/requests/blob/master/requests/api.py
If no timeout is provided, it is set to 90 by default.
Returns:
The page body, or raises an exception if the page couldn't be retrieved.
"""
# Set the default timeout, if missing
if 'timeout' not in kwargs:
kwargs['timeout'] = 90
# Figure out what environment we're running in and get some preliminary
# information about the service account.
bootstrap_credentials, _ = google.auth.default(
scopes=[IAM_SCOPE])
if isinstance(bootstrap_credentials,
google.oauth2.credentials.Credentials):
raise Exception('make_iap_request is only supported for service '
'accounts.')
elif isinstance(bootstrap_credentials,
google.auth.app_engine.Credentials):
requests_toolbelt.adapters.appengine.monkeypatch()
# For service account's using the Compute Engine metadata service,
# service_account_email isn't available until refresh is called.
bootstrap_credentials.refresh(Request())
signer_email = bootstrap_credentials.service_account_email
if isinstance(bootstrap_credentials,
google.auth.compute_engine.credentials.Credentials):
# Since the Compute Engine metadata service doesn't expose the service
# account key, we use the IAM signBlob API to sign instead.
# In order for this to work:
#
# 1. Your VM needs the https://www.googleapis.com/auth/iam scope.
# You can specify this specific scope when creating a VM
# through the API or gcloud. When using Cloud Console,
# you'll need to specify the "full access to all Cloud APIs"
# scope. A VM's scopes can only be specified at creation time.
#
# 2. The VM's default service account needs the "Service Account Actor"
# role. This can be found under the "Project" category in Cloud
# Console, or roles/iam.serviceAccountActor in gcloud.
signer = google.auth.iam.Signer(
Request(), bootstrap_credentials, signer_email)
else:
# A Signer object can sign a JWT using the service account's key.
signer = bootstrap_credentials.signer
# Construct OAuth 2.0 service account credentials using the signer
# and email acquired from the bootstrap credentials.
service_account_credentials = google.oauth2.service_account.Credentials(
signer, signer_email, token_uri=OAUTH_TOKEN_URI, additional_claims={
'target_audience': client_id
})
# service_account_credentials gives us a JWT signed by the service
# account. Next, we use that to obtain an OpenID Connect token,
# which is a JWT signed by Google.
google_open_id_connect_token = get_google_open_id_connect_token(
service_account_credentials)
# Fetch the Identity-Aware Proxy-protected URL, including an
# Authorization header containing "Bearer " followed by a
# Google-issued OpenID Connect token for the service account.
resp = requests.request(
method, url,
headers={'Authorization': 'Bearer {}'.format(
google_open_id_connect_token)}, **kwargs)
if resp.status_code == 403:
raise Exception('Service account {} does not have permission to '
'access the IAP-protected application.'.format(
signer_email))
elif resp.status_code != 200:
raise Exception(
'Bad response from application: {!r} / {!r} / {!r}'.format(
resp.status_code, resp.headers, resp.text))
else:
return resp.text
def get_google_open_id_connect_token(service_account_credentials):
"""Get an OpenID Connect token issued by Google for the service account.
This function:
1. Generates a JWT signed with the service account's private key
containing a special "target_audience" claim.
2. Sends it to the OAUTH_TOKEN_URI endpoint. Because the JWT in #1
has a target_audience claim, that endpoint will respond with
an OpenID Connect token for the service account -- in other words,
a JWT signed by *Google*. The aud claim in this JWT will be
set to the value from the target_audience claim in #1.
For more information, see
https://developers.google.com/identity/protocols/OAuth2ServiceAccount .
The HTTP/REST example on that page describes the JWT structure and
demonstrates how to call the token endpoint. (The example on that page
shows how to get an OAuth2 access token; this code is using a
modified version of it to get an OpenID Connect token.)
"""
service_account_jwt = (
service_account_credentials._make_authorization_grant_assertion())
request = google.auth.transport.requests.Request()
body = {
'assertion': service_account_jwt,
'grant_type': google.oauth2._client._JWT_GRANT_TYPE,
}
token_response = google.oauth2._client._token_endpoint_request(
request, OAUTH_TOKEN_URI, body)
return token_response['id_token']
In case you use a Cloud Function on Nodejs, a StackOverflow user created an example on how to achieve the same for Nodejs in this post.

Authorize requests to app engine app with a service account

I am using the app.yaml's login: admin in handlers to restrict access to my app only to selected Google accounts (which I can edit in IAM). I'm using the python27 standard environment on GAE.
I would like to use the JSON API my app exposes from another server app (not hosted on GAE). Using a service account looks like a straightforward solution, but I am unable to get the scopes or the request itself right, so the endpoint would see an authenticated Google user.
The service-user currently has Project/Viewer role in the IAM. I tried a few more like AppEngine/Viewer, AppEngine/Admin. I also tried some more scopes.
My test code:
"""Try do do an API request to a deployed app
with the current service account.
https://google-auth.readthedocs.io/en/latest/user-guide.html
"""
import sys
from google.auth.transport.requests import AuthorizedSession
from google.oauth2 import service_account
def main():
if len(sys.argv) < 2:
sys.exit("use: %s url" % sys.argv[0])
credentials = service_account.Credentials.from_service_account_file(
'service-user.json')
scoped_credentials = credentials.with_scopes(
['https://www.googleapis.com/auth/cloud-platform.read-only'])
authed_http = AuthorizedSession(scoped_credentials)
response = authed_http.request('GET', sys.argv[1])
print response.status_code, response.reason
print response.text.encode('utf-8')
if __name__ == '__main__':
main()
There is no error, the request behaves like unauthenticated. I checked the headers on the server, and while requesting from the browser there are several session cookies, the AuthorizedSession request contains single Authorization: Bearer .. header.
Normally the roles you would need is App Engine Admin; it's designed for this purpose. It should also work with the viewer/editor/owner primitive roles. That being said, to make sure it's not a "role" issue, simply give it the project owner role and also the explicit App Engine Admin role and try again. This will eliminate any role-based issue.
Let me know if that works for you.

Accessing a Google Drive spreadsheet from Appengine

I have an appengine app that needs to access a single, hard-coded spreadsheet on Google Drive.
Up until now I have been achieving this as follows:
SpreadsheetService service = new SpreadsheetService("myapp");
service.setUserCredentials("myusername#gmail.com", "myhardcodedpassword");
When I tried this today with a new user, I got InvalidCredentialsException even though the username and password were definitely correct. I got an email in my inbox saying suspicions sign-ins had been prevented, and there seems to be no way to enable them again.
I am also aware that hardcoding passwords in source is bad practice.
However, I have read very widely online for how to enable OAuth/OAuth2 for this, and have ended up wasting hours and hours piecing fragments of information from blogs, stackoverflow answers etc, to no avail.
Ideally the solution would involve an initial process to generate a long-lived access token, which could then be hard-coded in to the app.
I want a definitive list of steps for how to achieve this?
EDIT: As Google have redesigned the API Console, the details of the steps below have changed - see comments
OK here goes, step by step
Go to Google Cloud Console and register your project (aka application)
You need to note the Client ID, and Client Secret
Go to the OAuth Playground, click the gear icon and choose and enter your own credentials
You will be reminded that you need to go back to the Cloud COnsole and add the Oauth Playground as a valid callback url. So do that.
Do Step 1, choosing the spreadsheet scope and click authorize
Choose your Google account if prompted and grant auth when prompted
Do Step 2, Click 'Exchange auth code for tokens'
You will see an input box containing a refresh token
The refresh token is the equivalent of your long lived username/password, so this is what you'll hard code (or store someplace secure your app can retrieve it).
When you need to access Google Spreadsheets, you will call
POST https://accounts.google.com/o/oauth2/token
content-type: application/x-www-form-urlencoded
client_secret=************&grant_type=refresh_token&refresh_token=1%2xxxxxxxxxx&client_id=999999999999.apps.googleusercontent.com
which will return you an access token
{
"access_token": "ya29.yyyyyyyyyyyyyyyyyy",
"token_type": "Bearer",
"expires_in": 3600
}
Put the access token into an http header for whenever you access the spreadsheet API
Authorization: Bearer ya29.yyyyyyyyyyyyyyyyy
And you're done
Pinoyyid has indeed provided wonderful help. I wanted to follow up with Python code that will allow access to a Personal (web-accessible) Google Drive.
This runs fine within the Google App Engine (as part of a web app) and also standalone on the desktop (assuming the Google App Engine SDK is installed on your machine [available from: https://developers.google.com/appengine/downloads]).
In my case I added https://www.googleapis.com/auth/drive scope during Pinyyid's process because I wanted access to all my Google Drive files.
After following Pinoyyid's instructions to get your refresh token etc., this little Python script will get a listing of all the files on your Google drive:
import httplib2
import datetime
from oauth2client.client import OAuth2Credentials
from apiclient.discovery import build
API_KEY = 'AIz...' # from "Key for Server Applications" from "Public API Access" section of Google Developers Console
access_token = "ya29..." # from Piinoyyid's instructions
refresh_token = "1/V..." # from Piinoyyid's instructions
client_id = '654....apps.googleusercontent.com' # from "Client ID for web application" from "OAuth" section of Google Developers Console
client_secret = '6Cl...' # from "Client ID for web application" from "OAuth" section of Google Developers Console
token_expiry = datetime.datetime.utcnow() - datetime.timedelta(days=1)
token_uri = 'https://accounts.google.com/o/oauth2/token'
user_agent = 'python urllib (I reckon)'
def main():
service = createDrive()
dirlist = service.files().list(maxResults=30)
print 'dirlist', dirlist
result = dirlist.execute()
print 'result', result
def createDrive():
credentials = OAuth2Credentials(access_token, client_id, client_secret, refresh_token, token_expiry, token_uri, user_agent)
http = httplib2.Http()
http = credentials.authorize(http)
return build('drive', 'v2', http=http, developerKey=API_KEY)
main()
I'm grateful to all who have provided the steps along the way to solving this.

Resources