I'm trying to lease an app engine task from a pull queue in a compute engine instance but it keeps giving this error:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "you are not allowed to make this api call"
}
],
"code": 403,
"message": "you are not allowed to make this api call"
}
}
This is the code I'm using:
import httplib2, json, urllib
from oauth2client.client import AccessTokenCredentials
from apiclient.discovery import build
def FetchToken():
METADATA_SERVER = ('http://metadata/computeMetadata/v1/instance/service-accounts')
SERVICE_ACCOUNT = 'default'
http = httplib2.Http()
token_uri = '%s/%s/token' % (METADATA_SERVER, SERVICE_ACCOUNT)
resp, content = http.request(token_uri, method='GET',
body=None,
headers={'Metadata-Flavor': 'Google'})
print token_uri
print content
if resp.status == 200:
d = json.loads(content)
access_token = d['access_token'] # Save the access token
credentials = AccessTokenCredentials(d['access_token'],
'my-user-agent/1.0')
autho = credentials.authorize(http)
print autho
return autho
else:
print resp.status
task_api = build('taskqueue', 'v1beta2')
lease_req = task_api.tasks().lease(project='project-name',
taskqueue='pull-queue',
leaseSecs=30,
numTasks=1)
result = lease_req.execute(http=FetchToken()) ####ERRORS HERE
item = result.items[0]
print item['payload']
It seems like an authentication issue but it gives me the exact same error if I do the same lease request using a bullshit made-up project name so I can't be sure.
I also launched the instance with taskqueue enabled.
Any help would be greatly appreciated
In case anyone else is stuck on a problem like this I'll explain how it's working now.
Firstly I'm using a different (shorter) method of authentication:
from oauth2client import gce
credentials = gce.AppAssertionCredentials('')
http = httplib2.Http()
http=credentials.authorize(http)
credentials.refresh(http)
service = build('taskqueue', 'v1beta2', http=http)
Secondly, the reason my lease request was being denied is that in queue.yaml my service account email was set as a writer email. In the documentation it's mentioned that an email ending with #gmail.com will not have the rights of a user email when set as a writer email. It's not mentioned that that extends to emails ending with #developer.gserviceaccount.com.
Related
I'm using the example from https://github.com/AzureAD/microsoft-authentication-library-for-python/blob/dev/sample/confidential_client_secret_sample.py. My aim is to grab the URL to report on the number emails read, sent and received by user.
I've been playing around with the endpoint setting and decided to hardcode it whilst testing. The Graph API resources is at GET https://graph.microsoft.com/v1.0/reports/getEmailActivityUserCounts(period='D7').
The code i'm using is as follows.
if "access_token" in result:
# Calling graph using the access token
graph_data = requests.get( # Use token to call downstream service
"https://graph.microsoft.com/v1.0/reports/getEmailActivityUserCounts(period=\'D7\')",
#config["endpoint"],
headers={'Authorization': 'Bearer ' + result['access_token']},).json()
print("Graph API call result: %s" % json.dumps(graph_data, indent=2))
I believe i am correctly escaping D7 but when i run the code i get the following error.
Exception has occurred: JSONDecodeError
Expecting value: line 1 column 1 (char 0)
During handling of the above exception, another exception occurred:
During handling of the above exception, another exception occurred:
To add to this, the JSON is in the format of, when i removed the string and uncommented #config["endpoint"],
{
"authority": "https://login.microsoftonline.com/XXX/",
"client_id": "XXX",
"scope": ["https://graph.microsoft.com/.default"],
"secret": "XXX",
"endpoint": "https://graph.microsoft.com/v1.0/reports/getEmailActivityUserCounts(period='D7')"
}
Is this because the JSONDecoder can't decode the escaped characters for D7?
I tried to reproduce the same in my environment and got the results successfully as below:
I created an Azure AD application and granted API Permission:
To retrieve the report on the number emails read, sent and received by user, I used the below Python code:
import requests
import urllib
import json
import csv
import os
client_id = urllib.parse.quote_plus('ClientID')
client_secret = urllib.parse.quote_plus('ClientSecret')
tenant = urllib.parse.quote_plus('TenantID')
auth_uri = 'https://login.microsoftonline.com/' + tenant \
+ '/oauth2/v2.0/token'
auth_body = 'grant_type=client_credentials&client_id=' + client_id \
+ '&client_secret=' + client_secret \
+ '&scope=https%3A%2F%2Fgraph.microsoft.com%2F.default'
authorization = requests.post(auth_uri, data=auth_body,
headers={'Content-Type': 'application/x-www-form-urlencoded'
})
token = json.loads(authorization.content)['access_token']
graph_uri = \
'https://graph.microsoft.com/v1.0/reports/getEmailActivityUserCounts(period=%27D7%27)'
response = requests.get(graph_uri, data=auth_body,
headers={'Content-Type': 'application/json',
'Authorization': 'Bearer ' + token})
print("response:",response.text)
I am able to get the report successfully like below:
beacuse I am using in snippet I get a A short part of the message text.
I am want to change that for getting the full body of the message
how i can do it ?
def get_message_detail(service, message_id, format='raw', metadata_headers=[]):
try:
message_detail = service.users().messages().get(
userId='me',
id=message_id,
format=format,
metadataHeaders=metadata_headers
).execute()
return message_detail
except Exception as e:
print(e)
return None
if email_messages!= None:
for email_message in email_messages:
messageId = email_message['threadId']
messageSubject = '(No subject) ({0})'.format(messageId)
messsageDetail = get_message_detail(
gmail_service, email_message['id'], format='full',
metadata_headers=['parts'])
messageDetailPayload = messsageDetail.get('payload')
#print(messageDetailPayload)
for item in messageDetailPayload['headers']:
if item['name'] == 'Subject':
if item['value']:
messageSubject = '{0} ({1})'.format(item['value'],messageId)
email_data = messsageDetail['payload']['headers']
#print(email_data)
#print(messageSubject)
for values in email_data:
name = values['name']
if name == "From":
from_name = values['value']
get_detil_msg = messsageDetail['snippet']
print(get_detil_msg)
This will return the full mime message if thats what your looking for.
# To install the Google client library for Python, run the following command:
# pip install --upgrade google-api-python-client google-auth-httplib2 google-auth-oauthlib
from __future__ import print_function
import base64
import email
import json
import os.path
import google.auth.exceptions
from google.auth.transport.requests import Request
from google.oauth2.credentials import Credentials
from google_auth_oauthlib.flow import InstalledAppFlow
from googleapiclient.discovery import build
from googleapiclient.errors import HttpError
# If modifying these scopes, delete the file token.json.
SCOPES = ['https://mail.google.com/']
def Authorize(credentials_file_path, token_file_path):
"""Shows basic usage of authorization"""
try:
credentials = None
# The file token.json stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists(token_file_path):
try:
credentials = Credentials.from_authorized_user_file(token_file_path, SCOPES)
credentials.refresh(Request())
except google.auth.exceptions.RefreshError as error:
# if refresh token fails, reset creds to none.
credentials = None
print(f'An refresh authorization error occurred: {error}')
# If there are no (valid) credentials available, let the user log in.
if not credentials or not credentials.valid:
if credentials and credentials.expired and credentials.refresh_token:
credentials.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
credentials_file_path, SCOPES)
credentials = flow.run_local_server(port=0)
# Save the credentials for the next run
with open(token_file_path, 'w') as token:
token.write(credentials.to_json())
except HttpError as error:
# Todo handle error
print(f'An authorization error occurred: {error}')
return credentials
def ListMessages(credentials):
try:
# create a gmail service object
service = build('gmail', 'v1', credentials=credentials)
# Call the Gmail v1 API
results = service.users().messages().list(userId='me').execute()
messages = results.get('messages', [])
if not messages:
print('No messages where found.')
return
print('Messages:')
for message in messages:
getMessage(credentials, message['id'])
except HttpError as error:
# TODO(developer) - Handle errors from gmail API.
print(f'An error occurred: {error}')
def getMessage(credentials, message_id):
# get a message
try:
service = build('gmail', 'v1', credentials=credentials)
# Call the Gmail v1 API, retrieve message data.
message = service.users().messages().get(userId='me', id=message_id, format='raw').execute()
# Parse the raw message.
mime_msg = email.message_from_bytes(base64.urlsafe_b64decode(message['raw']))
print(mime_msg['from'])
print(mime_msg['to'])
print(mime_msg['subject'])
print("----------------------------------------------------")
# Find full message body
message_main_type = mime_msg.get_content_maintype()
if message_main_type == 'multipart':
for part in mime_msg.get_payload():
if part.get_content_maintype() == 'text':
print(part.get_payload())
elif message_main_type == 'text':
print(mime_msg.get_payload())
print("----------------------------------------------------")
# Message snippet only.
# print('Message snippet: %s' % message['snippet'])
except HttpError as error:
# TODO(developer) - Handle errors from gmail API.
print(f'A message get error occurred: {error}')
if __name__ == '__main__':
creds = Authorize('C:\\YouTube\\dev\\credentials.json', "token.json")
ListMessages(creds)
Full tutorial: How to read gmail message body with python?
I am using Cloud Endpoints Frameworks with Python in a Google Cloud App Engine Standard environment to provide an API.
As far as I can tell, I should be able to use python decorators from the Endpoints Frameworks in combination with the endpointscfg.py command-line tool to automatically set up token-based authentication with Auth0; the endpointscfg.py command-line automatically creates the openapi.json file that is used to configure the Google Endpoints proxy.
Here's an example of my decorator for an API that echos stuff back:
# # [START echo_api]
#endpoints.api(
name='echo',
version=_VERSION,
api_key_required=True,
audiences={'auth0': ['https://echo.<my-project>.appspot.com/_ah/api/echo/v1/echo']},
issuers={'auth0': endpoints.Issuer(
'https://<my-project>.auth0.com',
'https://<my-project>.auth0.com/.well-known/jwks.json')}
)
class EchoApi(remote.Service):
...
When I run the endpointscfg.py command-line tool, I get something in my openapi.json file that looks about right:
"paths": {
"/echo/v1/echo": {
"post": {
"operationId": "EchoApi_echo",
"parameters": [
{
"in": "body",
"name": "body",
"schema": {
"$ref": "#/definitions/MainEchoRequest"
}
}
],
"responses": {
"200": {
"description": "A successful response",
"schema": {
"$ref": "#/definitions/MainEchoResponse"
}
}
},
"security": [
{
"api_key": [],
"auth0_jwt": []
}
]
}
}
"securityDefinitions": {
"api_key": {
"in": "query",
"name": "key",
"type": "apiKey"
},
"auth0_jwt": {
"authorizationUrl": "https://<my-project>.auth0.com/authorize",
"flow": "implicit",
"type": "oauth2",
"x-google-issuer": "https://<my-project>.auth0.com",
"x-google-jwks_uri": "https://<my-project>.auth0.com/.well-known/jwks.json",
"x-google-audiences": "https://echo.<my-project>.appspot.com/_ah/api/echo/v1/echo"
}
}
So, the problem is that this set-up appears to do nothing and does not check incoming tokens to prevent access if no token is present or if the token is invalid.
I have been able to set-up manual processing of the bearer token within the API echo function using the python-jose library (sorry if it's not well done, but I'm just testing and comments are welcome):
authorization_header = self.request_state.headers.get('authorization')
if authorization_header is not None:
if authorization_header.startswith('Bearer '):
access_token = authorization_header[7:]
logging.info(access_token)
else:
logging.error("Authorization header did not start with 'Bearer '!")
raise endpoints.UnauthorizedException(
"Authentication failed (improperly formatted authorization header).")
else:
logging.error("Authorization header did not start with 'Bearer '!")
raise endpoints.UnauthorizedException("Authentication failed (bearer token not found).")
r = urlfetch.fetch(_JWKS_URL)
jwks_content = json.loads(r.content)
keys = jwks_content['keys']
public_key = jwk.construct(keys[0])
logging.info(public_key)
message, encoded_signature = str(access_token).rsplit('.', 1)
# decode the signature
decoded_signature = base64url_decode(encoded_signature.encode('utf-8'))
# verify the signature
if not public_key.verify(message.encode("utf8"), decoded_signature):
logging.warning('Signature verification failed')
raise endpoints.UnauthorizedException("Authentication failed (invalid signature).")
else:
logging.info('Signature successfully verified')
claims = jwt.get_unverified_claims(access_token)
# additionally we can verify the token expiration
if time.time() > claims['exp']:
logging.warning('Token is expired')
raise endpoints.UnauthorizedException("Authentication failed (token expired).")
# and the Audience (use claims['client_id'] if verifying an access token)
if claims['aud'] != _APP_CLIENT_ID:
logging.warning('Token was not issued for this audience')
raise endpoints.UnauthorizedException("Authentication failed (incorrect audience).")
# now we can use the claims
logging.info(claims)
This code works, but I assumed that the whole point of setting up the decorator and configuring the openapi.json file was to off-load these checks to the proxy so that only valid tokens hit my code.
What am I doing wrong?
UPDATE:
It may be that I need to check endpoints.get_current_user() in my code to control access. However, I have just noticed the following in my logs:
Cannot decode and verify the auth token. The backend will not be able to retrieve user info (/base/data/home/apps/e~<my-project>/echo:alpha23.414400469228485401/lib/endpoints_management/control/wsgi.py:643)
Traceback (most recent call last):
File "/base/data/home/apps/e~<my-project>/echo:alpha23.414400469228485401/lib/endpoints_management/control/wsgi.py", line 640, in __call__
service_name)
File "/base/data/home/apps/e~<my-project>/echo:alpha23.414400469228485401/lib/endpoints_management/auth/tokens.py", line 75, in authenticate
error)
UnauthenticatedException: (u'Cannot decode the auth token', UnauthenticatedException(u'Cannot find the `jwks_uri` for issuer https://<my-project>.auth0.com/: either the issuer is unknown or the OpenID discovery failed',))
However, I think everything is configured ok. Any idea why 'jwks_uri' cannot be found despite the fact that path in the openapi.json file is correct?
I'm the current maintainer of these Frameworks. You do need to check endpoints.get_current_user() to control access, yes. I'm working on a feature to make this much simpler.
As for that UnauthenticatedException, you can ignore it. That's coming from the 'management framework', which attempts to check auth tokens even though it's not involved in the Frameworks' oauth security (only the api key security).
I'm trying to connect to cloud sql inside google cloud endpoint and using lightweight jdbc wrapper sql2o as data access method.
#Api(name = "questionapi", version = "v1", description = "question api")
public class QuestionService {
private static Sql2o sql2o = new Sql2o(
"jdbc:mysql://xxx.xxx.xxx.xxx:3306/xxxxx", "root",
"xxxxxxx");
#ApiMethod(name = "get", httpMethod = HttpMethod.GET)
public List<Question> get() {
String q = "select * from questions";
try (Connection conn = sql2o.open()) {
return conn.createQuery(q).executeAndFetch(Question.class);
}
}
After the app is running, I can visit localhost:8888/_ah/api/explorer to try the api. However, there is an error says:
org.sql2o.Sql2oException: Could not acquire a connection from DataSource - No suitable driver found for jdbc:mysql://xxx.xxx.xxx.xxx:3306/xxxxx
How can I solve this issue?
EDIT:
After change to maven project and I got this new error message:
503 Service Unavailable
- Show headers -
{
"error": {
"message": "java.lang.NoClassDefFoundError: Could not initialize class com.mysql.jdbc.ConnectionImpl",
"code": 503,
"errors": [
{
"domain": "global",
"reason": "backendError",
"message": "java.lang.NoClassDefFoundError: Could not initialize class com.mysql.jdbc.ConnectionImpl"
}
]
}
}
EDIT
It's a new day, I still stuck here.
What I did is I use maven to download the endpoints-skeleton-archetype project, it's a new, empty Cloud Endpoints backend API project ready for use, with required files and directories.
I immediately deploy it to app engine, and try to return a meaningful value. It worked, a simple 'hellp world' string will be returned.
Next, I tried to connect to cloud sql using jdbc. In order to do that, I followed the tutorial here
to add <use-google-connector-j>true</use-google-connector-j> into appengine-web.xml
and I try different combination of connection string
Class.forName("com.mysql.jdbc.GoogleDriver");
String url = "jdbc:google:mysql://xxxxxxxxxxxx:xxxxx?user=root";
conn = DriverManager.getConnection(url);
ResultSet rs = conn.createStatement().executeQuery("SELECT 1 + 1");
After all these, I still get this error message.
503 Service Unavailable
- Show headers -
{
"error": {
"errors": [
{
"domain": "global",
"reason": "backendError",
"message": "java.lang.NullPointerException"
}
],
"code": 503,
"message": "java.lang.NullPointerException"
}
}
This error occurs when the jdbc driver is not found in the classpath. How are you managing dependencies? Do you use maven? The error should be fixed if you add the mysql jdbc driver to your list of dependencies.
I have another comment to your code, which has nothing to do with your question. But here it comes anyway.
The codeline below has a connection leak as you never closes the connection. This will eventually deplete the connection pool and your application will hang.
return sql2o.open().createQuery(q).executeAndFetch(Question.class);
This is the correct way of doing it when using sql2o:
try (Connection con = sql2o.open()) {
return con.createQuery(q).executeAndFetch(Question.class);
}
I'm creating an API using Google Cloud Endpoints where I would like to return a "no content" HTTP 204 response if there's nothing to return. I tried returning null, which throws an error on the development server, and a non-empty result on production with status code 200.
It is possible to send out a true 204 empty response or other types or custom responses?
To return a 204 No Content for a production Python Cloud Endpoints API, you can use VoidMessage.
from google.appengine.ext import endpoints
from protorpc import messages
from protorpc import message_types
from protorpc import remote
class MyMessage(messages.Message):
...
#endpoints.api('someapi', 'v1', 'Description')
class MyApi(remote.Service):
#endpoints.method(MyMessage, message_types.VoidMessage,
...)
def my_method(self, request):
...
return message_types.VoidMessage()
This currently returns a 200 on the development server, thanks for finding this bug!
This probably doesn't help, but the only way I know to manipulate the status code is by raising an exception. There are a default set of exceptions provided which map to 400, 401, 403, 404 and 500. The docs say you can subclass endpoints.ServiceException to generate other status codes, however I haven't been able to get this to work. If you set http_status to anything other than one of those listed above, it always results in a 400.
class TestException(endpoints.ServiceException):
http_status = httplib.NO_CONTENT
I run a test in my handler like this:
raise TestException('The status should be 204')
And I see this output when testing it using the API explorer:
400 Bad Request
- Show headers -
{
"error": {
"errors": [
{
"domain": "global",
"reason": "badRequest",
"message": "The status should be 204"
}
],
"code": 400,
"message": "The status should be 204"
}
}