JWT Authentication in Google Cloud Functions - google-app-engine

I'm having trouble troubleshooting the cause for a 403 response from the Google Dataflow API while called using the module "googleapis" inside a Google Cloud Function.
The code works when run on my PC using the same code that is being run on Cloud Functions.
The JWT .json file is being retrieved from an object stored on a Google Storage bucket.
The code looks like this:
...
return getToken(). //Retrieves the JWT Client from Google Storage
then(function (jwtToken) {
console.log("Token: ", JSON.stringify(jwtToken));
return dataFlowList({
projectId: adc.projectId,
auth: jwtToken,
filter: "TERMINATED"
}).then(list => filterDataflowJobList(list))
...
Here the getToken function:
...
let storage: CloudStorage.Storage = CloudStorage({
projectId: adc.projectId
});
var bucket: CloudStorage.Bucket = storage.bucket(bucketName);
var bucketGetFiles = PromiseLab.denodeify(bucket.getFiles);
var stream = bucket.file(jwtJsonFileName).createReadStream();
return toString(stream)
.then(function (msg) {
var jsonJwt = JSON.parse(msg);
var jwtClient = new google.auth.JWT(
jsonJwt.client_email,
null,
jsonJwt.private_key,
['https://www.googleapis.com/auth/cloud-platform'], // an array of auth scopes
null
);
return jwtClient;
}).catch(function (error) {
console.log("Error while trying to retrieve JWT json");
throw error;
})
}
...
I'm based in EU and Cloud Functions are US-bound, could that be the case?
Dataflow jobs are also run in US

While running on Google Function, the authentication retrieval method I'm using is not retrieving the projectId, hence the unauthorized.
async function getADC() {
// Acquire a client and the projectId based on the environment. This method looks
// for the GCLOUD_PROJECT and GOOGLE_APPLICATION_CREDENTIALS environment variables.
const res = await auth.getApplicationDefault();
let client = res.credential;
// The createScopedRequired method returns true when running on GAE or a local developer
// machine. In that case, the desired scopes must be passed in manually. When the code is
// running in GCE or a Managed VM, the scopes are pulled from the GCE metadata server.
// See https://cloud.google.com/compute/docs/authentication for more information.
if (client.createScopedRequired && client.createScopedRequired()) {
// Scopes can be specified either as an array or as a single, space-delimited string.
const scopes = ['https://www.googleapis.com/auth/cloud-platform'];
client = client.createScoped(scopes);
}
return {
client: client,
projectId: res.projectId
}
}
I discovered it by looking at the Header request in the error log, it was in the form of: url: 'https://dataflow.googleapis.com/v1b3/projects//jobs' (notice the double "//" between projects and jobs.

Related

Frontend Authenticated request Google Cloud Storage

I am using a Google Cloud Storage bucket to upload some of my users files. I do not want them to appear as public, so I created a service account representing my frontend app.
I want to know how to make Authenticated Request to Google Cloud Storage, without using the #google-cloud/storage npm package from my frontend app.
I know I need to include Auhtorization: Bearer <token> in my request headers but, how do I get this token ?
I'm using React on my frontend app.
Google has a number of libraries that you can use. Here is one example:
var { google } = require('googleapis')
const request = require('request')
// The service account JSON key file to use to create the Access Token
let privatekey = require('/path/service-account.json')
let scopes = 'https://www.googleapis.com/auth/devstorage.read_only'
let jwtClient = new google.auth.JWT(
privatekey.client_email,
null,
privatekey.private_key,
scopes
)
jwtClient.authorize(function(err, _token) {
if (err) {
console.log(err)
return err
} else {
console.log('token obj:', _token)
console.log('access token:', _token.access_token)
headers: {
"Authorization": "Bearer " + _token.access_token
}
// Make requests to Cloud Storage here
}
})

How to retrieve Azure Key Vault in React JS

I have created some setting in Azure and I need fetch some secret keys from there in react js
const KeyVault = require('azure-keyvault');
const msRestAzure = require('ms-rest-azure');
var KEY_VAULT_URI = "https://mydomain.com.vault.azure.net/";
msRestAzure.loginWithAppServiceMSI({resource: 'https://vault.azure.net', msiEndpoint: 'https://vault.azure.net', msiSecret: '69418689F1E342DD946CB82994CDA3CB', msiApiVersion: '' }).then((credentials) => {
const keyVaultClient = new KeyVault.KeyVaultClient(credentials);
var data = keyVaultClient.getSecret(KEY_VAULT_URI, 'My_Secret_Key');
console.log(data);
});
I'm getting some issue net::ERR_NAME_NOT_RESOLVED, I think I'm missing something. Could anyone please suggest that how to retrieve that secret keys from Azure in React Js
Using the loginWithAppServiceMSI() method from ms-rest-azure will autodetect if you're on a WebApp and get the token from the MSI endpoint. So you must host your code on Azure webapp. Refer to this article for more details.
function getKeyVaultCredentials(){
return msRestAzure.loginWithAppServiceMSI({resource: 'https://vault.azure.net'});
}
function getKeyVaultSecret(credentials) {
let keyVaultClient = new KeyVault.KeyVaultClient(credentials);
return keyVaultClient.getSecret(KEY_VAULT_URI, 'secret', "");
}
getKeyVaultCredentials().then(
getKeyVaultSecret
).then(function (secret){
console.log(`Your secret value is: ${secret.value}.`);
}).catch(function (err) {
throw (err);
});
If you don't have to use Managed Service Identity (MSI), you can use msRestAzure.loginWithServicePrincipalSecret(clientId, secret, domain) to get the credentials.
function getKeyVaultCredentials(){
return msRestAzure.loginWithServicePrincipalSecret(clientId, secret, domain);
}

Error when using Google Application Default Credentials on App Engine

I am trying to make a Node.js app (running Express on App Engine) authenticate with Google API (Server-to-Server) using the Google Application Default Credentials. The app is supposed to use the credentials to talk with Google Analytics, which I have set up by turning on the Analytics API in the Google Developers Console. This is the code I have implemented:
var google = require('googleapis')
var analytics = google.analytics('v3')
app.post('/getAnalyticsData', (req, res) => {
google.auth.getApplicationDefault(function(err, authClient) {
if (err) {
/* Handle error */
}
if (authClient) {
if (authClient.createScopedRequired && authClient.createScopedRequired()) {
authClient = authClient.createScoped(['https://www.googleapis.com/auth/analytics.readonly'])
}
analytics.data.ga.get({
'auth': authClient,
'ids': 'ga:VIEW_ID',
'metrics': 'ga:pageviews,ga:sessions',
'start-date': '2017-01-01',
'end-date': '2017-03-09'
}, function(err, response) {
if (err) {
console.log("Analytics error: ", err)
}
if (response) {
console.log("YAY! Analytics response: ", response)
/* Do something with the response */
}
})
}
})
})
But I am getting this error: A Forbidden error was returned while attempting to retrieve an access token for the Compute Engine built-in service account. This may be because the Compute Engine instance does not have the correct permission scopes specified. Insufficient Permission.
Any idea how to solve this and succeed with the authentication?
I had the same error when trying to use google-auth-library to connect to datastore and was unable to set the correct permissions for the default service account. I found an example in their samples folder that created an auth client using a key file. You can create your own service account with the right permissions and generate a key file on the service account admin page in the cloud console. Hope this helps.
const {auth} = require('google-auth-library');
async function getDnsInfo() {
const client = await auth.getClient({
keyFile: 'path/to/keyFile.json,
scopes: 'https://www.googleapis.com/auth/cloud-platform',
});
const projectId = await auth.getProjectId();
const url = `https://www.googleapis.com/dns/v1/projects/${projectId}`;
const res = await client.request({url});
console.log('DNS Info:');
console.log(res.data);
}

How to authenticate google cloud functions for access to secure app engine endpoints

Google Cloud Platform has introduced Identity Aware Proxy for protecting App Engine Flexible environment instances from public access.
However, it is not entirely clear if this can or should be used from Google Cloud Functions that are accessing GAE hosted API endpoints.
The documentation (with Python and Java examples) indicates an IAP authentication workflow consisting of 1) generating a JWT token, 2) creating an OpenID Token, 3) Then submitting requests to Google App Engine with an Authorization: Bearer TOKEN header.
This seems quite convoluted for running cloud functions if authorisation has to happen each time a function is called.
Is there another way for Google cloud functions to access secured GAE endpoints?
If you want to make calls from GCF to IAP protected app, you should indeed be using ID tokens. There are no examples in Nodejs so I made one using this as a reference (style may be wrong since that's the first time I touch nodejs). Unlike regular JWT claims set, it should not contain scope and have target_audience.
/**
* Make IAP request
*
*/
exports.CfToIAP = function CfToIAP (req, res) {
var crypto = require('crypto'),
request = require('request');
var token_URL = "https://www.googleapis.com/oauth2/v4/token";
// service account private key (copied from service_account.json)
var key = "-----BEGIN PRIVATE KEY-----\nMIIEvQexsQ1DBNe12345GRwAZM=\n-----END PRIVATE KEY-----\n";
// craft JWT
var JWT_header = new Buffer(JSON.stringify({ alg: "RS256", typ: "JWT" })).toString('base64');
// prepare claims set
var iss = "12345#12345.iam.gserviceaccount.com"; // service account email address (copied from service_account.json)
var aud = "https://www.googleapis.com/oauth2/v4/token";
var iat = Math.floor(new Date().getTime() / 1000);
var exp = iat + 120; // no need for a long linved token since it's not cached
var target_audience = "12345.apps.googleusercontent.com"; // this is the IAP client ID that can be obtained by clicking 3 dots -> Edit OAuth Client in IAP configuration page
var claims = {
iss: iss,
aud: aud,
iat: iat,
exp: exp,
target_audience: target_audience
};
var JWT_claimset = new Buffer(JSON.stringify(claims)).toString('base64');
// concatenate header and claimset
var unsignedJWT = [JWT_header, JWT_claimset].join('.');
// sign JWT
var JWT_signature = crypto.createSign('RSA-SHA256').update(unsignedJWT).sign(key, 'base64');
var signedJWT = [unsignedJWT, JWT_signature].join('.');
// get id_token and make IAP request
request.post({url:token_URL, form: {grant_type:'urn:ietf:params:oauth:grant-type:jwt-bearer', assertion:signedJWT}}, function(err,res,body){
var data = JSON.parse(body);
var bearer = ['Bearer', data.id_token].join(' ');
var options = {
url: 'https://1234.appspot.com/', // IAP protected GAE app
headers: {
'User-Agent': 'cf2IAP',
'Authorization': bearer
}
};
request(options, function (err, res, body) {
console.log('error:', err);
});
});
res.send('done');
};
/**
* package.json
*
*/
{
"name": "IAP-test",
"version": "0.0.1",
"dependencies": {
"request": ">=2.83"
}
}
Update: Bundling service account key is not recommended, so a better option is to use the metadata server. For the below sample to work Google Identity and Access Management (IAM) API should be enabled and App Engine default service account should have Service Account Actor role (default Editor is not enough):
/**
* Make request from CF to a GAE app behind IAP:
* 1) get access token from the metadata server.
* 2) prepare JWT and use IAM APIs projects.serviceAccounts.signBlob method to avoid bundling service account key.
* 3) 'exchange' JWT for ID token.
* 4) make request with ID token.
*
*/
exports.CfToIAP = function CfToIAP (req, res) {
// imports and constants
const request = require('request');
const user_agent = '<user_agent_to_identify_your_CF_call>';
const token_URL = "https://www.googleapis.com/oauth2/v4/token";
const project_id = '<project_ID_where_CF_is_deployed>';
const service_account = [project_id,
'#appspot.gserviceaccount.com'].join(''); // app default service account for CF project
const target_audience = '<IAP_client_ID>';
const IAP_GAE_app = '<IAP_protected_GAE_app_URL>';
// prepare request options and make metadata server access token request
var meta_req_opts = {
url: ['http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/',
service_account,
'/token'].join(''),
headers: {
'User-Agent': user_agent,
'Metadata-Flavor': 'Google'
}
};
request(meta_req_opts, function (err, res, body) {
// get access token from response
var meta_resp_data = JSON.parse(body);
var access_token = meta_resp_data.access_token;
// prepare JWT that is {Base64url encoded header}.{Base64url encoded claim set}.{Base64url encoded signature}
// https://developers.google.com/identity/protocols/OAuth2ServiceAccount for more info
var JWT_header = new Buffer(JSON.stringify({ alg: "RS256", typ: "JWT" })).toString('base64');
var iat = Math.floor(new Date().getTime() / 1000);
// prepare claims set and base64 encode it
var claims = {
iss: service_account,
aud: token_URL,
iat: iat,
exp: iat + 60, // no need for a long lived token since it's not cached
target_audience: target_audience
};
var JWT_claimset = new Buffer(JSON.stringify(claims)).toString('base64');
// concatenate JWT header and claims set and get signature usign IAM APIs projects.serviceAccounts.signBlob method
var to_sign = [JWT_header, JWT_claimset].join('.');
// sign JWT using IAM APIs projects.serviceAccounts.signBlob method
var signature_req_opts = {
url: ['https://iam.googleapis.com/v1/projects/',
project_id,
'/serviceAccounts/',
service_account,
':signBlob'].join(''),
method: "POST",
json: {
"bytesToSign": new Buffer(to_sign).toString('base64')
},
headers: {
'User-Agent': user_agent,
'Authorization': ['Bearer', access_token].join(' ')
}
};
request(signature_req_opts, function (err, res, body) {
// get signature from response and form JWT
var JWT_signature = body.signature;
var JWT = [JWT_header, JWT_claimset, JWT_signature].join('.');
// obtain ID token
request.post({url:token_URL, form: {grant_type:'urn:ietf:params:oauth:grant-type:jwt-bearer', assertion:JWT}}, function(err, res, body){
// use ID token to make a request to the IAP protected GAE app
var ID_token_resp_data = JSON.parse(body);
var ID_token = ID_token_resp_data.id_token;
var IAP_req_opts = {
url: IAP_GAE_app,
headers: {
'User-Agent': user_agent,
'Authorization': ['Bearer', ID_token].join(' ')
}
};
request(IAP_req_opts, function (err, res, body) {
console.log('error:', err);
});
});
});
});
res.send('done');
};
For anyone still looking at this 2020 and beyond Google has made this very easy.
Their docs have an example of how to auth IAP that works great in Cloud Functions:
// const url = 'https://some.iap.url';
// const targetAudience = 'IAP_CLIENT_ID.apps.googleusercontent.com';
const {GoogleAuth} = require('google-auth-library');
const auth = new GoogleAuth();
async function request() {
console.info(`request IAP ${url} with target audience ${targetAudience}`);
const client = await auth.getIdTokenClient(targetAudience);
const res = await client.request({url});
console.info(res.data);
}
python example:
from google.auth.transport.requests import Request as google_request
from google.oauth2 import id_token
open_id_connect_token = id_token.fetch_id_token(google_request(), client_id)
where client_id is string. Navigate to APIs & Services part of GCP, then select credentails on the left, there is client_id in OAuth2.0 part in AIP
when yoy want to make request to secured IAP service, just add to headers
{'Authorization': 'Bearer your_open_id_connect_token'}
source: https://cloud.google.com/iap/docs/authentication-howto
As discussed in this doc, you can authenticate to a Google Cloud Platform (GCP) API using:
1- Service Accounts (preferred method) - use of a Google account that is associated with your GCP project, as opposed to a specific user.
2- User Accounts - used when app needs to access resources on behalf of an end user.
3- API keys - generally used when calling APIs that don’t need to access private data.
some people use the Google CLOUD KEY MANAGEMENT SERVICE (KMS) to avoid hardcoding them in the cloud function.
https://cloud.google.com/kms/

AWS. Storing and displaying the profile picture of users

I have a requirement of storing andd displaying the profile picture of my users. So I used the S3.upload function in my nodeJs backend to store the image.
And after the image is stored I stored the link in database to fetch it using ng-source in my view. It worked but the link got expired after few hours and did not work. Below is the code for my upload. Is there any solution in how to do this or any other better way to do this.
var body = fs.createReadStream(file.file.path);
//Upload the photo to AWS
s3.upload({Body: body}, function (err, data) {
if (err) {
res.sendStatus(500);
}
if (data) {
//getSignedUrl and Store it in Database
var AWS = require('aws-sdk');
var url = req.params.doctorId + "/Images/" + fileName;
var s3 = new AWS.S3()
,params = {Bucket: S3Bucket, Key:url };
s3.getSignedUrl('getObject', params, function (err, url) {
if (err || url == null) res.status(500).send({msg: "amazon s3 error"});
else if (url) {
if(req.body.picture == 1) {
User.findByIdAndUpdate(req.params.doctorId, {$set: {'FileName.profilePicture': url}},
function (err, doc) {
if (err)
res.sendStatus(500);
else
res.send({url: url});
});
This is because you're getting the URL from a signed URL and signed URLs expire by design.
From Share an Object with Others on AWS docs:
All objects by default are private. Only the object owner has permission to access these objects. However, the object owner can optionally share objects with others by creating a pre-signed URL, using their own security credentials, to grant time-limited permission to download the objects.
It seems like you're not exactly storing "secret" resources here that access has to be granted to, then the best approach here is store the image publicly. This is trivial to do and you simply have to set the ACL to public-read when you call PutObject or upload. That way you'll know the URL for the object without actually having to retrieve it:
https://s3-[region].amazonaws.com/[bucket]/[file]
This is what your upload statement would look like then:
s3.upload({ Body: body, ACL: 'public-read' }, function (err, data) {

Resources