Google Cloud Storage - com.google.appengine.api.appidentity.AppIdentityServiceFailureException - google-app-engine

I am trying to store a file to Google Cloud Storage using a jaxrs Service running in Google App Engine. While trying to store the file I am getting below error.
com.google.appengine.tools.cloudstorage.NonRetriableException: com.google.appengine.api.appidentity.AppIdentityServiceFailureException: The AppIdentity service threw an unexpected error. Details:
I am trying to save the file to a new bucket and gave below ids (compute, app engine and service account) permissions to the bucket. I also created a separate service account and gave this service account also the Writer Permission (with Editor Role to this account to the project)
myaccount#myproject.iam.gserviceaccount.com
xxxx-compute#developer.gserviceaccount.com
xxxx#cloudservices.gserviceaccount.com
I understand the service account is not required because from app engine with the default service we should be able to store the file. But just to try I also created the above service account and stored the file in WEB-INF/resources/service_account_credentials.json location and set the below property in appengine-web.xml
<property name="GOOGLE_APPLICATION_CREDENTIALS" value="WEB-INF/resources/service_account_credentials.json"/>
I tried below two ways to store the file, but both are giving the error.
First Way...
Getting the Service
GcsService gcsService = GcsServiceFactory.createGcsService(new RetryParams.Builder()
.initialRetryDelayMillis(10)
.retryMaxAttempts(10)
.totalRetryPeriodMillis(15000)
.build());
Storing the file
GcsFileOptions gcsFileOptions = null ;
GcsFileOptions.Builder builder = new GcsFileOptions.Builder() ;
if(aclEntityName != null)
builder = builder.acl(aclEntityName) ;
if(fileMetaData != null && !fileMetaData.isEmpty())
for(Map.Entry<String, String> entry : fileMetaData.entrySet()){
builder = builder.addUserMetadata(entry.getKey(), entry.getValue()) ;
}
gcsFileOptions = builder.build() ;
GcsFilename fileName = new GcsFilename(bucketName, name) ;
GcsService gcsService = StorageFactory.getGcsService() ;
GcsOutputChannel outputChannel = gcsService.createOrReplace(fileName, gcsFileOptions);
copy(contentStream, Channels.newOutputStream(outputChannel));
Second Way as given below also gives the same error...
Getting the Service
HttpTransport transport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleCredential credential =
GoogleCredential.getApplicationDefault(transport, jsonFactory);
if (credential.createScopedRequired()) {
Collection<String> scopes = StorageScopes.all();
credential = credential.createScoped(scopes);
}
return new Storage.Builder(transport, jsonFactory, credential)
.setApplicationName("GCS Samples")
.build();
Storing the file second way
StorageObject objectMetaData = new StorageObject();
objectMetaData.setName(name);
if(fileMetaData != null && fileMetaData.isEmpty() == false )
objectMetaData.setMetadata(fileMetaData) ;
// Set the access control list to publicly read-only
if(aclEntityName != null && !aclEntityName.trim().equals("")
&& aclRole != null && !aclRole.trim().equals("") ) {
objectMetaData.setAcl(Arrays.asList(
new ObjectAccessControl().setEntity(aclEntityName).setRole(aclRole)));
}
InputStreamContent mediaContent = new InputStreamContent("application/octet-stream", contentStream);
// Do the insert
Storage client = StorageFactory.getService();
Storage.Objects.Insert insertRequest = client.objects().insert(
bucketName, objectMetaData, mediaContent);
if (mediaContent.getLength() > 0 && mediaContent.getLength() <= 2 * 1000 * 1000 /* 2MB */) {
insertRequest.getMediaHttpUploader().setDirectUploadEnabled(true);
}
insertRequest.execute();
What am I doing wrong? Is there any setting I need to do to fix this error? Please help !!!

From my understanding of Google Application Default Credentials, using GOOGLE_APPLICATION_CREDENTIALS environment variable should be pointing to a local file, and is generally used with gcloud command line tool to authenticate when testing code locally.
That said, according to https://stackoverflow.com/a/36408645/374638, perhaps this should be in <env-variables> instead.

Related

Is there a way to validate azure app credentials?

Given I have the following info from Azure app registration:
Application (client) ID,
Client secret,
Directory (tenant) ID,
Object ID
Is there a way to check it's a valid credential programmatically (like using curl etc but not powershell)?
If you meant to check client secret validity or even the properties of that app ,then please check if the below c# code can be worked around .We can try to query the application and see expiry date of secret. Please grant the app with Directory.Read.All ,Application.Read.All permission to this API for using client credentials flow.
var graphResourceId = "https://graph.microsoft.com";
var applicationId= "";
var ObjectId = "";
var clientsecret = "";
var clientCredential = new ClientCredential(applicationId,secret);
var tenantId = "xxx.onmicrosoft.com";
AuthenticationContext authContext = new AuthenticationContext($"https://login.microsoftonline.com/{tenantId}");
//get accesstoken
var accessToken = authContext.AcquireTokenAsync(graphResourceId, clientCredential).Result.AccessToken;
Uri servicePointUri = new Uri(graphResourceId);
Uri serviceRoot = new Uri(servicePointUri, tenantId);
ActiveDirectoryClient activeDirectoryClient = new ActiveDirectoryClient(serviceRoot, async () => await Task.FromResult(accessToken));
var app = activeDirectoryClient.Applications.GetByObjectId(appObjectId).ExecuteAsync().Result;
foreach (var passwordCredential in app.PasswordCredentials)
{
Console.WriteLine($"KeyID:{passwordCredential.KeyId}\r\nEndDate:{passwordCredential.EndDate}\r\n");
}
If you want , you can even request token using curl this way and validate using post man or by checking token in https://jwt.io .
Reference: check client secret expiry using C#

Blazor and Active Directory -Is getting it working here different than net core 5 MVC? And is that different that getting it working with Azure AD?

I have past net core 5 MVC applcation that I am rewriting in Blazor server.
The code I have used in MVC to get it working includes the following:
public void GetADinfo(out string givenName, out string surname, out string homePhone, out string email)
{
//===========================================================
//Go and get AD info for the current user or equivalent
var components = User.Identity.Name.Split('\\');
var username = components.Last();
// create LDAP connection object
DirectoryEntry myLdapConnection = createDirectoryEntry();
DirectorySearcher search = new DirectorySearcher(myLdapConnection);
search.Filter = "(cn=" + username + ")";
SearchResult result = search.FindOne();
DirectoryEntry dsresult = result.GetDirectoryEntry();
givenName = dsresult.Properties["givenName"][0].ToString();
surname = dsresult.Properties["sn"][0].ToString();
email = dsresult.Properties["mail"][0].ToString();
homePhone = dsresult.Properties["homePhone"][0].ToString();
//=============================================================================
}
public DirectoryEntry createDirectoryEntry()
{
// create and return new LDAP connection with desired settings
string ADconn = _context.ApplicConfs.Select(s => s.Ldapconn).FirstOrDefault();
string LDAPConn = _context.ApplicConfs.Select(s => s.Ldappath).FirstOrDefault();
//string ADconn;
//ADconn = "SERVER.A3HR.local";
//string LDAPConn;
//LDAPConn = "LDAP://SERVER.A3HR.local";
//DirectoryEntry ldapConnection = new DirectoryEntry("SERVER.A3HR.local");
//ldapConnection.Path = "LDAP://OU=staffusers,DC=leeds-art,DC=ac,DC=uk";
//ldapConnection.Path = "LDAP://SERVER.A3HR.local";
DirectoryEntry ldapConnection = new DirectoryEntry(ADconn);
ldapConnection.Path = LDAPConn;
ldapConnection.AuthenticationType = AuthenticationTypes.Secure;
return ldapConnection;
}
Does the code for using Active Directory in Blazor Server require anything different? In other words is the authentication different using Blazor as compared to net core MVC?
My app uses Blazor server windows authentication. So I get current user in the normal Blazor way. When I get that I want to use the current userid to look up the email and telephone number in AD and pre-populate it on a page if it exists. That way - in the application- the user doesn't have to re enter this information all the time.
Does anyone have an example of this using Blazor? Is the approach dramatically different between local AD and Azure AD in the logic/coding used? I see a few examples of Azure AD use in Blazor out there.
Thanks for any information provided...

Creating BigQuery table from Google Sheet using Java API - access denied

A Google drive sheet has been created (from XLS) using Drive API - by an App Engine application, with default service account. The newly created document has been shared with individuals and access to file has been confirmed.
File file = driveService.files().create(fileMetadata, inputStreamContent)
.setFields("id")
.execute();
Logger.info("Created file: %s", file.getId());
BatchRequest batch = driveService.batch();
Permission userPermission = new Permission()
.setType("user")
.setRole("writer")
.setEmailAddress("personal.email#gmail.com");
driveService.permissions().create(file.getId(), userPermission)
.setFields("id")
.execute();
Now I would like to create a BigQuery table from this Google Sheet. So I've got Drive API enabled obviously for previous step. I have adjusted BigQuery service to have Credentials with necessary scope created:
private static final List<String> SCOPES = asList(DriveScopes.DRIVE,
DriveScopes.DRIVE_READONLY, SheetsScopes.SPREADSHEETS, AUTH, BIGQUERY);
GoogleCredentials googleCredentials = AppEngineCredentials.getApplicationDefault().createScoped(SCOPES);
BigQueryOptions options = BigQueryOptions.newBuilder().setCredentials(googleCredentials).build();
BigQuery bigQuery = options.getService();
But still no luck when I call the controller to ingest the sheet with this code:
ExternalTableDefinition tableDefinition = ExternalTableDefinition
.of(String.format(GOOGLE_DRIVE_LOCATION_FORMAT, fileId), categoryMappingSchema(),
GoogleSheetsOptions.newBuilder().setSkipLeadingRows(FIRST_ROW).build());
TableInfo tableInfo = TableInfo.newBuilder(tableId, tableDefinition).build();
Table table = bigQuery.create(tableInfo);
The error I'm getting suggests that the scope has not been provided to the credentials.
Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found.
Am I missing something?
I suspect there's a problem with ADC - when I initialize Credentials from the json key, it works as expected:
InputStream inputStream = new ChannelInputStream(inputChannel);
bqCredentials = GoogleCredentials
.fromStream(inputStream)
.createScoped(BQ_SCOPES);
This approach did not work:
GoogleCredentials googleCredentials = AppEngineCredentials.getApplicationDefault().createScoped(SCOPES);

Unable to make Flask-SQLAlchemy connect to my Google App Engine DB

I am trying to set up a web-application based on Flask using Google App Engine (I'm new to both).
The web-application receives data from the client and it should be processed and saved in a database.
I've tried to use Flask-SQLAlchemy but I'm unable to set it up with Google Cloud SQL, I've used this guide to create a MySQL DB in the same project:
and then I'm trying to use it on my main python code:
app.config('SQLALCHEMY_DATABASE_URI') = 'mysql+mysqldb://root#/Results?unix_socket=/cloudsql/crafty-circlet-164415:psy01'
app.config['SECRET_KEY'] = 'NglfxE8FOP9pgV8fxpyj'
db = SQLAlchemy(app)
class Result(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.Text)
profession = db.Column(db.Text)
year = db.Column(db.Text)
pressure_level = db.Column(db.Integer)
reported_suc_count = db.Column(db.Integer)
marked_suc_count = db.Column(db.Integer)
real_suc_count = db.Column(db.Integer)
insertion_time = db.Column(db.DateTime)
def __init__(self, name, profession, year, pressure_level, reported_suc_count, marked_suc_count, real_suc_count):
self.name = name
self.profession = profession
self.year = year
self.pressure_level = pressure_level
self.reported_suc_count = reported_suc_count
self.marked_suc_count = marked_suc_count
self.real_suc_count = real_suc_count
self.insertion_time = datetime.utcnow()
#app.route('/resultform', methods=['POST', 'GET'])
def resultform():
if request.method == 'POST':
if not request.form['successmatrices']:
flash('please fill all the fields', 'error')
else:
if 'name' in request.form:
name = request.form['name']
else:
name = None
if 'profession' in request.form:
profession = request.form['profession']
else:
profession = None
if 'year' in request.form:
year = request.form['year']
else:
year = None
if 'pressure_level' in request.form:
pressure_level = int(request.form['pressure_level'])
else:
pressure_level = None
if 'successmatrices' in request.form:
successmatrices = int(request.form['successmatrices'])
else:
successmatrices = 0
new_result = Result(name=name, profession=profession, year=year, pressure_level=pressure_level, reported_suc_count=successmatrices, marked_suc_count=len(session['marked']), real_suc_count=len(session['correct']))
db.session.add(new_result)
db.session.commit()
return redirect(url_for('showresults'))
return render_template("resultform.html")
#app.route('/showresults')
def showresults():
return render_template("showresults.html", results=Results.query.all())
if __name__ == '__main__':
db.create_all()
app.run(debug=True)
When I'm trying to run it in my local development (I'm using PyCharm) I receive the following error in the background:
ERROR 2017-04-16 09:20:19,802 wsgi.py:263]
Traceback (most recent call last):
File "C:\Users\<>\AppData\Local\Google\Cloud SDK\google-cloud-sdk\platform\google_appengine\google\appengine\runtime\wsgi.py", line 240, in Handle
handler = _config_handle.add_wsgi_middleware(self._LoadHandler())
File "C:\Users\<>\AppData\Local\Google\Cloud SDK\google-cloud-sdk\platform\google_appengine\google\appengine\runtime\wsgi.py", line 299, in _LoadHandler
handler, path, err = LoadObject(self._handler)
File "C:\Users\<>\AppData\Local\Google\Cloud SDK\google-cloud-sdk\platform\google_appengine\google\appengine\runtime\wsgi.py", line 85, in LoadObject
obj = __import__(path[0])
File "C:\Users\<>\PycharmProjects\crafty-circlet-164415\main.py", line 7
app.config('SQLALCHEMY_DATABASE_URI') = 'mysql+mysqldb://root#/Results?unix_socket=/cloudsql/crafty-circlet-164415:psy01'
SyntaxError: can't assign to function call
And after deployment to GAE the following error appears:
Error: Server Error
The server encountered an error and could not complete your request.
Please try again in 30 seconds.
Any idea how to solve this?
app.config is a dictionary, so to add a config value you'll use a [ ] instead of () just as done in you app.config['SECRET_KEY'].
So it should be:
app.config['SQLALCHEMY_DATABASE_URI'] = SQLALCHEMY_DATABASE_URI
Some other pointers too for a successful connection. You'll need to format your connection details properly:
USER = 'root'
PASSWORD = 'your-cloudsql-password'
DATABASE = 'your-cloudsql-database-name'
# connection_name is of the format `project:region:your-cloudsql-instance`
CONNECTION_NAME = 'your-cloudsql-connection-name'
SQLALCHEMY_DATABASE_URI = (
'mysql+pymysql://{user}:{password}#localhost/{database}'
'?unix_socket=/cloudsql/{connection_name}').format(
user=USER, password=PASSWORD,
database=DATABASE, connection_name=CONNECTION_NAME)
app.config['SQLALCHEMY_DATABASE_URI'] = SQLALCHEMY_DATABASE_URI
I'll probably separate my secrets and all other sensitive info into a config file not checked into source or use environment variables, etc.
If you want to locally test your application with your Cloud SQL instance, you'll need to install the Cloud SQL Proxy and add the connection name as an environment variable and the MySQLdb library to your app.yaml
> cloud_sql_proxy -instances=your-connection-name=tcp:3306
Else you can use a local MySQL instance for testing but switching to Cloud SQL when on app engine.
More information on setting up Cloud SQL with App Engine can be found here

How to authenticate programmatically to google app engine (with Java)?

I'm trying to authenticate to google app engine programmatically.
I've tried the code sample from the "gae-app-manager" project but it fails:
tmp>java -jar net.sf.gae-app-manager-0.0.1-jar-with-dependencies.jar myaccount#gmail.com mypassword appname
Exception in thread "main" java.lang.Exception: Did not find ACSID cookie
at net.sf.gaeappmanager.google.LogonHelper.loginToGoogleAppEngine(LogonHelper.java:85)
at net.sf.gaeappmanager.google.appengine.Manager.retrieveAppQuotaDetails(Manager.java:34)
at net.sf.gaeappmanager.google.appengine.Main.main(Main.java:55)
Any idea? I'm able to get the token, but there are no cookies.
The code (taken from the gae-app-manager project - http://gae-app-manager.git.sourceforge.net/git/gitweb.cgi?p=gae-app-manager/gae-app-manager;a=blob;f=src/main/java/net/sf/gaeappmanager/google/LogonHelper.java;h=8e09a6d7f864c29b10847ac7fd2eeab2d3e561e6;hb=HEAD):
List<NameValuePair> nvps = new ArrayList<NameValuePair>();
nvps.add(new BasicNameValuePair("accountType", "HOSTED_OR_GOOGLE"));
nvps.add(new BasicNameValuePair("Email", userid));
nvps.add(new BasicNameValuePair("Passwd", password));
nvps.add(new BasicNameValuePair("service", "ah"));
nvps.add(new BasicNameValuePair("source", source));
HttpPost post = new HttpPost(
"https://www.google.com/accounts/ClientLogin");
post.setEntity(new UrlEncodedFormEntity(nvps, HTTP.UTF_8));
HttpResponse response = client.execute(post);
if (response.getStatusLine().getStatusCode() != 200) {
throw new Exception("Error obtaining ACSID");
}
String authToken = getAuthToken(response.getEntity().getContent());
post.abort();
HttpGet get = new HttpGet(
"https://appengine.google.com/_ah/login?auth=" + authToken);
response = client.execute(get);
for (Cookie cookie : client.getCookieStore().getCookies()) {
if (cookie.getName().startsWith("ACSID")) {
return cookie.getValue();
}
}
get.abort();
throw new Exception("Did not find ACSID cookie");
Thanks,
Li
Have you considered using the OAuth support instead of trying to log in as a web client would? Every App Engine app can act as an OAuth provider with very little work required on the server side to set it up.
To solve the problem use "SACSID" instead of "ACSID"

Resources