Creating BigQuery table from Google Sheet using Java API - access denied - google-app-engine

A Google drive sheet has been created (from XLS) using Drive API - by an App Engine application, with default service account. The newly created document has been shared with individuals and access to file has been confirmed.
File file = driveService.files().create(fileMetadata, inputStreamContent)
.setFields("id")
.execute();
Logger.info("Created file: %s", file.getId());
BatchRequest batch = driveService.batch();
Permission userPermission = new Permission()
.setType("user")
.setRole("writer")
.setEmailAddress("personal.email#gmail.com");
driveService.permissions().create(file.getId(), userPermission)
.setFields("id")
.execute();
Now I would like to create a BigQuery table from this Google Sheet. So I've got Drive API enabled obviously for previous step. I have adjusted BigQuery service to have Credentials with necessary scope created:
private static final List<String> SCOPES = asList(DriveScopes.DRIVE,
DriveScopes.DRIVE_READONLY, SheetsScopes.SPREADSHEETS, AUTH, BIGQUERY);
GoogleCredentials googleCredentials = AppEngineCredentials.getApplicationDefault().createScoped(SCOPES);
BigQueryOptions options = BigQueryOptions.newBuilder().setCredentials(googleCredentials).build();
BigQuery bigQuery = options.getService();
But still no luck when I call the controller to ingest the sheet with this code:
ExternalTableDefinition tableDefinition = ExternalTableDefinition
.of(String.format(GOOGLE_DRIVE_LOCATION_FORMAT, fileId), categoryMappingSchema(),
GoogleSheetsOptions.newBuilder().setSkipLeadingRows(FIRST_ROW).build());
TableInfo tableInfo = TableInfo.newBuilder(tableId, tableDefinition).build();
Table table = bigQuery.create(tableInfo);
The error I'm getting suggests that the scope has not been provided to the credentials.
Access Denied: BigQuery BigQuery: No OAuth token with Google Drive scope was found.
Am I missing something?

I suspect there's a problem with ADC - when I initialize Credentials from the json key, it works as expected:
InputStream inputStream = new ChannelInputStream(inputChannel);
bqCredentials = GoogleCredentials
.fromStream(inputStream)
.createScoped(BQ_SCOPES);
This approach did not work:
GoogleCredentials googleCredentials = AppEngineCredentials.getApplicationDefault().createScoped(SCOPES);

Related

How to update NetSuite through Salesforce?

How would you go about updating NetSuite through Salesforce.
I know that you would use NetSuite's RESTlet and Salesforce Apex code to connect the two, but how would you actually go about doing this in a step by step process?
To send data from Salesforce to NetSuite (specifically customer/account data) you will need to do some preliminary setup in both.
In NetSuite:
Create a RESTlet Script that has at the bare minimum a get and post.
For instance I would create a javascript file on my desktop that contains:
/**
*#NApiVersion 2.x
*#NScriptType restlet
*/
//Use: Update NS customer with data (context) that is passed from SF
define(['N/record'], function(record) //use the record module
{
function postData(context)
{
//load the customer I'm gonna update
var cust = record.load({type:context.recordtype, id:context.id});
log.debug("postData","loaded the customer with NSID: " + context.id);
//set some body fields
cust.setValue("companyname", context.name);
cust.setValue("entityid", context.name + " (US LLC)");
cust.setValue("custentity12", context.formerName);
cust.setValue("phone",context.phone);
cust.setValue("fax",context.fax);
//remove all addresses
while(cust.getLineCount('addressbook') != 0)
cust.removeLine('addressbook',0);
//add default billing address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultbilling',0,true);
cust.setSublistValue('addressbook','label',0,'BILL_TO');
var billingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
billingAddress.setValue('country',context.billingCountry);
billingAddress.setValue('addr1', context.billingStreet);
billingAddress.setValue('city',context.billingCity);
billingAddress.setValue('state',context.billingState);
billingAddress.setValue('zip',context.billingZip);
//add default shipping address
cust.insertLine('addressbook',0);
cust.setSublistValue('addressbook','defaultshipping',0,true);
cust.setSublistValue('addressbook','label',0,'SHIP_TO');
var shippingAddress=cust.getSublistSubrecord('addressbook','addressbookaddress',0);
shippingAddress.setValue('country',context.shippingCountry);
shippingAddress.setValue('addr1',context.shippingStreet);
shippingAddress.setValue('city',context.shippingCity);
shippingAddress.setValue('state',context.shippingState);
shippingAddress.setValue('zip',context.shippingZip);
//save the record
var NSID = cust.save();
log.debug("postData","saved the record with NSID: " + NSID);
return NSID; //success return the ID to SF
}
//get and post both required, otherwise it doesn't work
return {
get : function (){return "get works";},
post : postData //this is where the sauce happens
};
});
After You've saved this file, go to NetSuite>Customization>Scripting>Scripts>New.
Select the new file that you've saved and create the script record. Your script record in NetSuite should have GET and POST checked under scripts.
Next, click deploy script and choose who will call this script, specifically the user that will login on the salesforce end into NetSuite.
On the deployment page you will need the External URL it goes something like:
https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1
Note: If any data that you are updating is critical for a process, I would highly recommond creating this in the sandbox first for testing, before moving to production.
In Salesforce sandbox:
Click YourName>Developer Console
In the developer console click File>New and create an Apex Class:
global class NetSuiteWebServiceCallout
{
#future (callout=true) //allow restlet callouts to run asynchronously
public static void UpdateNSCustomer(String body)
{
Http http = new Http();
HttpRequest request = new HttpRequest();
request.setEndPoint('https://1234567.restlets.api.netsuite.com/app/site/hosting/restlet.nl?script=123&deploy=1'); //external URL
request.setMethod('POST');
request.setHeader('Authorization', 'NLAuth nlauth_account=1234567, nlauth_email=login#login.com, nlauth_signature=password'); //login to netsuite, this person must be included in the NS restlet deployment
request.setHeader('Content-Type','application/json');
request.setBody(body);
HttpResponse response = http.send(request);
System.debug(response);
System.debug(response.getBody());
}
}
You will have to set the endpoint here to the external url, the authorization nlauth_account=[your netsuite account number], and the header to your login email and password to a person who is on the deployment of the NS script, the body will be set in the trigger that calls this class.
Next create the trigger that will call this class. I made this script run every time I update the account in Salesforce.
trigger UpdateNSCustomer on Account (after update)
{
for(Account a: Trigger.new)
{
String data = ''; //what to send to NS
data = data + '{"recordtype":"customer","id":"'+a.Netsuite_Internal_ID__c+'","name":"'+a.Name+'","accountCode":"'+a.AccountCode__c+'",';
data = data + '"formerName":"'+a.Former_Company_Names__c+'","phone":"'+a.Phone+'","fax":"'+a.Fax+'","billingStreet":"'+a.Billing_Street__c+'",';
data = data + '"billingCity":"'+a.Billing_City__c+'","billingState":"'+a.Billing_State_Province__c+'","billingZip":"'+a.Billing_Zip_Postal_Code__c+'",';
data = data + '"billingCountry":"'+a.Billing_Country__c+'","shippingStreet":"'+a.Shipping_Street__c+'","shippingCity":"'+a.Shipping_City__c+'",';
data = data + '"shippingState":"'+a.Shipping_State_Province__c+'","shippingZip":"'+a.Shipping_Zip_Postal_Code__c+'","shippingCountry":"'+a.Shipping_Country__c+'"}';
data = data.replaceAll('null','').replaceAll('\n',',').replace('\r','');
System.debug(data);
NetSuiteWebServiceCallout.UpdateNSCustomer(data); //call restlet
}
}
In this script data is the body that you are sending to NetSuite.
Additionally, you will have to create an authorized endpoint for NetSuite in your remote site setings in salesforce (sandbox and production). Go to setup, and quickfind remote site settings which will be under security controls.
Create a new Remote site that has its remote site URL set to the first half of your external url: https://1234567.restlets.api.netsuite.com.
From here, do some testing in the sandbox.
If all looks well deploy the class and trigger to salesforce production.

Google Cloud Storage - com.google.appengine.api.appidentity.AppIdentityServiceFailureException

I am trying to store a file to Google Cloud Storage using a jaxrs Service running in Google App Engine. While trying to store the file I am getting below error.
com.google.appengine.tools.cloudstorage.NonRetriableException: com.google.appengine.api.appidentity.AppIdentityServiceFailureException: The AppIdentity service threw an unexpected error. Details:
I am trying to save the file to a new bucket and gave below ids (compute, app engine and service account) permissions to the bucket. I also created a separate service account and gave this service account also the Writer Permission (with Editor Role to this account to the project)
myaccount#myproject.iam.gserviceaccount.com
xxxx-compute#developer.gserviceaccount.com
xxxx#cloudservices.gserviceaccount.com
I understand the service account is not required because from app engine with the default service we should be able to store the file. But just to try I also created the above service account and stored the file in WEB-INF/resources/service_account_credentials.json location and set the below property in appengine-web.xml
<property name="GOOGLE_APPLICATION_CREDENTIALS" value="WEB-INF/resources/service_account_credentials.json"/>
I tried below two ways to store the file, but both are giving the error.
First Way...
Getting the Service
GcsService gcsService = GcsServiceFactory.createGcsService(new RetryParams.Builder()
.initialRetryDelayMillis(10)
.retryMaxAttempts(10)
.totalRetryPeriodMillis(15000)
.build());
Storing the file
GcsFileOptions gcsFileOptions = null ;
GcsFileOptions.Builder builder = new GcsFileOptions.Builder() ;
if(aclEntityName != null)
builder = builder.acl(aclEntityName) ;
if(fileMetaData != null && !fileMetaData.isEmpty())
for(Map.Entry<String, String> entry : fileMetaData.entrySet()){
builder = builder.addUserMetadata(entry.getKey(), entry.getValue()) ;
}
gcsFileOptions = builder.build() ;
GcsFilename fileName = new GcsFilename(bucketName, name) ;
GcsService gcsService = StorageFactory.getGcsService() ;
GcsOutputChannel outputChannel = gcsService.createOrReplace(fileName, gcsFileOptions);
copy(contentStream, Channels.newOutputStream(outputChannel));
Second Way as given below also gives the same error...
Getting the Service
HttpTransport transport = GoogleNetHttpTransport.newTrustedTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleCredential credential =
GoogleCredential.getApplicationDefault(transport, jsonFactory);
if (credential.createScopedRequired()) {
Collection<String> scopes = StorageScopes.all();
credential = credential.createScoped(scopes);
}
return new Storage.Builder(transport, jsonFactory, credential)
.setApplicationName("GCS Samples")
.build();
Storing the file second way
StorageObject objectMetaData = new StorageObject();
objectMetaData.setName(name);
if(fileMetaData != null && fileMetaData.isEmpty() == false )
objectMetaData.setMetadata(fileMetaData) ;
// Set the access control list to publicly read-only
if(aclEntityName != null && !aclEntityName.trim().equals("")
&& aclRole != null && !aclRole.trim().equals("") ) {
objectMetaData.setAcl(Arrays.asList(
new ObjectAccessControl().setEntity(aclEntityName).setRole(aclRole)));
}
InputStreamContent mediaContent = new InputStreamContent("application/octet-stream", contentStream);
// Do the insert
Storage client = StorageFactory.getService();
Storage.Objects.Insert insertRequest = client.objects().insert(
bucketName, objectMetaData, mediaContent);
if (mediaContent.getLength() > 0 && mediaContent.getLength() <= 2 * 1000 * 1000 /* 2MB */) {
insertRequest.getMediaHttpUploader().setDirectUploadEnabled(true);
}
insertRequest.execute();
What am I doing wrong? Is there any setting I need to do to fix this error? Please help !!!
From my understanding of Google Application Default Credentials, using GOOGLE_APPLICATION_CREDENTIALS environment variable should be pointing to a local file, and is generally used with gcloud command line tool to authenticate when testing code locally.
That said, according to https://stackoverflow.com/a/36408645/374638, perhaps this should be in <env-variables> instead.

How to fix error: A project ID is required for this service but could not be determined

I'm trying to insert data to Google Datastore from AppEngine and I'm getting an error:
java.lang.IllegalArgumentException: A project ID is required for this service but could not be determined from the builder or the environment. Please set a project ID using the builder.
at com.google.common.base.Preconditions.checkArgument(Preconditions.java:92)
at com.google.cloud.ServiceOptions.<init>(ServiceOptions.java:324)
at com.google.cloud.datastore.DatastoreOptions.<init>(DatastoreOptions.java:85)
at com.google.cloud.datastore.DatastoreOptions.<init>(DatastoreOptions.java:32)
at com.google.cloud.datastore.DatastoreOptions$Builder.build(DatastoreOptions.java:75)
at com.google.cloud.datastore.DatastoreOptions.defaultInstance(DatastoreOptions.java:123)
Here's my code:
Datastore datastore = DatastoreOptions.defaultInstance().service();
KeyFactory keyFactory = datastore.newKeyFactory().kind("keyKind");
Key key = keyFactory.newKey("keyName");
Entity entity = Entity.builder(key)
.set("name", "John Doe")
.set("age", 30)
.set("access_time", DateTime.now())
.build();
datastore.put(entity);
How do I fix this error?
In my case the next code is working:
BigQuery bigquery = BigQueryOptions.newBuilder().setProjectId("XXXXX")
.setCredentials(
ServiceAccountCredentials.fromStream(new FileInputStream("key.json"))
).build().getService();
I set the projectId in the builder.
Are you running AE standard or AE Flexible?
Make sure you have the App Engine api jar in your classpath (WEB-INF/lib directory).
I could solve this by setting the following environment variable to project id
GCLOUD_PROJECT
if this does not work try
GCP_PROJECT
more info here
https://cloud.google.com/functions/docs/configuring/env-var

GAE taskqueue access application storage

My GAE application is written in Python with webapp2. My application targets at analyzing user's online social network. Users could login and authorize my application, hence the access token will be stored for further crawling the data. Then i use the taskqueue to launch a backend task, as the crawling process is time consuming. However, when i access the datastore to fetch the access token, i can get it. I wonders whether there is a way to access the data of the frontend, rather than the temporary storage for the taskqueue.
the handler to the process http request from the user
class Callback(webapp2.RequestHandler):
def get(self):
global client
global r
code = self.request.get('code')
try:
client = APIClient(app_key=APP_KEY, app_secret=APP_SECRET,redirect_uri=CALLBACK_URL)
r = client.request_access_token(code)
access_token = r.access_token
record = model.getAccessTokenByUid(r.uid)
if record is None or r.access_token != record.accessToken:
# logging.debug("access token stored")
**model.insertAccessToken(long(r.uid), access_token, r.expires_in, "uncrawled", datetime.datetime.now())** #data stored here
session = self.request.environ['beaker.session']
session['uid'] = long(r.uid)
self.redirect(CLUSTER_PAGE % ("true"))
except Exception, e:
logging.error("callback:%s" % (str(e)));
self.redirect(CLUSTER_PAGE % ("false"))
the handle to process task submitted to taskqueue
class CrawlWorker(webapp2.RequestHandler):
def post(self): # should run at most 1/s
uid = self.request.get('uid')
logging.debug("start crawling uid:%s in the backend" % (str(uid)))
global client
global client1
global r
tokenTuple = model.getAccessTokenByUid(uid)
if tokenTuple is None: **#here i always get a None**
logging.error("CounterWorker:oops, authorization token is missed.")
return
The question is not clear (is it can or cant?) But if you want to access frontend data from the taskqueue, pass it as parameters to the task queue.

gaeutilities' session works on SDK, but not on Google app engine?

I encountered a strange problem: The gaeutilities' session worked on the GAE SDK, but not on the actual Google App Engine platform. The followings are session creation and existence checking using Python, respectively.
Session creation:
self.session = sessions.Session()
self.session.delete_item('account')
self.session.delete_item('accountKey')
...
query = db.Query(model.Member)
query = query.filter('account =', account) # 'account' is the user account
results = query.fetch(limit=1)
if results: # Account exists
member = results[0]
self.session['account'] = account
self.session['accountKey'] = member.key()
...
Session existence checking:
self.session = sessions.Session()
if 'accountKey' in self.session: # Session exists
account = self.session['account'] # Could this be the problem?
...
The above program runs OK on the GAE SDK. But I uploaded the program to Google App Engine, and it didn't work. What might be the problem?
I am not familiair with gaeutilities. But with self.session = sessions.Session() you create a new session. This will be empty. So your check if 'accountKey' in ... will not work. There must be another way to get the existing session.
I have found another approach to solve the problem using gae-sessions. Check here.

Resources