Nested Looping inside a groovy xml builder - loops

i am trying to create an XML file using groovy script. There is a requirement to loop two things, so that the resulting XML includes all the objects as passed by the user.
Here is the code so far, with the first loop:
import groovy.xml.*
//map to loop
def workflows = [[ name: "A", file: "fileA" , objectName: "wf_A" , objectType: "workflow", sourceRepository: "DEV2"],
[ name: 'B' , file: 'fileB' , objectName: 'wf_B' , objectType: 'workflow', sourceRepository: 'DEV2']]
// def folderNameMap = [[ srcFolder: ["srcFolder1", "srcFolder2"], TgtFolder: ["TgtFolder1", "TgtFolder2"]],
// [srcFolder: ["srcFolder3"], TgtFolder: ["TgtFolder3"]]
// ]
def builder = new StreamingMarkupBuilder()
builder.encoding = 'UTF-8'
def xml = builder.bind {
mkp.xmlDeclaration()
'udm.DeploymentPackage'(version:'$BUILD_NUMBER', application: "informaticaApp"){
deployables {
workflows.each { item ->
'powercenter.PowercenterXml'(name:item.name, file:item.file){
scanPlaceholders{ mkp.yield(true) }
sourceRepository{ mkp.yield(item.sourceRepository) }
'folderNameMap' {
entry( key:"multifolder", "{{multifolderTST}}" ) // <- this is hard code and i want to remove this
}
'objectNames' {
value { mkp.yield(item.objectName) }
}
'objectTypes' {
value { mkp.yield(item.objectType) }
}
}
}
}
dependencyResolution{ mkp.yield('LATEST') }
undeployDependencies{ mkp.yield(false) }
}
}
println XmlUtil.serialize(xml)
The resultant XML is:
<?xml version="1.0" encoding="UTF-8"?><udm.DeploymentPackage version="$BUILD_NUMBER" application="informaticaApp">
<deployables>
<powercenter.PowercenterXml name="A" file="fileA">
<scanPlaceholders>true</scanPlaceholders>
<sourceRepository>DEV2</sourceRepository>
<folderNameMap>
<entry key="multifolder">{{multifolderTST}}</entry>
</folderNameMap>
<objectNames>
<value>wf_A</value>
</objectNames>
<objectTypes>
<value>workflow</value>
</objectTypes>
</powercenter.PowercenterXml>
<powercenter.PowercenterXml name="B" file="fileB">
<scanPlaceholders>true</scanPlaceholders>
<sourceRepository>DEV2</sourceRepository>
<folderNameMap>
<entry key="multifolder">{{multifolderTST}}</entry>
</folderNameMap>
<objectNames>
<value>wf_B</value>
</objectNames>
<objectTypes>
<value>workflow</value>
</objectTypes>
</powercenter.PowercenterXml>
</deployables>
<dependencyResolution>LATEST</dependencyResolution>
<undeployDependencies>false</undeployDependencies>
</udm.DeploymentPackage>
This achieves the looping for the map declared as 'workflows' . There is another entry in the XML that needs to be iterated. The section in the script is
'folderNameMap' {
entry( key:"multifolder", "{{multifolderTST}}" ) // <- this is hard code and i want to remove this
}
I need to have this section iterated and create new line entries in the resulting XML, if multiple values were supplied to the script. Like:
<folderNameMap>
<entry key="multifolder">{{multifolderTST}}</entry>
<entry key="multifolder2">{{multifolderTST2}}</entry>
<entry key="multifolder3">{{multifolderTST3}}</entry>
</folderNameMap>
How can i define this 2nd map, so that the resultant XML looks like this: (the foldermap is a map. so i will have cases where only one srcFolder and a tgtFolder was given OR there will be times when there will be multiple srcFolder anb TgtFolders were given.)
<?xml version="1.0" encoding="UTF-8"?><udm.DeploymentPackage version="$BUILD_NUMBER" application="informaticaApp">
<deployables>
<powercenter.PowercenterXml name="A" file="fileA">
<scanPlaceholders>true</scanPlaceholders>
<sourceRepository>DEV2</sourceRepository>
<folderNameMap>
<entry key="multifolder">{{multifolderTST}}</entry>
</folderNameMap>
<objectNames>
<value>wf_A</value>
</objectNames>
<objectTypes>
<value>workflow</value>
</objectTypes>
</powercenter.PowercenterXml>
<powercenter.PowercenterXml name="B" file="fileB">
<scanPlaceholders>true</scanPlaceholders>
<sourceRepository>DEV2</sourceRepository>
<folderNameMap>
<entry key="multifolder1">{{multifolderTST1}}</entry>
<entry key="multifolder2">{{multifolderTST2}}</entry>
<entry key="multifolder3">{{multifolderTST3}}</entry>
</folderNameMap>
<objectNames>
<value>wf_B</value>
</objectNames>
<objectTypes>
<value>workflow</value>
</objectTypes>
</powercenter.PowercenterXml>
</deployables>
<dependencyResolution>LATEST</dependencyResolution>
<undeployDependencies>false</undeployDependencies>
</udm.DeploymentPackage>

So, I'm taking a stab in the dark here (as I'm not 100% sure I know what your question is), but assuming your input list can be changed to:
def workflows = [
[ name: 'A',
file: 'fileA',
objectName: 'wf_A',
objectType: 'workflow',
sourceRepository: 'DEV2',
folderNames: [ multifolder: '{{multifolderTST}}',
multifolder2: '{{multifolderTST2}}' ]],
[ name: 'B',
file: 'fileB',
objectName: 'wf_B',
objectType: 'workflow',
sourceRepository: 'DEV2',
folderNames: [ multifolder3: '{{multifolderTST3}}',
multifolder4: '{{multifolderTST4}}' ]]
]
Then, you can just do:
def builder = new StreamingMarkupBuilder()
builder.encoding = 'UTF-8'
def xml = builder.bind {
mkp.xmlDeclaration()
'udm.DeploymentPackage'(version:'$BUILD_NUMBER', application: "informaticaApp"){
deployables {
workflows.each { item ->
'powercenter.PowercenterXml'(name:item.name, file:item.file) {
scanPlaceholders(true)
sourceRepository(item.sourceRepository)
folderNameMap {
item.folderNames.each { name, value ->
entry(key:name, value)
}
}
objectNames {
value(item.objectName)
}
objectTypes {
value(item.objectType)
}
}
}
}
dependencyResolution('LATEST')
undeployDependencies(false)
}
}

Related

How to nest parameters in Postman

I am trying to configure an API call with an array in Postman, but I'm unable to figure out how to properly configure my parameters so that the Items and Properties appears as they do in the sample below. I can't quite figure out how to next Items > Item > Properties > Property
Here's a screenshot of what I need to send should look like
I feel like I might be getting close, but it's still wrong.
Here's a screenshot of my last attempt
It is possible by Postman Variable and hierarchy JSON access method.
Steps Overview
I will shows simple demo and apply your case.
Simple Demo
Demo - I will show ADD operation in Calculator SOAP
SOAP service URL
http://www.dneonline.com/calculator.asmx
Using variables input_a, input_b for adding
1. Assign Variable
JSON input
var jsonData = {
"key_a" : {
"data" : "2"
},
"key_b": {
"data" : "3"
}
}
input_a = jsonData["key_a"]["data"] = 2
input_b = jsonData["key_b"]["data"] = 3
In Pre-request Script section in Postman
2. Apply XML
<?xml version="1.0" encoding="utf-8"?>
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Body>
<Add xmlns="http://tempuri.org/">
<intA>{{input_a}}</intA>
<intB>{{input_b}}</intB>
</Add>
</soap:Body>
</soap:Envelope>
In input Body section in Postman
3. SOAP/POST Call
In outputBody section in Postman
4. XML Parsing
In Tests section in Postman
var xmlTree = xml2Json(responseBody);
var text = xmlTree["soap:Envelope"]["soap:Body"]["AddResponse"]["AddResult"];
postman.setEnvironmentVariable("output_result", Number(text));
console.log(postman.getEnvironmentVariable("output_result"));
5. Result
output_result = input_a + input_b
5 = 2 + 3
Complicate Demo - your question
Apply in your case by same idea(Simple Demo)
1. Assign Variable at Pre-request Script
var jsonData = {
"username" : "tom",
"password" : "1234",
"subscriptionId" : "sub-125abdv1",
"items": [
{
"item" : {
"id" : "CN #00168927",
"properties" : [
{
"property" : {
"name" : "ssn",
"value" : "555-01-0001",
}
},
{
"property" : {
"name" : "lastName",
"value" : "James",
}
}
]
}
},
{
"item" : {
"id" : "CN #00927119",
"properties" : [
{
"property" : {
"name" : "firstName",
"value" : "Jack",
}
},
{
"property" : {
"name" : "lastName",
"value" : "Doyle",
}
},
{
"property" : {
"name" : "ssn",
"value" : "555-01-0002",
}
}
]
}
}
]
}
console.log("username = " + jsonData["username"]);
console.log("password = " + jsonData["password"]);
console.log("subscriptionId = " + jsonData["subscriptionId"]);
postman.setEnvironmentVariable("username", jsonData["username"]);
postman.setEnvironmentVariable("password", jsonData["password"]);
postman.setEnvironmentVariable("subscriptionId", jsonData["subscriptionId"]);
console.log("items[0] id = " + jsonData["items"][0]["item"]["id"]);
postman.setEnvironmentVariable("item_zero_id", jsonData["items"][0]["item"]["id"]);
console.log("items[0] properties[0] name = " + jsonData["items"][0]["item"]["properties"][0]["property"]["name"]);
postman.setEnvironmentVariable("item_zero_property_zero_name", jsonData["items"][0]["item"]["properties"][0]["property"]["name"]);
console.log("items[0] properties[0] value = " + jsonData["items"][0]["item"]["properties"][0]["property"]["value"]);
postman.setEnvironmentVariable("item_zero_property_zero_value", jsonData["items"][0]["item"]["properties"][0]["property"]["value"]);
console.log("items[0] properties[1] name = " + jsonData["items"][0]["item"]["properties"][1]["property"]["name"]);
postman.setEnvironmentVariable("item_zero_property_one_name", jsonData["items"][0]["item"]["properties"][1]["property"]["name"]);
console.log("items[0] properties[1] value = " + jsonData["items"][0]["item"]["properties"][1]["property"]["value"]);
postman.setEnvironmentVariable("item_zero_property_one_value", jsonData["items"][0]["item"]["properties"][1]["property"]["value"]);
console.log("items[1] id = " + jsonData["items"][1]["item"]["id"]);
postman.setEnvironmentVariable("item_one_id", jsonData["items"][1]["item"]["id"]);
console.log("items[1] properties[0] name = " + jsonData["items"][1]["item"]["properties"][0]["property"]["name"]);
postman.setEnvironmentVariable("item_one_property_zero_name", jsonData["items"][1]["item"]["properties"][0]["property"]["name"]);
console.log("items[1] properties[0] value = " + jsonData["items"][1]["item"]["properties"][0]["property"]["value"]);
postman.setEnvironmentVariable("item_one_property_zero_value", jsonData["items"][1]["item"]["properties"][0]["property"]["value"]);
console.log("items[1] properties[1] name = " + jsonData["items"][1]["item"]["properties"][1]["property"]["name"]);
postman.setEnvironmentVariable("item_one_property_one_name", jsonData["items"][1]["item"]["properties"][1]["property"]["name"]);
console.log("items[1] properties[1] value = " + jsonData["items"][1]["item"]["properties"][1]["property"]["value"]);
postman.setEnvironmentVariable("item_one_property_one_value", jsonData["items"][1]["item"]["properties"][1]["property"]["value"]);
console.log("items[1] properties[2] name = " + jsonData["items"][1]["item"]["properties"][2]["property"]["name"]);
postman.setEnvironmentVariable("item_one_property_two_name", jsonData["items"][1]["item"]["properties"][2]["property"]["name"]);
console.log("items[1] properties[2] value = " + jsonData["items"][1]["item"]["properties"][2]["property"]["value"]);
postman.setEnvironmentVariable("item_one_property_two_value", jsonData["items"][1]["item"]["properties"][2]["property"]["value"]);
2. Apply XML
<?xml version="1.0"?>
<AddPortfolioItemsRequest xmlns="http://www.bkwservice.com/api/2022-08-01">
<username>{{username}}</username>
<password>{{password}}</password>
<subscriptionId>{{subscriptionId}}</subscriptionId>
<items>
<item>
<id>{{item_zero_id}}</id>
<properties>
<property>
<name>{{item_zero_property_zero_name}}</name>
<value>{{item_zero_property_zero_value}}</value>
</property>
<property>
<name>{{item_zero_property_one_name}}</name>
<value>{{item_zero_property_one_value}}</value>
</property>
</properties>
</item>
<item>
<id>{{item_one_id}}</id>
<properties>
<property>
<name>{{item_one_property_zero_name}}</name>
<value>{{item_one_property_zero_value}}</value>
</property>
<property>
<name>{{item_one_property_one_name}}</name>
<value>{{item_one_property_one_value}}</value>
</property>
<property>
<name>{{item_one_property_two_name}}</name>
<value>{{item_one_property_two_value}}</value>
</property>
</properties>
</item>
</items>
</AddPortfolioItemsRequest>

logstash not reading multiple files

My Kibana5.6.8 logstash configuration seems only reading one log file
My logstash.conf on /home/elastichsearch/confLogs is
input {
file {
type => "static"
path => "/home/elasticsearch/static_logs/**/*Web.log*" exclude => "*.zip"
start_position => beginning
sincedb_path => "/dev/null"
}
}
filter {
if [type] == "static" {
if [message] !~ /(.+)/ {
drop { }
}
grok{
patterns_dir => "./patterns"
overwrite => [ "message" ]
# 2017-08-07 11:47:35,466 INFO [http-bio-10.60.2.19-10267-exec-60] jsch.DeployManagerFileUSImpl (DeployManagerFileUSImpl.java:155) - Deconnexion de l'hote qvizzza3
# 2017-08-07 11:47:51,775 ERROR [http-bio-10.60.2.19-10267-exec-54] service.BindingsRSImpl (BindingsRSImpl.java:143) - Can't find bindings file deployed on server
# 2017-08-03 16:01:11,352 WARN [Thread-552] pcf2.AbstractObjetMQDAO (AbstractObjetMQDAO.java:137) - Descripteur de
match => [ "message", "%{TIMESTAMP_ISO8601:logdate},%{INT} %{LOGLEVEL:logLevel} \[(?<threadname>[^\]]+)\] %{JAVACLASS:package} \(%{JAVAFILE:className}:%{INT:line}\) - %{GREEDYDATA:message}" ]
}
# 2017-08-03 16:01:11,352
date{
match => [ "logdate", "YYYY-MM-dd hh:mm:ss" ]
target => "logdate"
}
}
}
output {
elasticsearch { hosts => ["192.168.99.100:9200"]}
My logs directory, with load balanced logrotate files
static_logs
--prd1
----mlog Web.log
----mlog Web.log.1
----mlog Web.log.2
--prd2
----mlog Web.log
----mlog Web.log.2
Where is my mistake ?
My patterns are on /home/elasticsearch/confLogs/patterns/grok-patterns qui with TIMESTAMP_ISO8601
Regards
If my log files are more 140M, logdate filter is not viewing as an date field, but as an string field !!!

Is there any REST service available in Salesforce to Convert Leads into Accounts?

We have to convert Leads to accounts via REST -OAuth calls. We are able to create, update(Edit) and Detail Lead fields but not able to convert them.
We found the same is possible via SOAP API but we are following REST OAuth only.
Yes and we resolved this by creating an Apex class for REST call. Sample code is this -
#RestResource(urlMapping='/Lead/*')
global with sharing class RestLeadConvert {
#HttpGet
global static String doGet() {
String ret = 'fail';
RestRequest req = RestContext.request;
RestResponse res = RestContext.response;
String leadId = req.requestURI.substring(req.requestURI.lastIndexOf('/')+1);
Database.LeadConvert lc = new Database.LeadConvert();
lc.setLeadId(leadId);
LeadStatus convertStatus = [SELECT Id, MasterLabel FROM LeadStatus WHERE IsConverted=true LIMIT 1];
lc.setConvertedStatus(convertStatus.MasterLabel);
Database.LeadConvertResult lcr ;
try{
lcr = Database.convertLead(lc);
system.debug('*****lcr.isSuccess()'+lcr.isSuccess());
ret = 'ok';
}
catch(exception ex){
system.debug('***NOT CONVERTED**');
}
return ret;
}
}
And you can use this call by
<Your Instance URL>/services/apexrest/Lead/<LeadId>
This test will give you around 93% of coverage.
#isTest
public class RestLeadConvertTest{
static testMethod void testHttpGet() {
Lead l = new Lead();
l.FirstName = 'First';
l.LastName = 'Last';
l.Company = 'Unit Test';
insert l;
Test.startTest();
RestRequest req = new RestRequest();
RestResponse res = new RestResponse();
req.requestURI = '/Lead/' + l.Id;
req.httpMethod = 'GET';
RestContext.request = req;
RestContext.response= res;
RestLeadConvert.doGet();
Test.stopTest();
}
}
You can construct a one-off SOAP request to convert a lead and use the same OAuth token that you already have for the REST API.
The request body should look like:
<?xml version="1.0" encoding="UTF-8"?>
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ens="urn:sobject.partner.soap.sforce.com" xmlns:fns="urn:fault.partner.soap.sforce.com" xmlns:tns="urn:partner.soap.sforce.com" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<soap:Header>
<tns:SessionHeader>
<sessionId>YOUR_OAUTH_TOKEN</sessionId>
</tns:SessionHeader>
</soap:Header>
<soap:Body>
<tns:convertLead>
<tns:leadConverts>
<tns:leadId>YOUR_LEAD_ID</tns:leadId>
<tns:convertedStatus>Closed - Converted</tns:convertedStatus>
</tns:leadConverts>
</tns:convertLead>
</soap:Body>
</soap:Envelope>
curl -H 'SOAPAction: null' -H 'Content-Type: text/xml' --data BODY_FROM_ABOVE https://your-instance-url/services/Soap/u/52.0
Note that the SOAPAction header is required, even though Salesforce does not use it.
The result will be returned as XML similar to:
<?xml version="1.0" encoding="UTF-8"?>
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns="urn:partner.soap.sforce.com" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<soapenv:Header>
<LimitInfoHeader>
<limitInfo>
<current>91</current>
<limit>15000</limit>
<type>API REQUESTS</type>
</limitInfo>
</LimitInfoHeader>
</soapenv:Header>
<soapenv:Body>
<convertLeadResponse>
<result>
<accountId>0015x00002C95kMAAR</accountId>
<contactId>0035x00003NjdeyAAB</contactId>
<leadId>00Q5x00001tHg1tEAC</leadId>
<opportunityId>0065x000025fDsWAAU</opportunityId>
<success>true</success>
</result>
</convertLeadResponse>
</soapenv:Body>
</soapenv:Envelope>
If you are more comfortable with JSON than XML, OneGraph provides a GraphQL API that wraps the convertLead functionality.
It's best to create your own OneGraph app to get a custom app_id, but one is provided here for demonstration purposes.
The GraphQL query will be:
mutation ConvertLead {
salesforce(
auths: {
salesforceOAuth: {
instanceUrl: "YOUR_INSTANCE_URL"
token: "YOUR_OAUTH_TOKEN"
}
}
) {
convertLead(
input: { leadConverts: [{ leadId: "YOUR_LEAD_ID" }] }
) {
leadConverts {
lead {
id
name
}
account {
name
id
}
contact {
name
id
}
opportunity {
name
id
}
success
errors {
message
statusCode
}
}
}
}
}
Then the request will look like:
curl -H 'Content-Type: application/json' 'https://serve.onegraph.com/graphql?app_id=4687c59d-8f9c-494a-ab67-896fd706cee9' --data '{"query": "QUERY_FROM_ABOVE"}'
The result will be returned as JSON that looks like:
{
"data": {
"salesforce": {
"convertLead": {
"leadConverts": [
{
"lead": {
"id": "00Q5x00001tHg1tEAC",
"name": "Daniel Boone"
},
"account": {
"name": "Company",
"id": "0015x00002C95kMAAR"
},
"contact": {
"name": "Daniel Boone",
"id": "0035x00003NjdeyAAB"
},
"opportunity": {
"name": "New Opportunity",
"id": "0065x000025fDsWAAU"
},
"relatedPersonAccountId": null,
"success": true,
"errors": []
}
]
}
}
}
}

Arrays of hashes sent to SOAP::Data

I'm sending a fairly simple Perl hash to SOAP::Data, but I'm not getting the XML that I want with an array of hashes. Here's what I'm sending it:
'hash' => {
'Location' => [
{
'key1' => 'value1',
'key2' => 'value2',
'key3' => 'value3',
},
{
'key1' => 'value1',
'key2' => 'value2',
'key3' => 'value3',
},
],
}
Here's what I get:
<hash>
<Location>
<c-gensym9>
<key1>value1</key1>
<key2>value2</key2>
<key3>value3</key3>
</c-gensym9>
<c-gensym10>
<key1>value1</key1>
<key2>value2</key2>
<key3>value3</key3>
</c-gensym10>
</Location>
</hash>
But what I want is this:
<hash>
<Location>
<key1>value1</key1>
<key2>value2</key2>
<key3>value3</key3>
</Location>
<Location>
<key1>value1</key1>
<key2>value2</key2>
<key3>value3</key3>
</Location>
</hash>
What am I missing? I suppose it'd help if I gave some code!:
my $hash = {};
my #Locations;
my #loc_codes = qw(0_4_10 0_51_117);
foreach my $l ( #loc_codes ) {
my #arr = split ('_', $l);
my $loc = {};
$loc->{key1} = $arr[0]; # country
$loc->{key2} = $arr[1]; # state
$loc->{key3} = $arr[2]; # city
push ( #Locations, $loc );
}
$hash->{Location} = \#Locations;
my $soap_elements = SOAP::Data->value(
SOAP::Data->name( 'some_method' => $hash )->prefix('p1')
)->prefix('p2');
You need strict, first of all
use strict;
use SOAP::Lite +trace => 'all';
my #loc_codes = qw(0_4_10 0_51_117);
my #Locations;
foreach my $l ( #loc_codes ) {
my #arr = split ('_', $l);
my $loc =
SOAP::Data
->name("Location" => \SOAP::Data->value(
SOAP::Data->name('key1', $arr[0]),
SOAP::Data->name('key2', $arr[1]),
SOAP::Data->name('key3', $arr[2])));
push ( #Locations, $loc );
}
my $soap_elements;
$soap_elements = SOAP::Data->value(
SOAP::Data->name( hash => \#Locations ));
my $serializer = SOAP::Serializer->new();
$serializer->readable('true');
my $xml = $serializer->serialize($soap_elements);
print $xml;
generates
<hash
soapenc:arrayType="xsd:anyType[2]"
xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"
xmlns:soapenc="http://schemas.xmlsoap.org/soap/encoding/"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:type="soapenc:Array">
<Location>
<key1 xsi:type="xsd:int">0</key1>
<key2 xsi:type="xsd:int">4</key2>
<key3 xsi:type="xsd:int">10</key3>
</Location>
<Location>
<key1 xsi:type="xsd:int">0</key1>
<key2 xsi:type="xsd:int">51</key2>
<key3 xsi:type="xsd:int">117</key3>
</Location>
</hash>
So I think you need to build your array elements first
Well ,currently , SOAP::Data is not as powerful as you imagined. It cannot dynamically or recursively analyse your compound(complex) data structure such as the nest of array , hash and scalar value.You can just pass simple data(scalar value)to SOAP::Data ,or , an SOAP::Data instance which is already assembled manually by your self thus creating a nested SOAP::Data strucure;Please refer to the SOAP::Data tutorial.
The tutorial has a simple but clear explanation on how to use SOAP::Data to deal with complex data type . But personally ,I don't recommend you to use perl as a soap client .From my previous experience, perl don't have a full support for SOAP protocal ,especially when dealing with complex data type or higher-version soap like soap 1.2.. Instead,you'd better use java .java has a very power support for latest soap and complex data type.

Specifying bigquery table schema that resides on a file in a multipart http request

I have a text file schema.txt in which the schema for the table that I want to create is defined.
I want to include this file in the multipart HTTP request that I'm using to create my table.
How do I specify the schema.txt file in the multipart HTTP request?
Below is what I'm currently doing (not working though):
def loadTable(service, projectId, datasetId, targetTableId, sourceCsv, filenm):
try:
jobCollection = service.jobs()
jobData = {
'projectId': projectId,
'configuration': {
'load': {
'sourceUris': [sourceCsv],
'schema': filenm,
'destinationTable': {
'projectId': projectId,
'datasetId': datasetId,
'tableId': targetTableId
},
'createDisposition': 'CREATE_IF_NEEDED',
'writeDisposition': 'WRITE_TRUNCATE',
'encoding': 'UTF-8'
}
}
}
Where filenm will be 'schema.txt'.
I know I can specify the schema directly as:
'schema': {
'fields': [
{
'name': 'level',
'type': 'STRING',
},
{
'name': 'message',
'type': 'STRING',
}
]
},
But instead I want to specify the file containing the schema.
Hmm, not sure why you need a "multipart HTTP request" unless you are ingesting directly from a file. Here you are specifying a CSV input, indicating a Cloud Storage object.
See here for more info:
https://developers.google.com/bigquery/docs/developers_guide#storageimport
In any case, this is not really a BigQuery question, more a Python question.. do you mean this?
import json
def loadTable(project_id, dataset_id, target_table, source_csv, filename):
file = open(filename, 'r')
schema = file.read()
file.close()
schema_json = json.loads('{%s}' % schema)
job_data = {
"projectId": project_id,
"configuration": {
"load": {
"sourceUris": [source_csv],
"schema": schema_json,
"destinationTable": {
"projectId": project_id,
"datasetId": dataset_id,
"tableId": target_table
},
"createDisposition": "CREATE_IF_NEEDED",
"writeDisposition": "WRITE_TRUNCATE",
"encoding": "UTF-8"
}
}
}
print json.dumps(job_data, indent=2)
loadTable('project_id', 'dataset_id', 'target_table', 'source_csv', '/tmp/schema.txt')

Resources