Does MuleSoft Salesforce Upsert Connector supports allOrNone? - salesforce

In MuleSoft for Salesforce Upsert connector how to pass allOrNone=true ?
I tried below but no luck , I see that allOrNone works for composite but any idea how to make it work for upsert as well ?
<salesforce:upsert doc:name="Upsert" doc:id="f97aa678-222f-4c4d-819d-95217c656ff2" config-ref="Salesforce_Config" objectType="Product2" externalIdFieldName="SAP_Material_Number__c">
<salesforce:headers >
<salesforce:header key='allOrNone' value="true" />
</salesforce:headers>
</salesforce:upsert>

I just found a way and its working. This might help others
<ee:transform doc:name="Transform Message" doc:id="ff880d6c-9b98-4853-a303-986a63cca156" >
<ee:message >
<ee:set-payload ><![CDATA[%dw 2.0
output application/java
---
payload]]></ee:set-payload>
</ee:message>
<ee:variables >
<ee:set-variable variableName="allOrNone" ><![CDATA[%dw 2.0
output application/java
---
{
allOrNone:true
}]]></ee:set-variable>
</ee:variables>
</ee:transform>
<salesforce:upsert doc:name="Upsert" doc:id="f97aa678-222f-4c4d-819d-95217c656ff2" config-ref="Salesforce_Config" objectType="Product2" externalIdFieldName="SAP_Material_Number__c">
<salesforce:headers >
<salesforce:header key='AllOrNoneHeader' value='#[vars.allOrNone]'/>
</salesforce:headers>
</salesforce:upsert>

Related

Mule 4 dynamic queries in the Database Connector

In my flow in Mule 4 I am trying to query a database for specific data.
For example I want to run a query like this:
SELECT * FROM mulesoft WHERE plant = CCCNNB;
The thing is both plant and CCNNB need to be dynamic. They will come through an API request. I can handle the value to be dynamic, but I get empty results whenever I try to make the field dynamic.
I first create a variable which stores the json from the request:
set-variable value="#[payload]" doc:name="Set Variable" doc:id="8ed26865-d722-4fdb-9407-1f629b45d318" variableName="SORT_KEY"/>
Request looks like this:
{
"FILTER_KEY": "plant",
"FILTER_VALS": "CCNNB"
}
Afterwards in the db connector I configure the following:
<db:select doc:name="Select" doc:id="13a66f51-2a4e-4949-b383-86c43056f7a3" config-ref="Database_Config">
<db:sql><![CDATA[SELECT * FROM mulesoft WHERE :filter_key = :filter_val;]]></db:sql>
<db:input-parameters ><![CDATA[#[{
"filter_val": vars.SORT_KEY.FILTER_VALS,
"filter_key": vars.SORT_KEY.FILTER_KEY
}]]]></db:input-parameters>
Replacing :filter_key with plant works but as soon as I try to make it dynamic I get nothing in the response. It does not fail though, response code is 200 but I get nothing inside it.
How can I make this work?
You can directly use the stored variables in the query itself.
Query Should be an expression in DataWeave.
#["SELECT * FROM $(vars.table) WHERE $(vars.SORT_KEY.FILTER_KEY) = :filter_val"]
<db:select config-ref="Database_Config">
<db:sql><![CDATA[#["SELECT * FROM $(vars.table) WHERE $(vars.SORT_KEY.FILTER_KEY) = :filter_val"]]]></db:sql>
<db:input-parameters ><![CDATA[#[{
"filter_val": vars.SORT_KEY.FILTER_VALS
}]]]>
</db:input-parameters>
</db:select>
There is another way also to read values from payload to build a dynamic query as below
#["SELECT * FROM mulesoft
WHERE " ++ vars.SORT_KEY.FILTER_KEY ++ " = '" ++ vars.SORT_KEY.FILTER_VALS ++ "'"]
Below is the XML that is created for this, as a POC
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:os="http://www.mulesoft.org/schema/mule/os"
xmlns:salesforce="http://www.mulesoft.org/schema/mule/salesforce"
xmlns:db="http://www.mulesoft.org/schema/mule/db"
xmlns:xml-module="http://www.mulesoft.org/schema/mule/xml-module"
xmlns:http="http://www.mulesoft.org/schema/mule/http"
xmlns:ee="http://www.mulesoft.org/schema/mule/ee/core"
xmlns="http://www.mulesoft.org/schema/mule/core"
xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/ee/core http://www.mulesoft.org/schema/mule/ee/core/current/mule-ee.xsd
http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd
http://www.mulesoft.org/schema/mule/db http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd
http://www.mulesoft.org/schema/mule/os http://www.mulesoft.org/schema/mule/os/current/mule-os.xsd">
<http:listener-config name="HTTP_Listener_config1"
doc:name="HTTP Listener config"
doc:id="6d5de64b-1355-4967-9352-4b324f02c7ad">
<http:listener-connection host="0.0.0.0"
port="8081" />
</http:listener-config>
<db:config name="Database_Config" doc:name="Database Config"
doc:id="d5c4d49c-aef3-4d4a-a7b5-470da3354127">
<db:my-sql-connection host="localhost"
port="3306" user="root" password="admin123" database="Mysql" />
</db:config>
<flow name="testFlow"
doc:id="8cfea1b0-d244-40d9-989c-e136af0d9f80" initialState="started">
<http:listener doc:name="Listener"
doc:id="265e671b-7d2f-4f3a-908c-8065a5f36a07"
config-ref="HTTP_Listener_config1" path="test" />
<set-variable value="#[payload]" doc:name="Set Variable"
doc:id="265a16c5-68d4-4217-8626-c4ab0a3e38e5" variableName="SORT_KEY" />
<db:select doc:name="Select"
doc:id="bdf4a59c-0bcc-46ac-8258-f1f1762c4e7f"
config-ref="Database_Config">
<db:sql><![CDATA[#["SELECT * FROM mulesoft.mulesoft WHERE " ++ vars.SORT_KEY.FILTER_KEY ++ " = '" ++ vars.SORT_KEY.FILTER_VALS ++ "'"]]]></db:sql>
</db:select>
<ee:transform doc:name="Transform Message"
doc:id="72cbe69f-c52e-4df9-ba5b-dd751990bc08">
<ee:message>
<ee:set-payload><![CDATA[%dw 2.0
output application/json
---
payload]]></ee:set-payload>
</ee:message>
</ee:transform>
</flow>
</mule>
Explanation of the Flow
I am using the payload that is in Question
Seting a variable name "SORT_KEY", value of this varibale is complete payload that we receive.
then creating a dynamic query inside the Db connector
using transform message sending the data as response, that we received from DataBase
So, there are several issues here. One, the creation of the sql statement.
You can do DW inside the DB:SELECT component if you want, as shown by previous answers. Either the
#["SELECT * FROM myTable" ++ vars.myWhere]
OR
#["SELECT * FROM myTable $(vars.myWhere)"]
work.
The problem you will run into is that DataSense doesn't like this. Without the literal text for the column names, DataSense can't figure out what columns to retrieve so it raises an error "Unable to resolve value for the prameter: sql". This leaves an error in your code, which always gives me angst. I wish Mulesoft would resolve this problem.
BTW, if you do dynamic SQL, you should STILL use input parameters for each value to avoid SQL injections.
I have an "Idea" posted here to fix the bogus error: https://help.mulesoft.com/s/ideas#0872T000000XbjkQAC

Unable to push vespa metrics to cloudwatch

Basically I need to monitor vespa metrics and for that I am trying to implement method to push metrics to cloudwatch.
This is the document that I am referring to https://docs.vespa.ai/documentation/monitoring.html
I have added the credentials file and putMetricData permission in the IAM role attached. The service.xml file that I am using in my code looks like this:
<admin version="2.0">
<adminserver hostalias="admin0"/>
<configservers>
<configserver hostalias="admin0"/>
</configservers>
<monitoring>
</monitoring>
<metrics>
<consumer id="my-cloudwatch">
<metric-set id="vespa" />
<cloudwatch region="ap-south-1" namespace="vespa">
<shared-credentials file="~/.aws/credentials" profile="default" />
</cloudwatch>
</consumer>
</metrics>
</admin>
I have deployed the code using vespa-deploy prepare application.zip && vespa-deploy activatebut I am still not seeing any metrics updated on my cloudwatch.
Also, I have tried to add:
<monitoring>
<interval>1</interval>
<systemname>vespa</systemname>
</monitoring>
But getting this error when deploying:
Request failed. HTTP status code: 400
Invalid application package: default.default: Error loading model: XML error in services.xml: element "interval" not allowed here; expected the element end-tag [9:16], input:
How can I fix this issue. Or atleast debug the issue that I am facing.
I suggest to use absolute path to the credentials file, as the ~ may not resolve to the directory you intended at runtime.
A couple more things:
I recommend using the default metric set, as vespa contains a lot of metrics, which will drive your CloudWatch cost higher. If you need additional metrics, you can add them with the metric tag inside consumer.
The monitoring element doesn't do anything useful in this context, so you should just drop it.
If you still don't see any metrics, please check for warnings or errors in the vespa log file (use vespa-logfmt) and the Telegraf log file: /opt/vespa/logs/telegraf/telegraf.log. (Vespa uses Telegraf internally to emit metrics to CloudWatch.)

How to delete the data from salesforce object?

I have a scenario where I need to delete complete data from a Salesforce object.
TO achieve, first, all the Ids from that object are fetched and saved in a file in .csv format. Once the data is uploaded into the file, need to delete the record by records in using batch.
I'm able to query on the object and save the data in .csv but while deleting the data sometimes getting below error.
Message : null (java.nio.BufferUnderflowException).
Element : /batch-delete-genericFlow/processors/3 # apl-sfa-batch-interface-v44:batch-delete-all.xml:48 (Transform Message)
--------------------------------------------------------------------------------
Exception stack is:
null (java.nio.BufferUnderflowException). (org.mule.api.MessagingException)
java.nio.Buffer.nextGetIndex(Buffer.java:500)
java.nio.HeapCharBuffer.get(HeapCharBuffer.java:135)
com.mulesoft.weave.reader.UTF8StreamSourceReader.decode$1(SeekableStreamSourceReader.scala:147)
com.mulesoft.weave.reader.UTF8StreamSourceReader.read(SeekableStreamSourceReader.scala:167)
com.mulesoft.weave.reader.csv.parser.StreamingCSVParser.read(StreamingCSVParser.scala:61)
(66 more...)
(set debug level logging or '-Dmule.verbose.exceptions=true' for everything)
********************************************************************************
PFB delete Batch code Code:
<batch:job name="batch-delete-genericBatch" max-failed-records="-1">
<batch:input>
<enricher target="#[flowVars['jobInfo_delete']]" doc:name="Enricher jobId">
<sfdc:create-job config-ref="SFA_MSBI" type="#[flowVars.sObjectName]" concurrencyMode="Serial" contentType="CSV" operation="delete" doc:name="Create Job"/>
</enricher>
<expression-component doc:name="Save Job ID"><![CDATA[sessionVars.jobInfo_delete = flowVars.jobInfo_delete.id
]]></expression-component>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step" >
<batch:commit doc:name="Batch Commit" size="5000">
<processor-chain doc:name="Processor Chain">
<dw:transform-message metadata:id="df884737f2bc" doc:name="Transform Message">
<dw:set-payload><![CDATA[%dw 1.0
%output application/java
---
payload map {
Id: $.Id
}]]></dw:set-payload>
</dw:transform-message>
<sfdc:create-batch config-ref="SFA_MSBI" doc:name="Salesforce">
<sfdc:job-info ref="#[flowVars.jobInfo_delete]"/>
<sfdc:objects ref="#[payload]"/>
</sfdc:create-batch>
</processor-chain>
</batch:commit>
</batch:step>
</batch:process-records>
<batch:on-complete>
<async doc:name="Async">
<sfdc:close-job config-ref="SFA_MSBI" jobId="#[sessionVars.jobInfo_delete]" doc:name="Salesforce"/>
</async>
</batch:on-complete>
</batch:job>
Please advise.
Seems like Error is causing with some Value.
Please post more code to have some context.
In mean time following is a proper way of deleting SFOs.
<sfdc:delete config-ref="mySalesforceConfig">
<sfdc:ids>
<sfdc:id>001...</sfdc:id>
</sfdc:ids>
</sfdc:delete>
http://mulesoft.github.io/salesforce-connector/8.3.1/apidocs/apidoc.html#_delete

Error in Using YCSB with Gemfire

Hii Am Using YCSB for Benchmarking Pivotal Gemfire My Gemfire server is running properly and by using following command am running the benchmarking test.
bin/ycsb load gemfire -P workloads/workloada -p gemfire.serverhost=x.x.x.x -P gemfire-binding/conf/cache.xml -p gemfire.serverport=40404 -s > load.txt
Loading workload...
Starting test.
0 sec: 0 operations;
Exception in thread "Thread-1" java.lang.IllegalStateException: You must use client-cache in the cache.xml when ClientCacheFactory is used.
at com.gemstone.gemfire.internal.cache.xmlcache.CacheCreation.create(CacheCreation.java:316)
at com.gemstone.gemfire.internal.cache.xmlcache.CacheXmlParser.create(CacheXmlParser.java:274)
at com.gemstone.gemfire.internal.cache.GemFireCacheImpl.loadCacheXml(GemFireCacheImpl.java:3495)
at com.gemstone.gemfire.internal.cache.GemFireCacheImpl.initializeDeclarativeCache(GemFireCacheImpl.java:926)
at com.gemstone.gemfire.internal.cache.GemFireCacheImpl.init(GemFireCacheImpl.java:708)
at com.gemstone.gemfire.internal.cache.GemFireCacheImpl.create(GemFireCacheImpl.java:533)
at com.gemstone.gemfire.cache.client.ClientCacheFactory.basicCreate(ClientCacheFactory.java:207)
at com.gemstone.gemfire.cache.client.ClientCacheFactory.create(ClientCacheFactory.java:161)
at com.yahoo.ycsb.db.GemFireClient.init(GemFireClient.java:125)
at com.yahoo.ycsb.DBWrapper.init(DBWrapper.java:63)
at com.yahoo.ycsb.ClientThread.run(Client.java:189)
0 sec: 0 operations;
Please anybody tell can tell me where am going wrong
Thanks in advance
In your GemFireClient.java you are setting up a client cache with the ClientCacheFactory class but the cache.xml that you are giving it for configuration is specifying a cache element instead of a client-cache element. Try changing your cache.xml to use a client-cache similar to the below example. Note that when configuring a client-cache in your cache.xml you will need to use a different DTD than the one used above.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE client-cache PUBLIC
"-//GemStone Systems, Inc.//GemFire Declarative Caching 7.0//EN"
"http://www.gemstone.com/dtd/cache7_0.dtd">
<client-cache copy-on-read="false" >
<region name="usertable" refid="PARTITION"/>
</client-cache>
you do not need to specify a cache.xml when running the YCSB client. The cache.xml in gemfire-binding/conf folder is meant to be supplied to the GemFire server.

Mule CXF Marshall Response

I am using cxf:jaxws-client in Mule 3 and the response I get from my web service call is of type ReleasingInputStream. I have tried adding the http-response-to-message-transformer, but that generates an error - does anyone know how I can retrieve the response as an object as opposed to a ReleasingInputStream?
Many thanks.
To solve the issue put the <cxf-client> inside the <outbound-endpoint> section (NOT BEFORE IT), by modifying the following code
<cxf:jaxws-client
clientClass="com.xyz.services.WSServices"
port="WSServicesSoap"
wsdlLocation="classpath:wsdl-file.wsdl"
operation="GimmeDataOperation" />
<outbound-endpoint exchange-pattern="request-response" address="http://localhost:8083/OutboundService" />
which produces a ReleasingInputStream output to
<outbound-endpoint exchange-pattern="request-response" address="http://localhost:8083/OutboundService" >
<cxf:jaxws-client
clientClass="com.xyz.services.WSServices"
port="WSServicesSoap"
wsdlLocation="classpath:wsdl-file.wsdl"
operation="GimmeDataOperation" />
</outbound-endpoint>
that returns the expected object.
I was just having this same problem. I solved it by adding an ObjectToString transformer to the response side of the outbound endpoint like this:
<mule>
<object-to-string-transformer name="ObjectToString"/>
<flow>
...
...
...
<cxf:jaxws-client clientClass="com.my.ClientClass"
port="MyPort"
wsdlLocation="classpath:MyWsdl.wsdl"
operation="MyOperation" />
<outbound-endpoint address="http://some.address/path/to/service"
exchange-pattern="request-response"
responseTransformer-refs="ObjectToString" />
...
...
...
</flow>
</mule>
The whole point of jaxws-client is to receive the unmarshaled Java object, so getting the WS response as a String or ReleasingInputStream should not even be an option.
To make the <cxf:jaxws-client> "work" as one would expect the WS client to work - put the INSIDE of the <outbound-endpoint> you will be getting a correct Java object as a payload.

Resources