Mule Oracle Database Connector SQL with IN OPERATOR - database

I've problems with database connector of mule, which i was using for select query. I have an arraylist of string to give inside in parameter.
Mule Sql Query - passing parameters to the IN operator
Solution mentioned above doesn't work with 3.7.3 Mule ESB, I've tried in many ways and searched for that. Except this document there is no definite way which i've founded till this time.
I am using query below :
select * from db_table where id in (2,3,4)
On my example 2,3,4 is inside my flow variable which contains arraylist.
Any suggestions ?

You can do something like this. I am using MySQL as my example, just change it to Oracle.
<flow name="dbqueryFlow">
<http:listener config-ref="HTTP_Listener_Configuration" path="/db" doc:name="HTTP"/>
<set-payload value="#[[1,3,5]]" doc:name="Mock Set of Ids"/>
<expression-component doc:name="Set IDs to String"><![CDATA[ids = payload.toString().replace("[", "").replace("]", "").replace(", ", ",");
flowVars['ids'] = ids;]]></expression-component>
<logger message="SID === #[flowVars.ids]" level="INFO" doc:name="Logger"/>
<db:select config-ref="MySQL_Configuration" doc:name="Database" >
<db:dynamic-query><![CDATA[SELECT * FROM bridge WHERE id IN (#[flowVars.ids])]]></db:dynamic-query>
</db:select>
<logger level="INFO" doc:name="Logger"/>
</flow>
Hope this helps

<set-variable variableName="CHUNKVIBLIST" value="#[new ArrayList()]" doc:name="Set CHUNKVIBLIST Variable"/>
<expression-component doc:name="Expression"><![CDATA[int chunkSize = 1000;
for (int i = 0; i < flowVars.VIBLIST.size(); i += chunkSize) { flowVars.CHUNKVIBLIST.add("'"+org.apache.commons.lang.StringUtils.join(flowVars.VIBLIST.subList(i, i + chunkSize >= flowVars.VIBLIST.size() ? flowVars.VIBLIST.size() : i + chunkSize),"','")+"'");
}]]></expression-component>
<foreach collection="#[flowVars.CHUNKVIBLIST]" doc:name="For Each matnr batch:">
<db:select config-ref="Mule_DB_Configuration" doc:name="Select statement">
<db:dynamic-query><![CDATA[SELECT DISTINCT(MATNR) FROM cated_prodrelease where MATNR IN (#[payload])]]></db:dynamic-query>
</db:select>
</foreach>
Update on 01.03.2021 Old answer was my beginner days, i just decided
to update my answer and changed with above, you should create payload
contains comma seperated string to use that answer which is using IN
statement as dynamic query.

Related

How to add a dynamic parameter in Mule4 Select Query

I have a SELECT query for which the entire WHERE condition is coming from a Java class. How can I use the entire WHERE condition in the SELECT query? This is what I tried:
SELECT * FROM EMPLOYEE_TABLE WHERE 1=1 #[payload]
Here payload is a WHERE clause coming from a Java class: NAME = VKP AND STATE = PA AND CITY = KOP
I am getting an error message like " parameter null was not bound for query select…", but I am able to see the payload value coming out from the Java class and in the logs.
You should not use an expression directly in the query. Instead assign the query in an expression to a variable, then use the variable as the query.
For example something like:
<set-variable value="#['SELECT * FROM EMPLOYEE_TABLE WHERE 1=1 ' ++ payload]" name="query" />
<db:select ...>
<db:sql>#[vars.query]</db:sql>
</db:select>
Note that you risk having SQL injections vulnerabilities doing this.

How to perform Delete operation on Salesforce in Mule?

I trying to perform delete operation on salesforce but getting error message "java.lang.ArrayStoreException" what does it means.
Can anybody explain how to perform delete operation ? My code is
<flow name="z_testFlow2" processingStrategy="synchronous">
<poll doc:name="Poll">
<fixed-frequency-scheduler frequency="10" startDelay="5" timeUnit="SECONDS"/>
<echo-component doc:name="Echo"/>
</poll>
<dw:transform-message doc:name="Transform Message">
<dw:set-payload><![CDATA[%dw 1.0 %output application/java
---
[{
Name:"Thir9"
}]]]></dw:set-payload>
</dw:transform-message>
<sfdc:delete config-ref="Salesforce__Basic_Authentication" doc:name="Salesforce"/>
<logger message="hi.......... #[payload]" level="INFO" doc:name="Logger"/>
</flow>
The delete operation expects an array of strings containing the ids (e.g. ["1","2","3","4"])
Some considerations:
1) The default input will be taken from the payload (#[payload]), in this case you have to set the list of ids into the payload previously
<dw:transform-message doc:name="Transform Message">
<dw:set-payload>
<![CDATA[
%dw 1.0
%output application/java
---
["1","2","3","4","5","6"]
]]>
</dw:set-payload>
</dw:transform-message>
<sfdc:delete config-ref="Salesforce__Basic_Authentication" doc:name="Salesforce" />
2) You can change the default expression, to take the ids from a different place (e.g. from a flowVar)
<sfdc:delete config-ref="Salesforce__Basic_Authentication" doc:name="Salesforce" >
<sfdc:ids ref="#[flowVars.myListOfIds]"/>
</sfdc:delete>
3) You can specify -manually the ids to be deleted
<sfdc:delete config-ref="Salesforce__Basic_Authentication" doc:name="Salesforce" >
<sfdc:ids>
<sfdc:id>123</sfdc:id>
<sfdc:id>666</sfdc:id>
</sfdc:ids>
</sfdc:delete>
Step 1: Use SELECT query to find the record. Ex: SELECT Id FROM employee
Step 2: Set payload and place the ID like this: #[[payload.Id]]
Step 3: Then use the following expression in the delete operation: #[payload]
select the option "operation" in sales force connector and write SQL query for the delete operation

Writing large number of records to a CSV file - Not Working

Background: We are building an application MuleSoft and as part of the requirement we have to write a large number of records (approx. 30K) to a csv file. Before that we need to extract the data in the forms of XML, standalone data from DB2. Then we are applying some transformation/mapping rules and then finally we are writing the data to a csv file and FTP the csv file. I am attaching the XML.
Issue: The process is hanging somewhere after processing about 2500-2600 records only. It is not throwing any error. It just stays there, it doesn't do anything. We tried options like 1. Putting the flow as part of a mule batch flow. No difference observed 2. set max error count = -1, as we found this somewhere in the blog
Please if somebody can provide any suggestion, that will be really helpful. Is there any limit in number of records while writing to a file?
<?xml version="1.0" encoding="UTF-8"?>
<mule xmlns:batch="http://www.mulesoft.org/schema/mule/batch" xmlns:db="http://www.mulesoft.org/schema/mule/db"
xmlns:file="http://www.mulesoft.org/schema/mule/file"
xmlns:dw="http://www.mulesoft.org/schema/mule/ee/dw" xmlns:metadata="http://www.mulesoft.org/schema/mule/metadata"
xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation"
xmlns:spring="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.mulesoft.org/schema/mule/db http://www.mulesoft.org/schema/mule/db/current/mule-db.xsd
http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd
http://www.mulesoft.org/schema/mule/ee/dw http://www.mulesoft.org/schema/mule/ee/dw/current/dw.xsd
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd
http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd
http://www.mulesoft.org/schema/mule/batch http://www.mulesoft.org/schema/mule/batch/current/mule-batch.xsd">
<db:generic-config name="Generic_Database_Configuration1" url="jdbc:db2://faadbcdd0017:60004/MATIUT:user=mat_adm;password=q1w2e3r4;" driverClassName="com.ibm.db2.jcc.DB2Driver" doc:name="Generic Database Configuration"/>
<file:connector name="File" outputPattern="Carfax.csv" writeToDirectory="C:\opt\CCM\Output\IUT" autoDelete="false" outputAppend="true" streaming="true" validateConnections="true" doc:name="File"/>
<file:connector name="File1" outputPattern="sample.txt" readFromDirectory="C:\opt\CCM" autoDelete="true" streaming="true" validateConnections="true" doc:name="File"/>
<batch:job name="batch2Batch">
<batch:input>
<logger message="Startr>>>>>>>>>>>>>>>>>>>>>>>>>>>>>" level="INFO" doc:name="Logger"/>
<foreach doc:name="For Each">
<db:select config-ref="Generic_Database_Configuration1" doc:name="Database">
<db:parameterized-query><![CDATA[select MSG_ID,TEMPL_ID,MSG_DATA,EMAIL_CHNL_IND,PUSH_CHNL_IND, INSERT_TMSP,UID FROM IUT.message_master WHERE INSERT_TMSP between
(CURRENT TIMESTAMP- HOUR (CURRENT TIMESTAMP) HOURS- MINUTE(CURRENT TIMESTAMP) MINUTES- SECOND(CURRENT TIMESTAMP) SECONDS
- MICROSECOND(CURRENT TIMESTAMP) MICROSECONDS) and ((CURRENT TIMESTAMP- HOUR (CURRENT TIMESTAMP) HOURS
- MINUTE(CURRENT TIMESTAMP) MINUTES- SECOND(CURRENT TIMESTAMP) SECONDS- MICROSECOND(CURRENT TIMESTAMP) MICROSECONDS) + 1 DAY)
and SOURCE_SYS='CSS' and ONLINE_BATCH_IND IN('Y','E') AND APPL_PROCESS_IND = 'N' with UR]]></db:parameterized-query>
</db:select>
</foreach>
<logger message="#[payload]" level="INFO" doc:name="Logger"/>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step">
<component class="com.mule.object.transformer.Mapper" doc:name="Java"/>
<dw:transform-message metadata:id="9bd2e755-065a-4208-95cf-1277f5643ee9" doc:name="Transform Message">
<dw:input-payload mimeType="application/java"/>
<dw:set-payload><![CDATA[%dw 1.0
%output application/csv separator = "|" , header = false , ignoreEmptyLine = true
---
[{
Timestamp: payload.timeStamp,
NotificationType: payload.notificationType,
UID: payload.UID,
Name: payload.messageData.firstName,
MiddleName: payload.messageData.middleName,
LastName: payload.messageData.lastName,
Email: payload.messageData.email,
HHNumber: payload.messageData.cssDataRequest.householdNumber,
PolicyNumber: payload.messageData.cssDataRequest.policyContractNumber,
SentDate: payload.messageData.cssDataRequest.sendDate,
PinNumber: payload.messageData.cssDataRequest.pin,
AOR: payload.messageData.cssDataRequest.agentOfRecord
}]]]></dw:set-payload>
</dw:transform-message>
<file:outbound-endpoint path="C:\opt\CCM\Output\IUT" connector-ref="File" responseTimeout="10000" doc:name="File"/>
</batch:step>
</batch:process-records>
<batch:on-complete>
<logger message="Batch2 Completed" level="INFO" doc:name="Logger"/>
</batch:on-complete>
</batch:job>
</mule>
Try to use Batch Processing. Inside the BatchStep keep a BatchCommit which can be used for accumulating all the records within batch. And set this attribute streaming="true" for Batch Commit block. And Your File connector should be inside Batch Commit. Let me know if this helped

INSERT statement not working when using it through a variable in Mule

My database component has the following configuration
<db:insert config-ref="Oracle_Configuration" bulkMode="true" doc:name="Database">
<db:dynamic-query><![CDATA[#[flowVars.dbquery]]]></db:dynamic-query>
</db:insert>
I have declared the "dbquery" variable as follows
<set-variable variableName="dbquery" value="INSERT INTO WBUSER.EMP VALUES('#[payload.FullName]','#[payload.SerialNumber]')" doc:name="Variable"/>
On running the application the values inserted into the DB are "#[payload.FullName] and #[payload.SerialNumber].
But when my database component has the following configuration actual values of FullName and SerialNumber are getting inserted into the database.
<db:insert config-ref="Oracle_Configuration" bulkMode="true" doc:name="Database">
<db:dynamic-query><![CDATA[INSERT INTO WBUSER.EMP VALUES('#[payload.FullName]','#[payload.SerialNumber]')]]></db:dynamic-query>
</db:insert>
Here FullName and SerialNumber are not variables. They are column names of the list in the payload as [{FullName=yo, SerialNumber=129329}, {FullName=he, SerialNumber=129329}].
Can someone tell me the difference here. And is there a way i can achieve database insertion using just the variable as in the earlier case?
It caused by different approach to insert data. It works correctly for the configuration inside db-insert, because the payload is in form of List and Bulk Mode option selected.
To make it work for the first configuration (declare SQL query in a variable) then you have to do the following steps:
Iterate each payload value by utilizing: collection-splitter.
Deselect Bulk Mode from database connector.
The configuration should be:
<collection-splitter doc:name="Collection Splitter"/>
<set-variable variableName="dbquery" value="INSERT INTO WBUSER.EMP VALUES('#[payload.FullName]','#[payload.SerialNumber]')" doc:name="Variable"/>
<db:insert config-ref="MySQL_Configuration" doc:name="Database">
<db:dynamic-query><![CDATA[#[flowVars.dbquery]]]></db:dynamic-query>
</db:insert>

Mule Salesforce query

I am trying to populate a joiner table in Salesforce from data in a database. The joiner table has two lookups (of course to two different objects).
My Flow starts from querying a database. The query fetches has two fields NT_ACCOUNT & CXM_ID - These two exists as separate objects in Salesforce so i have to perform lookup to salesforce to get their corresponding salesforce Id's to create a record in my joiner table. I am not sure of the ideal way to do this
Below is the flow i have which takes NT_ACCOUNT and did a query against SF to get its Id in SFDC and create the joiner record. Now my question what is the ideal way populate the other loop in my joiner which is (a look up on SFDC for CXM_ID in database). Will it be a good option to query for CXM_ID in SFDC for its corresponding SFDC ID and combine both payloads and pass it on create (if yes any pointers on how to do that will help).
<batch:job name="Batch2">
<batch:threading-profile poolExhaustedAction="WAIT"/>
<batch:input>
<poll doc:name="Poll">
<fixed-frequency-scheduler frequency="1000"/>
<db:select config-ref="Generic_Database_Configuration" doc:name="Database">
<db:parameterized-query><![CDATA[SELECT NT_ACCOUNT,CXM_ID from Table where lastmodifieddate = 'within last 24 hours';]]></db:parameterized-query>
</db:select>
</poll>
</batch:input>
<batch:process-records>
<batch:step name="Batch_Step_1">
<sfdc:query-single config-ref="Salesforce__Basic_authentication1" query="dsql:#["SELECT id FROM FM_Account__c WHERE account_number__c = '"+payload.NT_ACCOUNT+"' LIMIT 1"]" doc:name="Salesforce"/>
<batch:commit size="200" doc:name="Batch Commit">
<data-mapper:transform config-ref="Map_To_Map_1" doc:name="Map To Map"/>
<sfdc:create config-ref="Salesforce__Basic_authentication1" type="Joiner_Table__c" doc:name="Salesforce">
<sfdc:objects ref="#[payload]"/>
</sfdc:create>
<logger message="#[message.payload]" level="INFO" doc:name="Logger"/>
</batch:commit>
</batch:step>
</batch:process-records>
</batch:job>

Resources