Apache Camel Java dsl tool - apache-camel

Is there any tool available that can convert java DSL to XML route or vice versa.
I want to convert the following XML route into Java DSL
<route id="test">
<from uri="file://{{VAR_DATA_PATH}}/test/xml"/>
<multicast>
<choice>
<when>
<xpath>/bookinfo</xpath>
<doTry>
<to uri="downloadBook"/>
<marshal ref="xstream-utf8"/>
<to uri="another URI"/>
<doCatch>
<exception>java.lang.Exception</exception>
<handled>
<constant>false</constant>
</handled>
<to uri="3rd URI"/>
</doCatch>
</doTry>
</when>
<when>
<xpath>somePath</xpath>
<to uri="4th URI" />
<to uri="5th URI"/>
</when>
</choice>
<to uri="6th URI" />
</multicast>
</route>

From Java DSL to XML, it's fairly easy. You can use Hawtio or karaf's "route-info" command. Even though the routes are in Java DSL, when you view them it would be XML.
I'm not aware of the other way around (from XML to Java) but it's not difficult at all to do it yourself.

Related

Multi File Download Not Working in Apache Camel

I am using quartz and poll enrich to download multiple files at once , one of the configuration used is localWorkDirectory=/tmp/sftp_tmp/ . I see in logs that says
org.apache.camel.component.file.remote.SftpConsumer.processExchange - About to process file: RemoteFile[filename] . But it randomly sometimes places the file in /tmp/sftp_tmp/ instead of the at final destination c:/cameldata at times. I am using camel version 2.24.1
<route id="sftp-read">
<from uri="quartz2://sftp?stateful=true&trigger.repeatInterval=3s"/>
<pollEnrich><spel>file:c:/cameldata?include=SFTPRECFILE&noop=true&idempotent=false&readLock=markerFile</spel></pollEnrich>
<to uri="seda:startSftpExecutor?waitForTaskToComplete=Always&timeout=-1"/>
</route>
<route id="sftp-executor">
<from uri="seda:startSftpExecutor" />
<pollEnrich><spel>{{sftpUrl}}&sendEmptyMessageWhenIdle=true&binary=false&include=BR-.*&maximumReconnectAttempts=3&noop=true&maxMessagesPerPoll=1&localWorkDirectory=/tmp/sftp_tmp/&stepwise=false</spel></pollEnrich>
<choice>
<when>
<simple>${in.body} != null</simple>
<to uri="file:c:/cameldata?fileName=${file:onlyname}&tempPrefix=.tmp"/>
</when>
<otherwise>
<setBody><constant></constant></setBody>
<log loggingLevel="INFO" message="file(s) downloaded successfully" />
<to uri="file:c:/cameldata?fileName=LOADFILE" />
</otherwise>
</choice>
</route>

Batch insertion issue while shutting down camel

I am doing batch insertion in mysql using mybatis.I keep adding messages to an ArrayList and fire the query when ArrayList size has reached the batch size.
<route id="billingRoute">
<from uri="activemq:queue:{{billing.queue}}" />
<log message="Message received from billing queue is ${body}"></log>
<unmarshal ref="gsonBilling"></unmarshal>
<bean beanType="com.bng.upload.processors.GetCdr" method="process(com.bng.upload.beans.BillingEvent,${exchange})" />
log message="Multicasting data ${body} to file system and database" />
<multicast>
<pipeline>
<transform>
<method ref="insertionBean" method="billingBatchInsertion"></method>
</transform>
<choice>
<when>
<simple> ${body.size()} == ${properties:batch.size}</simple>
<to uri="mybatis:batchInsertBilling?statementType=InsertList"></to>
<log message="Inserted in billing table : ${in.header.CamelMyBatisResult}"></log>
</when>
</choice>
</pipeline>
<pipeline>
<choice>
<when>
<simple>${properties:billing.write.file} == true</simple>
<setHeader headerName="path">
<simple>${properties:billing.cdr.folder}</simple>
</setHeader>
<log message="Going to write to file : ${body}"></log>
<bean beanType="com.bng.upload.processors.ToFile"
method="process(${exchange},com.bng.upload.beans.BillingCdr)" />
<to uri="file://?fileExist=Append"></to>
</when>
</choice>
</pipeline>
</multicast>
</route>
</routeContext>
We are not using transactions as there are multiple components included in route -Files,ActiveMq and Database. The issue is that when the tomcat is restarted,if the ArrayList size has not reached the batch size,those messages get lost.Is there any solution to it ?

Can we use multiple mutlicast in apache camel?

I have a requirement where i want to use mutlicast in Apache Camel for than single time in a single route. i.e Multicast within a multicast.
<routeContext id="myRoute" xmlns="http://camel.apache.org/schema/spring">
<route id="myRouteId">
<from uri="activemq:queue:{{XXXX.queue}}" />
....
<multicast parallelProcessing="true">
<pipeline>
##everything working fine here
</pipeline>
<pipeline>
<multicast>
<pipeline>
<log message="Inserting in database now"></log>
<transform>
<method ref="insertBean" method="myBatchInsertion"></method>
</transform>
<choice>
<when>
<simple>${in.header.myCount} == ${properties:batch.size} </simple>
<to uri="sql:{{sql.core.insertMyQuery}}?batch=true"></to>
<log message="Inserted rows ${body}"></log>
</when>
</choice>
</pipeline>
</multicast>
</pipeline>
</multicast>
</route>
</routeContext>
Is it possible to do that?
When i am trying to do that, my program is not getting executed successfully.
Is the unsuccessful execution is a result of mulitple multicast?
Can anybody help?
I got the reference from following link:
http://camel.apache.org/multicast.html
Why do you use pipeline? It "is" pipeline by default.
Also all the log and transform and choice statements can be put outside of the multicast. And since you are generating your endpoints dynamically, put the values in a header and use recipientlist for dynamic endpoints. Multicast is for hard-coded endpoints. Here is an example:
<route>
<from uri="direct:a" />
<!-- use comma as a delimiter for String based values -->
<recipientList delimiter=",">
<header>myHeader</header>
</recipientList>
</route>
http://camel.apache.org/recipient-list.html

Can I pipeline within content based router?

Can I pipeline within a content based router?
I have to pipeline beans within a content-based router. For that, I adopted the following configuration. I hope the configuration itself explains my requirements. Is it correct?
Do I have to add the end() tag also?
<route>
<from uri="activemq:queue:injob"/>
<choice>
<when>
<simple>${header.type} == 'heartbeat'</simple>
<to uri="bean:heartBeatHandler"/>
<to uri="activemq:queue:outjob"/>
</when>
<when>
<simple>${header.type} == 'dnsrequest'</simple>
<to uri="bean:dnsRequestHandler"/>
<to uri="bean:parser"/>
<to uri="activemq:queue:outjob"/>
</when>
<when>
<simple>${header.type} == 'whoisrequest'</simple>
<to uri="bean:whoisRequestHandler"/>
<to uri="bean:parser"/>
<to uri="activemq:queue:outjob"/>
</when>
<otherwise>
<to uri="bean:errorHandler"/>
</otherwise>
</choice>
</route>
Yes this is correct what you do.
Camel runs in pipeline mode by default (eg the pipes and filters EIP - http://camel.apache.org/pipes-and-filters.html), but if you want you can make that explicit using < pipeline >. eg
<when>
<simple>${header.type == 'heartbeat'}</simple>
<pipeline>
<to uri="bean:heartBeatHandler"/>
<to uri="activemq:queue:outjob"/>
</pipeline>
</when>
But very often you would omit the < pipeline > and do as your example.
Opposed to pipeline there is multicast (http://camel.apache.org/multicast.html), and when you combine these two, then you may need to use pipeline eg
<multicast>
<pipeline>
<to uri="bean:heartBeatHandler"/>
<to uri="activemq:queue:outjob"/>
</pipeline>
<pipeline>
<to uri="bean:somethingElse"/>
<to uri="activemq:queue:somethingElse"/>
</pipeline>
</multicast>

sftp using camel

I am trying to use sftp using camel, and getting jsch esception.
The route that I created for the SFTP -
<camelContext xmlns="http://activemq.apache.org/camel/schema/spring">
<package>myGroupId</package>
<route>
<from uri="file:src/srcData?noop=true"/>
<choice>
<when>
<xpath>/person/city = 'London'</xpath>
<to uri="file:src/targetData/UK"/>
</when>
<when>
<xpath>/person/city = 'Chicago'</xpath>
<to uri="file:src/targetData/US"/>
</when>
<when>
<xpath>/person/city = 'Tokyo'</xpath>
<to uri="sftp://XXXserverXXX:22/dir1/subdir?username=testUser?password=testPwd&binary=true"/>
</when>
<otherwise>
<to uri="file:src/targetData/OT"/>
</otherwise>
</choice>
</route>
</camelContext>
But with this configuration I am facing the following exception -
com.jcraft.jsch.JSchException: reject HostKey:
You should probably define a hostfile:
The "knownHostsFile" option should point to a ssh known hosts file with the public key of the host you are connecting to in it.
It's actually documented over here: http://camel.apache.org/ftp2.html

Resources