I have camel route configured as below:
<route id="text-file-route">
<from
uri="file:files/merchant?antInclude={{include-file-type}}&initialDelay=1000&delay=1000&move=${file:name.noext}.processed" />
<split streaming="true">
<tokenize token="\n" />
<process ref="splitBatchAdapterProcessor" />
<process ref="merchantStreamProcessor" />
<process ref="merchantTableProcessor" />
<to uri="mock:dummy" />
</split>
</route>
With current configuration the file gets renamed even if an exception is thrown while processing the file.
What i want is, the file should only be rename if no exception is thrown while the file.
I an using camel 2.12.
You need to turn on shareUnitOfWork so the splitter returns back the exception so the file consumer can rollback. You can read more about this at: http://camel.apache.org/splitter
<split streaming="true" shareUnitOfWork="true">
Related
I am using quartz and poll enrich to download multiple files at once , one of the configuration used is localWorkDirectory=/tmp/sftp_tmp/ . I see in logs that says
org.apache.camel.component.file.remote.SftpConsumer.processExchange - About to process file: RemoteFile[filename] . But it randomly sometimes places the file in /tmp/sftp_tmp/ instead of the at final destination c:/cameldata at times. I am using camel version 2.24.1
<route id="sftp-read">
<from uri="quartz2://sftp?stateful=true&trigger.repeatInterval=3s"/>
<pollEnrich><spel>file:c:/cameldata?include=SFTPRECFILE&noop=true&idempotent=false&readLock=markerFile</spel></pollEnrich>
<to uri="seda:startSftpExecutor?waitForTaskToComplete=Always&timeout=-1"/>
</route>
<route id="sftp-executor">
<from uri="seda:startSftpExecutor" />
<pollEnrich><spel>{{sftpUrl}}&sendEmptyMessageWhenIdle=true&binary=false&include=BR-.*&maximumReconnectAttempts=3&noop=true&maxMessagesPerPoll=1&localWorkDirectory=/tmp/sftp_tmp/&stepwise=false</spel></pollEnrich>
<choice>
<when>
<simple>${in.body} != null</simple>
<to uri="file:c:/cameldata?fileName=${file:onlyname}&tempPrefix=.tmp"/>
</when>
<otherwise>
<setBody><constant></constant></setBody>
<log loggingLevel="INFO" message="file(s) downloaded successfully" />
<to uri="file:c:/cameldata?fileName=LOADFILE" />
</otherwise>
</choice>
</route>
So I have the following route (Camel 2.20.0)
I was working on a global <onException> block for a new route. For some reason it wasn't firing, so I moved the items to a doTry/doCatch within one specific route just to play with the error handeling.
<camelContext xmlns="http://camel.apache.org/schema/spring" id="jobfeedCamelContext">
<route id="testError">
<from uri="timer://runOnce?repeatCount=1&delay=5000" />
<doTry>
<throwException exceptionType="java.lang.Throwable"/>
<to uri="errorBean"/> <!-- bean does nothing but explicitly throws java.lang.Throwable -->
<doCatch>
<exception>java.lang.Throwable</exception>
<log message="### exception" />
</doCatch>
</doTry>
<log message="### out of try" />
</route>
</camelContext>
For output I get the stack trace from the beans java.Lang.Throwable (but no stacktrace is generated for the<throwException exceptionType="java.lang.Throwable"/>. I do not get my "### exception" log entry in any scenario, but I do get the "### out of try" log entry.
Have used this functionality in other routes on older version of camel, so I can't really see where I am going wrong. Anyone have any ideas? I've turned on route tracing and there is nothing helpful.
<throwException exceptionType="java.lang.Throwable" message="some text"/>
In below route , i am using both "try..catch" and onexception features .
If there is any exception in my bean or the lines outside try block..file is moved to ".error" since i used moveFailed option but during exception which are caught by catch block generated by lines of try block,file is lost..
1.when server is down
2.when connection timeout
Please suggest the ways to preserve the file during such failures/exceptions
<camelContext streamCache="false" useMDCLogging="true" id="XXX" xmlns="http://camel.apache.org/schema/spring">
<streamCaching spoolDirectory="/tmp/cachedir/#camelId#/#uuid#" spoolUsedHeapMemoryThreshold="70" bufferSize="65536" anySpoolRules="true" id="myCacheConfig"/>
<onException >
<description>An exception was encountered.</description>
<exception>java.lang.Exception</exception>
<log message="somemessage" loggingLevel="INFO"/>
</onException>
<route >
<from uri="file:D:/Users/Desktop/src?moveFailed=.error" />
<transform>
<method ref="somebean" method="somemethod"/>
</transform>
<doTry>
<to uri="file:D:/Users/Desktop/src" />
<log message="transfered successfully" />
<doCatch>
<exception>java.lang.Exception</exception>
<log message="Exception occurred and Stopping the Route"/>
<to uri="controlbus:route?routeId=XXXX&action=stop"/>
<log message="Stopped the Route:XXX"/>
</doCatch>
</doTry>
</route>
Please suggest the ways to preserve the file during such failures/exceptions
Rethrow the exception in your docatch-block to reach the onException block.
<doCatch>
<exception>java.lang.Exception</exception>
<log message="Exception occurred and Stopping the Route"/>
<to uri="controlbus:route?routeId=XXXX&action=stop&async=true"/> // async=true to resume/finish this route
<log message="Stopped the Route:XXX"/>
<throwException exception="java.lang.Exception"/> // dont know the exact syntax in xml dsl
</doCatch>
Before explaining my problem I would state here that I am completely new to camel file processing. I have a requirement to read the file from a directory do some processing and delete them. This was a very high level requirement and I am able to achieve this using camel. But now I've got some new requirements as stated below. Need help on that.
Create this application as a job and trigger it by reading another directory where some specific files would be dropped other wise it should kicked of by its own every 15-20 minutes.
Before triggering the actual application make sure that the directory has some specific number of files present (say 25 files)
If all files are present - execute a method to create a unique tracking ID for all these 25 files. If I have a unique ID how can I make it available through multiple routes?
As of now I have tried implementing routepolicy but since I have never used it earlier I need some guidance so that I can go ahead with this.
1. Separe your route logic from route triggering
<route id="TriggerFromFile">
<from uri="file:triggerFolder" />
<log message="Triggered from file" />
<to uri="direct:startLogic" />
<route>
<route id="TriggerFromTimer">
<from uri="timer:triggerTimer?period=15m" />
<log message="Triggered from timer" />
<to uri="direct:startLogic" />
</route>
<route id="Logic>
<from uri="direct:startLogic" />
<to uri="..." />
</route>
2. Count the number of files and use that as a filter
Define a bean that counts the number of files in the dir, set that number
as body and validate using a filter.
<route id="TriggerFromFile">
<from uri="file:triggerFolder" />
<log message="Triggered from file" />
<to uri="direct:countFile" />
<route>
<route id="TriggerFromTimer">
<from uri="timer:triggerTimer?period=15m" />
<log message="Triggered from timer" />
<to uri="direct:countFile" />
</route>
<route id="FileCount">
<from uri="direct:countFile" />
<to uri="bean:countFilesInDir" />
<log message="There are ${body} files the directory" />
<filter>
<simple>${body} >= 25</simple>
<to uri="direct:startLogic" />
</filter>
</route>
<route id="Logic">
<from uri="direct:startLogic" />
<to uri="..." />
</route>
3. Set a Header to your Exchange's message before sending it to other routes
When Camel sends an Exchange between routes, headers and properties are copied.
Calculate a unique id in some way (concatenate file names, md5 of content, file modification timestamp....) and set it in a Header.
An header can hold any java object.
I wrote the following route and expected that the bean 'teaserService' should be called only one time, at the end of processing of all files, but ... it's called after processing of each file:
<route id="teaserInterface">
<from
uri="file://{{teaser.dropInDir}}?readLock=changed&delete=true&delay=60000" />
<choice>
<when>
<simple>${file:ext} == 'properties'</simple>
<to uri="file://{{teaser.config.directory}}" />
</when>
<when>
<simple>${file:ext} == 'jpg' || ${file:ext} == 'JPG'</simple>
<to uri="sftp://{{apache.ftp.user}}#{{apache.ftp.host}}/{{apache.teaser.ftp.targetDir}}?password={{apache.ftp.password}}&binary=true&separator=UNIX" />
</when>
<otherwise>
<transform>
<simple>Dear user,\n\n the Teaser interface only accept *.jpg and *.properties files, but we found the file ${file.name}.\n\n Have a nice day,\nYour lovely Teaser interface</simple>
</transform>
<to
uri="smtp://smtp.blabla.com?contentType=text/plain&from=blabla#blabla.com&to=chica#chicas.com&subject=A problem occured while setting up new teaser!" />
</otherwise>
</choice>
<bean ref="teaserService" method="updateTeaser" />
</route>
How to achieve such a behavior?
Thanks
The Camel file compoment is a batch consumer and adds properties to the exchange regarding the batch it is processing. You can test for the property CamelBatchComplete and if that is set to true, then call your bean.
If you want to proceed only after all files have been read, you must sample them somehow. This can be achieved using the aggregator pattern:
<route>
<from uri="file://src/data/aggregate-and-process?readLock=changed&delete=true&delay=60000" />
<aggregate strategyRef="aggregationStrategy" completionFromBatchConsumer="true">
<correlationExpression>
<constant>true</constant>
</correlationExpression>
<to uri="direct:sub" />
</aggregate>
</route>
<route>
<from uri="direct:sub" />
<!-- processing aggregated body -->
</route>
Please note that I set completionFromBatchConsumer="true". From the Camel documentation:
This option is if the exchanges are coming from a Batch Consumer. Then when enabled the Aggregator2 will use the batch size determined by the Batch Consumer in the message header CamelBatchSize. [...] This can be used to aggregate all files consumed from a File endpoint in that given poll.