Kaha DB message store to persist Files in camel - apache-camel

This is my Camel Route:
<route>
<from uri="file:///c:/"/>
<to uri="file:///D:/"/>
</route>
In case of any failure in this route I want to persistently store files in Kaha DB so that files won't be lost. But am not aware of blueprint.xml configuration of Kaha DB persistence for storing files. And my Activemq.xml file is as follows
<broker brokerName="kahaDB_Persistence" persistent="true" useShutdownHook="false">
<persistenceAdapter>
<kahaDB directory="${data}/kahadb/"
journalMaxFileLength="100mb"
concurrentStoreAndDispatchQueues="false"
concurrentStoreAndDispatchTopics="false"/>
</persistenceAdapter>
</broker>
Please advise me how to connect to this KahaDB from blueprint.xml by considering above mentioned route.

Camel's file component has a built in archive feature that saves files that have been processed. It copies them into a folder called ".camel", but it can be changed with a configuration option.
I would not recommend using KahaDB, as it doesn't fit the "right tool for the job" mantra.
Camel File component docs

Related

File append issue for an SFTP producer - Apache Camel

I am trying to append the data to a file which is in SFTP server using Apache Camel SFTP component but data is not appending to the file in SFTP server rather it is creating a new file. I am using this file append in split component. Below is the small snippet used to transfer the data. Any suggestions please?
<to uri="sftp://{{rue.sftp.username}}#{{rue.sftp.host}}:{{rue.sftp.port}}/{{rdx.FileLocation}}?password={{rue.sftp.password}}&fileName=${exchangeProperty.failedFileName}&fileExist=Append" />

Error message "unable to create new native thread" using SMB endpoint

I'm getting a java.lang.OutOfMemoryError: unable to create new native thread error during transfer of around 20 files using SMB endpoint.
Camel version 2.13.
The route itself is quite simple:
<route id="Filetransfer">
<from uri="ftp://user#server//source/map?password=pwd&include=fileA.*.csv|fileB.*.csv|fileC.*.csv|fileD.*.csv|fileE.*.csv|fileF.*.csv&move=save&consumer.delay=30000" />
<log message="${routeId}: ${header.CamelFileName}" />
<to uri="smb://domain;user#server/target/map?password=pwd"/>
</route>
When I check the number of threads in Hawtio dashboard the thread count is hitting a peak of 1000. The route executes correctly when only some small files are transferred. When some biggger
files (>5Mb, >100.000 lines) are transferred, the route gives the error.
When I replace the SMB endpoint with a FILE endpoint like <to uri="file:///tmp/camel"/> the route also executes correctly, and all the files are transferred.
Splitting the file per line first, and then use the Append option in the SMB endpoint causes the same error.
What can I do to make the SMB endpoint work, regardless the size of the files?

Apache Camel Batch FTP Upload then disconnect

My use case is to poll a local directory for a list of new files periodically, and then upload them to a FTP server in 1 connection. The Camel route is defined in Spring XML as follows:
<route>
<from uri="file:data/inbox?noop=true&delay=1000&maxMessagesPerPoll=3" />
<to uri="ftp:uid:xxxxx#host:21/data?disconnect=false"/>
</route>
The route is functioning well, except that the FTP connection will retain connected until the FTP server timeout my connection. I hope to reuse the same connection to upload a batch of files and then close the connection immediately when the last file in the batch completed the upload. How can achieve this in Camel?
This is not possible currently. You will need to write some code to do the disconnect yourself.
You are welcome to log a JIRA to enhance this in camel-ftp: https://issues.apache.org/activemq/browse/CAMEL. For example a new option to disconnectOnBatchComplete.
There might be a way but it is not pretty.
You could try to wrap your route based on a cronSchedulePolicy. So say you kick start the route once every hour and poll the directory and send the files. Then you simply add a stop(). Not sure if the stop is exactly the same in the xml dsl. Alternatively, you could also write that onExchangeComplete(new Processor(StopProcessor(routeId)) and inside that processor you via exchange.getContext.stopRoute(routeid) stop the route. Again this depends on your requirements allow you to do this.
<route>
<from uri="file:data/inbox?noop=true&delay=1000&maxMessagesPerPoll=3" />
<to uri="ftp:uid:xxxxx#host:21/data?disconnect=false"/>
<stop/>
</route>

How I can create a camel globalObject?

I need a global Object for all routes, processes and Components. In this global Object I would save configuration parameters. But I don't know how and where I can set a global Object, and how I can read it in my own process and my own components.
I create the camel Context in Spring and have a RouteBuilder to build my routes.
Thank you
If you want to setup your route, then you may use PropertyPlaceholderConfigurer, see here:
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer"/>
<camelContext xmlns="http://activemq.apache.org/camel/schema/spring">
<route>
<from uri="activemq:${someQueueName}"/>
<to uri="mock:results"/>
</route>
</camelContext>
Alternatively, you may use ApplicationContextRegistry that allows you to look up beans in the Spring ApplicationContext. This implementation is automatically used when you’re using Camel in a Spring environment, see here. E.g., access the registry as follows:
String myValue = exchange.getContext().getRegistry().lookupByNameAndType("myKey", String.class);

how to configure Apache Camel Quartz endpoint to use JDBCJobStore

I have configured Quartz endpoint for the scheduling requirement. However currently in my route configuration, trigger information is hard coded in the XML configuration file. As per the requirement, trigger information needs to come from DB.
<camel:route>
<camel:from uri="quartz://commandActions/MSFI?cron=15+17+13+?+*+MON-SUN+*" />
<camel:bean ref="userGateway" method="generateCommand" />
<camel:to uri="wmq:SU.SCHEDULER" />
</camel:route>
Quartz documentation says Jobs and triggers can be stored in database and are accessed using JDBCJobStore. Is it possible to configure Camel Quartz endpoint to use JDBCJobStore? I tried to find out an example but couldn't find. If someone has implemented this before, kindly share an example.
Thanks,
Vaibhav
Yeah see the quartz documentation how to configure it to use a jdbc job store. You can do this using a quartz.properties file, which you can tell Camel to use.
See the Camel side part here: http://camel.apache.org/quartz at the section Configuring quartz.properties file

Resources