<route id="readCSV">
<from uri="file:inbox?noop=true&delay=10&fileName=b.csv&delete=true"/>
<log message="${body}"/>
<unmarshal ref="csvDataFormat"/>
<process ref="listRead"/>
<to uri="direct:ins"/>
</route>
I have this route i read the file correctly but when i want to transform it into a java object the list is full of nulls.
What is my problem?
An example: we have a CSV file with names of persons, their IQ and their current activity.
Jack Dalton, 115, mad at Averell Joe Dalton, 105, calming Joe William
Dalton, 105, keeping Joe from killing Averell Averell Dalton, 80,
playing with Rantanplan Lucky Luke, 120, capturing the Daltons
We can now use the CSV component to unmarshal this file:
from("file:src/test/resources/?fileName=daltons.csv&noop=true")
.unmarshal().csv()
.to("mock:daltons");
The resulting message will contain a List> like:
List<List<String>> data = (List<List<String>>) exchange.getIn().getBody();
for (List<String> line : data) {
LOG.debug(String.format("%s has an IQ of %s and is currently %s", line.get(0), line.get(1), line.get(2)));
}
Related
I'm trying to implement a simple streaming pipeline:
Get list of users from a remote REST endpoint, spliting the list into individual messages
For each user I have to enrich it with information from a SQL parametric table (departments), e.g.:
Initial message (User 1)
id:1
departmentId: 1
parentDepartmentId: 23
Departments
Department 1
id: 1
name: Dep1
Department 2
id: 23
name: Dep23
Enriched User 1
id:1
departmentId: 1
departmentName: Dep1
parentDepartmentId: 23
parentDepartmentName: Dep23
Route context:
<routeContext id="route1" xmlns="http://camel.apache.org/schema/spring">
<route id="route1">
<from id="_from1" uri="timer:mytimer"/>
<setHeader headerName="CamelHttpMethod">
<constant>POST</constant>
</setHeader>
<setHeader headerName="Content-Type">
<constant>application/json</constant>
</setHeader>
<setBody>
<simple>{ "new": "true" }</simple>
</setBody>
<to id="_apiCall1" uri="https4://myapi/v1/users/search"/>
<split streaming="true">
<simple>${body}</simple>
<to uri="direct:processNewUser"/>
</split>
</route>
<route id="processNewUser">
<from uri="direct:processNewUser"/>
<enrich strategyRef="myAggregationStrategy">
<simple>sql:select * from departments"</simple>
</enrich>
</route>
</routeContext>
I need to get the whole departments table to be able to enrich the user information but I want to avoid doing so for every message.
Is there a way to store the content of the sql query and reuse it during the enrichment phase?
Can't you get the departments first? It seems you could do that. Then, you'd set the result to an exchange property for instance and use it to enrich each user.
There are actually many ways to achieve that.
I have created multiple routes(say department, Employee) which takes input from file system folders(say department, Empployee) and process those files.
Now, I want to make them dependent. So, if I upload both emp.csv and dept.csv in those folders then it will process department file first and once complete it will start processing file for employee.
is there any way in camel to achieve this.
I looked at Route startupOrdering and AutoStartup feature, but it will work only for the first time when starting routes. However, I need same behavior for entire route life.
Thanks.
<route id="b" xmlns="http://camel.apache.org/schema/spring">
<from uri="file:/home/dev/code/Integration/RunCamleExample/src/main/resources/csv/Department?repeatCount=1&noop=true&delay=10000"/>
<log message="Department data is : ${body}"/>
</route>
<route id="employee" xmlns="http://camel.apache.org/schema/spring">
<from uri="file:/home/dev/code/Integration/RunCamleExample/src/main/resources/csv/Employee?noop=true&delay=10000"/>
<log message="Employee data is : ${body}"/>
</route>
I suggest to use other logic to handle the task. Two simple ways to go:
Use pollEnrich
Use pollEnrich to collect extra resource (e.g. a file with known name in file system) once at the middle of a route
Flow: Collect department files (From Endpoint) --(for each department file from file system) -> collect single employee file (trigger pollEnrich once with known name) ----> do anything else (if any)
Use ControlBus
Use ControlBus component to control the status of routes (only one of the route in 'start' status)
Flow: Start route A --(when route A complete its goal)-> Suspend route A ---> Start route B --(when route B complete its goal)-> Suspend route B ---> Start route A [loop back to head]
Dependent route execution first can be achieved in Camel using "RouteContext".
Example: If route 'A' is executed before route 'B' then route 'A' should be defined as 'RouteContext' and route be is defined inside "camelContext" like below:
<routeContext id="A" xmlns="http://camel.apache.org/schema/spring">
<route id="A">
<from uri="file:/home/dev/code/Integration/RunCamleExample/src/main/resources/csv/Department?repeatCount=1&noop=true&delay=10000"/>
<log message="Department data is : ${body}"/>
</route>
</routeContext>
Then regular "camelContext" should be defined with reference to this routeContext first.
<camelContext id="test" xmlns="http://camel.apache.org/schema/spring">
<routeContextRef ref="A"/>
<route id="B">
<from uri="file:/home/dev/code/Integration/RunCamleExample/src/main/resources/csv/Employee?noop=true&delay=10000"/>
<log message="Employee data is : ${body}"/>
</route>
</camelContext>
I have a proxy, which take a RequestBean as argument, which contains a list property and I need to split this list property using split EIP.
I tried
<split streaming="true">
<simple>${body}</simple>
<process ref="requestHeaderProcessor" />
</split>
My Complete route is
<route id="httpBridge">
<from uri="cxf:bean:splitterOperation?dataFormat=POJO" />
<split streaming="true">
<simple>${body}</simple>
<bean ref="requestHeaderProcessor" method="process" />
</split>
<to uri="cxf:bean:realService" />
</route>
My Proxy Service method signature is
public List<ResponseBean> splitList(List<RequestContent> requestBean);
ResponseBean.java
ResponseBean {
private String name;
}
RequestBean.java
RequestBean {
private String list;
}
The processor is not receiving individual RequestContent, I want the processor to receive RequestContent individually.
tried printing following line and
System.out.println(exchange.getIn().getBody().getClass().getName());
and got java.util.ArrayList. So its 100% Iteratable.
But, when I print the
System.out.println(exchange.getIn().getBody());
I am getting
[webservice.RequestContent#10128f3, webservice.RequestContent#1277137]
Which is list of all the RequestContent.
But, why am I seeing the list in the bean, the Exchange must contain only one RequestContent according to split definition (Since, it processes sequentially).
Where am I making mistake. or is this the way it works. How can I make sure it splits the content?
How to achieve this?
Whatever this method returns is what is used for splitting
<simple>${body.requestBean.requestContent}</simple>
So make sure that is a List or array, or can be iterated
I have the following route in Camel:
<route>
<from uri="servlet:///user?matchOnUriPrefix=true"/>
<setHeader headerName="cachename">
<simple>${header.CamelHttpPath.split("/")[1]}</simple>
</setHeader>
<setHeader headerName="key1">
<simple>${header.CamelHttpPath.split("/")[2]}</simple>
</setHeader>
<to uri="direct:put"/>
</route>
<route>
<from uri="direct:put" />
<!-- If using version 2.8 and above set headerName to "CamelHazelcastOperationType" -->
<setHeader headerName="CamelHazelcastOperationType">
<constant>put</constant>
</setHeader>
<setHeader headerName="CamelHazelcastObjectId">
<simple>${header.key1}</simple>
</setHeader>
<process ref="requestTimeProc"/>
<to uri="hazelcast:map:default" />
</route>
And i am trying to put a string as value into the cache.
My HazelCast Configuration is :
<map name="default">
<in-memory-format>OBJECT</in-memory-format>
<backup-count>1</backup-count>
<async-backup-count>0</async-backup-count>
<time-to-live-seconds>0</time-to-live-seconds>
<max-idle-seconds>0</max-idle-seconds>
<eviction-policy>NONE</eviction-policy>
<max-size policy="PER_NODE">0</max-size>
<eviction-percentage>25</eviction-percentage>
<merge-policy>com.hazelcast.map.merge.PassThroughMergePolicy</merge-policy>
</map>
<serialization>
<portable-version>0</portable-version>
</serialization>
And i Am gettign the following exception:
2013-12-10 22:47:08,288 [tp-1763826860-0] ERROR DefaultErrorHandler - Failed delivery for (MessageId: ID-DJD7W4R1-54721-1386740808850-0-1 on ExchangeId: ID-DJD7W4R1-54721-1386740808850-0-2). Exhausted after delivery attempt: 1 caught: com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable serializer for class org.apache.camel.converter.stream.InputStreamCache
com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable serializer for class org.apache.camel.converter.stream.InputStreamCache
at com.hazelcast.nio.serialization.SerializationServiceImpl.toData(SerializationServiceImpl.java:172)[hazelcast-3.1.2.jar:3.1.2]
at com.hazelcast.nio.serialization.SerializationServiceImpl.toData(SerializationServiceImpl.java:157)[hazelcast-3.1.2.jar:3.1.2]
at com.hazelcast.map.MapService.toData(MapService.java:666)[hazelcast-3.1.2.jar:3.1.2]
at com.hazelcast.map.proxy.MapProxyImpl.put(MapProxyImpl.java:72)[hazelcast-3.1.2.jar:3.1.2]
at com.hazelcast.map.proxy.MapProxyImpl.put(MapProxyImpl.java:60)[hazelcast-3.1.2.jar:3.1.2]
at org.apache.camel.component.hazelcast.map.HazelcastMapProducer.put(HazelcastMapProducer.java:136)[camel-hazelcast-2.11.2.jar:2.11.2]
at org.apache.camel.component.hazelcast.map.HazelcastMapProducer.process(HazelcastMapProducer.java:71)[camel-hazelcast-2.11.2.jar:2.11.2]
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)[camel-core-2.11.2.jar:2.11.2]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[camel-core-2.11.2.jar:2.11.2]
at org.apache.camel.processor.SendProcessor$2.doInAsyncProducer(SendProcessor.java:122)[camel-core-2.11.2.jar:2.11.2]
Please let me know if i am missing anythign inthe configuration . I am using the default Hazelcast configuration provided int he Hazelcast jar. Any help will be highly appreciated
A few comments:
I would not change the 'default' map. This means that any map in the system, which doesn't have an explicit configuration, will now use this configuration. So figure out which map is used and configure explicitly for that map.
<map name="explicitName">
<in-memory-format>OBJECT</in-memory-format>
....
</map>
About the exception:
com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable serializer for class org.apache.camel.converter.stream.InputStreamCache
It means that the InputStreamCache is put in the map, but Hazelcast doesn't know how to conver it to a stream of bytes. You can plug in a serializer for this class, see the following blogpost how to do so:
http://blog.hazelcast.com/blog/2013/10/16/kryo-serializer/
The big question for me is: why are you trynig to put an InputStreamCache in a map? My gut feeling says that this class is not something you want to put in a distribured Hazelcast map at all.
I have a camel route:
from("file:///u01/www/images/nonprofits-test?move=.done&preMove=.processing&filter=#nonpFileFilter&minDepth=2&recursive=true")
Later on in the route I need to access the origin file name. How do I get that information? All of the headers contain information in like ${file:name}, but not the actual file name.
Thanks in advance!
The base problem is that simple language is not being evaluated correctly in while running Camel with grails. This is being discussed further on the Camel user list.
there is a header called "CamelFileName" that stores this
see camel-file2 headers section for more details...
If your simple language is not working it would be because you are not using <simple> tag try something like below.
<route id="movedFailedFileForRetry">
<from uri="file:///opt/failed?delete=true" />
<log loggingLevel="INFO" message="Moving failed file ${header.CamelFileName} for retry" />
<choice>
<when>
<simple>${headers.CamelFileName} == 'file1.txt'</simple>
<to uri="file:///opt/input1" />
</when>
<otherwise>
<to uri="file:///opt/input2" />
</otherwise>
</choice>
</route>
Hope it helps!!
${headers.CamelFileName} will provide you with the CamelFileName that is read for processing. We have many other header properties that you can find from the Camel Documentation.