Mulitple routes in apache camel - best practice - apache-camel

i have implemented an apache camel scheduler which performs a task at fixed interval of time . the number of task to be performed have grown now and i am in a confusion to continue with same approach or create multiple route builders .
The approach now , call the DB get all the configured rest details and iterate in the confuguire method of the routerbuilder and build the routes.
code sample :
public void configure() {
for(int i=0; i< list.length;i++){
from("quartz://myTimer?trigger.repeatInterval=2000&trigger.repeatCount=-1")
.setBody().simple("Current time is ${header.firedTime}")
.to("stream:out");
}
}
Here i have only one routebuilder class , configure method has the for loop which creates the multiple routes.

This seems to be an approach that won't scale well. If you have 10, or 100 ´list´ items - then, great.
If you have 1000 or 10,000 or 1,000,000 routes defined in the RouteBuilder - will that work? I don't know.
A camel route is meant to model the data flow, not represent the data itself. The data should flow through the routes.
I would remodel your solution and make an in-memory queue via the seda: endpoint, and put x items of data (can easily be a java.lang.Integer) on to seda:myqueue.
Then have a second route which consumes asyncronously from seda:myqueue, and processes the data, with whichever timings you choose.

Related

Apache Camel - What is most Camel way to send errors and good object simultaneously from Component Endpoint?

Edit: To be specific I think I am asking for an elegant way to specify 2 endpoints, which I want to send 2 different exchanges from an input of 1 exchange.
I know Camel 'can' do this - but I only have inelegant methods involving sending an object which contains both types to a multicast() and processors on each section removing
I expect there to be potentially multiple error messages with source objects attached. I could just throw them each as exceptions, but this feels incorrect. I'm wondering what the 'correct' approach might be. I almost just want to be able to specify an error endpoint as a target for my component
Currently I have
camel.addComponent( "my", new MyComponent() )
from( some source )
... processing // Lists of input objects as body of in
... onException()
.to( "my:endpoint" )
MyComponent <-- MyEndpoint <-- MyProducer
I want to process the items in each List object that arrives at MyProducer. I process the elements and send failing items out to one endpoint, and good items out to an endpoint
I do not see a good / elegant way of achieving this. If it was single elements (i.e. not collection) I can just throw an exception and catch it in an onException stream.
But I really want to be able to take items, and separate good items and send them one way, and bad items and send them another.
In other words, I want to simultaneously send 2 different messages to 2 different endpoints from the same input from an Endpoint. (The endpoint isn't actually so important here, it is juts I am writing one, it could be any Processor).
I know I could make a decomposable object with good and bad items on it, then multicast and process each good and bad section out on different pieces, but I really would like a succinct reusable mechanism (e.g. built into a Processor or endpoint)
In Camel the stuff between the from() and the to() are an Exchange which is treated as a single message. The component should just be a factory for Endpoint instances, which either creates or sends exchanges. It shouldn't be processing the message (which is what it sounds like here) so there's not really a concept of of errors or good objects, that's for the Route and the Processors/Beans to decide.
If you want to do it all within a single exchange then you can simply have your processor add 2 lists to exchange properties and route them based on the contents.
from("direct:somesource")
.process(new Processor() {
public void process(Exchange exchange) {
// get contents,
// turn it into a list
//
List<Object> worked = new ArrayList<>();
List<Object> failed = new ArrayList<>();
for (Object o : listFromBody) {
try {
// do work
worked.add(o);
} catch (Exception e) {
failed.add(o)
}
}
exchange.getIn().setProperty("worked", worked);
exchange.getIn().setProperty("failed", failed);
}
};)
.choice()
.when(header("failed").isNotEqualTo(null)) // might be a better way to do this, I've not got the IDE open
.process(turnFailedHeaderIntoMessageBody)
.to("direct:errorQueue")
.otherwise()
.process(turnWorkedHeaderIntoMessageBody)
.to("direct:workedQueue");
However this is not a good camel pattern. It's messy and tries to treat the properties as different messages which is contrary to how camel works. From the route's perspective the exchange is an atomic unit, so if you need to break the message up, it's usual to route the contents of the Exchange to be processed as an Exchange by a different route.
I personally would split the list into separate exchanges and process them individually like this:
from("direct:somesource")
.split(body())
.to("direct:processIndividualMessage");
from("direct:direct:processIndividualMessage")
.doTry()
.process(myProcessor)
.to("direct:goodQueue")
.doCatch(Exception.class)
.to("direct:errorQueue")
.end()
It's all depends on your data model. But Camel has no limitations in this regards, you certainly achieve this. I can clarify if you have any specific question.

Is it possible to have a pool of splitters on a camel route

I'm using a Splitter to break up a large file, do some processing on the splits and then, with a custom AggregationStrategy, save the updated splits to a new file. The splitter is configured for streaming but not parallel processing. This works fine.
The problem is that the Splitter calls doAggregate (inherited from MulticastProcessor) which is synchronized. When there are concurrent requests on that route, the performance is significantly impacted by the synchronization. Obviously, the more concurrent requests, the worse it is.
Is there a way to create a pool of splitter instances that can be used for the route? Each incoming exchange could use a different splitter instance and thus avoid the synchronized doAggregate call. Is it possible to leverage a custom ProcessorFactory to do this?
Update:
I created a simple test to demonstrate what I'm talking about.
I have a route like this
from("direct:splitterTest").split(method(new MySplitter(), "rowIterator"), new MyAggregationStrategy()).to("log:someSplitProcessing?groupSize=500")
The MySplitter simply returns an iterator of 10000 String[] which emulates reading a file.
The MyAggregationStrategy pretends to perform some work and saves the records to a new file.
In my test, I added loop to emulate some processing like
Random random = new Random(System.currentTimeMillis());
for (int i = 0; i < 10000; i++) {
random.nextGaussian();
}
I submit requests to the route like this (not that I'm not passing in a file in this case because the splitter is just returning dummy data):
ProducerTemplate producerTemplate = camelContext.createProducerTemplate();
Future<File> future1 = producerTemplate.asyncRequestBody("direct:splitterTest", null, File.class);
Future<File> future2 = producerTemplate.asyncRequestBody("direct:splitterTest", null, File.class);
System.out.println(future1.get());
System.out.println(future2.get());
I wanted to post visualvm screenshots showing how 2 and 4 concurrent in-flight exchanges are impacted by the synchronization but this account is too new to be allowed to post images.
The point is this. When the route is created, there is a single Splitter instance for that route. Multiple in-flight exchanges on that route will synchronize in the doAggregate call which seriously impacts the processing time for each request. When there are 4 requests, you can see that 3 threads are blocked while one is in the doAggregate call.
Due to the nature of the processing that I'm doing, I'm not able to configure the splitter to process in parallel so what I'm looking for is a way to create multiple splitter instances. I could create, say, 4 copies of the route and then use a routingSlip or dynamicRouter to round robin requests to each but that seems kind of ugly and I am hoping there's a better way.
You can just use your own thread pool to save in parallel if you want from the doAggregate.

Camel use pollEnrich from same URI multiple times returns null body

I have 2 routes. The first route uses poll enrich to check if a file is present. The second route uses a poll enrich on the same uri to read and process the file. The first route invokes the second via a SEDA queue, like so:
public void configure() throws Exception {
String myFile = "file://myDir?fileName=MyFile.zip&delete=false&readLock=none";
from("direct:test")
.pollEnrich(myFile, 10000)
.to("seda:myQueue")
;
from("seda:myQueue")
.pollEnrich(myFile, 10000)
.log("Do something with the body")
;
}
As it stands, if I execute the first route, the poll enrich finds a file, but when the poll enrich in the second route executes, it returns a body of null. If I just execute the second route on its own, it retrieves the file correctly.
Why does the second poll enrich return null, is the file locked? (I was hoping using a combination of noop,readLock, and delete=false would prevent any locking)
Does camel consider the second poll enrich as a duplicate, therefore filtering it out? (I have tried implementing my own IdempotentRepository to return false on contains(), but the second pollEnrich still returns null)
You may wonder why I'm trying to enrich from 2 routes, the first route has to check if a number of files exist, only when all files are present (i.e., pollEnrich doesn't return null) can the second route start processing them.
Is there an alternative to pollEnrich that I can use? I'm thinking that perhaps I'll need to create a bean that retrieves a file by URI and returns it as the body.
I'm using camel 2.11.0
I realize this is now an old topic, but I just had a similar problem.
I suggest you try the options:
noop=true
which you already have, and
idempotent=false
To tell Camel it is OK to process the same file twice.
Update after testing:
I actually tested this with both settings as suggested above, it works some times, but under moderate load, it fails, i.e. returns null body for some exchanges, although not all.
The documentation indicates that setting noop=true automatically sets idempotent=true, so I am not sure the idempotent setting is being honoured in this case.
Is there any specific reason why you are not using just one route?
I don't understand why you are using two routes for this. File component can check if the file is there and if it is, pull it. If you are worried about remembering the files so you don't get duplicates, you can use an idempotent repository. At least, based on your question, I don't think you need to complicate the logic using two routes and the content enricher EIP.
the second route returns NULL because the file was already consumed in the first route...if you are just looking for a signal message when all files are present, then use a file consumer along with an aggregator and possibly a claim check to avoid carrying around large payloads in memory, etc...
As you've probably learned this does not work as one might expect
noop=true&idempotent=false
my guess is that Camel ignores idempotent=false and as documented uses instance of MemoryMessageIdRepository. To work around this, one can configure file endpoint to use custom idempotent repo:
noop=true&idempotentRepository=#myRepo
and register custom repository in the registry or spring context:
#Component("myRepo")
public class MyRepo implements IdempotentRepository {
#Override
public boolean contains(Object o) {
return false;
}
...
}
Try pollEnrich with strategyMethodAllowNull="true". By default , this value is false. When it is false, the aggregation strategy looks for the existing Exchange body, to aggregate the content returned from file.
When we make strategyMethodAllowNull="true", the existing body is considered as null. So every time , the content of the file is set into the current exchange body

Camel: Tracing history of exchanges when a splitter is used

I'm using Apache Camel, and trying to create a log of the history of the processing of each message in a workflow.
For simple straight-through workflows, where a message comes in, is processed by a few steps, and then leaves, this could be as simple as just keeping a sequential log of the exchanges. I can do this by writing a custom TraceEventHandler, which is called at each exchange and allows me to do logging.
However, if a splitter is involved, I don't know how to calculate the provenance of any given exchange. I could maintain my own log of exchanges, but in the case of a splitter, not all previous activity would be an ancestor of the current exchange. That is, if an incoming message is split into part1 and part2, which are then each processed separately, I don't want to consider the processing of part1 when calculating the history of part2.
A TraceEventHandler has this method:
#Override
public void traceExchange(ProcessorDefinition<?> node, Processor target,
TraceInterceptor traceInterceptor,Exchange exchange) throws Exception {
}
and I expected that there would be an Exchange method like Exchange getPreviousExchange() that I could call inside traceExchange, but I can find no such thing.
Any advice? I'm not married to using a custom TraceEventHandler if there's a better way to do this.
Thanks.
You can find the previous Exchange id by looking up the exchange property with the key "CamelCorrelationId".
If you want to track the post-split processing as separate branches, then you need to consider the Camel property "CamelSplitIndex". This property will indicate which iteration of the split you're processing and when combined with the CamelCorrelationId as William suggested, will provide the full picture.

Design question on dynamic Apache camel routes/context

We have ActiveMQ onto which the events that happen in the system are published. The project involves users adding entities to their watch-list and whenever there are events on those entities I would like an email to be sent out to the interested participants.
The use-case roughly translates to some one expressing an interest in a product information page on the catalog and an email being sent whenever any activity happens on that product (price goes down, there is a positive review etc.,). I had modelled this interaction as a Camel route.
So, for example, if the user says email me whenever this product's rating equals 5, then the following route would be added to the camel context:
from("activemq:topic:events.product.save").filter().xpath("/object[<object id>]/rating").isEqualTo("5").to("email:<user's email>")
Similarly if the user wants to be notified whenever there is a new comment on a product, another route would be created and so on. This could potentially, end up creating thousands of routes as each user starts adding their watches of interest.
Some questions that I have are:
Is this an acceptable way of creating dynamic routes? One option I am considering is to use recipient lists. But I haven't been able to come up with a solution that would make it elegant to route messages to the bean that would return the recipient list. For example for the case explained above would the bean have a bunch of if-else to see which recipient list to return?
The camelcontext has a method to load routes from a xml file but no method to persist the existing routes. What would be simplest (and efficient) way to persist these dynamically created routes? This thread in the camel-users list sums up my request.
Given the dynamic nature of your subscription requirements, you should use a database to store the information rather than trying to create dynamic routes. This is a much more scalable/appropriate use of technology...
Then you can only need a single static route or a POJO consumer (see below) that can process the product update messages using a simple POJO bean (bean-binding can help, etc). The POJO bean would then be responsible for querying the database to find all "interested" users and send an email using camel-mail
public class NotificationBean {
#Consume(uri="activemq:topic:events.product.save")
public void onUpdate(#XPath("/object/id") String id,
#XPath("/object/rating") String rating) {
//query database for subscriptions for this product ID/rating, etc.
//for each interested subscriber
//send email (camel-mail, etc)
}
public void addSubscription(String productID, Integer rating, String email) {
//create/update subscription entry in database, etc...
}
}

Resources