I have two routes which execute some command every 2 seconds on different servers and print the output to the same file:
camelCtx.addRoutes(new RouteBuilder() {
#Override
public void configure() {
from("ssh://username:password#host1:port?delay=2&pollCommand=whoami")
.to("file:///tmp/?fileName=test.txt");
}
});
camelCtx.addRoutes(new RouteBuilder() {
#Override
public void configure() {
from("ssh://username:password#host2:port?delay=2&pollCommand=whoami")
.to("file:///tmp/?fileName=test.txt");
}
});
I would like to be sure that the two routes were started at the same time and also prefix output of each command with timestamp when route was started. For the second problem I tried custom process:
.process(exchange -> {
String body = exchange.getIn().getBody(String.class);
exchange.getIn().setBody(System.currentTimeMillis() + " " + body);
})
but it obviously gives the time when the output was received.
I also can execute date +%s%N before executing the command, so that pollCommand parameter would look like this:
...&pollCommand=date +%s%N;whoami"...
but in this case it's the time when connection to the server is already established, which is a bit too late...
So how to get the 'start time' of the route?
And also how to synchronise several routes so that they execute simultaneously?
If you mean the time that the exchange (camel message) on the route was started/created then you can access that information from the exchange property.
For example from a Camel Processor you can do:
Date created = exchange.getProperty(Exchange.CREATED_TIMESTAMP, Date.class);
You can use that information to build a file name which you can set with the header Exchange.FILE_NAME when will then override the filename configured in the endpoint uri, so you can include the timestamp.
Related
I have many quartz to schedule tasks like (ping, Email, etc.). Is it possible to manage many quartzes in one route in Apache Camel or I have to create route for every quarz?
Error:
Caused by: org.apache.camel.FailedToStartRouteException: Failed to start route route9 because of Multiple consumers for the same endpoint is not allowed: quartz://myTimer?cron=0+0/1+*+*+*+?+*
Code:
#Component
public class TimingRoute extends RouteBuilder {
static final Logger LOGGER = LoggerFactory.getLogger(TimingRoute.class);
#Override
public void configure() throws Exception {
// Every 3 minutes: 0+0/3+*+*+*+?+*
// Every 10 seconds: 0/10+*+*+*+*+?+*
from("quartz://myTimer?cron=0+0/1+*+*+*+?+*") //
.setBody().simple("Current time is " + LocalDateTime.now()) //
.log("${body}").to("direct:processPollingEmail");
from("quartz://myTimer?cron=0+0/1+*+*+*+?+*") //
.setBody().simple("Current time is " + LocalDateTime.now()) //
.log("${body}").to("direct:processPing");
}
}
What you did there already is "create route for every quartz".
I believe the only reason why you're getting this error is that you gave the same ID to both your quartz endpoints. Try naming them "myTimer" and "myTimer2" (or anything more meaningful like "emailTimer" and "pingTimer") and you should be fine.
I want to test below camel route. All the example which i find online has route starting with file, where as in my case i have a spring bean method which is getting called every few minutes and finally message is transformed and moved to jms as well as audit directory.
I am clue less on write test for this route.
All i have currently in my test case is
Mockito.when(tradeService.searchTransaction()).thenReturn(dataWithSingleTransaction);
from("quartz2://tsTimer?cron=0/20+*+8-18+?+*+MON,TUE,WED,THU,FRI+*")
.bean(TradeService.class)
.marshal()
.jacksonxml(true)
.to("jms:queue:out-test")
.to("file:data/test/audit")
.end();
Testing with Apache Camel and Spring-Boot is really easy.
Just do the following (the example below is an abstract example just to give you a hint how you can do it):
Write a Testclass
Use the Spring-Boot Annotations to configure the test class.
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE)
#RunWith(SpringRunner.class)
public class MyRouteTest {
#EndpointInject(uri = "{{sourceEndpoint}}")
private ProducerTemplate sourceEndpoint;
....
public void test() {
// send your body to the endpoint. See other provided methods too.
sourceEndpoint.sendBody([your input]);
}
}
In the src/test/application.properties:
Configure your Camel-Endpoints like the source and the target:
sourceEndpoint=direct:myTestSource
Hints:
It's good not to hardwire your start-Endpoint in the route directly when using spring-boot but to use the application.properties. That way it is easier to mock your endpoints for unit tests because you can change to the direct-Component without changing your source code.
This means instead of:
from("quartz2://tsTimer?cron=0/20+*+8-18+?+*+MON,TUE,WED,THU,FRI+*")
you should write:
from("{{sourceEndpoint}}")
and configure the sourceEndpoint in your application.properties:
sourceEndpoint=quartz2://tsTimer?cron=0/20+*+8-18+?+*+MON,TUE,WED,THU,FRI+*
That way you are also able to use your route for different situations.
Documentation
A good documentation about how to test with spring-boot can be found here: https://docs.spring.io/spring-boot/docs/current/reference/html/boot-features-testing.html
For Apache Camel: http://camel.apache.org/testing.html
#the hand of NOD Thanks for your hints, i was going into completely wrong direction. After reading your answer i was able to write the basic test and from this i think i can take it forward.
Appreciate your time, however i see that based on my route it should drop an XML file to audit directory which is not happening.
Look like intermediate steps are also getting mocked, without I specifying anything.
InterceptSendToMockEndpointStrategy - Adviced endpoint [xslt://trans.xslt] with mock endpoint [mock:xslt:trans.xslt]
INFO o.a.c.i.InterceptSendToMockEndpointStrategy - Adviced endpoint [file://test/data/audit/?fileName=%24%7Bheader.outFileName%7D] with mock endpoint [mock:file:test/data/audit/]
INFO o.a.camel.spring.SpringCamelContext - StreamCaching is not in use. If using streams then its recommended to enable stream caching. See more details at http://camel.apache.org/stream-caching.html
TradePublisherRoute.java
#Override
public void configure() throws Exception {
logger.info("TradePublisherRoute.configure() : trade-publisher started configuring camel route.");
from("{{trade-publisher.sourceEndpoint}}")
.doTry()
.bean(tradeService)
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
String dateStr = Constant.dateFormatForFileName.format(new Date());
logger.info("this is getting executed : " + dateStr);
exchange.setProperty(Constant.KEY_INCOMING_XML_FILE_NAME, "REQ-" + dateStr + Constant.AUDIT_FILE_EXTENSION);
exchange.setProperty(Constant.KEY_OUTGOING_XML_FILE_NAME, "RESP-" + dateStr + Constant.AUDIT_FILE_EXTENSION);
}
})
.marshal()
.jacksonxml(true)
.wireTap("{{trade-publisher.requestAuditDir}}" + "${header.inFileName}")
.to("{{trade-publisher.xsltFile}}")
.to("{{trade-publisher.outboundQueue}}")
.to("{{trade-publisher.responseAuditDir}}" + "${header.outFileName}")
.bean(txnService, "markSuccess")
.endDoTry()
.doCatch(Exception.class)
.bean(txnService, "markFailure")
.log(LoggingLevel.ERROR, "EXCEPTION: ${exception.stacktrace}")
.end();
TradePublisherRouteTest.java
#ActiveProfiles("test")
#RunWith(CamelSpringBootRunner.class)
#SpringBootTest(classes = TradePublisherApplication.class)
#MockEndpoints
public class TradePublisherRouteTest {
#EndpointInject(uri = "{{trade-publisher.outboundQueue}}")
private MockEndpoint mockQueue;
#EndpointInject(uri = "{{trade-publisher.sourceEndpoint}}")
private ProducerTemplate producerTemplate;
#MockBean
TradeService tradeService;
private List<Transaction> transactions = new ArrayList<>();
#BeforeClass
public static void beforeClass() {
}
#Before
public void before() throws Exception {
Transaction txn = new Transaction("TEST001", "C001", "100", "JPM", new BigDecimal(100.50), new Date(), new Date(), 1000, "P");
transactions.add(txn);
}
#Test
public void testRouteConfiguration() throws Exception {
Mockito.when(tradeService.searchTransaction()).thenReturn(new Data(transactions));
producerTemplate.sendBody(transactions);
mockQueue.expectedMessageCount(1);
mockQueue.assertIsSatisfied(2000);
}
Please correct me if i am doing something wrong!
I created a CronScheduledRoutePolicy to start and fire my route daily at 15:30PM for fetching xml from some website and storing it in db like below
CronScheduledRoutePolicy startPolicy = new CronScheduledRoutePolicy();
startPolicy.setRouteStartTime("0 30 15 * * ?");
from("direct:quatzRoute")
.routePolicy(startPolicy)
.log("Route started")*/
.to("http4://mywebsite/today.xml")
.log("Response ${body}")
.convertBodyTo(String.class).process(new Processor() {
public void process(Exchange e) throws Exception {
log.info("Before Logging the xml");
ExchangeRateBean.writeToDB(e);
log.info("After Logging the xml");
}
})
.log("Xml Stored in DB")
.to("mock:result");
In console it shows route started on deployment of bundle. But at specified time (for testing gave current time) my job is not done. No log messages also.
Is there anything else i need to do??
That is a policy for activating the route, so incoming messages to direct:quatzRoute work at that time.
Use one of the following instead to fetch data at a specific time
http://camel.apache.org/quartz.html
http://camel.apache.org/quartz2.html
In my application I have a generic Camel Route such as the following
from("direct:something").to("direct:outgoing")
and then dynamically in my code I deploy another route:
from("direct:outgoing").process(processor)
When flowing from route 1 to route 2 a new Exchange will be created. Is there an idiomatic way to correlate both? Should I set EXCHANGE.Correlation_ID header on the first route before sending it out?
This should definitely all be processed on the one exchange. Run this test and you'll see the same camel Exchange, with the same properties, etc.
public class CamelExchangeTest {
public static void main(String[] args) throws Exception {
final Processor showExchangeIdProcessor = new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
System.out.println(exchange.getExchangeId());
}
};
Main camelMain = new Main();
camelMain.addRouteBuilder(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("timer:foo?period=1s&repeatCount=1")
.log("excgabge created!")
.process(showExchangeIdProcessor)
.to("direct:outgoing")
;
from("direct:outgoing")
.log("outgoing!")
.process(showExchangeIdProcessor)
;
}
});
camelMain.run();
}
}
Output:
ID-MYPC-55760-1411129552791-0-2
ID-MYPC-55760-1411129552791-0-2
So something else is going on. When you say "direct:outgoing", do you mean exactly that or is it something different - a different component perhaps?
When you say the route is created dynamically, how exactly is that done, and when (and why?)
From the Camel doc:
Some EIP patterns will spin off a sub message, and in those cases, Camel will add a correlation id on the Exchange as a property with they key Exchange.CORRELATION_ID, which links back to the source Exchange. For example the Splitter, Multicast, Recipient List, and Wire Tap EIP does this.
Thus, Exchange.CORRELATION_ID is set by Camel and should not be set by your application. But feel free to set a custom header or property if you need to such as:
exchange.getIn().setProperty("myProperty", myIdentifier);
I'm working on a camel prototype which uses two start points in the same camel context.
The first route consumes messages which are used to "configure" the application. Messages are loaded in a configuration repository through a configService bean:
// read configuration files
from("file:data/config?noop=true&include=.*.xml")
.startupOrder(1)
.to("bean:configService?method=loadConfiguration")
.log("Configuration loaded");
The second route implements a recipient list eip pattern, delivering a different kind of input messages to a number of recipients, which are read dinamically from the same configuration repository:
// process some source files (using configuration)
from("file:data/source?noop=true")
.startupOrder(2)
.unmarshal()
.to("setupProcessor") // set "recipients" header
.recipientList(header("recipients"))
// ...
The question that arises now is how to synchronize them, so the second route "waits" if the first is processing new data.
I'm new to Apache Camel and pretty lost on how to approach such a problem, any suggestion would be appreciated.
Use aggregate in combination with the possibility to start and stop routes dynamically:
from("file:data/config?noop=true&include=.*.xml")
.id("route-config")
.aggregate(constant(true), new MyAggregationStrategy()).completionSize(2).completionTimeout(2000)
.process(new Processor() {
#Override
public void process(final Exchange exchange) throws Exception {
exchange.getContext().startRoute("route-source");
}
});
from("file:data/source?noop=true&idempotent=false")
.id("route-source") // the id is needed so that the route is found by the start and stop processors
.autoStartup(false) // this route is only started at runtime
.aggregate(constant(true), new MyAggregationStrategy()).completionSize(2).completionTimeout(2000)
.setHeader("recipients", constant("direct:end")) // this would be done in a separate processor
.recipientList(header("recipients"))
.to("seda:shutdown"); // shutdown asynchronously or the route would be waiting for pending exchanges
from("seda:shutdown")
.process(new Processor() {
#Override
public void process(final Exchange exchange) throws Exception {
exchange.getContext().stopRoute("route-source");
}
});
from("direct:end")
.log("End");
That way, route-source is only started when route-config is completed. route-config and consequently route-source are restarted if new files are found in the config directory.
You can also place an "on completion" http://camel.apache.org/oncompletion.html in the first route that activates the second one.
Apache camel File will create a lock for the file that being processed. Any other File process on this file will not pool on if there is a lock (except if you put consumer.exclusiveReadLock=false)
source :
http://camel.apache.org/file.html => URI Options => consumer.exclusiveReadLock