Camel route does not log when deployed in ServiceMix - apache-camel

I referred below article and created dependencies and log4j properties
http://camel.apache.org/how-do-i-use-log4j.html
Here is my simple route
from("direct:start")
.routeId("LogEipInfoLevelRoute")
.log(LoggingLevel.INFO, "Displaying Something - ${body}")
.to("mock:result");
Once i deploy my route in servicemix, im checking service mix log. I couldnt find any such message logged. except the route creation message with specified id.
Am i checking it in the wrong place?

You need to send a message to the direct:start endpoint for anyting to happen.
If you just want to see something going on, then you can use a timer instead of direct, eg
from("timer:foo?period=5000")
to figure a new message every 5 seconds. Notice the message body is null from a timer.

I assume that you are using OSGI Based ServiceMix version
Please execute following command in karaf console:
karaf#root> log:display | grep Displaying
More about ServiceMix logging systems can be found under http://servicemix.apache.org/docs/4.4.x/users-guide/logging-system.html

You must pass something to direct:start for something to happen. You can read more here about using direct : http://camel.apache.org/direct.html
I would just suggest using a timer to kick off your route. Once your route is kicked off by the timer your log will be written out to your log file.

Related

Apache Camel SFTP Consumer not deleting file after reading

I see a weird behavior with Apache Camel SFTP. Even after setting the delete=true attribute, it doesn't delete the file after receiving. I am using 3.0.0-M3 version of camel-ftp
Following is my SFTP configuration,
sftp://<<HOST_NAME>>:<<PORT>>/<<PATH>>?username=<<USERNAME>>" +
"&password=<<PASSWORD>>" +
"&preferredAuthentications=password" +
"&readLock=changed" +
"&readLockMinAge=30000" +
"&delay=20000" +
"&delete=true";
Now Camel is able to read the file, but it doesn't delete the file after reading. While going through the docs, it says
delete (consumer) -
If true, the file will be deleted after it is processed successfully.
How does camel define if it was processed successfully ? Do we need to set any exchange property for Camel to mark it processed successfully ?
After receiving the file all I am doing is pasing it to another route, like following,
from(endpointUri).to("direct:procesSftpFile");
Should I change it from direct to vm or seda?
Looks like nobody faced this issue and I somehow figured out the where this started happening.
The issue was not because of Camel sftp component, but it was with the piece of code I was calling.
Second part of my flow looks like this,
from("direct:procesSftpFile")
.log("...")
// logging and other regular processing
....
// sending to vm InOnly
.to("vm:queue1?exchangePattern=InOnly")
.. some more processing..
.to("vm:queue2?exchangePattern=InOnly")
So the issue was with calling those queue1 and queue2 in above snipet.
Commenting them, fixed it and sftp started deleting the files. For calling the VM, instead of to(), I used producerTemplate.asyncSend as workaround.
One thing I am still confused about is, if we are using InOnly exchange pattern, then why it is affecting the sftp behavior ? Probably I should ask this in a separate question.

Apache Camel Endpoint URI validation

When I got an error inside an Endpoint URI the camel context wont start.
It seems like Camel validates in a first step every Endpoint URI before starting the context.
Can someone pleas tell me, how this works?
Am I right?
I cant figure it out.
Yes Camel validates that it has been configured correctly when it startup. This happens as part of starting the Camel routes.
Its like misconfiguring any other software which will report an error for you to fix.
There is some tooling which you can use to validate your source code to find endpoints that has been misconfigured. I wrote a blog entry about this: http://www.davsclaus.com/2016/01/cheers-fabric8-camel-maven-plugin-to.html

For Apache Camel, is it possible to have half of a route synchronous and the second half async?

I currently have a camel route that exposes a cxf endpoint. When a messages comes through the endpoint I would first enrich that message with some information from another webservice and then do more processing afterwards. However, I want make the first half of this route synchronous so I can send back a response to whomever called my exposed cxf endpoint.
The route looks something like this:
from(cxf:CxfEndpoint)
.process(someProcessing)
.to(cxf:ExternalCxfEndpoint)
.to(activemq:queue:somequeue)
//return a response back to caller here
from(activemq:queue:somequeue)
... //additional processing here
...
The reason for this is because when a message comes via my exposed cxf endpoint I don't know if it's a valid message. I need to first validate it with the message enrichment. Once the message is enriched, I want let whomever sent the message know that their message is accepted but don't want them to wait for the message to make it through the whole route as that could take hours.
Does anyone know how this would work?
Thanks in advance!
I believe all you need to do is set exchangePattern to InOnly a.k.a. make it an Event Message. This should have your route not wait for a reply from ActiveMQ. Camel exchange will default to InOut when it's originating from a web service, as in your case.
A related question with an answer from a Camel dev here.
Also see this one for some details on the behavior when your broker is down.
Yes definitely , 100% possible. A simple example would be this :
From cxf endpoint
Store your request in a camel property or header
To xslt - generate xslt for cxf endpoint - Synchronous flow
Reset your original payload using set body.
Wiretap Endpoint - to any endpoint downstream or even a route , this becomes asynchronous . This won't take part in the above sync response .
Note- step 2 & 4 may not be required, it depends on your use case .
There are whole lots of things you can do, I just gave a very simple example . It doesn't need to be wiretap as well, but wiretap helps us not to write any additional custom exceptional handling.

Camel Route for Javamail Not Stopping on Shutdown

I've a simple route
from(
"myQuartz://EMAIL_Route?cron=0+0/5+*+*+*+?")
.routeId("EMAIL_Route")
.shutdownRunningTask(
ShutdownRunningTask.CompleteCurrentTaskOnly)
.beanRef("errorReportProcessor")
.filter((body().isNotNull()))
.to("smtp://smtpHost?From=someone&to=someoneElse&Subject=something").end();
Even if I shutdown the application in Websphere application server, I still continue to get emails. The scheduler/thread is not stopping. In my quartz properties file, I also tried
org.quartz.scheduler.makeSchedulerThreadDaemon=true
but, fruitless. The Camel, Quartz and Mail component version is 2.12.4. Spring 3.2.5.Release. Websphere 8.
SystemOut.log files clearly mentions, the application stopped without errors. However, I can see a java.exe instance running in task-manager.
OK. I found the issue was with missing "root-app-context". Once, I configured the "root-app-context", the Cron-scheduler is now stopping and no more stranded threads. :)
Even the extra configuration to makeSchedulerThreadDaemon was not required.
org.quartz.scheduler.makeSchedulerThreadDaemon=true

Camel with rabbitmq - Misspelt queue name

Today, I tried to simulate a scenario where in the camel "to" tag I supplied a mis-spelt queue name(which was not there) Camel or RabbitMq instead of throwing an exception back continued to finish the route flow.
Intrgigued I did write a sample program to send a message using "channel.basicPublish" with a wrong queue name. I never got any exception thrown back from rabbit mq client.
however if the exchange name was wrong, I did get an exception back. Is this expected behaviour?
I tried adding return listener, confirm listener,exception handler, etc., but none of them got invoked.
Any clues?
Messages are published to exchanges, so the exchange must be there when publishing messages. At publish time RabbitMQ doesn't care about queues, unless the mandatory flag is provided, or the channel is in confirm mode.

Resources