I am deploying a jar in karaf .
My jar consists of a camel route
Copyin only the route part :
from("file:/app/billing/billingip/HOBSRating/data/mediation/voice/input?include=USAGE_VOICE.*.txt")
.doTry()
.log("#########The Camel Header before loading into kafka topic ######## :${headers}")
.log("#########The Camel Body before loading into kafka topic ######## :${body}")
.to("kafka:172.20.211.201:9092?topic=VoiceStream&zookeeperHost=172.20.211.201&zookeeperPort=9092&serializerClass=kafka.serializer.StringEncoder")
/* .to("kafka:${kafkaserver}?topic=DataStream&zookeeperHost=${zookeeperHost}&zookeeperPort=${zookeeperport}&serializerClass=kafka.serializer.StringEncoder")*/
.to("file:/app/billing/billingip/HOBSRating/data/mediation/voice/success")
.doCatch(Exception.class)
.log("########The exception message is ####### :${exception.message}")
.log("########The stack trace of the exception is ####### :${exception.stacktrace}")
.to("file:/app/billing/billingip/HOBSRating/data/mediation/voice/error")
.log("############### End of Voice Cdr to Kafka Topic Route ################")
Currently ,I am hard coding the kafka server credentials but i want to make this property file driven.
Read about Camel's property placeholder where you can externalize configuration and refer to them from your Camel routes: http://camel.apache.org/using-propertyplaceholder.html
Related
I have the following Apache Camel FTP file download Route:
from(downloadUri)
.routeId(routeId)
.aggregate(new CustomListAggregationStrategy())
.constant(true)
.completionFromBatchConsumer()
.to("direct:" + routeDestinationId);
I add this Route to a context and then request data with a ConsumerTemplate:
List<ResultType> result = consumerTemplate.receiveBody(CAMEL_DIRECT_OBJECT_PREFIX
+ routeId, List.class);
When a connection error occurs (e.g. unknown host, host not reachable), I want to shutdown the Route and throw an exception after the "receiveBody" line where I try to read the downloaded files.
How can I do this?
I tried an onException-handler for the Route where I added a process block to it and called exchange.getContext().stop(); in that processing block. But the application just keeps running.
Stopping a route during routing an existing message is a bit tricky. The reason for that is Camel will Graceful Shutdown the route you are stopping. And if you do that while a message is being routed the Graceful Shutdown will try to wait until that message has been processed.
You can find more info here : https://camel.apache.org/manual/faq/how-can-i-stop-a-route-from-a-route.html
The Documentation of Camel transport for CXF with blueprint
https://camel.apache.org/components/latest/cxf-transport.html
says, the configuration looks like:
client:
<camel:conduit id="*.camel-conduit" camelContextId="camel1" />
server:
<camel:destination id="*.camel-destination" camelContextId="camel1" />
But Blueprint complaints: '*.camel-destination' is not a valid value for 'NCname'. Same for '*.camel-conduit' .
If I leave out the id attribute, the CXF client or CXF server starts up.
But when called, it doesn't find the Camel context.
CXF client:
org.apache.camel.component.cxf.transport.CamelConduit says
IllegalAgumentException "CamelContext must be specified on: conduit:"
CXF server:
org.apache.camel.component.cxf.transport.CamelDestination says
IllegalAgumentException "CamelContext must be specified on:"
Running on Fuse 6.3.
Does anybody know how I must configure CXF transport for Camel in Blueprint?
Try using the "name" attribute instead of the "id" one.
It seems the documentation page(https://camel.apache.org/components/latest/cxf-transport.html) indicates that both the attributes can be used by I think that "name" is the correct one.
I have a Apache Camel route publishing an AVRO message onto a Apache Kafka topic. I only got this to work when setting the producer property 'serializerClass=kafka.serializer.StringEncoder'. Otherwise I get
java.lang.ClassCastException: java.lang.String cannot be cast to [B
at kafka.serializer.DefaultEncoder.toBytes(Encoder.scala:34) at
kafka.producer.async.DefaultEventHandler$$anonfun$serialize$1.apply(DefaultEventHandler.scala:130)
at
kafka.producer.async.DefaultEventHandler$$anonfun$serialize$1.apply(DefaultEventHandler.scala:125)
at
scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at
scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
at
kafka.producer.async.DefaultEventHandler.serialize(DefaultEventHandler.scala:125)
at
kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:52)
at kafka.producer.Producer.send(Producer.scala:77) at
kafka.javaapi.producer.Producer.send(Producer.scala:33) at
org.apache.camel.component.kafka.KafkaProducer.process(KafkaProducer.java:84)
On the other end I have a second Apache Camel route supposed to consume from the above topic which failes with
java.io.IOException: Invalid long encoding at
org.apache.avro.io.BinaryDecoder.innerLongDecode(BinaryDecoder.java:217)
at org.apache.avro.io.BinaryDecoder.readLong(BinaryDecoder.java:176)
at
org.apache.avro.io.ResolvingDecoder.readLong(ResolvingDecoder.java:162)
at
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:160)
at
org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:193)
at
org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:183)
at
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:151)
at
org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:142)
at
org.apache.camel.dataformat.avro.AvroDataFormat.unmarshal(AvroDataFormat.java:133)
at
org.apache.camel.processor.UnmarshalProcessor.process(UnmarshalProcessor.java:67)
Here is the Apache Camel consumer code I use:
<route id="cassandra.publisher">
<from
uri="{{kafka.base.uri}}&topic=sensordata&groupId=Cassandra_ConsumerGroup&consumerId=CassandraConsumer_Instance_1&clientId=adapter2" />
<unmarshal>
<custom ref="avroSensorData" />
</unmarshal>
In order to solve this problem you have to provide the keyDeserializer and valueDeserializer for camel kafka consumer as follows:
&keyDeserializer=org.apache.kafka.common.serialization.StringDeserializer &valueDeserializer=org.apache.kafka.common.serialization.ByteArrayDeserializer
http://camel.465427.n5.nabble.com/Camel-Kafka-Component-td5749525.html#a5769561
describes that Apache Camel version 2.16.0/2.15.3 will support various datatype and not only String messages.
As promised this has been fixed with Apache Camel 2.15.3 and was fixed with CAMEL-8790 (https://issues.apache.org/jira/browse/CAMEL-8790).
I am new to IBM MQ with Apache camel. below is my configuration
<from IBM MQ>
<parallel>
<to ACTIVE MQ>
<to IBM MQ>
</parallel>
my application is running in Spring Container. some time I am getting this below warning, hence My route is not working ( IBM MQ is not reading msg). for getting warning, Route is not working .
Exception
org.springframework.jms.listener.DefaultMessageListenerContainer handleListenerSetupFailure
WARNING: Setup of JMS message listener invoker failed for destination 'temporary' - trying to recover. Cause: User XXXXXX is not authorized to create: temp-queue://ID:IP Address-1:26:1
Feb 20, 2015 2:59:07 AM org.springframework.jms.listener.DefaultMessageListenerContainer refreshConnectionUntilSuccessful
INFO: Successfully refreshed JMS Connection
Feb 20, 2015 2:59:08 AM org.springframework.jms.listener.DefaultMessageListenerContainer handleListenerSetupFailure
WARNING: Setup of JMS message listener invoker failed for destination 'temporary' - trying to recover. Cause: JMSWMQ2008: Failed to open MQ queue 'SYSTEM.DEFAULT.MODEL.QUEUE'.; nested exception is com.ibm.mq.MQException: JMSCMQ0001: WebSphere MQ call failed with compcode '2' ('MQCC_FAILED') reason '2035' ('MQRC_NOT_AUTHORIZED').**
I am using Active & IBM MQ, why Spring JMS Listener is throwing warning ?
You should make sure you don't have request-reply specified as exchange pattern.
I would explicitly state that the message should be "InOnly".
<inOnly uri="ACTIVEMQ..."/>
<inOnly uri="IBM MQ..."/>
If you intend to do a request/reply, then you need to make sure your WebSphere MQ user has rights to access SYSTEM.DEFAULT.MODEL.QUEUE.
Something like this should allow authority:
setmqaut -m QMGR -t q -n SYSTEM.DEFAULT.MODEL.QUEUE -g mygroup +dsp +inq +get
I am learning Webservice security . I am using CXF framework for that. I have developed one test service it will just double up the value whatever we sent. Based on this tutorial
i have added the WS-Policy for XML encryption and signature.
Then i developed the web service client for this service as a eclipse project using CXF.
The following is my client configuration file
<jaxws:client id="doubleItClient" serviceClass="com.DoubleIt" address="http://localhost:8080/myencws/services/DoubleItPort?wsdl">
<jaxws:features>
<bean class="org.apache.cxf.feature.LoggingFeature" />
</jaxws:features>
<jaxws:properties>
<entry key="ws-security.callback-handler" value="com.ClientKeystorePasswordCallback"/>
<entry key="ws-security.encryption.properties" value="com/clientKeystore.properties"/>
<entry key="ws-security.signature.properties" value="com/clientKeystore.properties"/>
<entry key="ws-security.encryption.username" value="myservicekey"/>
</jaxws:properties>
I have generated all the keystore file , and i created the clientKeystore.properties file and placed in the src directory of my project.
But whenever i run this client the SOAP request message was not encrypted. So inn server side i am getting exception like
These policy alternatives can not be satisfied:
{http://docs.oasis-open.org/ws-sx/ws-securitypolicy/200702}EncryptedParts
{http://docs.oasis-open.org/ws-sx/ws-securitypolicy/200702}SignedParts
The following is my SOAP request
<soap:Envelope
xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/"><soap:Body><ns2:doubleValue
xmlns:ns2="http://com/"><arg0>5</arg0></ns2:doubleValue></soap:Body></soap:Envelope>
I am using CXF2.7.3. I dont know whats wrong . Please help me.
I have a similar issue with my code before, what was missing was the jar dependencies which does the actual encryption when the security policy are read by your client from the WSDL.
My fix was to add certain maven dependencies in your POM to enable encryption. Check this url: http://cxf.apache.org/docs/using-cxf-with-maven.html
Also read "Enabling WS-SecurityPolicy" section in url http://cxf.apache.org/docs/ws-securitypolicy.html
I hope this helps
Make sure you are using the correct library. Try to include cxf bundle only, remove other cxf dependencies
If you are using maven, something like this:
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-bundle</artifactId>
<version>2.7.18</version>
</dependency>
I ran into the same issue and after much experimentation, the following guidelines help every single time.
Structure your cxf client config xml to have import of META-INF cxf.xml.
Define the cxf bus features (for logging)
Define the http conduits (if needed for TLS Handshake etc)
jaxws:client bean with name attribute as {targetNameSpaceWSDL)/PortName and createdFromAPI=true and abstract=true
Make client tag contain jaxws features. Remember to use latest "security" and not "ws-security"
In your java client class, use the SpringBus to load the cxf client config xml.SVN Link for SpringBus Client Config
Make sure all the required dependencies for WS policy processing is present in classpath like cxf-rt-ws-policy and cxf-rt-ws-security.jar and bouncycastle providers if needed
Note:
security.signature.properties and security.encryption.properties can be externalized as well and directly referred to with the absolute path in the xml value.