Apache Camel : Corba Endpoint Message Error - apache-camel

Am calling a corba server using a cxf client.
CXF client is successfully receiving the response if its a primitive type, but when the server is sending objects, it fails with below error
The JDK version is : jdk1.7.0_55
CXf version : 2.6.0
ESB Server : Jboss Fuse 6.0.0
Error log says:
10:51:30,208 | WARN | eadpool; w: Idle | PhaseInterceptorChain | 150 - org.apache.cxf.cxf-api - 2.6.0.redhat-60024 | Interceptor for {http://cxf.apache.org/bindings/corba/idl/subsProfileProv}balancemanagement.SubscriberProfileProvisionCORBAService#{http://cxf.apache.org/bindings/corba/idl/subsProfileProv}createSubscriberProfile has thrown exception, unwinding now
org.apache.cxf.binding.corba.CorbaBindingException: org.apache.cxf.binding.corba.CorbaBindingException: Error reading streamable value
at org.apache.cxf.binding.corba.CorbaConduit.close(CorbaConduit.java:145)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.interceptor.MessageSenderInterceptor$MessageSenderEndingInterceptor.handleMessage(MessageSenderInterceptor.java:62)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:262)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.cxf.endpoint.ClientImpl.doInvoke(ClientImpl.java:530)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:456)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.camel.component.cxf.CxfProducer.process(CxfProducer.java:112)[197:org.apache.camel.camel-cxf:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.SendProcessor$2.doInAsyncProducer(SendProcessor.java:122)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.impl.ProducerCache.doInAsyncProducer(ProducerCache.java:298)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:117)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.interceptor.TraceInterceptor.process(TraceInterceptor.java:91)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.fabric.FabricTraceProcessor.process(FabricTraceProcessor.java:81)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.RedeliveryErrorHandler.processErrorHandler(RedeliveryErrorHandler.java:334)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:220)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.interceptor.StreamCachingInterceptor.process(StreamCachingInterceptor.java:52)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.RouteContextProcessor.processNext(RouteContextProcessor.java:46)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.interceptor.DefaultChannel.process(DefaultChannel.java:308)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.Pipeline.process(Pipeline.java:117)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.Pipeline.process(Pipeline.java:80)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.RouteContextProcessor.processNext(RouteContextProcessor.java:46)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.UnitOfWorkProcessor.processAsync(UnitOfWorkProcessor.java:150)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.UnitOfWorkProcessor.process(UnitOfWorkProcessor.java:117)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.RouteInflightRepositoryProcessor.processNext(RouteInflightRepositoryProcessor.java:48)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:73)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:99)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:86)[133:org.apache.camel.camel-core:2.10.0.redhat-60024]
at org.apache.camel.component.cxf.CxfConsumer$1.syncInvoke(CxfConsumer.java:133)[197:org.apache.camel.camel-cxf:2.10.0.redhat-60024]
at org.apache.camel.component.cxf.CxfConsumer$1.invoke(CxfConsumer.java:75)[197:org.apache.camel.camel-cxf:2.10.0.redhat-60024]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:58)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)[:1.7.0_55]
at java.util.concurrent.FutureTask.run(FutureTask.java:262)[:1.7.0_55]
at org.apache.cxf.workqueue.SynchronousExecutor.execute(SynchronousExecutor.java:37)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:107)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:262)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)[150:org.apache.cxf.cxf-api:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.runtime.CorbaDSIServant.invoke(CorbaDSIServant.java:175)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at com.sun.corba.se.impl.protocol.CorbaServerRequestDispatcherImpl.dispatchToServant(CorbaServerRequestDispatcherImpl.java:642)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaServerRequestDispatcherImpl.dispatch(CorbaServerRequestDispatcherImpl.java:205)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaMessageMediatorImpl.handleRequestRequest(CorbaMessageMediatorImpl.java:1700)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaMessageMediatorImpl.handleRequest(CorbaMessageMediatorImpl.java:1558)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaMessageMediatorImpl.handleInput(CorbaMessageMediatorImpl.java:940)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.giopmsgheaders.RequestMessage_1_2.callback(RequestMessage_1_2.java:198)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaMessageMediatorImpl.handleRequest(CorbaMessageMediatorImpl.java:712)[:1.7.0_55]
at com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.dispatch(SocketOrChannelConnectionImpl.java:469)[:1.7.0_55]
at com.sun.corba.se.impl.transport.SocketOrChannelConnectionImpl.doWork(SocketOrChannelConnectionImpl.java:1230)[:1.7.0_55]
at com.sun.corba.se.impl.orbutil.threadpool.ThreadPoolImpl$WorkerThread.performWork(ThreadPoolImpl.java:490)[:1.7.0_55]
at com.sun.corba.se.impl.orbutil.threadpool.ThreadPoolImpl$WorkerThread.run(ThreadPoolImpl.java:519)[:1.7.0_55]
Caused by: org.apache.cxf.binding.corba.CorbaBindingException: Error reading streamable value
at org.apache.cxf.binding.corba.runtime.CorbaStreamableImpl._read(CorbaStreamableImpl.java:51)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.utils.FixedAnyImpl.read_value(FixedAnyImpl.java:55)[:2.6.0.redhat-60024]
at com.sun.corba.se.impl.corba.RequestImpl.unmarshalReply(RequestImpl.java:352)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaMessageMediatorImpl.handleDIIReply(CorbaMessageMediatorImpl.java:476)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaClientRequestDispatcherImpl.processResponse(CorbaClientRequestDispatcherImpl.java:668)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaClientRequestDispatcherImpl.marshalingComplete(CorbaClientRequestDispatcherImpl.java:373)[:1.7.0_55]
at com.sun.corba.se.impl.protocol.CorbaClientDelegateImpl.invoke(CorbaClientDelegateImpl.java:147)[:1.7.0_55]
at com.sun.corba.se.impl.corba.RequestImpl.doInvocation(RequestImpl.java:325)[:1.7.0_55]
at com.sun.corba.se.impl.corba.RequestImpl.invoke(RequestImpl.java:246)[:1.7.0_55]
at org.apache.cxf.binding.corba.CorbaConduit.buildRequest(CorbaConduit.java:194)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.CorbaConduit.close(CorbaConduit.java:141)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
... 67 more
Caused by: java.lang.IllegalArgumentException
at java.nio.Buffer.position(Buffer.java:236)[:1.7.0_55]
at com.sun.corba.se.impl.encoding.ByteBufferWithInfo.position(ByteBufferWithInfo.java:176)[:1.7.0_55]
at com.sun.corba.se.impl.encoding.CDRInputStream_1_2.alignAndCheck(CDRInputStream_1_2.java:95)[:1.7.0_55]
at com.sun.corba.se.impl.encoding.CDRInputStream_1_0.read_long(CDRInputStream_1_0.java:494)[:1.7.0_55]
at com.sun.corba.se.impl.encoding.CDRInputStream_1_0.readStringOrIndirection(CDRInputStream_1_0.java:553)[:1.7.0_55]
at com.sun.corba.se.impl.encoding.CDRInputStream_1_0.read_string(CDRInputStream_1_0.java:589)[:1.7.0_55]
at com.sun.corba.se.impl.encoding.CDRInputStream.read_string(CDRInputStream.java:175)[:1.7.0_55]
at org.apache.cxf.binding.corba.runtime.CorbaObjectReader.readString(CorbaObjectReader.java:279)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.runtime.CorbaObjectReader.read(CorbaObjectReader.java:111)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.runtime.CorbaObjectReader.readStruct(CorbaObjectReader.java:318)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.runtime.CorbaObjectReader.read(CorbaObjectReader.java:128)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.runtime.CorbaObjectReader.readStruct(CorbaObjectReader.java:318)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.runtime.CorbaObjectReader.read(CorbaObjectReader.java:128)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
at org.apache.cxf.binding.corba.runtime.CorbaStreamableImpl._read(CorbaStreamableImpl.java:49)[171:org.apache.cxf.cxf-rt-bindings-corba:2.6.0.redhat-60024]
... 77 more
10:51:30,208 | ERROR | eadpool; w: Idle | DefaultErrorHandler | 133 - org.apache.camel.camel-core - 2.10.0.redhat-60024 | Failed delivery for (MessageId: ID-M-D4S6T72-55807-1454903203683-5-6 on ExchangeId: ID-M-D4S6T72-55807-1454903203683-5-5). Exhausted after delivery attempt: 1 caught: org.apache.cxf.binding.corba.CorbaBindingException: org.apache.cxf.binding.corba.CorbaBindingException: Error reading streamable value

It may be related to https://issues.apache.org/jira/browse/CXF-5254 - the problem with struct containing sequence of strings.

Related

Cassandra is not starting due to java.nio.file.FileAlreadyExistsException when loading jar trigger

I'm trying to use a Trigger in Cassandra 4.1, but initialization fails when loading the jar file
Trace:
cassandra_1 | INFO [OptionalTasks:1] 2022-11-17 22:35:17,483 CustomClassLoader.java:83 - Loading new jar /etc/cassandra/triggers/cassandra-logger-0.2.jar
cassandra_1 | ERROR [OptionalTasks:1] 2022-11-17 22:35:17,488 JVMStabilityInspector.java:68 - Exception in thread Thread[OptionalTasks:1,5,OptionalTasks]
cassandra_1 | org.apache.cassandra.io.FSWriteError: java.nio.file.FileAlreadyExistsException: /tmp/lib/cassandra-0.jar
(...)
cassandra_1 | Caused by: java.nio.file.FileAlreadyExistsException: /tmp/lib/cassandra-0.jar
cassandra_1 | at java.base/sun.nio.fs.UnixCopyFile.copy(Unknown Source)
cassandra_1 | at java.base/sun.nio.fs.UnixFileSystemProvider.copy(Unknown Source)
cassandra_1 | at java.base/java.nio.file.Files.copy(Unknown Source)
cassandra_1 | at org.apache.cassandra.triggers.CustomClassLoader.addClassPath(CustomClassLoader.java:86)
cassandra_1 | ... 22 common frames omitted
cassandra_1 | ERROR [OptionalTasks:1] 2022-11-17 22:35:17,493 DefaultFSErrorHandler.java:64 - Stopping transports as disk_failure_policy is stop
cassandra_1 | ERROR [OptionalTasks:1] 2022-11-17 22:35:17,494 StorageService.java:501 - Stopping native transport
cassandra_1 | INFO [OptionalTasks:1] 2022-11-17 22:35:17,505 Server.java:176 - Stop listening for CQL clients
cassandra_1 | ERROR [OptionalTasks:1] 2022-11-17 22:35:17,506 StorageService.java:506 - Stopping gossiper
cassandra_1 | WARN [OptionalTasks:1] 2022-11-17 22:35:17,507 StorageService.java:405 - Stopping gossip by operator request
cassandra_1 | INFO [OptionalTasks:1] 2022-11-17 22:35:17,507 Gossiper.java:2087 - Announcing shutdown
cassandra_1 | INFO [OptionalTasks:1] 2022-11-17 22:35:17,509 StorageService.java:2950 - Node /192.168.96.2:7000 state jump to shutdown
cassandra_1 | INFO [OptionalTasks:1] 2022-11-17 22:35:17,513 StorageService.java:2950 - Node /192.168.96.2:7000 state jump to shutdown
Jar file:
https://github.com/felipead/cassandra-logger/releases/download/v0.2/cassandra-logger-0.2.jar
docker-compose.yaml, and cassandra.yaml:
https://github.com/hofstede-matheus/MATB09-postgres-vs-cassandra
This error also happens with hms-cassandra-triggers-1.0.1.jar, so I think there is no problem with
cassandra-logger-0.2.jar
Tried with other .jar files and no success.
Thanks in advance
Using cassandra:3.0 image in the docker-compose.yaml solved this startup issue.
I think the triggers are not compatible with version 4.1 of Cassandra.
4.1.0 release of Apache Cassandra has changed the way to load the triggers jars using java.nio (java.nio.file.Files), in earlier releases ( 4.0 or 3.X ) uses Guava (com.google.common.io.Files).
Guava copy method, by default, overwrite a file if you have permissions but java.nio not.
To solve this problem it's as easy as modify the class org.apache.cassandra.triggers.CustomClassLoader on line 86 from:
copy(inputJar.toPath(), out.toPath());
to:
copy(inputJar.toPath(), out.toPath(),StandardCopyOption.REPLACE_EXISTING);

Issue with Apache Camel https rest api with username and password

I have the following piece of code, I've built for connecting to a "https" REST end point using Apache Camel. The problem is that I get 401 error if this is run.
from("timer:learnTimer?period=100s")
.to("log:?level=INFO&showBody=true")
.setHeader("currentTime", simple(currentTime))
.setHeader(Exchange.CONTENT_TYPE,constant("application/json"))
.setHeader(Exchange.HTTP_METHOD, constant("GET"))
.setHeader(Exchange.HTTP_URI, simple("https://xxxxxx/api/siem/offenses?filter=status%20%3D%20%22OPEN%22%20and%20start_time%20%3E%201543647979000?&authMethod=Basic&authUsername=xxxxx&authPassword=xxxxx"))
.to("https://xxxxxxx/api/siem/offenses?filter=status%20%3D%20%22OPEN%22%20and%20start_time%20%3E%201543647979000?&authMethod=Basic&authUsername=xxxx&authPassword=xxxx").convertBodyTo(String.class)
.to("log:?level=INFO&showBody=true");
The error I am receiving is:
Stacktrace
org.apache.camel.http.common.HttpOperationFailedException: HTTP operation failed invoking https://xx.xx.xx.xx/api/siem/offenses?filter=status+%3D+%22OPEN%22+and+start_time+%3E+1543647979000%3F with statusCode: 401
at org.apache.camel.component.http.HttpProducer.populateHttpOperationFailedException(HttpProducer.java:243)
at org.apache.camel.component.http.HttpProducer.process(HttpProducer.java:165)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:548)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:138)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:101)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:201)
at org.apache.camel.component.timer.TimerConsumer.sendTimerExchange(TimerConsumer.java:197)
at org.apache.camel.component.timer.TimerConsumer$1.run(TimerConsumer.java:79)
at java.util.TimerThread.mainLoop(Timer.java:555)
at java.util.TimerThread.run(Timer.java:505)
15:16| WARN | CamelLogger.java 213 | Error processing exchange. Exchange[ID-zabbixproxy-node2-1544019394005-0-1]. Caused by: [org.apache.camel.http.common.HttpOperationFailedException - HTTP operation failed invoking https://xx.xx.xx.xx/api/siem/offenses?filter=status+%3D+%22OPEN%22+and+start_time+%3E+1543647979000%3F with statusCode: 401]
org.apache.camel.http.common.HttpOperationFailedException: HTTP operation failed invoking https://10.96.40.66/api/siem/offenses?filter=status+%3D+%22OPEN%22+and+start_time+%3E+1543647979000%3F with statusCode: 401
at org.apache.camel.component.http.HttpProducer.populateHttpOperationFailedException(HttpProducer.java:243)
at org.apache.camel.component.http.HttpProducer.process(HttpProducer.java:165)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:148)
Are you sure you should set these header before making an rest call?
un necessary request headers in IN Message may cause some issue.
Exchange exchange = ExchangeBuilder.anExchange(camelContext)
.withHeader("").withProperty("")
.withPattern(ExchangePattern...)
.withHeader(Exchange.HTTP_METHOD, HttpMethod.GET)
.build();
producer.send("the end point to rest",exchange);
// producer is ProducerTemaplte
In above code you can set The ExchangePattern and required Headers and property (if only needed).
Hope this helps.

Camel HL7 - ClosedChannelException while sending ACK back to the client

I'm building a HL7 listener using netty4 and processing HL7 messages. Once succesfully processed an ACK is sent back.
from("hl7NettyListener")
.routeId("route_hl7listener")
.startupOrder(997)
.unmarshal()
.hl7(false)
.to("direct:a");
from("direct:a")
.doTry()
.to("bean:processHL7?method=process")
.doCatch(HL7Exception.class)
.to("direct:ErrorACK")
//.transform(ack())
.stop()
.end()
.transform(ack())
.wireTap("direct:b");
This is working fine in my local eclipse. I fire a HL7 message and I get a ACk back.
But i package this application into a jar and put it on my server and then try doing a
cat example.hl7 | netcat localhost 4444 (to fire a HL7 message to port 4444 on linux env)
I dont get an ACK back. I get a closedconnection exception.
DEBUG NettyConsumer - Channel: [id: 0xdf13b06b, L:0.0.0.0/0.0.0.0:4444] writing body: MSH|^~\&|Karisma||Kestral|Kestral|20180309144109.827+1300||ACK^R01|701||2.3.1
2018-03-09 14:41:09,838 [ad #3 - WireTap] DEBUG WireTapProcessor - >>>> (wiretap) direct:b Exchange[]
2018-03-09 14:41:09,839 [ServerTCPWorker] DEBUG NettyConsumer - Caused by: [org.apache.camel.CamelExchangeException - Cannot write response to null. Exchange[ID-annan06-56620-1520559639101-0-2]. Caused by: [java.nio.channels.ClosedChannelException - null]]
org.apache.camel.CamelExchangeException: Cannot write response to null. Exchange[ID-annan06-56620-1520559639101-0-2]. Caused by: [java.nio.channels.ClosedChannelException - null]
at org.apache.camel.component.netty4.handlers.ServerResponseFutureListener.operationComplete(ServerResponseFutureListener.java:54)
at org.apache.camel.component.netty4.handlers.ServerResponseFutureListener.operationComplete(ServerResponseFutureListener.java:36)
at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:514)
at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:488)
at io.netty.util.concurrent.DefaultPromise.access$000(DefaultPromise.java:34)
at io.netty.util.concurrent.DefaultPromise$1.run(DefaultPromise.java:438)
at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:418)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:440)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:873)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.nio.channels.ClosedChannelException
at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
That worked. It was failing because netcat was immediately closing the connection. Had put a ”netcat -i 5 localhost” for netcat to wait for 5 secs and successfully received the ACK back.

Camel-Beanio not finding Bean Class

I'm using apache-servicemix-7.0.0, and using camel-beanio with it (2.16.4).
So I have defined a route that calls beanio to parse a file.
However the Class that the records map to is in an external jar.
I have tried to wrap this external jar using:
https://access.redhat.com/documentation/en-US/Red_Hat_JBoss_Fuse/6.0/html/Deploying_into_the_Container/files/UrlHandlers-Wrap.html
and it makes it available.
But when I deploy my camel blueprint.xml it still throws:
"2017-05-24 15:57:51,566 | ERROR | mix-7.0.0/deploy | BlueprintCamelContext | 40 - org.apache.camel.
camel-blueprint - 2.16.4 | Error occurred during starting Camel: CamelContext(_context1) due Invalid record 'r
ecord', in stream 'REALITY_FILE': Invalid bean class 'za.co.sci.core.shared.RealityFileRecordModel'
org.beanio.BeanIOConfigurationException: Invalid record 'record', in stream 'REALITY_FILE': Invalid bean class
'za.co.sci.core.shared.RealityFileRecordModel'
at org.beanio.internal.compiler.ProcessorSupport.process(ProcessorSupport.java:93)[232:org.beanio:2.1.
0]
....
...
Caused by: java.lang.ClassNotFoundException: za.co.sci.core.shared.RealityFileRecordModel not found by ESB-POC
[257]
at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1
574)[org.apache.felix.framework-5.6.1.jar:]
"
My blue print attached, and camel mapping xml attached
Any ideas how to make camel route find this class?
Thanks,
Jose

Nutch2.3.1 hangs while inject, parse fetch, generate

I've read various SO threads on why it takes so long (or hangs) while generating/injecting/parsing/fetching, but to no luck. The solutions in the following SO threads I've tried implementing, but no luck.
1) Nutch 2.1 urls injection takes forever
2) Nutch 2.2.1 doesnt continue after Injector job
and various other threads.
I'm using Nutch2.3.1 and HBase0.94.27. I've been following this and this tutorial and I was able to build successfully. But when I fire any nutch commands, it hangs up.
Following are the logs I get while firing these commands:-
Inject Command
root#ubuntu:~/apache-nutch-2.3.1/runtime/local# ./bin/nutch inject seed/urls.txt
InjectorJob: starting at 2016-05-04 09:59:12
InjectorJob: Injecting urlDir: seed/urls.txt
Generate Command
root#ubuntu:~/apache-nutch-2.3.1/runtime/local# bin/nutch generate -topN 40
GeneratorJob: starting at 2016-05-04 09:54:08
GeneratorJob: Selecting best-scoring urls due for fetch.
GeneratorJob: starting
GeneratorJob: filtering: true
GeneratorJob: normalizing: true
GeneratorJob: topN: 40
Fetch command
root#ubuntu:~/apache-nutch-2.3.1/runtime/local# bin/nutch fetch -all
FetcherJob: starting at 2016-05-04 10:00:14
FetcherJob: fetching all
FetcherJob: threads: 10
FetcherJob: parsing: false
FetcherJob: resuming: false
FetcherJob : timelimit set for : -1
Parse Command
root#ubuntu:~/apache-nutch-2.3.1/runtime/local# bin/nutch parse -all
ParserJob: starting at 2016-05-04 10:00:43
ParserJob: resuming: false
ParserJob: forced reparse: false
ParserJob: parsing all
Update Command
root#ubuntu:~/apache-nutch-2.3.1/runtime/local# bin/nutch updatedb -all
DbUpdaterJob: starting at 2016-05-04 10:02:24
DbUpdaterJob: updatinging all
Following is the HBase logs:-
client /0:0:0:0:0:0:0:1:45216
2016-05-04 10:00:47,214 WARN org.apache.zookeeper.server.NIOServerCnxn: caught end of stream exception
EndOfStreamException: Unable to read additional data from client sessionid 0x1547b2be4bc000e, likely client has closed socket
at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:220)
at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208)
at java.lang.Thread.run(Thread.java:745)
2016-05-04 10:00:47,215 INFO org.apache.zookeeper.server.NIOServerCnxn: Closed socket connection for client /0:0:0:0:0:0:0:1:45216 which had sessionid 0x1547b2be4bc000e
2016-05-04 10:00:47,215 WARN org.apache.zookeeper.server.NIOServerCnxn: caught end of stream exception
EndOfStreamException: Unable to read additional data from client sessionid 0x1547b2be4bc000d, likely client has closed socket
at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:220)
at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208)
at java.lang.Thread.run(Thread.java:745)
2016-05-04 10:00:47,216 INFO org.apache.zookeeper.server.NIOServerCnxn: Closed socket connection for client /127.0.0.1:59934 which had sessionid 0x1547b2be4bc000d
2016-05-04 10:01:10,000 INFO org.apache.zookeeper.server.ZooKeeperServer: Expiring session 0x1547b2be4bc000c, timeout of 40000ms exceeded
2016-05-04 10:01:10,001 INFO org.apache.zookeeper.server.PrepRequestProcessor: Processed session termination for sessionid: 0x1547b2be4bc000c
2016-05-04 10:01:22,002 INFO org.apache.zookeeper.server.ZooKeeperServer: Expiring session 0x1547b2be4bc000b, timeout of 40000ms exceeded
2016-05-04 10:01:22,003 INFO org.apache.zookeeper.server.PrepRequestProcessor: Processed session termination for sessionid: 0x1547b2be4bc000b
2016-05-04 10:01:28,001 INFO org.apache.zookeeper.server.ZooKeeperServer: Expiring session 0x1547b2be4bc000e, timeout of 40000ms exceeded
2016-05-04 10:01:28,001 INFO org.apache.zookeeper.server.ZooKeeperServer: Expiring session 0x1547b2be4bc000d, timeout of 40000ms exceeded
2016-05-04 10:01:28,001 INFO org.apache.zookeeper.server.PrepRequestProcessor: Processed session termination for sessionid: 0x1547b2be4bc000e
2016-05-04 10:01:28,001 INFO org.apache.zookeeper.server.PrepRequestProcessor: Processed session termination for sessionid: 0x1547b2be4bc000d
2016-05-04 10:02:25,195 INFO org.apache.zookeeper.server.NIOServerCnxnFactory: Accepted socket connection from /127.0.0.1:59938
2016-05-04 10:02:25,202 INFO org.apache.zookeeper.server.ZooKeeperServer: Client attempting to establish new session at /127.0.0.1:59938
2016-05-04 10:02:25,204 INFO org.apache.zookeeper.server.ZooKeeperServer: Established session 0x1547b2be4bc000f with negotiated timeout 40000 for client /127.0.0.1:59938
2016-05-04 10:02:25,822 INFO org.apache.zookeeper.server.NIOServerCnxnFactory: Accepted socket connection from /127.0.0.1:59940
2016-05-04 10:02:25,822 INFO org.apache.zookeeper.server.ZooKeeperServer: Client attempting to establish new session at /127.0.0.1:59940
2016-05-04 10:02:25,825 INFO org.apache.zookeeper.server.ZooKeeperServer: Established session 0x1547b2be4bc0010 with negotiated timeout 40000 for client /127.0.0.1:59940
2016-05-04 10:04:15,530 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Stats: total=2.02 MB, free=243.82 MB, max=245.84 MB, blocks=3, accesses=27, hits=24, hitRatio=88.88%, , cachingAccesses=27, cachingHits=24, cachingHitsRatio=88.88%, , evictions=0, evicted=0, evictedPerRun=NaN
2016-05-04 10:04:28,372 DEBUG org.apache.hadoop.hbase.client.MetaScanner: Scanning .META. starting at row= for max=2147483647 rows using org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation#25e5c862
2016-05-04 10:04:28,379 DEBUG org.apache.hadoop.hbase.master.CatalogJanitor: Scanned 0 catalog row(s) and gc'd 0 unreferenced parent region(s)
2016-05-04 10:09:15,530 DEBUG org.apache.hadoop.hbase.io.hfile.LruBlockCache: Stats: total=2.02 MB, free=243.82 MB, max=245.84 MB, blocks=3, accesses=27, hits=24, hitRatio=88.88%, , cachingAccesses=27, cachingHits=24, cachingHitsRatio=88.88%, , evictions=0, evicted=0, evictedPerRun=NaN
Hadoop.log
2016-05-04 10:42:18,132 INFO crawl.InjectorJob - InjectorJob: starting at 2016-05-04 10:42:18
2016-05-04 10:42:18,134 INFO crawl.InjectorJob - InjectorJob: Injecting urlDir: seed/urls.txt
2016-05-04 10:42:18,527 WARN util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
What exactly is the problem. I have configured everything correctly and it still hangs up. Why

Resources