Apache Cxf Webclient Doen't Work As Expected in Tomee 8 - cxf

I am trying to get jwk keyset from google for use with Apache Cxf OIDC and Jose Libs. The code works fine when I run it on a stand alone main method.
public class Main {
/**
* #param args the command line arguments
*/
public static void main(String[] args) {
final WebClient client = WebClient.create("https://www.googleapis.com/oauth2/v3/certs", Arrays.asList(new JsonWebKeysProvider()), true).accept(MediaType.APPLICATION_JSON);
JsonWebKeys keys = client.get(JsonWebKeys.class);
keys.getKeys().forEach(key -> {
System.out.println("****************************************************************************");
System.out.println("ID........." + key.getKeyId());
System.out.println("Alg........" + key.getAlgorithm());
System.out.println("Key Type..." + key.getKeyType());
System.out.println("Use........" + key.getPublicKeyUse());
});
}
}
The ID, algorithm, key type and use is printed properly meaning that the keys are property populated.
Sample output:
****************************************************************************
ID.........79c809dd1186cc228c4baf9358599530ce92b4c8
Alg........RS256
Key Type...RSA
Use........sig
****************************************************************************
ID.........17d55ff4e10991d6b0efd392b91a33e54c0e218b
Alg........RS256
Key Type...RSA
Use........sig
pom.xml extract for Main class.
<dependencies>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-client</artifactId>
<version>3.3.5</version>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-security-sso-oidc</artifactId>
<version>3.3.5</version>
</dependency>
</dependencies>
The same code however doesn't work when deployed in Tomee 8.
#WebServlet(name = "NewServlet", urlPatterns = {"/x"})
public class NewServlet extends HttpServlet {
#Override
protected void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
PrintWriter writer = response.getWriter();
final WebClient client = WebClient.create("https://www.googleapis.com/oauth2/v3/certs", Arrays.asList(new JsonWebKeysProvider()), true).accept(MediaType.APPLICATION_JSON);
JsonWebKeys keys = client.get(JsonWebKeys.class);
keys.getKeys().forEach(key -> {
writer.println("****************************************************************************");
writer.println("ID........." + key.getKeyId());
writer.println("Alg........" + key.getAlgorithm());
writer.println("Key Type..." + key.getKeyType());
writer.println("Use........" + key.getPublicKeyUse());
});
}
}
The ID, algorithm, key type and use is null when this code runs in Tomee 8. I have added cxf oidc lib and jose jars are installed in tomee/lib folder.
Sample output:
****************************************************************************
ID.........null
Alg........null
Key Type...null
Use........null
****************************************************************************
ID.........null
Alg........null
Key Type...null
Use........null
pom.xml extract for the servlet.
<dependencies>
<dependency>
<groupId>org.apache.tomee</groupId>
<artifactId>javaee-api</artifactId>
<version>8.0-3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-frontend-jaxrs</artifactId>
<version>${cxf.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-security-sso-oidc</artifactId>
<version>${cxf.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-client</artifactId>
<version>${cxf.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
What is causing this issue?

I realized that when Webclient is created inside tomee, it picks up bus properties provided by tomee which was causing JsonWebKeysProvider not to be invoked.
In my case below is the correct way to create the client inside tomee.
JAXRSClientFactoryBean sf = new JAXRSClientFactoryBean();
sf.setAddress("https://www.googleapis.com/oauth2/v3/certs");
sf.setProvider(new JsonWebKeysProvider());
sf.setBus(new ExtensionManagerBus());
Calling sf.setBus(new ExtensionManagerBus()); ensures tomee provided values/properties aren't picked up.

Related

Flink TableEnvironment.create throws NoSuchMethodError

I am testing flink hive connector, following the instruction here https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/hive/overview/.
The final code is as follows. I tried to run it in Intellij IDE. Unfortunately, it doesn't work. TableEnvironment.create throws NoSuchMethodError
public static void main(String[] args) throws Exception {
EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
TableEnvironment tableEnv = TableEnvironment.create(settings); // throws NoSuchMethodError
String name = "myhive";
String defaultDatabase = "default";
String hiveConfDir = "/Users/gaoxiahong/apache-hive-3.1.2-bin/conf";
HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
tableEnv.registerCatalog(name, hive);
tableEnv.useCatalog(name);
System.out.println(tableEnv.executeSql("show tables"));
}
Exception message is as follows:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.calcite.sql.parser.SqlParser.config()Lorg/apache/calcite/sql/parser/SqlParser$Config;
at org.apache.flink.table.planner.delegation.PlannerContext.lambda$getSqlParserConfig$1(PlannerContext.java:263)
at java.util.Optional.orElseGet(Optional.java:267)
at org.apache.flink.table.planner.delegation.PlannerContext.getSqlParserConfig(PlannerContext.java:257)
at org.apache.flink.table.planner.delegation.PlannerContext.createFrameworkConfig(PlannerContext.java:148)
at org.apache.flink.table.planner.delegation.PlannerContext.<init>(PlannerContext.java:130)
at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:116)
at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:62)
at org.apache.flink.table.planner.delegation.DefaultPlannerFactory.create(DefaultPlannerFactory.java:64)
at org.apache.flink.table.factories.PlannerFactoryUtil.createPlanner(PlannerFactoryUtil.java:52)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.create(TableEnvironmentImpl.java:302)
at org.apache.flink.table.api.TableEnvironment.create(TableEnvironment.java:93)
at com.yqg.flinkhive.Test.main(Test.java:18)
My flink version is 1.15.2 and hive version is 3.1.2. The pom.xml file looks like:
<properties>
<flink.version>1.15.2</flink.version>
<hive.version>3.1.2</hive.version>
<scala.version>2.12</scala.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-hive_${scala.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hive.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
Can anyone help me figure out the issue here? Thanks in advane~
Per https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/configuration/overview/ you most likely need to swap flink-table-api-java-bridge to flink-table-api-scala-bridge_2.12.
See also https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/hive/overview/#program-maven

Does org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer implements SinkFunction<T> sinkFunction

I am trying to implement a simple flink job that use org.apache.flink.streaming.connectors, take a Kafka topic as input source and output to a Kafka sink. I am following this guide https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/datastream/kafka/ and write code as such
FlinkKafkaConsumer<String> kafkaConsumer = new FlinkKafkaConsumer<>(TOPIC_IN, new SimpleStringSchema(), props); //FlinkKafkaConsumer<String> testKafkaConsumer = new FlinkKafkaConsumer<>(TOPIC_TEST, new SimpleStringSchema(), props);
kafkaConsumer.setStartFromEarliest();
DataStream<String> dataStream = env.addSource(kafkaConsumer);
StringSchema stringSchema = new StringSchema(TOPIC_OUT);
FlinkKafkaProducer<String> kafkaProducer = new FlinkKafkaProducer<>(TOPIC_OUT, stringSchema, props, FlinkKafkaProducer.Semantic.EXACTLY_ONCE);
//addSink((SinkFunction<String>) kafkaProducer);
dataStream.addSink(kafkaProducer);
However, addSinkneeds SinkFunction while I provide a FlinkKafkaProducer, which extends TwoPhaseCommitSinkFunction. I am confused why it complains and not works.
My pom.xml file is as follows
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_2.11</artifactId>
<version>1.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-base</artifactId>
<version>1.13.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-java -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>1.13.2</version>
<scope>provided</scope>
</dependency>
seems this class has been deprecated https://ci.apache.org/projects/flink/flink-docs-master/api/java/org/apache/flink/streaming/connectors/kafka/package-summary.html.
There is no FlinkKafkaProducer constructor with the method signature you're using. You could use this one:
public FlinkKafkaProducer(
String topicId,
SerializationSchema<IN> serializationSchema,
Properties producerConfig,
#Nullable FlinkKafkaPartitioner<IN> customPartitioner,
FlinkKafkaProducer.Semantic semantic,
int kafkaProducersPoolSize)

Undocumented Constraint? publishing to topic *from* pubsub trigger

I don't know if I'm going crazy, or if this is a limitation that just isn't documented (I've scoured the GCP API docs):
Is it possible to have a cloud function with a pubsub trigger on 'topic A', and inside that cloud function, publish a message to 'topic B'.
I've tried all the other triggers with identical code running (cloud functions as HTTP triggers, Cloud Storage Triggers, Firebase Triggers), and they all successfully publish to topics.
But the moment I (almost literally) copy-paste my code into a pubsub trigger, after consuming the message, when it attempts to publish it's own message to the next topic, it just hangs. The function just times-out when attempting to publish.
So to recap, is the following possible in GCP?
PubSub Topic A --> Cloud Function --> Pubsub Topic B
Thanks in advance for any clarifications! This is all in Java 11. Here's the code:
...<bunch of imports>
public class SignedURLGenerator implements BackgroundFunction<PubSubMessage> {
private static final String PROJECT_ID = System.getenv("GOOGLE_CLOUD_PROJECT");
private static final Logger logger = Logger.getLogger(SignedURLGenerator.class.getName());
/**
* Handle the incoming PubsubMessage
**/
#Override
public void accept(PubSubMessage message, Context context) throws IOException, InterruptedException {
String data = new String(Base64.getDecoder().decode(message.data));
System.out.println("The input message is: " + data.toString());
//Do a bunch of other stuff not relevant to the issue at hand...
publishSignedURL(url.toString());
}
//Here's the interesting part
public static void publishSignedURL(String message) throws IOException, InterruptedException {
String topicName = "url-ready-notifier";
String responseMessage;
Publisher publisher = null;
try {
// Create the PubsubMessage object
ByteString byteStr = ByteString.copyFrom(message, StandardCharsets.UTF_8);
PubsubMessage pubsubApiMessage = PubsubMessage.newBuilder().setData(byteStr).build();
System.out.println("Message Constructed:" + message);
//This part works fine, the message gets constructed
publisher = Publisher.newBuilder(ProjectTopicName.of(PROJECT_ID, topicName)).build();
System.out.println("Publisher Created.");
//This part also works fine, the publisher gets created
publisher.publish(pubsubApiMessage).get();
responseMessage = "Message published.";
//The code NEVER GETS HERE. The message is never published. And eventually the cloud function time's out :(
} catch (InterruptedException | ExecutionException e) {
System.out.println("Something went wrong with publishing: " + e.getMessage());
}
System.out.println("Everything wrapped up.");
}
Edit
As requested, this is my current POM
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cloudfunctions</groupId>
<artifactId>pubsub-function</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.target>11</maven.compiler.target>
<maven.compiler.source>11</maven.compiler.source>
</properties>
<dependencies>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>libraries-bom</artifactId>
<version>20.6.0</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>com.google.cloud.functions</groupId>
<artifactId>functions-framework-api</artifactId>
<version>1.0.1</version>
<type>jar</type>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-storage</artifactId>
<version>1.117.1</version>
</dependency>
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>google-cloud-pubsub</artifactId>
<version>1.113.4</version>
</dependency>
<dependency>
<groupId>com.google.api</groupId>
<artifactId>gax</artifactId>
<version>1.66.0</version>
</dependency>
<dependency>
<groupId>com.google.api</groupId>
<artifactId>gax-grpc</artifactId>
<version>1.66.0</version>
</dependency>
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>0.7.2</version>
</dependency>
</dependencies>
</project>
Can you try to explicitly set the flow control param in your publisher client? like that
publisher = Publisher.newBuilder(ProjectTopicName.of(PROJECT_ID, topicName)).setBatchingSettings(BatchingSettings.newBuilder()
.setDelayThreshold(Duration.of(10, ChronoUnit.SECONDS))
.setElementCountThreshold(1L)
.setIsEnabled(true)
.build()).build();
I don't know what happens, maybe a default and global configuration of PubSub. If it's not that, I will delete this answer.
EDIT 1
Here a screen capture of the builder class on a Publisher parent classe
You have all the default value of the library. However, the behavior that you observe isn't normal. The default must stay the default even if you are in a PubSub trigger. I will open an issue and forward it to the team directly.

Springboot starter for Apache Camel (AMQP) doesn't find ConnectionFactory bean

I created an application to read messages from Apache qpid and to send them on Apache kafka. I am using Camel with Springboot starter. My Pom looks like this -
<dependencyManagement>
<dependencies>
<!-- Camel BOM -->
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-dependencies</artifactId>
<version>${camel.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- ... other BOMs or dependencies ... -->
</dependencies>
</dependencyManagement>
<dependencies>
<!-- starters -->
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-amqp-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-kafka-starter</artifactId>
</dependency>
<!-- other camel dependencies -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-spring</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-amqp</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.3.0.RELEASE</version>
</plugin>
</plugins>
</build>
and Spring application class is -
#SpringBootApplication
public class CamelSpringJmsKafkaApplication {
public static void main(String[] args) {
SpringApplication.run(CamelSpringJmsKafkaApplication.class, args);
}
#Bean
public JmsConnectionFactory jmsConnectionFactory(#Value("${qpidUser}") String qpidUser, #Value("${qpidPassword}") String qpidPassword, #Value("${qpidBrokerUrl}") String qpidBrokerUrl) {
JmsConnectionFactory jmsConnectionFactory = new JmsConnectionFactory(qpidPassword, qpidPassword, qpidBrokerUrl);
return jmsConnectionFactory;
}
#Bean
#Primary
public CachingConnectionFactory jmsCachingConnectionFactory(JmsConnectionFactory jmsConnectionFactory) {
CachingConnectionFactory cachingConnectionFactory = new CachingConnectionFactory(jmsConnectionFactory);
return cachingConnectionFactory;
}
application.properties is -
camel.springboot.main-run-controller = true
camel.component.amqp.enabled = true
camel.component.amqp.connection-factory = jmsCachingConnectionFactory
camel.component.amqp.async-consumer = true
camel.component.amqp.concurrent-consumers = 1
camel.component.amqp.map-jms-message = true
camel.component.amqp.test-connection-on-startup = true
camel.component.kafka.brokers = localhost:9092
qpidBrokerUrl = amqp://localhost:5672?jms.username=guest&jms.password=guest&jms.clientID=clientid2&amqp.vhost=default
qpidUser = guest
qpidPassword = guest
RouteBuilder is -
#Component
public class QpidToKafkaRoute extends RouteBuilder {
public void configure() throws Exception {
from("amqp:queue:test")
.log("Received key : ${header.JMSMessageID}, message : ${body}")
.setHeader(KafkaConstants.KEY, header("JMSMessageID"))
.to("kafka:camel")
.log("Sent key : ${headers[kafka.KEY]}, message : ${body}");
}
}
When I start this application, it throws following exception -
org.apache.camel.FailedToStartRouteException: Failed to start route route1 because of null
at org.apache.camel.impl.engine.RouteService.warmUp(RouteService.java:125) ~[camel-base-3.4.0.jar:3.4.0]
Caused by: java.lang.IllegalArgumentException: connectionFactory must be specified
at org.apache.camel.util.ObjectHelper.notNull(ObjectHelper.java:152) ~[camel-util-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.createConnectionFactory(JmsConfiguration.java:1629) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.getOrCreateConnectionFactory(JmsConfiguration.java:773) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.createListenerConnectionFactory(JmsConfiguration.java:1638) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.getOrCreateListenerConnectionFactory(JmsConfiguration.java:816) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.configureMessageListenerContainer(JmsConfiguration.java:1468) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.createMessageListenerContainer(JmsConfiguration.java:725) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsEndpoint.createMessageListenerContainer(JmsEndpoint.java:189) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsEndpoint.createConsumer(JmsEndpoint.java:184) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsEndpoint.createConsumer(JmsEndpoint.java:73) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.DefaultRoute.addServices(DefaultRoute.java:560) ~[camel-base-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.DefaultRoute.onStartingServices(DefaultRoute.java:166) ~[camel-base-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.RouteService.doWarmUp(RouteService.java:153) ~[camel-base-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.RouteService.warmUp(RouteService.java:123) ~[camel-base-3.4.0.jar:3.4.0]
Could you please help suggest why during autoconfiguring Springboot is not finding connectionFactory? When I debug this code, I can see connectionFactory bean is getting created. I can even see one more log line -
CamelContext has only been running for less than a second. If you intend to run Camel for a longer time then you can set the property camel.springboot.main-run-controller=true in application.properties or add spring-boot-starter-web JAR to the classpath.
however if you see my application.properties file, required property is present at the very beginning.
One more log, I can see at the beginning of application startup -
[main] trationDelegate$BeanPostProcessorChecker : Bean 'org.apache.camel.spring.boot.CamelAutoConfiguration' of type [org.apache.camel.spring.boot.CamelAutoConfiguration] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
Note - One intresting fact that exactly same code was running fine last night, just restarted my desktop and there is not even a single word changed and now it is throwing exception. Code can also be seen here - https://github.com/prashantbhardwaj/qpid-to-kafka-using-camel

jms queue in apache camel rdsl

package com.att.ajsc.deviceeventrouter;
import org.apache.camel.Exchange;
import org.apache.camel.Processor;
import org.springframework.stereotype.Component;
#Component("messageProcessor")
public class MessageProcessor implements Processor {
public void process(Exchange exchange) throws Exception {
System.out.println("-----"+exchange.getIn().getBody());
exchange.getOut().setBody(exchange.getIn().getBody());
}
}
import javax.jms.ConnectionFactory;
import org.apache.activemq.ActiveMQConnectionFactory;
import org.apache.camel.CamelContext;
import org.apache.camel.ProducerTemplate;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.component.jms.JmsComponent;
import org.apache.camel.impl.DefaultCamelContext;
import org.apache.camel.util.jndi.JndiContext;
/**
* An example class for demonstrating some of the basics behind Camel. This
* example sends some text messages on to a JMS Queue, consumes them and
* persists them to disk
*/
public final class CamelJmsToFileExample {
private CamelJmsToFileExample() {
}
public static void main(String args[]) throws Exception {
/*JndiContext jndiContext = new JndiContext();
jndiContext.bind("myBean", new MyBean());
jndiContext.bind("mongoClient", new CommonDbConnection());*/
CamelContext context = new DefaultCamelContext();
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory("tcp://localhost:61616");
context.addComponent("test-jms", JmsComponent.jmsComponentAutoAcknowledge(connectionFactory));
CustomMessage customMessage = new CustomMessage();
customMessage.setId("123");
context.addRoutes(new RouteBuilder() {
public void configure() {
System.out.println("Inside the configure");
from("test-jms:queue:test.queue").to("messageProcessor");
}
});
ProducerTemplate template = context.createProducerTemplate();
context.start();
template.sendBody("test-jms:queue:test.queue", customMessage);
}
}
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>EmailService</groupId>
<artifactId>com.emailService</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>emailService</name>
<dependencies>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>2.15.1</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-stream</artifactId>
<version>2.15.1</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jms</artifactId>
<version>2.15.1</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-camel</artifactId>
<version>5.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.activemq</groupId>
<artifactId>activemq-pool</artifactId>
<version>5.11.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.mongodb/mongo-java-driver -->
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>3.4.2</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-mongodb</artifactId>
<version>2.13.1</version>
</dependency>
<dependency>
<groupId>org.springframework.integration</groupId>
<artifactId>spring-integration-mongodb</artifactId>
<version>4.3.8.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-autoconfigure</artifactId>
<version>1.5.2.RELEASE</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-spring-boot</artifactId>
<version>2.15.1</version>
</dependency>
</dependencies>
</project>
exception am facing :
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Inside the configure
Exception in thread "main" org.apache.camel.FailedToCreateRouteException: Failed to create route route1 at: >>> To[messageProcessor] <<< in route: Route(route1)[[From[test-jms:queue:test.queue]] -> [To[messa... because of No endpoint could be found for: messageProcessor, please check your classpath contains the needed Camel component jar.
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:1028)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:185)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:841)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:2895)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:2618)
at org.apache.camel.impl.DefaultCamelContext.access$000(DefaultCamelContext.java:167)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2467)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2463)
at org.apache.camel.impl.DefaultCamelContext.doWithDefinedClassLoader(DefaultCamelContext.java:2486)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:2463)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:2432)
at CamelJmsToFileExample.main(CamelJmsToFileExample.java:37)
Caused by: org.apache.camel.NoSuchEndpointException: No endpoint could be found for: messageProcessor, please check your classpath contains the needed Camel component jar.
at org.apache.camel.util.CamelContextHelper.getMandatoryEndpoint(CamelContextHelper.java:81)
at org.apache.camel.model.RouteDefinition.resolveEndpoint(RouteDefinition.java:200)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:107)
at org.apache.camel.impl.DefaultRouteContext.resolveEndpoint(DefaultRouteContext.java:113)
at org.apache.camel.model.SendDefinition.resolveEndpoint(SendDefinition.java:62)
at org.apache.camel.model.SendDefinition.createProcessor(SendDefinition.java:56)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:505)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:217)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:1025)
... 12 more
Hi am facing the above exception when send data from jms queue to processor can any one help to resolve this issue.
Can any one suggest how to implement jms Queue in apache camel java RDSL.
Any pointer will be really helpful to me..
Thanks..
The message processor is a string, eg
from("test-jms:queue:test.queue").to("messageProcessor");
Which makes Camel assume its a component with that name and its not, so change it to
from("test-jms:queue:test.queue").process(new Processor() ... );
And inline a processor where you can do something with the message, or if you just want a quick test then use a log endpoint
from("test-jms:queue:test.queue").to("log:hello");
Also beware that the start method on CamelContext is not blocking. See this FAQ: http://camel.apache.org/running-camel-standalone-and-have-it-keep-running.html

Resources