Apache Camel compare string with quotes using spring DSL, how to escape? - apache-camel

How to compare the body content matching (equals or startWith) the value
{"status":"OK"}
How do that?
<camel:choice>
<camel:when>
<simple>${in.body} == '{"status":"OK"}'</simple>
...
</camel:when>
</camel:choice>
I try:
'{"status":"OK"}'
'{\x22status\x22:\x22OK\x22}'
'\{"status":"OK"\}'
'{"status":"OK"\}'
...
I can do this:
<simple>${in.body} contains '{"status":"OK"'</simple>
But I need equals or startWith as rule operator.
:(
I'm using the version apache-camel-2.20.1

Camel supports JsonPath, so you should better go with this component in order to compare json.
Add the depencency in pom
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jsonpath</artifactId>
<version>x.x.x</version>
</dependency>
and then
<camel:choice>
<camel:when>
<jsonpath>$[?(#.status=='OK')]]</jsonpath>
...
</camel:when>
</camel:choice>

Related

Flink TableEnvironment.create throws NoSuchMethodError

I am testing flink hive connector, following the instruction here https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/hive/overview/.
The final code is as follows. I tried to run it in Intellij IDE. Unfortunately, it doesn't work. TableEnvironment.create throws NoSuchMethodError
public static void main(String[] args) throws Exception {
EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
TableEnvironment tableEnv = TableEnvironment.create(settings); // throws NoSuchMethodError
String name = "myhive";
String defaultDatabase = "default";
String hiveConfDir = "/Users/gaoxiahong/apache-hive-3.1.2-bin/conf";
HiveCatalog hive = new HiveCatalog(name, defaultDatabase, hiveConfDir);
tableEnv.registerCatalog(name, hive);
tableEnv.useCatalog(name);
System.out.println(tableEnv.executeSql("show tables"));
}
Exception message is as follows:
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.calcite.sql.parser.SqlParser.config()Lorg/apache/calcite/sql/parser/SqlParser$Config;
at org.apache.flink.table.planner.delegation.PlannerContext.lambda$getSqlParserConfig$1(PlannerContext.java:263)
at java.util.Optional.orElseGet(Optional.java:267)
at org.apache.flink.table.planner.delegation.PlannerContext.getSqlParserConfig(PlannerContext.java:257)
at org.apache.flink.table.planner.delegation.PlannerContext.createFrameworkConfig(PlannerContext.java:148)
at org.apache.flink.table.planner.delegation.PlannerContext.<init>(PlannerContext.java:130)
at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:116)
at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:62)
at org.apache.flink.table.planner.delegation.DefaultPlannerFactory.create(DefaultPlannerFactory.java:64)
at org.apache.flink.table.factories.PlannerFactoryUtil.createPlanner(PlannerFactoryUtil.java:52)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.create(TableEnvironmentImpl.java:302)
at org.apache.flink.table.api.TableEnvironment.create(TableEnvironment.java:93)
at com.yqg.flinkhive.Test.main(Test.java:18)
My flink version is 1.15.2 and hive version is 3.1.2. The pom.xml file looks like:
<properties>
<flink.version>1.15.2</flink.version>
<hive.version>3.1.2</hive.version>
<scala.version>2.12</scala.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-hive_${scala.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-api-java-bridge</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-exec</artifactId>
<version>${hive.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner_${scala.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
Can anyone help me figure out the issue here? Thanks in advane~
Per https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/configuration/overview/ you most likely need to swap flink-table-api-java-bridge to flink-table-api-scala-bridge_2.12.
See also https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/hive/overview/#program-maven

Does org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer implements SinkFunction<T> sinkFunction

I am trying to implement a simple flink job that use org.apache.flink.streaming.connectors, take a Kafka topic as input source and output to a Kafka sink. I am following this guide https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/datastream/kafka/ and write code as such
FlinkKafkaConsumer<String> kafkaConsumer = new FlinkKafkaConsumer<>(TOPIC_IN, new SimpleStringSchema(), props); //FlinkKafkaConsumer<String> testKafkaConsumer = new FlinkKafkaConsumer<>(TOPIC_TEST, new SimpleStringSchema(), props);
kafkaConsumer.setStartFromEarliest();
DataStream<String> dataStream = env.addSource(kafkaConsumer);
StringSchema stringSchema = new StringSchema(TOPIC_OUT);
FlinkKafkaProducer<String> kafkaProducer = new FlinkKafkaProducer<>(TOPIC_OUT, stringSchema, props, FlinkKafkaProducer.Semantic.EXACTLY_ONCE);
//addSink((SinkFunction<String>) kafkaProducer);
dataStream.addSink(kafkaProducer);
However, addSinkneeds SinkFunction while I provide a FlinkKafkaProducer, which extends TwoPhaseCommitSinkFunction. I am confused why it complains and not works.
My pom.xml file is as follows
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_2.11</artifactId>
<version>1.13.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-base</artifactId>
<version>1.13.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-streaming-java -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>1.13.2</version>
<scope>provided</scope>
</dependency>
seems this class has been deprecated https://ci.apache.org/projects/flink/flink-docs-master/api/java/org/apache/flink/streaming/connectors/kafka/package-summary.html.
There is no FlinkKafkaProducer constructor with the method signature you're using. You could use this one:
public FlinkKafkaProducer(
String topicId,
SerializationSchema<IN> serializationSchema,
Properties producerConfig,
#Nullable FlinkKafkaPartitioner<IN> customPartitioner,
FlinkKafkaProducer.Semantic semantic,
int kafkaProducersPoolSize)

Springboot starter for Apache Camel (AMQP) doesn't find ConnectionFactory bean

I created an application to read messages from Apache qpid and to send them on Apache kafka. I am using Camel with Springboot starter. My Pom looks like this -
<dependencyManagement>
<dependencies>
<!-- Camel BOM -->
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-dependencies</artifactId>
<version>${camel.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<!-- ... other BOMs or dependencies ... -->
</dependencies>
</dependencyManagement>
<dependencies>
<!-- starters -->
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-amqp-starter</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.springboot</groupId>
<artifactId>camel-kafka-starter</artifactId>
</dependency>
<!-- other camel dependencies -->
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-spring</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-amqp</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>2.3.0.RELEASE</version>
</plugin>
</plugins>
</build>
and Spring application class is -
#SpringBootApplication
public class CamelSpringJmsKafkaApplication {
public static void main(String[] args) {
SpringApplication.run(CamelSpringJmsKafkaApplication.class, args);
}
#Bean
public JmsConnectionFactory jmsConnectionFactory(#Value("${qpidUser}") String qpidUser, #Value("${qpidPassword}") String qpidPassword, #Value("${qpidBrokerUrl}") String qpidBrokerUrl) {
JmsConnectionFactory jmsConnectionFactory = new JmsConnectionFactory(qpidPassword, qpidPassword, qpidBrokerUrl);
return jmsConnectionFactory;
}
#Bean
#Primary
public CachingConnectionFactory jmsCachingConnectionFactory(JmsConnectionFactory jmsConnectionFactory) {
CachingConnectionFactory cachingConnectionFactory = new CachingConnectionFactory(jmsConnectionFactory);
return cachingConnectionFactory;
}
application.properties is -
camel.springboot.main-run-controller = true
camel.component.amqp.enabled = true
camel.component.amqp.connection-factory = jmsCachingConnectionFactory
camel.component.amqp.async-consumer = true
camel.component.amqp.concurrent-consumers = 1
camel.component.amqp.map-jms-message = true
camel.component.amqp.test-connection-on-startup = true
camel.component.kafka.brokers = localhost:9092
qpidBrokerUrl = amqp://localhost:5672?jms.username=guest&jms.password=guest&jms.clientID=clientid2&amqp.vhost=default
qpidUser = guest
qpidPassword = guest
RouteBuilder is -
#Component
public class QpidToKafkaRoute extends RouteBuilder {
public void configure() throws Exception {
from("amqp:queue:test")
.log("Received key : ${header.JMSMessageID}, message : ${body}")
.setHeader(KafkaConstants.KEY, header("JMSMessageID"))
.to("kafka:camel")
.log("Sent key : ${headers[kafka.KEY]}, message : ${body}");
}
}
When I start this application, it throws following exception -
org.apache.camel.FailedToStartRouteException: Failed to start route route1 because of null
at org.apache.camel.impl.engine.RouteService.warmUp(RouteService.java:125) ~[camel-base-3.4.0.jar:3.4.0]
Caused by: java.lang.IllegalArgumentException: connectionFactory must be specified
at org.apache.camel.util.ObjectHelper.notNull(ObjectHelper.java:152) ~[camel-util-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.createConnectionFactory(JmsConfiguration.java:1629) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.getOrCreateConnectionFactory(JmsConfiguration.java:773) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.createListenerConnectionFactory(JmsConfiguration.java:1638) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.getOrCreateListenerConnectionFactory(JmsConfiguration.java:816) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.configureMessageListenerContainer(JmsConfiguration.java:1468) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsConfiguration.createMessageListenerContainer(JmsConfiguration.java:725) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsEndpoint.createMessageListenerContainer(JmsEndpoint.java:189) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsEndpoint.createConsumer(JmsEndpoint.java:184) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.component.jms.JmsEndpoint.createConsumer(JmsEndpoint.java:73) ~[camel-jms-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.DefaultRoute.addServices(DefaultRoute.java:560) ~[camel-base-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.DefaultRoute.onStartingServices(DefaultRoute.java:166) ~[camel-base-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.RouteService.doWarmUp(RouteService.java:153) ~[camel-base-3.4.0.jar:3.4.0]
at org.apache.camel.impl.engine.RouteService.warmUp(RouteService.java:123) ~[camel-base-3.4.0.jar:3.4.0]
Could you please help suggest why during autoconfiguring Springboot is not finding connectionFactory? When I debug this code, I can see connectionFactory bean is getting created. I can even see one more log line -
CamelContext has only been running for less than a second. If you intend to run Camel for a longer time then you can set the property camel.springboot.main-run-controller=true in application.properties or add spring-boot-starter-web JAR to the classpath.
however if you see my application.properties file, required property is present at the very beginning.
One more log, I can see at the beginning of application startup -
[main] trationDelegate$BeanPostProcessorChecker : Bean 'org.apache.camel.spring.boot.CamelAutoConfiguration' of type [org.apache.camel.spring.boot.CamelAutoConfiguration] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
Note - One intresting fact that exactly same code was running fine last night, just restarted my desktop and there is not even a single word changed and now it is throwing exception. Code can also be seen here - https://github.com/prashantbhardwaj/qpid-to-kafka-using-camel

Apache Cxf Webclient Doen't Work As Expected in Tomee 8

I am trying to get jwk keyset from google for use with Apache Cxf OIDC and Jose Libs. The code works fine when I run it on a stand alone main method.
public class Main {
/**
* #param args the command line arguments
*/
public static void main(String[] args) {
final WebClient client = WebClient.create("https://www.googleapis.com/oauth2/v3/certs", Arrays.asList(new JsonWebKeysProvider()), true).accept(MediaType.APPLICATION_JSON);
JsonWebKeys keys = client.get(JsonWebKeys.class);
keys.getKeys().forEach(key -> {
System.out.println("****************************************************************************");
System.out.println("ID........." + key.getKeyId());
System.out.println("Alg........" + key.getAlgorithm());
System.out.println("Key Type..." + key.getKeyType());
System.out.println("Use........" + key.getPublicKeyUse());
});
}
}
The ID, algorithm, key type and use is printed properly meaning that the keys are property populated.
Sample output:
****************************************************************************
ID.........79c809dd1186cc228c4baf9358599530ce92b4c8
Alg........RS256
Key Type...RSA
Use........sig
****************************************************************************
ID.........17d55ff4e10991d6b0efd392b91a33e54c0e218b
Alg........RS256
Key Type...RSA
Use........sig
pom.xml extract for Main class.
<dependencies>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-client</artifactId>
<version>3.3.5</version>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-security-sso-oidc</artifactId>
<version>3.3.5</version>
</dependency>
</dependencies>
The same code however doesn't work when deployed in Tomee 8.
#WebServlet(name = "NewServlet", urlPatterns = {"/x"})
public class NewServlet extends HttpServlet {
#Override
protected void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
PrintWriter writer = response.getWriter();
final WebClient client = WebClient.create("https://www.googleapis.com/oauth2/v3/certs", Arrays.asList(new JsonWebKeysProvider()), true).accept(MediaType.APPLICATION_JSON);
JsonWebKeys keys = client.get(JsonWebKeys.class);
keys.getKeys().forEach(key -> {
writer.println("****************************************************************************");
writer.println("ID........." + key.getKeyId());
writer.println("Alg........" + key.getAlgorithm());
writer.println("Key Type..." + key.getKeyType());
writer.println("Use........" + key.getPublicKeyUse());
});
}
}
The ID, algorithm, key type and use is null when this code runs in Tomee 8. I have added cxf oidc lib and jose jars are installed in tomee/lib folder.
Sample output:
****************************************************************************
ID.........null
Alg........null
Key Type...null
Use........null
****************************************************************************
ID.........null
Alg........null
Key Type...null
Use........null
pom.xml extract for the servlet.
<dependencies>
<dependency>
<groupId>org.apache.tomee</groupId>
<artifactId>javaee-api</artifactId>
<version>8.0-3</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-frontend-jaxrs</artifactId>
<version>${cxf.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-security-sso-oidc</artifactId>
<version>${cxf.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-rt-rs-client</artifactId>
<version>${cxf.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
What is causing this issue?
I realized that when Webclient is created inside tomee, it picks up bus properties provided by tomee which was causing JsonWebKeysProvider not to be invoked.
In my case below is the correct way to create the client inside tomee.
JAXRSClientFactoryBean sf = new JAXRSClientFactoryBean();
sf.setAddress("https://www.googleapis.com/oauth2/v3/certs");
sf.setProvider(new JsonWebKeysProvider());
sf.setBus(new ExtensionManagerBus());
Calling sf.setBus(new ExtensionManagerBus()); ensures tomee provided values/properties aren't picked up.

Camel CXF Wsdl2java issue

I'm trying to call a third party SOAP web service by using Camel & CXF. Here is an excerpt from the wsdl
<message name="setDeviceDetailsv4">
<part name="parameters" element="tns:setDeviceDetailsv4"></part>
<part name="gdspHeader" element="tns:gdspHeader"></part>
</message>
<message name="setDeviceDetailsv4Response">
<part name="result" element="tns:setDeviceDetailsv4Response"></part>
</message>
<portType name="SetDeviceDetailsv4">
<operation name="setDeviceDetailsv4" parameterOrder="parameters gdspHeader">
<input message="tns:setDeviceDetailsv4"></input>
<output message="tns:setDeviceDetailsv4Response"></output>
</operation>
</portType>
<binding name="SetDeviceDetailsv4PortBinding" type="tns:SetDeviceDetailsv4">
<soap:binding transport="http://schemas.xmlsoap.org/soap/http" style="document"></soap:binding>
<operation name="setDeviceDetailsv4">
<soap:operation soapAction=""></soap:operation>
<input>
<soap:body use="literal" parts="parameters"></soap:body>
<soap:header message="tns:setDeviceDetailsv4" part="gdspHeader" use="literal"></soap:header>
</input>
<output>
<soap:body use="literal"></soap:body>
</output>
</operation>
</binding>
<service name="SetDeviceDetailsv4Service">
<port name="SetDeviceDetailsv4Port" binding="tns:SetDeviceDetailsv4PortBinding">
<soap:address location="http://localhost:${HttpDefaultPort}/GDSPWebServices/SetDeviceDetailsv4Service"></soap:address>
</port>
</service>
As one can see, the soap body uses the "parameters" part which is mentioned in the wsdl above, related to tns:setDeviceDetailsv4.
The example client code looks like this:
System.out.println("Invoking setDeviceDetailsv4...");
SetDeviceDetailsv4_Type _setDeviceDetailsv4_parameters = null;
GdspHeader _setDeviceDetailsv4_gdspHeader = null;
SetDeviceDetailsv4Response _setDeviceDetailsv4__return = port.setDeviceDetailsv4(_setDeviceDetailsv4_parameters, _setDeviceDetailsv4_gdspHeader);
System.out.println("setDeviceDetailsv4.result=" + _setDeviceDetailsv4__return);
When I make a call through my camel route that matches the client code above, I was expecting CXF / Camel to append the "gdspHeader" to the soap header but it's not, it's sending it as a parameter to the web method. A separate developer hand coded the SOAP call and here is what he had and it works perfectly !!
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ws="http://ws.gdsp.xxxxxxx.com/">
<soapenv:Header>
<ws:gdspHeader>
<gdspCredentials>
<userId>xxxx</userId>
<password>xxxx</password>
</gdspCredentials>
</ws:gdspHeader>
</soapenv:Header>
<soapenv:Body>
<ws:setDeviceDetailsv4>
<deviceId>xxxxxx</deviceId>
<state>x</state>
</ws:setDeviceDetailsv4>
</soapenv:Body>
</soapenv:Envelope>
Yet when I make a call through Camel, here is what I get as the SOAP message:
<soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
<soap:Header>
<ns2:gdspHeader xmlns:ns2="http://ws.gdsp.xxxx.com/">
<gdspCredentials>
<password>xxxx</password>
<userId>xxxx</userId>
</gdspCredentials>
</ns2:gdspHeader>
</soap:Header>
<soap:Body>
<ns1:setDeviceDetailsv4 xmlns:ns1="http://ws.gdsp.Xxxxx.com/">
<ns2:arg0 xmlns:ns2="http://ws.gdsp.xxx.com/">
<deviceId>xxxx</deviceId>
<state>x</state>
</ns2:arg0>
<ns2:arg1 xmlns:ns2="http://ws.gdsp.xxxx.com/">
<gdspCredentials>
<password>xxxx</password>
<userId>xxxx</userId>
</gdspCredentials>
</ns2:arg1>
</ns1:setDeviceDetailsv4>
</soap:Body>
</soap:Envelope>
and it FAILS. I've tried to make the gdspCredentials NULL and that doesn't work and if I only pass in one parameter, CXF throws a soap fault stating that the method requires two parameters.
Here is a portion of my pom.xml file
<build>
<plugins>
<plugin>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<version>2.7.7</version>
<executions>
<execution>
<id>generate-sources</id>
<phase>generate-sources</phase>
<configuration>
<wsdlOptions>
<wsdlOption>
<frontEnd>jaxws21</frontEnd>
<faultSerialVersionUID>1</faultSerialVersionUID>
<wsdl>src/main/resources/wsdl/extWebServices.wsdl</wsdl>
<extraargs>
<extraarg>-client</extraarg>
</extraargs>
</wsdlOption>
</wsdlOptions>
</configuration>
<goals>
<goal>wsdl2java</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
How can I get my Camel / CXF call to match what the other developer had done?
The wsdl didn't work out of the box for my needs. I was able to modify the wsdl to remove the "header" option and use an interceptor for handle the header portion and a processor to handle the response & request marshalling / unmarshalling.

Resources