I created simple bundle:
public class Activator implements BundleActivator {
private CamelContext camelContext;
private CsvDataFormat csv = new CsvDataFormat();
public void start(BundleContext bundleContext) throws Exception {
csv.setDelimiter('|');
csv.setQuoteDisabled(true);
camelContext = new OsgiDefaultCamelContext(bundleContext);
camelContext.addRoutes(new RouteBuilder() {
#Override
public void configure() throws Exception {
from("file://./in/csv?charset=windows-1251")
.unmarshal(csv)
.process(exchange -> {
//do smth
});
}
});
camelContext.start();
}
public void stop(BundleContext bundleContext) throws Exception {
camelContext.stop();
}
}
I using Apache ServiceMix. I install bundles:
karaf#root>feature:install camel-csv
karaf#root>bundle:install -s mvn:org.apache.camel/camel-core-osgi/2.16.5
But, when I start my bundle, I have error:
Caused by: java.lang.NoClassDefFoundError: org/apache/camel/core/osgi/OsgiDefaultCamelContext
at ru.camel.csv.Activator.start(Activator.java:19)
But why? In karaf console I see:
222 | Active | 50 | 2.16.5 | camel-csv
223 | Active | 50 | 1.1.0 | Apache Commons CSV
224 | Resolved | 80 | 1.0.0.SNAPSHOT | csv
228 | Active | 80 | 2.16.5 | camel-core-osgi
Bundle camel-core-osgi contains class OsgiDefaultCamelContext. Why I get this error?
My pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>ru.camel</artifactId>
<groupId>ru.camel.test</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>csv</artifactId>
<dependencies>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core-osgi</artifactId>
<version>2.16.5</version>
</dependency>
<dependency>
<groupId>org.osgi</groupId>
<artifactId>org.osgi.core</artifactId>
<version>6.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-csv</artifactId>
<version>2.16.5</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<inherited>true</inherited>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<executions>
<execution>
<id>osgi-bundle</id>
<goals>
<goal>bundle</goal>
</goals>
<phase>package</phase>
<configuration>
<instructions>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Bundle-Version>${project.version}</Bundle-Version>
<Import-Package>org.apache.camel.core.osgi.OsgiDefaultCamelContext</Import-Package>
<Bundle-Activator>ru.camel.csv.Activator</Bundle-Activator>
</instructions>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
You should check the imports of your bundle.
Every bundle has to declare which packages (classes) it wants to import. Your bundle probably did not declare to import the package OsgiDefaultCamelContext is in.
Imports are defined in the META-INF/MANIFEST.MF of your bundle. Depending on your build tool this file might be created automatically during the build. Otherwise, you have to write that yourself, although I would highly recommend looking at a tool that does this automatically during the build.
In your pom.xmlchange the configuration of the Import-Package statement of the maven-bundle-plugin to:
<Import-Package>org.apache.camel.core.osgi.*;*</Import-Package>
Related
I'm trying to write a ETL pipeline from kafka to HDFS using flink.
I'm using the bahir KuduSink and a PojoOperationMapper
It throws an exception before starting. I've included my code, pom, and exception stack trace.
Is there something obvious I'm missing?
package pipeline.poc.model;
import lombok.Data;
#Data
public class MyModel{
private String msgKey;
private String msgData;
}
Pipeline mapping
package pipeline.poc;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationFeature;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
public class MessageMapFunction implements MapFunction<ObjectNode, MyModel> {
/**
*
*/
private static final long serialVersionUID = 1L;
private final ObjectMapper mapper;
public MessageMapFunction() {
super();
mapper = new ObjectMapper();
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
}
#Override
public MyModel map(ObjectNode value) throws Exception {
JsonNode msgValue = value.get("value");
return mapper.convertValue(msgValue, MyModel.class);
}
}
The pipeline program
package pipeline.poc;
import java.util.Properties;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.connectors.kudu.connector.KuduTableInfo;
import org.apache.flink.connectors.kudu.connector.writer.KuduWriterConfig;
import org.apache.flink.connectors.kudu.connector.writer.PojoOperationMapper;
import org.apache.flink.connectors.kudu.connector.writer.AbstractSingleOperationMapper.KuduOperation;
import org.apache.flink.connectors.kudu.streaming.KuduSink;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.flink.streaming.connectors.kafka.KafkaDeserializationSchema;
import org.apache.flink.streaming.util.serialization.JSONKeyValueDeserializationSchema;
public class Pipeline {
private final StreamExecutionEnvironment env;
private final KuduWriterConfig kuduConfig;
private final PojoOperationMapper<MyModel> operationMapper;
private final KuduSink<MyModel> kuduSink;
private final KafkaDeserializationSchema<ObjectNode> schema;
private final FlinkKafkaConsumer<ObjectNode> consumer;
private final String[] columns = {"key", "value"};
private final MapFunction<ObjectNode, MyModel> messageMapFunction;
public static void main(String[] args) {
new Pipeline().run();
}
Pipeline() {
env = StreamExecutionEnvironment.getExecutionEnvironment();
schema = new JSONKeyValueDeserializationSchema(false);
kuduConfig = KuduWriterConfig.Builder
.setMasters("localhost:7051,localhost:7151,localhost:7251")
.build();
operationMapper = new PojoOperationMapper<> (
MyModel.class,
columns,
KuduOperation.INSERT);
kuduSink = new KuduSink<>(
kuduConfig,
KuduTableInfo.forTable("TOYTABLE"),
operationMapper);
Properties props = new Properties();
props.setProperty("bootstrap.servers", "localhost:9092");
props.setProperty("group.id", "pipeline.demo");
consumer = new FlinkKafkaConsumer<>(
"pipeline.demo",
schema,
props);
messageMapFunction = new MessageMapFunction();
}
public void run() {
DataStream<ObjectNode> dataStream = env.addSource(consumer);
DataStream<MyModel> messageStream = dataStream.map(messageMapFunction);
// just printing the mapped stream works
// messageStream.print();
// Adding the kuduSink throw an exception
messageStream.addSink(kuduSink);
try {
env.execute("Pipeline Demo");
} catch (Exception e) {
e.printStackTrace();
}
}
}
It throw this exception
ERROR StatusLogger No Log4j 2 configuration file found. Using default configuration (logging only errors to the console), or user programmatically provided configurations. Set system property 'log4j2.debug' to show Log4j 2 internal initialization logging. See https://logging.apache.org/log4j/2.x/manual/configuration.html for instructions on how to configure Log4j 2
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.flink.api.java.ClosureCleaner (file:/Users/smitopher/.m2/repository/org/apache/flink/flink-core/1.13.1/flink-core-1.13.1.jar) to field java.util.Properties.serialVersionUID
WARNING: Please consider reporting this to the maintainers of org.apache.flink.api.java.ClosureCleaner
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: [Ljava.lang.reflect.Field;#1095f122 is not serializable. The object probably contains or references non serializable fields.
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:164)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:2053)
at org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:203)
at org.apache.flink.streaming.api.datastream.DataStream.addSink(DataStream.java:1243)
at pipeline.poc.Pipeline.run(Pipeline.java:75)
at pipeline.poc.Pipeline.main(Pipeline.java:34)
Caused by: java.io.NotSerializableException: java.lang.reflect.Field
at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1185)
at java.base/java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1379)
at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1175)
at java.base/java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:349)
at org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:624)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:143)
... 8 more
pom.xml
<!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor
license agreements. See the NOTICE file distributed with this work for additional
information regarding copyright ownership. The ASF licenses this file to
you under the Apache License, Version 2.0 (the "License"); you may not use
this file except in compliance with the License. You may obtain a copy of
the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required
by applicable law or agreed to in writing, software distributed under the
License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS
OF ANY KIND, either express or implied. See the License for the specific
language governing permissions and limitations under the License. -->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>pipeline.poc</groupId>
<artifactId>kafka-flink-pipeline</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Flink Quickstart Job</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.13.1</flink.version>
<target.java.version>11</target.java.version>
<scala.binary.version>2.11</scala.binary.version>
<maven.compiler.source>${target.java.version}</maven.compiler.source>
<maven.compiler.target>${target.java.version}</maven.compiler.target>
<log4j.version>2.12.1</log4j.version>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<!-- Apache Flink dependencies -->
<!-- These dependencies are provided, because they should not be packaged
into the JAR file. -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<!-- Add connector dependencies here. They must be in the default scope
(compile). -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- Add logging framework, to produce console output when running in the
IDE. -->
<!-- These dependencies are excluded from the application JAR by default. -->
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.bahir</groupId>
<artifactId>flink-connector-kudu_2.11</artifactId>
<version>1.1-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.20</version>
</dependency>
</dependencies>
<build>
<plugins>
<!-- Java Compiler -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>${target.java.version}</source>
<target>${target.java.version}</target>
</configuration>
</plugin>
<!-- We use the maven-shade plugin to create a fat jar that contains all
necessary dependencies. -->
<!-- Change the value of <mainClass>...</mainClass> if your program entry
point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.flink:force-shading</exclude>
<exclude>com.google.code.findbugs:jsr305</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>org.apache.logging.log4j:*</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<!-- Do not copy the signatures in the META-INF folder. Otherwise,
this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.apple.pipeline.poc.StreamingJob</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<!-- This improves the out-of-the-box experience in Eclipse by resolving
some warnings. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<versionRange>[3.1.1,)</versionRange>
<goals>
<goal>shade</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore />
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<versionRange>[3.1,)</versionRange>
<goals>
<goal>testCompile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
I have added io.quarkus:quarkus-camel-core to my application, but the direct start does not work within native-image. If I run quarkus in JVM, then it works.
There are projects in Github (https://github.com/apache/camel-quarkus/tree/master/extensions/direct) that somehow indicate that there is an extension planned in future, but it is not officially supported.
How can I make it run with minimal effort, e.g. create own extension project only for direct. If I am adding the existing projects to my Maven pom, I am getting problems with the different Maven coordinates, and at the end the native build tells me that there are duplicates.
What would be a good way to make the "direct" statement from Camel run in quarkus?
By the way, the native build works, i.e. I get an executable, but the injection of the direct statement does not work:
"org.apache.camel.ResolveEndpointFailedException: Failed to resolve
endpoint: direct://init due to: No component found with scheme:
direct"
Sources:
REST endpoint:
#Path("/hello")
public class GreetingResource {
#GET
#Produces(MediaType.TEXT_PLAIN)
public String hello() {
ExchangeBuilder exchangeBuilder = new ExchangeBuilder(context);
Exchange out = template.send("direct:init", exchangeBuilder.build());
return out.getOut().toString();
}
CamelRouteBuilder:
public class CamelSyncRouteBuilder extends RouteBuilder {
static final String HTTP_ROUTE_ID = "http:camel";
static long[] times = new long[1];
#Override
public void configure() throws Exception {
from("direct:init").routeId(HTTP_ROUTE_ID)
.setHeader(MyOrderService.class.getName(), MyOrderService::new)
.setHeader(Filler.class.getName(), Filler::new).process(fill(Filler.class.getName(), "fill"))
.split(body().tokenize("#"), CamelSyncRouteBuilder.this::aggregate)
.process(stateless(MyOrderService.class.getName(), "handleOrder")).end().to("log:foo?level=OFF");
}
pom.xml:
<?xml version="1.0"?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sap.it.graal</groupId>
<artifactId>getting-started</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<surefire-plugin.version>2.22.0</surefire-plugin.version>
<maven.compiler.target>1.8</maven.compiler.target>
<maven.compiler.source>1.8</maven.compiler.source>
<quarkus.version>0.19.1</quarkus.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-bom</artifactId>
<version>${quarkus.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-resteasy</artifactId>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-junit5</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-camel-core</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>${quarkus.version}</version>
<executions>
<execution>
<goals>
<goal>build</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>${surefire-plugin.version}</version>
<configuration>
<systemProperties>
<java.util.logging.manager>org.jboss.logmanager.LogManager</java.util.logging.manager>
</systemProperties>
</configuration>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>native</id>
<activation>
<property>
<name>native</name>
</property>
</activation>
<build>
<plugins>
<plugin>
<groupId>io.quarkus</groupId>
<artifactId>quarkus-maven-plugin</artifactId>
<version>${quarkus.version}</version>
<executions>
<execution>
<goals>
<goal>native-image</goal>
</goals>
<configuration>
<enableHttpUrlHandler>true</enableHttpUrlHandler>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-failsafe-plugin</artifactId>
<version>${surefire-plugin.version}</version>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
<configuration>
<systemProperties>
<native.image.path>${project.build.directory}/${project.build.finalName}-runner</native.image.path>
</systemProperties>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
the direct component should work out of the box even without a dedicated extension, as example it is used to create the integration test for the jdbc component (https://github.com/apache/camel-quarkus/tree/master/integration-tests/jdbc).
Can you share more information about your project and set-up ?
I was facing the same problem because i didn't learn how exactly Camel works. As it said in logs - you have no Component that provides "direct" URI. So you have to add that Component. There is one in Camel Quarkus extensions:
<dependency>
<groupId>org.apache.camel.quarkus</groupId>
<artifactId>camel-quarkus-direct</artifactId>
</dependency>
Hello I am new to servicemix and cannot start a simple self-programmed bundle.
My pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<!-- Generated by Apache ServiceMix Archetype -->
<modelVersion>4.0.0</modelVersion>
<groupId>de.rupp</groupId>
<artifactId>test</artifactId>
<packaging>bundle</packaging>
<version>1.0-SNAPSHOT</version>
<name>test</name>
<properties>
<camel.version>3.0.0-M3</camel.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-core</artifactId>
<version>${camel.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<defaultGoal>install</defaultGoal>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.0.2</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.4.3</version>
<configuration>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.6</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Import-Package>*</Import-Package>
<Private-Package>de.rupp</Private-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
My camel-context.xml
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:camel="http://camel.apache.org/schema/spring"
xsi:schemaLocation="
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-2.5.xsd
http://camel.apache.org/schema/spring
http://camel.apache.org/schema/spring/camel-spring-2.10.3.xsd">
<camelContext id="Merda" xmlns="http://camel.apache.org/schema/spring">
<packageScan>
<package>de.rupp</package>
</packageScan>
</camelContext>
</beans>
I only have one simple class
public class TestRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
from("file:in")
.id("file-in")
.log("Nachricht: ${body}")
.to("file:out");
}
}
The resulting MANIFEST.MF is
Manifest-Version: 1.0
Bnd-LastModified: 1560848195732
Build-Jdk: 1.8.0_181
Built-By: bla
Bundle-ManifestVersion: 2
Bundle-Name: test
Bundle-SymbolicName: test
Bundle-Version: 1.0.0.SNAPSHOT
Created-By: Apache Maven Bundle Plugin
Export-Package: de.rupp;uses:="org.apache.camel.builder,org.apache.camel
.model";version="1.0.0.SNAPSHOT"
Import-Package: org.apache.camel.builder;version="[3.0,4)",org.apache.ca
mel.model;version="[3.0,4)"
Tool: Bnd-1.50.0
when I copy the jar to the deploy folder and use bundle:list
I see it installed
225 | Installed | 80 | 1.0.0.SNAPSHOT | test
However I cannot start it.
karaf#root>start 225
Error executing command: Error executing command on bundles:
Error starting bundle 225: Unable to resolve test [225](R 225.14): missi
ng requirement [test [225](R 225.14)] osgi.wiring.package; (&(osgi.wiring.packag
e=org.apache.camel.builder)(version>=3.0.0)(!(version>=4.0.0))) Unresolved requi
rements: [[test [225](R 225.14)] osgi.wiring.package; (&(osgi.wiring.package=org
.apache.camel.builder)(version>=3.0.0)(!(version>=4.0.0)))]
Any help would be greatly appreciated.
Does anyone know a good tutorial for writing camel bundles?
Thanks,
Hans
You should use the Camel version that ServiceMix is using (ships with out of the box). ServiceMix does NOT support Camel 3.
Also I wonder suggest to look at just using Apache Karaf or alternative runtimes for Camel (Spring Boot, Quarkus, Tomcat, Standalone Camel via Camel Main) as ServiceMix is not so active anymore.
Replace your property with this:
<properties>
<camel.version>2.16.5</camel.version>
</properties>
I am working with Spring Boot version 1.5.2.Release
Angular 2 with systemjs config for front end
Maven build with following high-level configuration
Pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<exclusions>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
</exclusion>
<exclusion>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-logging</artifactId>
</exclusion>
<exclusion>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-tomcat</artifactId>
<scope>provided</scope>
</dependency>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<fork>true</fork>
<skip>false</skip>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
</configuration>
</plugin>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<start-class>mmm.his.empi.WebComponentApplication</start-class>
</properties>
Application.java
#SpringBootApplication
#EnableJpaRepositories
#EnableAutoConfiguration (exclude = {HibernateJpaAutoConfiguration.class})
#Import({ AppConfig.class, HibernateConfiguration.class, SwaggerConfig.class })
public class Application extends SpringBootServletInitializer{
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(applicationClass);
}
public static void main(String[] args) {
System.out.println("WebComponentApplication started... 7.10");
SpringApplication.run(applicationClass, args);
}
private static Class<Application> applicationClass = Application.class;
}
This configuration works fine with Tomcat deployment on Windows. But
when i deploy the same war on tomcat, i am just getting 404 - No
Resource found error
War Structure
ROOT.war
META-INF - Maven - MANIFEST
WEB-INF - classes - lib - lib-provided
org - springframework - boot - loader - (loader classes)
classes contains the spring boot application class and controllers.
classes folder also contains the dist folder of Angular application
MANIFEST
Manifest-Version: 1.0
Implementation-Title: test-component
Implementation-Version: 0.0.1-SNAPSHOT
Built-By: XXX
Implementation-Vendor-Id: com.test.proj
Spring-Boot-Version: 1.5.2.RELEASE
Implementation-Vendor: Pivotal Software, Inc.
Main-Class: org.springframework.boot.loader.WarLauncher
Start-Class: com.test.proj.Application
Spring-Boot-Classes: WEB-INF/classes/
Spring-Boot-Lib: WEB-INF/lib/
Created-By: Apache Maven 3.3.9
Build-Jdk: 1.8.0_91
Implementation-URL: http://maven.apache.org
I am deploying my war at the root context of Tomcat as ROOT.war
I am not sure what is going wrong here. Please help.
I resolved this issue.
Its actually a JDK conflict on the linux machine. I have JDK 1.8 installed, but set the environment variable only for the user. but i deployed the app as a ROOT user. I had to set it for the ROOT user again for it to work properly.
I'm trying to run a single page web application written in angularjs with spark java on a jetty, the project is written in eclipse kepler as a Maven project.
the error I'm receiving when I go into the url http://localhost:8085/ is as follows, and it doesnt matter if I try to access any other url's or point directly to the index.html/jsp still gets the same error.
HTTP ERROR 404
Problem accessing /. Reason:
Not Found
Powered by Jetty://
Eclipse Console is showing that the server is up and running
== Spark has ignited ...
Listening on localhost:8085 [Thread-2] INFO org.eclipse.jetty.server.Server - jetty-9.0.2.v20130417 [Thread-2]
INFO org.eclipse.jetty.server.ServerConnector - Started
ServerConnector#12ee37d9{HTTP/1.1}{localhost:8085}
Added the Package Explorer in case that might help.
I'll add some of the files code and if move is needed just let me know and ill edit to add them.
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>todoapp</groupId>
<artifactId>todoapp1</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>2.11.3</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.2.4</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<version>9.0.2.v20130417</version>
<configuration>
<webApp>
<contextPath>/public</contextPath>
</webApp>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.todoapp.Bootstrap</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Bootstrap.java
package com.todoapp;
import com.mongodb.*;
import static spark.Spark.setIpAddress;
import static spark.Spark.setPort;
import static spark.SparkBase.staticFileLocation;
public class Bootstrap {
private static final String IP_ADDRESS = System.getenv("OPENSHIFT_DIY_IP") != null ? System.getenv("OPENSHIFT_DIY_IP") : "localhost";
private static final int PORT = System.getenv("OPENSHIFT_DIY_IP") != null ? Integer.parseInt(System.getenv("OPENSHIFT_DIY_IP")) : 8085;
public static void main(String[] args) throws Exception {
setIpAddress(IP_ADDRESS);
setPort(PORT);
staticFileLocation("/public");
new TodoResource(new TodoService(mongo()));
}
private static DB mongo() throws Exception {
String host = System.getenv("OPENSHIFT_MONGODB_DB_HOST");
if (host == null) {
MongoClient mongoClient = new MongoClient("localhost");
return mongoClient.getDB("todoapp");
}
int port = Integer.parseInt(System.getenv("OPENSHIFT_MONGODB_DB_PORT"));
String dbname = System.getenv("OPENSHIFT_APP_NAME");
String username = System.getenv("OPENSHIFT_MONGODB_DB_USERNAME");
String password = System.getenv("OPENSHIFT_MONGODB_DB_PASSWORD");
MongoClientOptions mongoClientOptions = MongoClientOptions.builder().connectionsPerHost(20).build();
MongoClient mongoClient = new MongoClient(new ServerAddress(host, port), mongoClientOptions);
mongoClient.setWriteConcern(WriteConcern.SAFE);
DB db = mongoClient.getDB(dbname);
if (db.authenticate(username, password.toCharArray())) {
return db;
} else {
throw new RuntimeException("Not able to authenticate with MongoDB");
}
}
}
this 2 files are the ones responsible for the jetty server, now I have been researching the internet for over a day now trying to find solution and reasons for this error, but nothing helped me.
Make sure your /public directory, where your html files land, is accessible through your project's classpath. In order to test this, try using staticFiles.externalLocation (this for version 2.5, externalStaticFileLocation in your case) with the full path to /public as the param. If it works, you'd probably move your /public directory to the same location as your java packages and it will work with staticFileLocation("/public") as you expect.