Error in creating rule engine drools - owl-api

I intended to use the Openllet reasoner, as suppose to the other available reasoners. But this reasoner is compatible only with the OWL API 5.X.X distribution. I have a xxx.owl file which contains SWRL rules. Since the existing SWRL API is not compatible with OWL API 5, Ignazio Palmisano had kindly put up a forked repository with required changes, so that it is compatible with the OWL API 5.X.X distribution. Consequently, I removed the dependencies related to SWRL API and drools engine. Instead, I built them locally by downloading the 'zip' files.
Now, with ".jar" files of the SWRL API and Drools loaded into the project in intelliJ, I am presented with this following error:
Exception in thread "main" org.swrlapi.exceptions.SWRLRuleEngineException: Error creating rule engine Drools. Exception: java.lang.NoClassDefFoundError. Message: org/drools/runtime/rule/AgendaFilter
at org.swrlapi.factory.DefaultSWRLRuleAndQueryEngineFactory.createSWRLRuleEngine(DefaultSWRLRuleAndQueryEngineFactory.java:71)
at org.swrlapi.factory.DefaultSWRLRuleAndQueryEngineFactory.createSWRLRuleEngine(DefaultSWRLRuleAndQueryEngineFactory.java:41)
at org.swrlapi.factory.SWRLAPIFactory.createSWRLRuleEngine(SWRLAPIFactory.java:38)
at SWRLrules.main(SWRLrules.java:61)
Caused by: java.lang.NoClassDefFoundError: org/drools/runtime/rule/AgendaFilter
at org.swrlapi.drools.core.DroolsSWRLRuleEngineCreator.create(DroolsSWRLRuleEngineCreator.java:27)
at org.swrlapi.factory.DefaultSWRLRuleAndQueryEngineFactory.createSWRLRuleEngine(DefaultSWRLRuleAndQueryEngineFactory.java:59)
... 3 more
Caused by: java.lang.ClassNotFoundException: org.drools.runtime.rule.AgendaFilter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more code here
Here I am also attaching the dependencies in pom.xml file:
<dependencies>
<dependency>
<groupId>net.sourceforge.owlapi</groupId>
<artifactId>owlapi-osgidistribution</artifactId>
<version>5.1.4</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
</dependency>
<dependency>
<groupId>com.github.galigator.openllet</groupId>
<artifactId>openllet-owlapi</artifactId>
<version>2.6.3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>org.swrlapi.example.SWRLAPIExample</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>java</goal>
</goals>
</execution>
</executions>
<configuration>
<mainClass>org.swrlapi.example.SWRLAPIExample</mainClass>
</configuration>
</plugin>
</plugins>
</build>
P.S: I built the swrl api and drools engine locally and imported the jar file into the project.

You do not need to remove the dependencies from the pom file (the error you're seeing is caused by some jar having been missed in the manual process).
If you use my swrlapi fork and change the version to, say, 2.0.6-SNAPSHOT, then running locally
mvn clean install
will put a 2.0.6-SNAPSHOT jar in your local maven repository. At that point, change your pom to require swrlapi 2.0.6-SNAPSHOT and you'll get the updated version in your application.

Related

Gatling : Scala - Could not find or load main class Engine

I am getting error Could not find or load main class Engine while running the simulation through command line. I used below command to create the jar file.
mvn clean scala:compile assembly:single package
Folder Structure
src
test
resources
scala
testrunner
testsimuation1.scala
Engine
IDEPathHelper
Recorder
Maven - 3.6.3
Intellij - 2021.1
Scala - 2.13.10
Gatling - 3.9.0
JDK - 1.8
Below is the POM.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>io.gatling.demo</groupId>
<artifactId>gatling-maven-plugin-demo-scala</artifactId>
<version>3.9.0</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<gatling.version>${project.version}</gatling.version>
<gatling-maven-plugin.version>4.2.9</gatling-maven-plugin.version>
<maven-jar-plugin.version>3.2.0</maven-jar-plugin.version>
<scala-maven-plugin.version>4.8.0</scala-maven-plugin.version>
</properties>
<dependencies>
<dependency>
<groupId>io.gatling.highcharts</groupId>
<artifactId>gatling-charts-highcharts</artifactId>
<version>${gatling.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<resources>
<resource>
<directory>src/test/resources</directory>
</resource>
</resources>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<pluginManagement>
<plugins>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>${gatling-mvn-plugin.version}</version>
<configuration>
<configFolder>src/test/resources</configFolder>
<simulationsFolder>src/test/scala</simulationsFolder>
<resultsFolder>src/results</resultsFolder>
<simulationClass>testrunner.testsimuation1</simulationClass>
<jvmArgs>
<jvmArg>-Dsimulation=testsimuation1</jvmArg>
<jvmArg>-Xmx6g</jvmArg>
<jvmArg>-Xms2g</jvmArg>
</jvmArgs>
</configuration>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.4.1</version>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>io.gatling.app.Gatling</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
I tried moving src/test -> src/main but it threw the compilation error. Using this tool for the first time and struggling to resolve the issues.
I tried few solutions from other threads. It didn't help.
Thanks
The Engine class is only supposed to be used as a helper to launch with a "Right Click" when using your IDE.
Using this tool for the first time and struggling to resolve the issues.
You're using maven. The way you're supposed to be running Gatling with maven is to run mvn gatling:test, see gatling-maven-plugin's documentation.
Whatever else you're doing with maven-assembly-plugin is custom development, not really related to Gatling.
Note: if you're looking for handling Gatling deployments, Gatling Enterprise is an option.

statefun is giving org.apache.flink.client.program.ProgramInvocationException classloader.parent-first-patterns.additional;

I am running stateful-fun 2.0 basic hello job with the following command
./bin/flink run -c org.apache.flink.statefun.flink.core.StatefulFunctionsJob ./stateful-sun-hello-java-1.0-SNAPSHOT-jar-with-dependencies.jar
and my POM.xml is
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>stateful-sun-hello-java</artifactId>
<version>1.0-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>statefun-sdk</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>statefun-flink-distribution</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>statefun-kafka-io</artifactId>
<version>2.0.0</version>
</dependency>
</dependencies>
<build>
<defaultGoal>clean generate-sources compile install</defaultGoal>
<plugins>
<!-- compile proto file into java files. -->
<plugin>
<groupId>com.github.os72</groupId>
<artifactId>protoc-jar-maven-plugin</artifactId>
<version>3.6.0.1</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<includeMavenTypes>direct</includeMavenTypes>
<inputDirectories>
<include>src/main/protobuf</include>
</inputDirectories>
<outputTargets>
<outputTarget>
<type>java</type>
<outputDirectory>src/main/java</outputDirectory>
</outputTarget>
<outputTarget>
<type>grpc-java</type>
<pluginArtifact>io.grpc:protoc-gen-grpc-java:1.15.0</pluginArtifact>
<outputDirectory>src/main/java</outputDirectory>
</outputTarget>
</outputTargets>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<!-- get all project dependencies -->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<!-- MainClass in mainfest make a executable jar -->
<archive>
<manifest>
<mainClass>org.apache.flink.statefun.flink.core.StatefulFunctionsJob</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- bind to the packaging phase -->
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
It is giving the following exception
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: Invalid configuration: classloader.parent-first-patterns.additional; Must contain all of org.apache.flink.statefun, org.apache.kafka, com.google.protobuf
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:335)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:205)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:138)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:662)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:893)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:966)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:966
Please suggest how to fix this.
You have to add below configuration parameters to your flink-conf.yaml file.
classloader.parent-first-patterns.additional: org.apache.flink.statefun;org.apache.kafka;com.google.protobuf
jobmanager.scheduler: legacy
https://ci.apache.org/projects/flink/flink-statefun-docs-release-2.0/deployment-and-operations/packaging.html#flink-jar --> Official document says:
The following configurations are strictly required for running StateFun application.
classloader.parent-first-patterns.additional: org.apache.flink.statefun;org.apache.kafka;com.google.protobuf
We need to add this jar
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>statefun-flink-distribution</artifactId>
<version>2.1.0</version>
</dependency>
Package a statefun job to submit to an existing Flink cluster deployment style is no longer mentioned in statefun 3.1 documents. Is this deployment style still possible or supported? If so, where should the module.yaml file be packaged?

Proper setup for pitest-maven report-aggregate goal

folks!
I've tried to use the pitest-maven plugin in my Maven / Java project and it is apparently failing to generate an aggregated report (taking into consideration that I have a multi-module project).
I gather some information from the official website and from several other sources, however, none of them was really helpful to define the proper configuration for this scenario.
In a nutshell, my structure looks like:
Parent-Project
Child A
Child B
Child ...
Child N
In some of the submodules, it does make sense to have a pi-test being executed, others not. So to say, my configuration in general is.
Parent-module pom:
<profile>
<id>run-pitest</id>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
<version>1.3.2</version>
<configuration>
<outputFormats>
<param>HTML</param>
<param>XML</param>
</outputFormats>
<!--<historyInputFile>${project.basedir}/pitHistory.txt</historyInputFile>-->
<!--<historyOutputFile>${project.basedir}/pitHistory.txt</historyOutputFile>-->
<mutators>
<mutator>CONDITIONALS_BOUNDARY</mutator>
<mutator>MATH</mutator>
<mutator>INCREMENTS</mutator>
<mutator>NEGATE_CONDITIONALS</mutator>
</mutators>
<verbose>true</verbose>
<exportLineCoverage>true</exportLineCoverage>
<testPlugin>testng</testPlugin>
<!--<reportsDirectory>${project.build.directory}/pit-reports</reportsDirectory>-->
</configuration>
<executions>
<execution>
<phase>test</phase>
<goals>
<goal>mutationCoverage</goal>
</goals>
</execution>
<execution>
<id>report</id>
<phase>site</phase>
<goals>
<goal>report-aggregate</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
</plugin>
</plugins>
</build>
</profile>
Child project that has mutations:
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
<configuration>
<mutationThreshold>80</mutationThreshold>
<exportLineCoverage>true</exportLineCoverage>
</configuration>
</plugin>
</plugins>
And, finally, when I try to execute the phase site (as defined in the parent) even after I executed a clean install that created the files such as linecoverage.xml and mutations.xml, I'm getting this error:
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11.820 s
[INFO] Finished at: 2018-04-06T13:20:47+02:00
[INFO] Final Memory: 35M/514M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.pitest:pitest-maven:1.3.2:report-aggregate (report) on project my-parent: An error has occurred in PIT Test Report report generation. Failed to build: no lineCoverageFiles have been set -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
...
Does any of you have a clue if I did a bad configuration or if there is a better way to do any part of this setup?
It seems that you are running into several problems at once :
When running report-aggregate the plugin analyze the dependencies of the module it
runs in and expect everyone of them to have linecoverage.xml and a mutations.xml file. You have to have a submodule dedicated to report aggregation, and you should run report-aggregate only in this submodule.
report-aggregate can't deal with timestamped reports, it must be disabled
I couldn't make it work with the site phase. Not sure if it's a bug in the plugin or if I missed something (keeping the default phase works, but you need to get the report somehow, it won't be in the site).
Putting it all together :
in parent-module pom:
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
<version>1.3.2</version>
<configuration>
<outputFormats>
<param>HTML</param>
<param>XML</param>
</outputFormats>
<!-- omitting mutators, testPlugin and verbose for brevity -->
<exportLineCoverage>true</exportLineCoverage>
<!--
it's currently not possible to aggregate timestamped
reports, so it must be disabled.
-->
<timestampedReports>false</timestampedReports>
</configuration>
<executions>
<execution>
<!--
Use an id to disable it in some submodules
-->
<id>pitest-mutation-coverage</id>
<phase>test</phase>
<goals>
<goal>mutationCoverage</goal>
</goals>
</execution>
<!--
NO report-aggregate here. Most of the time you don't
want it to run.
-->
</executions>
</plugin>
</plugins>
</pluginManagement>
<!-- NO pitest here since its use will vary per submodule -->
</build>
in submodule with mutation :
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
<!--
You can put configuration here, but IMOHO it's better to have it
in the parent pom. (so I leave it out here)
-->
</plugin>
</plugins>
in the submodule generating the report :
<!--
Only include submodules where `mutationCoverage` is run.
-->
<dependencies>
<dependency>
<groupId>you.groupId</groupId>
<artifactId>submodule-A</artifactId>
</dependency>
<dependency>
<groupId>you.groupId</groupId>
<artifactId>submodule-B</artifactId>
</dependency>
</dependencies>
and also
<plugins>
<plugin>
<groupId>org.pitest</groupId>
<artifactId>pitest-maven</artifactId>
<executions>
<!--
Using the execution id to change its phase to none disables
mutationCoverage in this module.
(not sure it's the best way to do it, but as long as it doesn't
run you should be fine)
-->
<execution>
<id>pitest-mutation-coverage</id>
<phase>none</phase>
</execution>
<execution>
<id>report</id>
<!--
Keep default phase here.
-->
<goals>
<goal>report-aggregate</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>

Classloader issues with Kafka Connector and Remote Executor (Apache Flink)

When trying to execute a Flink job on a remote cluster, i keep getting the same ClassLoader exception.
org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09
ClassLoader info: URL ClassLoader:
Class not resolvable through given classloader.
at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:207)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:223)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
at java.lang.Thread.run(Thread.java:745)
10/07/2016 06:34:57 Job execution switched to status FAILING.
I am packaging my jar using the maven-shade plugin and this class is definetely packaged with the jar, as can be seen when I look inside my jar.
$:jar -tf MyJar.jar |grep "org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer"
org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumer09.class
org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumerBase.class
Here's my pom.xml
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.10</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.10</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka-0.9_2.10</artifactId>
<version>1.1.2</version>
</dependency>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<finalName>FloorData</finalName>
<shadeTestJar>false</shadeTestJar>
<shadedArtifactAttached>false</shadedArtifactAttached>
<createDependencyReducedPom>false</createDependencyReducedPom>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>floordata.cli.launcher.FlinkLauncher</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
I am configuring my remote environment as follows in my code. I am not sure if this is correct, but do I need to specify the location of the kafka connector as an argument to createRemoteEnvironment??
InetSocketAddress address = parseHostPortAddress((String) configuration.get(FLINK_JOB_MANAGER_ADDRESS));
StreamExecutionEnvironment.createRemoteEnvironment(address.getHostName(), address.getPort());

Conflicting versions of datanucleus enhancer in a maven google app engine project

I'm having a problem setting up datanucleus enhancer to use with a google app engine project. If I use the datanucleus eclipse plugin everything goes well, but in my maven project I get a strange conflicting version error.
My POM has these datanucleus references:
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>1.1.0</version>
</dependency>
...
<plugin>
<groupId>org.datanucleus</groupId>
<artifactId>maven-datanucleus-plugin</artifactId>
<version>1.1.0</version>
<configuration>
<mappingIncludes>**/*.class</mappingIncludes>
<verbose>true</verbose>
<enhancerName>ASM</enhancerName>
<api>JDO</api>
</configuration>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>enhance</goal>
</goals>
</execution>
</executions>
</plugin>
When I try to build the project I get the following error:
Exception in thread "main" Plugin (Bundle) "org.datanucleus" is already registered.
Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/drome/.m2/repository/org/datanucleus/datanucleus-core/1.1.0/**datanucleus-core-1.1.0.jar**" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/drome/.m2/repository/org/datanucleus/datanucleus-core/1.1.3/**datanucleus-core-1.1.3.jar**."
org.datanucleus.exceptions.NucleusException: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/Users/drome/.m2/repository/org/datanucleus/datanucleus-core/1.1.0/datanucleus-core-1.1.0.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/Users/drome/.m2/repository/org/datanucleus/datanucleus-core/1.1.3/datanucleus-core-1.1.3.jar."
at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:437)
at org.datanucleus.plugin.NonManagedPluginRegistry.registerBundle(NonManagedPluginRegistry.java:343)
at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensions(NonManagedPluginRegistry.java:227
)
at org.datanucleus.plugin.NonManagedPluginRegistry.registerExtensionPoints(NonManagedPluginRegistry.jav
a:159)
at org.datanucleus.plugin.PluginManager.registerExtensionPoints(PluginManager.java:82)
at org.datanucleus.OMFContext.(OMFContext.java:164)
at org.datanucleus.enhancer.DataNucleusEnhancer.(DataNucleusEnhancer.java:171)
at org.datanucleus.enhancer.DataNucleusEnhancer.(DataNucleusEnhancer.java:149)
at org.datanucleus.enhancer.DataNucleusEnhancer.main(DataNucleusEnhancer.java:1157)
I don't understand why datanucleus required maven to download datanucleus-core-1.1.3.jar since this is not referenced in the pom.xml
I also do not understand why datanucleus-core-1.1.3.jar is being registered...
Any ideas?
Thanks in advance...
The DN M2 plugin pulls in the latest versions of the available DN jars that it needs to do its job (there is no other sensible way to do it other than use the latest). You want to restrict "core" to a different version, either by specifying the plugin dependency of core, or by specifying that in your application to
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>1.1.0</version>
<scope>runtime</scope>
</dependency>
Unfortunately the answer is "hidden" in the comments:
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>1.1.0</version>
<scope>runtime</scope>
</dependency>
That worked for me!
I ran into the same issue while testing a maven gae plugin archetype.
I fixed it by adding exclusions in my gae runtime transitive dependencies
<!-- Google App Engine meta-package -->
<dependency>
<groupId>net.kindleit</groupId>
<artifactId>gae-runtime</artifactId>
<version>${gae.version}</version>
<type>pom</type>
<exclusions>
<exclusion>
<groupId>com.google.appengine.orm</groupId>
<artifactId>datanucleus-core</artifactId>
</exclusion>
</exclusions>
</dependency>
and then adding the nucleus core as a runtime dependency
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>${datanucleus-core.version}</version>
<scope>runtime</scope>
<exclusions>
<exclusion>
<groupId>javax.transaction</groupId>
<artifactId>transaction-api</artifactId>
</exclusion>
</exclusions>
</dependency>
as keeping the gae plugin section simple:
<plugin>
<groupId>org.datanucleus</groupId>
<artifactId>maven-datanucleus-plugin</artifactId>
<version>${maven-datanucleus-plugin.version}</version>
<configuration>
<!--
Make sure this path contains your persistent classes!
-->
<mappingIncludes>**/model/*.class</mappingIncludes>
<verbose>true</verbose>
<enhancerName>ASM</enhancerName>
<api>JDO</api>
</configuration>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>enhance</goal>
</goals>
</execution>
</executions>
</plugin>
After reading "How to override a plugin's dependency in Maven", I found another way to fix this. Here is my POM:
<plugin>
<groupId>org.datanucleus</groupId>
<artifactId>maven-datanucleus-plugin</artifactId>
<version>3.1.0-m3</version>
<configuration>
<verbose>true</verbose>
</configuration>
<executions>
<execution>
<phase>process-classes</phase>
<goals>
<goal>enhance</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.datanucleus</groupId>
<artifactId>datanucleus-core</artifactId>
<version>3.0.4</version>
</dependency>
</dependencies>
</plugin>
clearing your old version of datanucleus from your local maven repository also solving the problem.
Maven-datanucleus-plugin has stopped pulling in the latest versions of the available datanucleus-core since version 3.1.1.
Check the differences between the POM files for Maven-datanucleus-plugin 3.1.1 (http://repo1.maven.org/maven2/org/datanucleus/maven-datanucleus-plugin/3.1.1/maven-datanucleus-plugin-3.1.1.pom) and 3.1.0-release (http://mvnrepository.com/artifact/org.datanucleus/maven-datanucleus-plugin/3.1.0-release).
For maven-datanucleus-plugin 3.1.1 the version range of datanucleus-core dependency is (3.0.99, 3.1.99), and for maven-datanucleus-plugin 3.1.0-release it is (3.0.99, ). No wonder for the older versions of maven-datanucleus-plugin, it automatically pulls in the latest versions of datanucleus-core.

Resources