Flink classloader falis when deploy the Job in cluster - apache-flink

I am observing a very strange behavior in flink job deployment.
When I run the Job in eclipse. It runs the job without any issues.
When I run the same job in Flink cluster it fails. I am using the following command to run the job.
./flink run --class FlinkKafkaExample flink-streaming-confluent-1.0-SNAPSHOT-all.jar --inputTopic1 sample-data-topic1 --inputTopic2 sample-data-topic2 --env example.properties
I am getting the following error.
java.lang.NoClassDefFoundError: org/apache/avro/generic/GenericData$Array
at org.apache.flink.formats.avro.utils.AvroKryoSerializerUtils.addAvroGenericDataArrayRegistration(AvroKryoSerializerUtils.java:64)
at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.buildKryoRegistrations(KryoSerializer.java:564)
at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.<init>(KryoSerializer.java:132)
at org.apache.flink.api.java.typeutils.GenericTypeInfo.createSerializer(GenericTypeInfo.java:90)
at org.apache.flink.streaming.api.graph.StreamGraph.addOperator(StreamGraph.java:202)
at org.apache.flink.streaming.api.graph.StreamGraph.addSource(StreamGraph.java:170)
at org.apache.flink.streaming.api.graph.StreamGraphGenerator.transformSource(StreamGraphGenerator.java:470)
at org.apache.flink.streaming.api.graph.StreamGraphGenerator.transform(StreamGraphGenerator.java:170)
at org.apache.flink.streaming.api.graph.StreamGraphGenerator.transformOneInputTransform(StreamGraphGenerator.java:527)
at org.apache.flink.streaming.api.graph.StreamGraphGenerator.transform(StreamGraphGenerator.java:166)
at org.apache.flink.streaming.api.graph.StreamGraphGenerator.generateInternal(StreamGraphGenerator.java:132)
at org.apache.flink.streaming.api.graph.StreamGraphGenerator.generate(StreamGraphGenerator.java:124)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:1538)
at org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:53)
at FlinkKafkaExample.main(FlinkKafkaExample.java:127)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:420)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:404)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:781)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:275)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
at org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
Caused by: java.lang.ClassNotFoundException: org.apache.avro.generic.GenericData$Array
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 29 more
When I check the dependencies required for the Jar its available in the classpath. I have even tried adding the same jar in the lib folder of flink 1.5.0
Dependency its looking for is as shown below:
<!-- https://mvnrepository.com/artifact/org.apache.flink/flink-avro -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-avro</artifactId>
<version>1.5.0</version>
</dependency>

Related

org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: No ExecutorFactory found to execute the application

I am trying to submit a flink job of version 1.11.3 . The actual application is developed in flink 1.8.1.
Since the Hadoop cluster is 3.2.0 apache I downloaded flink 1.11.3 (flink-1.11.3-bin-scala_2.11.tgz) and tried to submit the job.
while submitting facing the below mentioned exception . I have set the HADOOP parameters :
export HADOOP_CONF_DIR=/etc/hadoop/conf
export HADOOP_CLASSPATH='hadoop classpath'
Is there any changes I need to do it the pom file to overcome this
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: No ExecutorFactory found to execute the application.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:302)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:198)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:149)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:699)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:232)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:916)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:992)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.flink.runtime.security.contexts.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:992)
Caused by: java.lang.IllegalStateException: No ExecutorFactory found to execute the application.
at org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:84)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:1809)
at org.apache.flink.client.program.StreamContextEnvironment.executeAsync(StreamContextEnvironment.java:128)
at org.apache.flink.client.program.StreamContextEnvironment.execute(StreamContextEnvironment.java:76)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1700)
at org.sapphire.appspayload.StreamingJob.main(StreamingJob.java:214)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:288)

Flink: submitting job failed to deserialize JobGraph

I was trying to submit a Flink job to a cluster:
./bin/flink run -m <ip>:8081 examples/batch/WordCount.jar --input /opt/flink/README.txt
but got the error Failed to deserialize JobGraph:
org.apache.flink.client.program.ProgramInvocationException: Could not retrieve the execution result. (JobID: 6095949ee689e308039dbc62da2bdf03)
at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:255)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
at org.apache.flink.client.program.ContextEnvironment.execute(ContextEnvironment.java:62)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:820)
at org.apache.flink.api.java.DataSet.collect(DataSet.java:413)
at org.apache.flink.api.java.DataSet.print(DataSet.java:1652)
at org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:88)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:576)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:438)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1010)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1083)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1083)
Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$8(RestClusterClient.java:382)
at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:870)
at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:852)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1977)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:263)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:760)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:736)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:561)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:929)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:442)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.rest.util.RestClientException: [Failed to deserialize JobGraph.]
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:389)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:373)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:952)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:926)
Turns out it is a compatibility issue, the cluster is Flink 1.8.1 while the CLI and job jar is from Flink 1.9.0. After switching to the same version, it worked.
In my case, the issue was happening because the Flink cluster was not started
./bin/start-cluster.sh
In my case (when I was playing Flink + Beam) it was the issue with version compatibility.
I had some example Beam job which was running on GCP DataFlow but get the mentioned error on Flink 1.15.X + Beam 2.38.X.
When switched to Flink 1.14.X then job run successfully so be careful regarding dependencies.

Unable to build Flink from sources due to MapR artifacts problems

Summary: mapr dependency could not be found and thus the Flink build on master branch fails.
Failed to execute goal on project flink-mapr-fs: Could not resolve
dependencies for project org.apache.flink:flink-mapr-fs:jar:1.10-SNAPSHOT:
Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
MapR has been in shutdown talks as a company - i'm unclear as to whether they're even open. What is the answer/ what to do here? I'd like to simply "skip"/ignore the mapr connector but not sure how to do so..
Details:
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal on project flink-mapr-fs: Could not resolve dependencies for project org.apache.flink:flink-mapr-fs:jar:1.10-SNAPSHOT: Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:221)
at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.resolveProjectDependencies(LifecycleDependencyResolver.java:127)
at org.apache.maven.lifecycle.internal.MojoExecutor.ensureDependenciesAreResolved(MojoExecutor.java:245)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:199)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:80)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:307)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:193)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:106)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:863)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:199)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.project.DependencyResolutionException: Could not resolve dependencies for project org.apache.flink:flink-mapr-fs:jar:1.10-SNAPSHOT: Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:180)
at org.apache.maven.lifecycle.internal.LifecycleDependencyResolver.getDependencies(LifecycleDependencyResolver.java:195)
... 23 more
Caused by: org.eclipse.aether.collection.DependencyCollectionException: Failed to collect dependencies at com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:291)
at org.eclipse.aether.internal.impl.DefaultRepositorySystem.collectDependencies(DefaultRepositorySystem.java:316)
at org.apache.maven.project.DefaultProjectDependenciesResolver.resolve(DefaultProjectDependenciesResolver.java:172)
... 24 more
Caused by: org.eclipse.aether.resolution.ArtifactDescriptorException: Failed to read artifact descriptor for com.mapr.hadoop:maprfs:jar:5.2.1-mapr
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:282)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.readArtifactDescriptor(DefaultArtifactDescriptorReader.java:198)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.resolveCachedArtifactDescriptor(DefaultDependencyCollector.java:535)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.getArtifactDescriptorResult(DefaultDependencyCollector.java:519)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:409)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.processDependency(DefaultDependencyCollector.java:363)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.process(DefaultDependencyCollector.java:351)
at org.eclipse.aether.internal.impl.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:254)
... 26 more
Caused by: org.eclipse.aether.resolution.ArtifactResolutionException: Could not transfer artifact com.mapr.hadoop:maprfs:pom:5.2.1-mapr from/to mapr-releases (https://repository.mapr.com/maven/): sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:444)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifacts(DefaultArtifactResolver.java:246)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolveArtifact(DefaultArtifactResolver.java:223)
at org.apache.maven.repository.internal.DefaultArtifactDescriptorReader.loadPom(DefaultArtifactDescriptorReader.java:267)
... 33 more
Caused by: org.eclipse.aether.transfer.ArtifactTransferException: Could not transfer artifact com.mapr.hadoop:maprfs:pom:5.2.1-mapr from/to mapr-releases (https://repository.mapr.com/maven/): sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at org.eclipse.aether.connector.basic.ArtifactTransportListener.transferFailed(ArtifactTransportListener.java:43)
at org.eclipse.aether.connector.basic.BasicRepositoryConnector$TaskRunner.run(BasicRepositoryConnector.java:355)
at org.eclipse.aether.util.concurrency.RunnableErrorForwarder$1.run(RunnableErrorForwarder.java:67)
at org.eclipse.aether.connector.basic.BasicRepositoryConnector$DirectExecutor.execute(BasicRepositoryConnector.java:581)
at org.eclipse.aether.connector.basic.BasicRepositoryConnector.get(BasicRepositoryConnector.java:249)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.performDownloads(DefaultArtifactResolver.java:520)
at org.eclipse.aether.internal.impl.DefaultArtifactResolver.resolve(DefaultArtifactResolver.java:421)
... 36 more
Caused by: org.apache.maven.wagon.TransferFailedException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Just verify you have the below entry in "flink/flink-filesystems/flink-mapr-fs/pom.xml"
<profile>
<id>unsafe-mapr-repo</id>
<activation>
<property>
<name>unsafe-mapr-repo</name>
</property>
</activation>
<repositories>
<!-- MapR -->
<repository>
<id>mapr-releases</id>
<url>http://repository.mapr.com/maven/</url>
<snapshots><enabled>false</enabled></snapshots>
<releases><enabled>true</enabled></releases>
</repository>
</repositories>
</profile>
Once it is verified, run the below command from "flink/flink-filesystems/flink-mapr-fs", to ensure the build was successful for the below project.
mvn clean package -DskipTests -Punsafe-mapr-repo.
Once this is successful, run the same maven package command from "flink/flink-filesystems", to ensure the build was successful and after that you should be good to run from flink project.
This will ensure that you have a successful target-build for the flink project.

Component camel-jsonpath gives error after adding to pom file

In a clean Camel project I add the following dependency:
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-jsonpath</artifactId>
<version>2.16.1</version>
</dependency>
When I run the project I get the below error. If I remove that dependency it works. I have tried this on two different projects and the same happens. Anything else that has to be added to the pom for jsonpath to work? It seems to complain for ASM?
[INFO] Using org.apache.camel.test.blueprint.Main to initiate a CamelContext
[INFO] Starting Camel ...
[mel.test.blueprint.Main.main()] Activator INFO Camel activator starting
[mel.test.blueprint.Main.main()] Activator INFO Camel activator started
[ Blueprint Extender: 1] BlueprintContainerImpl INFO Bundle INT001_GetPostcodeDataFromXXX/0.0.1.SNAPSHOT is waiting for namespace handlers [http://camel.apache.org/schema/blueprint]
EventDispatcher: Error during dispatch.
EventDispatcher: Error during dispatch.
EventDispatcher: Error during dispatch.
EventDispatcher: Error during dispatch.
EventDispatcher: Error during dispatch.
org.osgi.framework.ServiceException: Service factory exception: org/objectweb/asm/commons/AdviceAdapter
at org.apache.felix.connect.felix.framework.ServiceRegistrationImpl.getFactoryUnchecked(ServiceRegistrationImpl.java:246)
at org.apache.felix.connect.felix.framework.ServiceRegistrationImpl.getService(ServiceRegistrationImpl.java:178)
at org.apache.felix.connect.felix.framework.ServiceRegistry.getService(ServiceRegistry.java:323)
at org.apache.felix.connect.PojoSRBundleContext.getService(PojoSRBundleContext.java:162)
at org.apache.aries.blueprint.namespace.NamespaceHandlerRegistryImpl.addingService(NamespaceHandlerRegistryImpl.java:113)
at org.osgi.util.tracker.ServiceTracker$Tracked.customizerAdding(ServiceTracker.java:932)
at org.osgi.util.tracker.ServiceTracker$Tracked.customizerAdding(ServiceTracker.java:864)
at org.osgi.util.tracker.AbstractTracked.trackAdding(AbstractTracked.java:256)
at org.osgi.util.tracker.AbstractTracked.track(AbstractTracked.java:229)
at org.osgi.util.tracker.ServiceTracker$Tracked.serviceChanged(ServiceTracker.java:894)
at org.apache.felix.connect.felix.framework.util.EventDispatcher.invokeServiceListenerCallback(EventDispatcher.java:852)
at org.apache.felix.connect.felix.framework.util.EventDispatcher.fireEventImmediately(EventDispatcher.java:775)
at org.apache.felix.connect.felix.framework.util.EventDispatcher.fireServiceEvent(EventDispatcher.java:594)
at org.apache.felix.connect.PojoSR$1.serviceChanged(PojoSR.java:78)
at org.apache.felix.connect.felix.framework.ServiceRegistry.registerService(ServiceRegistry.java:130)
at org.apache.felix.connect.PojoSRBundleContext.registerService(PojoSRBundleContext.java:101)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.registerService(BlueprintContainerImpl.java:453)
at org.apache.aries.blueprint.container.ServiceRecipe.register(ServiceRecipe.java:193)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.registerServices(BlueprintContainerImpl.java:704)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:379)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:269)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at org.apache.aries.blueprint.container.ExecutorServiceWrapper.run(ExecutorServiceWrapper.java:106)
at org.apache.aries.blueprint.utils.threading.impl.DiscardableRunnable.run(DiscardableRunnable.java:48)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoClassDefFoundError: org/objectweb/asm/commons/AdviceAdapter
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.aries.proxy.impl.interfaces.ProxyClassLoader.createProxyClass(ProxyClassLoader.java:146)
at org.apache.aries.proxy.impl.interfaces.InterfaceProxyGenerator.getProxyInstance(InterfaceProxyGenerator.java:95)
at org.apache.aries.proxy.impl.AsmProxyManager.createNewProxy(AsmProxyManager.java:80)
at org.apache.aries.proxy.impl.AbstractProxyManager.createDelegatingInterceptingProxy(AbstractProxyManager.java:75)
at org.apache.aries.proxy.impl.AbstractProxyManager.createInterceptingProxy(AbstractProxyManager.java:53)
at org.apache.aries.blueprint.container.ServiceRecipe$TriggerServiceFactory.getService(ServiceRecipe.java:569)
at org.apache.felix.connect.felix.framework.ServiceRegistrationImpl.getFactoryUnchecked(ServiceRegistrationImpl.java:242)
... 31 more
Caused by: java.lang.ClassNotFoundException: org.objectweb.asm.commons.AdviceAdapter
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 49 more
org.osgi.framework.ServiceException: Service factory exception: org/objectweb/asm/commons/AdviceAdapter
at org.apache.felix.connect.felix.framework.ServiceRegistrationImpl.getFactoryUnchecked(ServiceRegistrationImpl.java:246)
at org.apache.felix.connect.felix.framework.ServiceRegistrationImpl.getService(ServiceRegistrationImpl.java:178)
at org.apache.felix.connect.felix.framework.ServiceRegistry.getService(ServiceRegistry.java:323)
at org.apache.felix.connect.PojoSRBundleContext.getService(PojoSRBundleContext.java:162)
at org.apache.aries.blueprint.namespace.NamespaceHandlerRegistryImpl.addingService(NamespaceHandlerRegistryImpl.java:113)
at org.osgi.util.tracker.ServiceTracker$Tracked.customizerAdding(ServiceTracker.java:932)
at org.osgi.util.tracker.ServiceTracker$Tracked.customizerAdding(ServiceTracker.java:864)
at org.osgi.util.tracker.AbstractTracked.trackAdding(AbstractTracked.java:256)
at org.osgi.util.tracker.AbstractTracked.track(AbstractTracked.java:229)
at org.osgi.util.tracker.ServiceTracker$Tracked.serviceChanged(ServiceTracker.java:894)
at org.apache.felix.connect.felix.framework.util.EventDispatcher.invokeServiceListenerCallback(EventDispatcher.java:852)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoClassDefFoundError: org/objectweb/asm/commons/AdviceAdapter
Edit:
After adding:
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm-commons</artifactId>
<version>5.0.3</version>
<scope>runtime</scope>
</dependency>
It worked but doubt this is the correct way. The camel-component should get all the dependent jars.
I posted this on the Camel nabble forum and indeed it seems you need to add the asm dependency to get it to work.
http://camel.465427.n5.nabble.com/Error-with-Camel-component-camel-jsonpath-td5777201.html
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm-commons</artifactId>
<version>5.0.3</version>
<scope>runtime</scope>
</dependency>
the pom.xml are about compile-time dependencies. You have a runtime error about a missing dependency (org.objectweb.asm). You should add it to your container.
In your stack traces : a new service is registered in a blueprint context. this service is then injected in another blueprint context. for this service, the blueprint container want to wrap it in a new proxy, and to create this kind of proxy, aries blueprint need org.objectweb.asm. i don't think this error is directly related to camel-jsonpath.
In my case, back when asm-commons with version 5.0.3 was used, I was still getting some error. As an altetrnative to previous answer, asm-all artifact can be used instead, since it includes other dependencies that in my scenario were breaking otherwise.
<dependency>
<groupId>org.ow2.asm</groupId>
<artifactId>asm-all</artifactId>
<scope>runtime</scope>
</dependency>
However, regarding deployment, it seems quite impossible to integrate the project using this dependency to Fabric8 (no errors on console log and containers do not start), so not sure if there is a better alternative with a sort of camel-jsonpath feature.
Hope it helps someone.

can't run a java code with owlapi

i'm new to owlapi and i'm trying to write a sample java code on debian to load an ontology that i already built using protégé. I'm using "owlapi-osgidistribution-4.0.2.jar", but i always get this error:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/cache/CacheLoader
at org.semanticweb.owlapi.vocab.OWLFacet.<init>(OWLFacet.java:87)
at org.semanticweb.owlapi.vocab.OWLFacet.<clinit>(OWLFacet.java:60)
at org.semanticweb.owlapi.vocab.OWL2Datatype$Category.<clinit>(OWL2Datatype.java:328)
at org.semanticweb.owlapi.vocab.OWL2Datatype.<clinit>(OWL2Datatype.java:74)
at uk.ac.manchester.cs.owl.owlapi.InternalsNoCache.<clinit>(InternalsNoCache.java:59)
at uk.ac.manchester.cs.owl.owlapi.OWLDataFactoryImpl.<init>(OWLDataFactoryImpl.java:128)
at uk.ac.manchester.cs.owl.owlapi.OWLDataFactoryImpl.<clinit>(OWLDataFactoryImpl.java:74)
at org.semanticweb.owlapi.apibinding.OWLManager.getOWLDataFactory(OWLManager.java:152)
at org.semanticweb.owlapi.apibinding.OWLManager.createOWLOntologyManager(OWLManager.java:113)
at LoadingOntologies.main(LoadingOntologies.java:22)
Caused by: java.lang.ClassNotFoundException: com.google.common.cache.CacheLoader
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 10 more
i tried to use maven for dependency but in vain.
can anyone tell me how to solve this problem please, thanks.
You're missing the guava jars. For OWLAPI 4.0.2, you also need all other jars included in the maven dependencies. If you could not use Maven to build your code, you'll need to make sure all the dependencies are added manually.
Can you describe what you have tried with Maven, and what errors you got?
In my opinion the best way to get all the dependencies is via Maven. Locating JARs manually takes too much time.
<dependencies>
<dependency>
<groupId>net.sourceforge.owlapi</groupId>
<artifactId>owlapi-distribution</artifactId>
<version>5.1.0</version>
</dependency>
</dependencies>

Resources