I am new to Karaf and trying to learn how to handle it.
On the way I tried to add Camunda features.
Like described on https://github.com/camunda/camunda-bpm-platform-osgi/tree/master/camunda-bpm-karaf-feature
at first, I added the repo:
feature:repo-add mvn:org.camunda.bpm.extension.osgi/camunda-bpm-karaf-feature/4.1.0/xml/features
then I tried to install them;
feature:install camunda-bpm-karaf-feature-full
unfortunately I got this Exception
org.osgi.framework.BundleException: Unable to build resource for mvn:xmlpull/xmlpull/1.1.3.1: Unsupported 'Bundle-ManifestVersion' value: 1
at org.apache.felix.utils.resource.ResourceBuilder.build(ResourceBuilder.java:82)
at org.apache.felix.utils.resource.ResourceBuilder.build(ResourceBuilder.java:67)
at org.apache.karaf.features.internal.region.SubsystemResolver.prepare(SubsystemResolver.java:180)
at org.apache.karaf.features.internal.service.Deployer.deploy(Deployer.java:379)
at org.apache.karaf.features.internal.service.FeaturesServiceImpl.doProvision(FeaturesServiceImpl.java:1025)
at org.apache.karaf.features.internal.service.FeaturesServiceImpl.lambda$doProvisionInThread$13(FeaturesServiceImpl.java:964)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.osgi.framework.BundleException: Unsupported 'Bundle-ManifestVersion' value: 1
at org.apache.felix.utils.resource.ResourceBuilder.doBuild(ResourceBuilder.java:90)
at org.apache.felix.utils.resource.ResourceBuilder.build(ResourceBuilder.java:80)
... 9 more
Error executing command: Unable to build resource for mvn:xmlpull/xmlpull/1.1.3.1: Unsupported 'Bundle-ManifestVersion' value: 1
I am using Karaf version 4.2.1
Does somebody know what I am doing wrong?
One of the features depend on xmlpull 1.1.3.1 which has Manifest-Version: 1.0 in MANIFEST.MF thus making it an OSGi R3 bundle.
Apache Felix supports bundles conforming the OSGi Release 4 (or newer) only (Manifest-Version: 2.0) which is why it rejects xmlpull. See the Felix source for reference.
If you control the source consider wrapping xmlpull and installing the wrapped bundle from the features. You can also play around with the Karaf console; for example, install -s wrap:mvn:xmlpull/xmlpull/1.1.3.1.
Related
I am trying to update my tool chain from Java 8 to 11.
Doing so I run into issues compiling WSDLs to java using the Apache CXF maven plugin.
I tried with the latest two available versions:
org.apache.cxf:cxf-codegen-plugin:3.3.12 and
org.apache.cxf:cxf-codegen-plugin:3.4.5
When I run the wsdl2java goal (phase generate-sources) it first seems to read all the WSDLs and XSDs ok but at the end it invariably ends in compile errors (see below). So it seems as if it generates code that can't be compiled under Java 11!?!
What dependencies do I need to add? Or what other steps are necessary to get this working again?
Error:
...
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /D:/Projects/KStA_ZH_ZHQuest/code/application/zhquest-sam-web-service/target/generated-sources/cxf/ch/adnovum/nevisidm/ws/services/v1_37/AdminServicePortImpl.java:[10,17] package javax.jws does not exist
...
[ERROR] /D:/Projects/KStA_ZH_ZHQuest/code/application/zhquest-sam-web-service/target/generated-sources/cxf/ch/adnovum/nevisidm/ws/services/v1_37/AdminServicePortImpl.java:[14,22] package javax.jws.soap does not exist
[ERROR] /D:/Projects/KStA_ZH_ZHQuest/code/application/zhquest-sam-web-service/target/generated-sources/cxf/ch/adnovum/nevisidm/ws/services/v1_37/AdminServicePortImpl.java:[15,33] package javax.xml.bind.annotation does not exist
Note: I am aware of these threads: CXF codegen maven plugin doesn't work OpenJDK 11 and What dependencies do I need to run Apache CXF with Spring Boot on Java 11? but the dependencies suggested in either did not fix my issue.
https://issues.jboss.org/browse/ENTESB-8039?focusedCommentId=13618981&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-13618981
In above ticket it mentioned as resolved but when i tried in jboss fuse 6.3 latest version, it is not updating the profile. As a result it thrown below error.
Stack: java.lang.IllegalArgumentException: No operation deployProjectJsonMergeOption found on MBean io.fabric8:type=ProjectDeployer
Looks like you are using a more recent version of the maven plugin than Fuse, hence the exception. The deployProjectJsonMergeOption method was introduced in 6.3 Roll Up 1. I'd suggest using the same version plugin/fuse or upgrading your Fuse installation.
HTH,
John.
just trying to migrate from flink 1.3 into 1.4 and getting this exception on
linux machine:
(not reproducing at windows).
i've import this package also:
// https://mvnrepository.com/artifact/org.apache.flink/flink-shaded-hadoop2
compile group: 'org.apache.flink', name: 'flink-shaded-hadoop2', version: '1.4.0'
any help?
at flink console:
TriggerWindow(TumblingProcessingTimeWindows(10000), ReducingStateDescriptor{serializer=org.apache.flink.api.java.typeutils.runtime.TupleSerializer#cb6c5dba, reduceFunction=com.clicktale.reducers.MetricsReducer#4e406694}, ProcessingTimeTrigger(), WindowedStream.reduce(WindowedStream.java:241)) -> Sink: Unnamed (1/1)
java.util.ServiceConfigurationError: org.apache.hadoop.fs.FileSystem: Provider org.apache.hadoop.fs.LocalFileSystem not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2364)
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2375)
at org.apache.flink.runtime.fs.hdfs.HadoopFsFactory.create(HadoopFsFactory.java:99)
at org.apache.flink.core.fs.FileSystem.getUnguardedFileSystem(FileSystem.java:401)
at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.createHadoopFileSystem(BucketingSink.java:1154)
at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.initFileSystem(BucketingSink.java:411)
at org.apache.flink.streaming.connectors.fs.bucketing.BucketingSink.initializeState(BucketingSink.java:355)
at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.tryRestoreFunction(StreamingFunctionUtils.java:178)
at org.apache.flink.streaming.util.functions.StreamingFunctionUtils.restoreFunctionState(StreamingFunctionUtils.java:160)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.initializeState(AbstractUdfStreamOperator.java:96)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator.initializeState(AbstractStreamOperator.java:259)
at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeOperators(StreamTask.java:694)
at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeState(StreamTask.java:682)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:253)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:748)
I faced a similar (not specifically this, but dependencies related) issues migrating from 1.3 to 1.4.
In my case, I had to re-generate a fresh POM file using maven archetype and then add the needed dependencies one by one.
See Java Quickstart or Scala Quickstart.
Reason being that there has been a major rework on dependency structure. See Release notes for more information.
Note that Flink 1.4 will load any Hadoop jars found via the "hadoop classpath" shell command, and these will be first on the classpath. So if you have an incompatible version of Hadoop installed that the "hadoop" command points at, you can run into this kind of problem.
I tried to installed camel-osgi using below command. It given an error.
karaf#root()>feature:install camel-osgi
Error:
Error executing command: No matching features for camel-osgi/0.0.0
You first have to install the feature repository of camel before using any camel feature.
feature:repo-add camel 2.16.2
feature:install camel-core
Karaf can also show you the available features using feature:list.
I'm trying to get clojure-solr-0.2.0 working with solr4 (trunk nightly build). I'm getting the following exception:
Caused by: java.lang.RuntimeException: Invalid version or the data in not in 'javabin' format
at org.apache.solr.common.util.JavaBinCodec.unmarshal(JavaBinCodec.java:99)
at org.apache.solr.client.solrj.impl.BinaryResponseParser.processResponse(BinaryResponseParser.java:39)
at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:466)
at org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:243)
at org.apache.solr.client.solrj.request.QueryRequest.process(QueryRequest.java:89)
I'm assuming this is because clojure-solr is using some version of solrj library that is from solr3 codebase and javabin format changed between versions.
Is there some easy way to get this to work?
EDIT:
I deleted solrj from my project's lib folder, and replaced it with lib/apache-solr-solrj-4.0-2012-05-15_08-20-37.jar. I was hoping that would just work since the same classes (should) be found on the classpath at runtime (the transitive dependency should only be required when lein deps is run to fetch dependencies (or so I reasoned). The result was a class not found at runtime:
Caused by: java.lang.ClassNotFoundException: org.apache.solr.client.solrj.impl.CommonsHttpSolrServer
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at clojure.lang.DynamicClassLoader.findClass(DynamicClassLoader.java:61)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:169)
at clojure_solr$eval1423$loading__4414__auto____1424.invoke(clojure_solr.clj:1)
at clojure_solr$eval1423.invoke(clojure_solr.clj:1)
at clojure.lang.Compiler.eval(Compiler.java:5424)
... 95 more
And the problem is that CommonsHttpSolrServer is no longer present in the jar:
$ jar tf lib/apache-solr-solrj-4.0-2012-05-15_08-20-37.jar |grep Http
org/apache/solr/client/solrj/impl/HttpSolrServer$1.class
org/apache/solr/client/solrj/impl/HttpSolrServer$2.class
org/apache/solr/client/solrj/impl/HttpSolrServer$3.class
org/apache/solr/client/solrj/impl/HttpSolrServer$DeflateDecompressingEntity.class
org/apache/solr/client/solrj/impl/HttpSolrServer$GzipDecompressingEntity.class
org/apache/solr/client/solrj/impl/HttpSolrServer$UseCompressionRequestInterceptor.class
org/apache/solr/client/solrj/impl/HttpSolrServer$UseCompressionResponseInterceptor.class
org/apache/solr/client/solrj/impl/HttpSolrServer.class
org/apache/solr/client/solrj/impl/LBHttpSolrServer$1.class
org/apache/solr/client/solrj/impl/LBHttpSolrServer$Req.class
org/apache/solr/client/solrj/impl/LBHttpSolrServer$Rsp.class
org/apache/solr/client/solrj/impl/LBHttpSolrServer$ServerWrapper.class
org/apache/solr/client/solrj/impl/LBHttpSolrServer.class
$ jar tf lib/apache-solr-solrj-4.0-2012-05-15_08-20-37.jar |grep CommonsHttp
$
you will very likely need to make your own build of clojure-solr and fix it to work with the latest solr4. This is actually not that hard.
clone it from the github project
lein install
add it as a dependency in your main project
hack and repeat.