Problem with 'getInstanceExplanations' when using OWL-API and Pellet reasoner - owl-api

I trying to get reasoning explanations using getInstanceExplanations method:
Set<Set<OWLAxiom>> expIns = expGen.getInstanceExplanations(individual, Class);
But I had this error:
Exception in thread "main" java.lang.NoSuchMethodError: 'org.semanticweb.owlapi.model.OWLLiteral org.semanticweb.owlapi.model.OWLDataFactory.getOWLTypedLiteral(java.lang.String, org.semanticweb.owlapi.model.OWLDatatype)'
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitLiteral(ConceptConverter.java:348)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitRestrictedDatatype(ConceptConverter.java:386)
at org.mindswap.pellet.output.ATermBaseVisitor.visit(ATermBaseVisitor.java:109)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitSome(ConceptConverter.java:174)
at org.mindswap.pellet.output.ATermBaseVisitor.visit(ATermBaseVisitor.java:85)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitList(ConceptConverter.java:357)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitAnd(ConceptConverter.java:138)
at org.mindswap.pellet.output.ATermBaseVisitor.visit(ATermBaseVisitor.java:67)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitSome(ConceptConverter.java:174)
at org.mindswap.pellet.output.ATermBaseVisitor.visit(ATermBaseVisitor.java:85)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitList(ConceptConverter.java:357)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitOr(ConceptConverter.java:148)
at org.mindswap.pellet.output.ATermBaseVisitor.visit(ATermBaseVisitor.java:73)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitList(ConceptConverter.java:357)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.visitAnd(ConceptConverter.java:138)
at org.mindswap.pellet.output.ATermBaseVisitor.visit(ATermBaseVisitor.java:67)
at com.clarkparsia.pellet.owlapiv3.ConceptConverter.convert(ConceptConverter.java:84)
at com.clarkparsia.pellet.owlapiv3.AxiomConverter.convert(AxiomConverter.java:87)
at com.clarkparsia.owlapi.explanation.GlassBoxExplanation.convertExplanation(GlassBoxExplanation.java:263)
at com.clarkparsia.owlapi.explanation.GlassBoxExplanation.getPelletExplanation(GlassBoxExplanation.java:228)
at com.clarkparsia.owlapi.explanation.GlassBoxExplanation.getExplanation(GlassBoxExplanation.java:191)
at com.clarkparsia.owlapi.explanation.HSTExplanationGenerator.getExplanation(HSTExplanationGenerator.java:110)
at com.clarkparsia.owlapi.explanation.HSTExplanationGenerator.getExplanations(HSTExplanationGenerator.java:131)
at com.clarkparsia.owlapi.explanation.HSTExplanationGenerator.getExplanations(HSTExplanationGenerator.java:115)
at com.clarkparsia.owlapi.explanation.PelletExplanation.getUnsatisfiableExplanations(PelletExplanation.java:154)
at com.clarkparsia.owlapi.explanation.PelletExplanation.getEntailmentExplanations(PelletExplanation.java:82)
at com.clarkparsia.owlapi.explanation.PelletExplanation.getInstanceExplanations(PelletExplanation.java:111)
at com.google.mapApi.map.Ontology.Ontology.main(Ontology.java:492)
I am using OWL-API 4.5.0, and pellet 2.3.2

Pellet 2.3.2 relies on OWLAPI 3.x.x and is not compatible with OWLAPI 4.x.x
There is a build compatible with OWLAPI 4 available on Maven Central
https://search.maven.org/search?q=v:2.4.0-ignazio1977
There is also a much more recent build for OWLAPI 5, under the Openllet fork:
https://search.maven.org/artifact/com.github.galigator.openllet/openllet-parent/2.6.4/pom
(2.6.4 is compatible with Java 8, same as the versions you're currently using.)

Related

SOLR JDBC failing with Could not initialize class org.apache.solr.handler.sql.SolrRules at org.apache.solr.handler.sql.SolrTableScan.register

When trying to run a SOLR query,
SELECT count(*) from products;
we are seeing the below exception from solr server, ours is a SOLR cloud setup,
SOLR version - solr-8.8.2-PATCH2
solr-solrj-8.8.2 version
Complete Stack Trace i have mentioned below,
2023-02-09 14:21:34.824 ERROR (qtp1209411469-15) [c:products s:shard3 r:core_node12 x:otmm_shard3_replica_n10] o.a.s.s.HttpSolrCall java.lang.RuntimeException: java.lang.NoClassDefFoundError: Could not initialize class org.apache.solr.handler.sql.SolrRules
at org.apache.solr.servlet.HttpSolrCall.sendError(HttpSolrCall.java:746)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:592)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:427)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:357)
at org.eclipse.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:201)
at org.eclipse.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1601)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:548)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:602)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:235)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1624)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1434)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1594)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1349)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:191)
at org.eclipse.jetty.server.handler.InetAccessHandler.handle(InetAccessHandler.java:177)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:146)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
at org.eclipse.jetty.server.Server.handle(Server.java:516)
at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:388)
at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:633)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:380)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:277)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105)
at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131)
at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:386)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883)
at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034)
at java.base/java.lang.Thread.run(Thread.java:829)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.solr.handler.sql.SolrRules
at org.apache.solr.handler.sql.SolrTableScan.register(SolrTableScan.java:72)
at org.apache.calcite.plan.AbstractRelOptPlanner.onNewClass(AbstractRelOptPlanner.java:239)
at org.apache.calcite.plan.volcano.VolcanoPlanner.onNewClass(VolcanoPlanner.java:464)
at org.apache.calcite.plan.AbstractRelOptPlanner.registerClass(AbstractRelOptPlanner.java:230)
at org.apache.calcite.plan.volcano.VolcanoPlanner.registerImpl(VolcanoPlanner.java:1224)
at org.apache.calcite.plan.volcano.VolcanoPlanner.register(VolcanoPlanner.java:589)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:604)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:84)
at org.apache.calcite.rel.AbstractRelNode.onRegister(AbstractRelNode.java:268)
at org.apache.calcite.plan.volcano.VolcanoPlanner.registerImpl(VolcanoPlanner.java:1132)
at org.apache.calcite.plan.volcano.VolcanoPlanner.register(VolcanoPlanner.java:589)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:604)
at org.apache.calcite.plan.volcano.VolcanoPlanner.ensureRegistered(VolcanoPlanner.java:84)
at org.apache.calcite.rel.AbstractRelNode.onRegister(AbstractRelNode.java:268)
at org.apache.calcite.plan.volcano.VolcanoPlanner.registerImpl(VolcanoPlanner.java:1132)
at org.apache.calcite.plan.volcano.VolcanoPlanner.setRoot(VolcanoPlanner.java:265)
at org.apache.calcite.tools.Programs.lambda$standard$3(Programs.java:262)
at org.apache.calcite.tools.Programs$SequenceProgram.run(Programs.java:331)
at org.apache.calcite.prepare.Prepare.optimize(Prepare.java:166)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:297)
at org.apache.calcite.prepare.Prepare.prepareSql(Prepare.java:208)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare2_(CalcitePrepareImpl.java:642)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepare_(CalcitePrepareImpl.java:508)
at org.apache.calcite.prepare.CalcitePrepareImpl.prepareSql(CalcitePrepareImpl.java:478)
at org.apache.calcite.jdbc.CalciteConnectionImpl.parseQuery(CalciteConnectionImpl.java:231)
at org.apache.calcite.jdbc.CalciteMetaImpl.prepareAndExecute(CalciteMetaImpl.java:556)
at org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:675)
at org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:156)
at org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:227)
at org.apache.solr.client.solrj.io.stream.JDBCStream.open(JDBCStream.java:278)
at org.apache.solr.client.solrj.io.stream.ExceptionStream.open(ExceptionStream.java:52)
at org.apache.solr.handler.StreamHandler$TimerStream.open(StreamHandler.java:465)
at org.apache.solr.client.solrj.io.stream.TupleStream.writeMap(TupleStream.java:82)
at org.apache.solr.common.util.JsonTextWriter.writeMap(JsonTextWriter.java:164)
at org.apache.solr.common.util.TextWriter.writeMap(TextWriter.java:216)
at org.apache.solr.common.util.TextWriter.writeVal(TextWriter.java:69)
at org.apache.solr.response.TextResponseWriter.writeVal(TextResponseWriter.java:153)
at org.apache.solr.common.util.JsonTextWriter.writeNamedListAsMapWithDups(JsonTextWriter.java:387)
at org.apache.solr.common.util.JsonTextWriter.writeNamedList(JsonTextWriter.java:293)
at org.apache.solr.response.JSONWriter.writeResponse(JSONWriter.java:73)
at org.apache.solr.response.JSONResponseWriter.write(JSONResponseWriter.java:66)
at org.apache.solr.response.QueryResponseWriterUtil.writeQueryResponse(QueryResponseWriterUtil.java:65)
at org.apache.solr.servlet.HttpSolrCall.writeResponse(HttpSolrCall.java:890)
at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:583)
... 40 more
Kindly let me know what is being wrong with this. Also am adding additional lines because its compalining me add more details.
Iam using Solrj, and i have a application it supports only JDBC and iam in a situation to use only JDBC solr using SOLRJ,
Its basically a BIRT reporting tool.
Where we can define SQL and output would be automatically mapped to the report.
This is what we are trying to do
Added below in solr.in.sh
SOLR_OPTS="$SOLR_OPTS -Dsolr.modules=sql"
And in solr admin, can able to see,
Args in Systemdashboard,
-Dsolr.modules=sql
But still not working
{
"responseHeader":{
"status":0,
"QTime":2},
"config":{"requestHandler":{"/export":{
"class":"solr.ExportHandler",
"useParams":"_EXPORT",
"components":["query"],
"invariants":{
"rq":"{!xport}",
"distrib":false},
"name":"/export",
"_useParamsExpanded_":{"_EXPORT":"[NOT AVAILABLE]"},
"_effectiveParams_":{
"distrib":"false",
"rq":"{!xport}"}}}}}
Edit: I just realized you had linked to the wrong docs for your version. The way you are trying to add modules was not available until 9.0 (https://issues.apache.org/jira/browse/SOLR-15914) 8.8 docs are at: https://solr.apache.org/guide/8_8/ Note that until recently this feature was called Parallel Sql https://solr.apache.org/guide/8_8/parallel-sql-interface.html
For 8.8 you shouldn't need to configure modules since at that time /sql was an implicitly loaded request handler.
https://solr.apache.org/guide/8_8/implicit-requesthandlers.html
You may need to verify if the implicit handlers have been (mis)configured via the request parameters API (https://solr.apache.org/guide/8_8/implicit-requesthandlers.html#how-to-edit-implicit-handler-paramsets)
Below refers to the 9.x versions of solr
java.lang.NoClassDefFoundError: Could not initialize class org.apache.solr.handler.sql.SolrRules
Probably indicates that you have not successfully loaded the sql module. Normally one enables modules at the very bottom of solr.in.sh not solr.sh as you said in your comments. solr.in.sh is the place where Solr intends for you to set up environment variables. One really shouldn't ever need to modify solr.sh directly, and doing so may make upgrades difficult in the future.
Check that there isn't another set of enablements in solr.in.sh that is overwriting what you've tried to do in solr.sh. Also check that you aren't enabling any other modules in other ways (several methods are shown here: https://solr.apache.org/guide/solr/latest/configuration-guide/solr-modules.html). You should pick one way of enabling modules (sysprops, solr.in.sh, solr.xml or solrconfig.xml) and then enable all modules that way to avoid having to understand any complicated precedence logic if possible. I don't know the precedence order, but I can probably figure it out if you have other modules and absolutely can't avoid using more than one method.
Also, you still haven't told us where you got the version ending in -PATCH2. This version spec sounds like you are working with some folks who are compiling their own custom Solr, so you should be sure to understand what they've changed in case they've customized something about how or where Solr loads jar files (not very likely, but one never knows).

Flink 1.12.3 upgrade triggers `NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps`

When I upgrade my Flink Java app from 1.12.2 to 1.12.3, I get a new runtime error. I can strip down my Flink app to this two liner:
public class TableEnvOnly {
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment streamEnv = StreamExecutionEnvironment.getExecutionEnvironment();
StreamTableEnvironment tableEnv = StreamTableEnvironment.create(streamEnv);
}
}
This works and doesn't trigger any errors with Flink version 1.12.2. When I upgrade the Maven Flink dependencies to 1.12.3, the same simple app throws the error:
Exception in thread "main" java.lang.NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.refArrayOps(java.lang.Object[])'
at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:118)
at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:47)
at org.apache.flink.table.planner.delegation.BlinkPlannerFactory.create(BlinkPlannerFactory.java:48)
at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:143)
at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:113)
at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:85)
at simple.TableEnvOnly.main(TableEnvOnly.java:12)
FYI, I'm not using Scala directly. My Gradle dependencies are:
implementation("org.apache.flink:flink-table-planner-blink_2.12:1.12.3")
implementation("org.apache.flink:flink-clients_2.12:1.12.3")
implementation("org.apache.flink:flink-connector-kafka_2.12:1.12.3")
implementation("org.apache.flink:flink-connector-jdbc_2.12:1.12.3")
TL;DR: After upgrade to Flink 1.12.4 the problem magically disappears.
Details
After upgrade from Flink 1.12.2 to Flink 1.12.3 the following code stopped to compile:
import scala.collection.JavaConverters._
val input = new DataStream[String](env.fromCollection(Seq("a", "b", "c").asJava))
val res = input.map(_.toUpperCase)
The Scala compiler reports the error:
could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[String]
The version of scala-compiler and scala-library is 2.12.7 - exactly as used by Flink.
To overcome the compilation problem, we provide an implicit instance of TypeInformation:
implicit val typeInfo = TypeInformation.of(classOf[String])
Then, the code compiles. Nevertheless we face the runtime failure described above:
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.flink.api.scala.ClosureCleaner$.getSerializedLambda(ClosureCleaner.scala:184)
at org.apache.flink.api.scala.ClosureCleaner$.org$apache$flink$api$scala$ClosureCleaner$$clean(ClosureCleaner.scala:257)
at org.apache.flink.api.scala.ClosureCleaner$.clean(ClosureCleaner.scala:168)
at org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.scalaClean(StreamExecutionEnvironment.scala:859)
at org.apache.flink.streaming.api.scala.DataStream.clean(DataStream.scala:1189)
at org.apache.flink.streaming.api.scala.DataStream.map(DataStream.scala:623)
As mentioned, the upgrade to Flink 1.12.4 helps - both the compilation and the runtime failures disappear.
My guess is that some Flink 1.12.3 jars have been accidentally compiled with a wrong Scala version. The subsequent release 1.12.4 has been compiled with the correct Scala version.

Ontapi Runtime Error including openllet 2.6.3

Adding opellet 2.6.3 to Maven project including Ontapi throws the following error:
Exception in thread "main" java.lang.NoSuchFieldError: TSV
at ru.avicomp.ontapi.OntFormat.<clinit>(OntFormat.java:61)
at ru.avicomp.ontapi.OntologyFactoryImpl$ONTLoaderImpl.guessFormat(OntologyFactoryImpl.java:752)
at ru.avicomp.ontapi.OntologyFactoryImpl$ONTLoaderImpl.getSupportedFormats(OntologyFactoryImpl.java:774)
at ru.avicomp.ontapi.OntologyFactoryImpl$ONTLoaderImpl.read(OntologyFactoryImpl.java:795)
at ru.avicomp.ontapi.OntologyFactoryImpl$ONTLoaderImpl.readGraph(OntologyFactoryImpl.java:725)
at ru.avicomp.ontapi.OntologyFactoryImpl$ONTLoaderImpl.loadGraph(OntologyFactoryImpl.java:580)
at ru.avicomp.ontapi.OntologyFactoryImpl$ONTLoaderImpl.load(OntologyFactoryImpl.java:286)
at ru.avicomp.ontapi.OntologyFactoryImpl.loadOWLOntology(OntologyFactoryImpl.java:109)
at ru.avicomp.ontapi.OntologyFactoryImpl.loadOWLOntology(OntologyFactoryImpl.java:58)
at ru.avicomp.ontapi.OntologyManagerImpl.load(OntologyManagerImpl.java:1678)
at ru.avicomp.ontapi.OntologyManagerImpl.load(OntologyManagerImpl.java:1644)
at ru.avicomp.ontapi.OntologyManagerImpl.loadOntologyFromOntologyDocument(OntologyManagerImpl.java:1587)
at ru.avicomp.ontapi.OntologyManager.loadOntologyFromOntologyDocument(OntologyManager.java:243)
at ru.avicomp.ontapi.OntologyManager.loadOntologyFromOntologyDocument(OntologyManager.java:259)
at ru.avicomp.ontapi.OntologyManager.loadOntologyFromOntologyDocument(OntologyManager.java:58)
The code tested is the following:
OWLOntologyManager manager = OntManagers.createONT();
OWLDataFactory factory = manager.getOWLDataFactory();
OWLOntology ontology=manager.loadOntologyFromOntologyDocument(
new File("ontologies/E1G1.owl"));
My pom file contains the following dependencies:ontapi1 .1.0, jena-arq 3.6.0, openllet-pellint 2.6.3.
Ensure you only have one owlapi version in the claaspath. The stack trace implies there are at least two versions.

Apache CXF 2.7.11 on WebSphere 8.5

I have an application that exposes web services for clients via CXF. This side of things works perfectly.
The application also needs to act as a client itself and contact other servers, this is where I am running into problems.
With "Parent First" classloading I get this:
Caused by: javax.xml.ws.WebServiceException: Error: Maintain Session is enabled but none of the session properties (Cookies, Over-written URL) are returned.
at org.apache.axis2.jaxws.ExceptionFactory.createWebServiceException(ExceptionFactory.java:173) ~[org.apache.axis2.jar:na]
at org.apache.axis2.jaxws.ExceptionFactory.makeWebServiceException(ExceptionFactory.java:70) ~[org.apache.axis2.jar:na]
at org.apache.axis2.jaxws.ExceptionFactory.makeWebServiceException(ExceptionFactory.java:118) ~[org.apache.axis2.jar:na]
at org.apache.axis2.jaxws.BindingProvider.setupSessionContext(BindingProvider.java:355) ~[org.apache.axis2.jar:na]
at org.apache.axis2.jaxws.BindingProvider.checkMaintainSessionState(BindingProvider.java:322) ~[org.apache.axis2.jar:na]
at org.apache.axis2.jaxws.client.proxy.JAXWSProxyHandler.invokeSEIMethod(JAXWSProxyHandler.java:393) ~[org.apache.axis2.jar:na]
at ...
With "Parent last" classloading the application can't even expose its own services:
[23/06/15 15:33:12:985 BST] 000002d3 servlet E com.ibm.ws.webcontainer.servlet.ServletWrapper service Uncaught service() exception thrown by servlet cxf: java.lang.VerifyError: JVMVRFY013 class loading constraint violated; class=org/apache/cxf/jaxb/attachment/JAXBAttachmentUnmarshaller, method=getAttachmentAsDataHandler(Ljava/lang/String;)Ljavax/activation/DataHandler;, pc=0
at java.lang.J9VMInternals.verifyImpl(Native Method)
at java.lang.J9VMInternals.verify(J9VMInternals.java:85)
at java.lang.J9VMInternals.initialize(J9VMInternals.java:162)
I have tried disabling WebShere's own JAXWS Engine via the WAR's manifest.mf and no matter what I try with "Parent last" classloading I always get some error like the above. A different class depending on what JAR I have moved or replaced, but always a verify error.
I have also gone through the official Apache documentation, various IBM guides, countless blog and forum posts to no avail. I am at my wit's end with this
The same WAR runs perfectly on Tomcat, JBoss and WebLogic.
This is a complete list of all thirdparty JAR files:
activation-1.1.jar
antisamy-1.4.3.jar
aopalliance-1.0.jar
asm-3.3.1.jar
batik-css-1.7.jar
batik-ext-1.7.jar
batik-util-1.7.jar
bcprov-jdk15-1.46.jar
bsh-core-2.0b4.jar
commons-beanutils-1.7.0.jar
commons-beanutils-core-1.7.0.jar
commons-codec-1.3.jar
commons-collections-3.2.jar
commons-configuration-1.5.jar
commons-dbutils-1.6.jar
commons-digester-1.8.jar
commons-fileupload-1.3.1.jar
commons-httpclient-3.1.jar
commons-io-2.4.jar
commons-jexl-2.1.1.jar
commons-lang-2.4.jar
commons-logging-1.1.1.jar
cxf-api-2.7.11.jar
cxf-rt-bindings-soap-2.7.11.jar
cxf-rt-bindings-xml-2.7.11.jar
cxf-rt-core-2.7.11.jar
cxf-rt-databinding-jaxb-2.7.11.jar
cxf-rt-frontend-jaxws-2.7.11.jar
cxf-rt-frontend-simple-2.7.11.jar
cxf-rt-transports-http-2.7.11.jar
cxf-rt-ws-addr-2.7.11.jar
cxf-rt-ws-policy-2.7.11.jar
dom4j-1.6.1.jar
esapi-2.0.1.jar
FastInfoset-1.0.2.jar
geronimo-javamail_1.4_spec-1.7.1.ja
hamcrest-all-1.3.jar
hsqldb-1.8.0.10.jar
httpclient-4.3.6.jar
httpcore-4.3.3.jar
jaxen-1.1-beta-8.jar
jaxrpc-api-1.1.jar
jaxrpc-impl-1.1.3_01.jar
jaxrpc-spi-1.1.3_01.jar
joda-time-2.2.jar
js-1.7R2.jar
log4j-1.2.16.jar
logback-classic-0.9.21.jar
logback-core-0.9.21.jar
mail-1.4.7.jar
mailapi-1.4.3.jar
nekohtml-1.9.12.jar
not-yet-commons-ssl-0.3.9.jar
opensaml-2.6.1.jar
openws-1.5.1.jar
quartz-1.8.6.jar
saaj-api-1.3.5.jar
saaj-impl-1.3.jar
serializer-2.7.1.jar
slf4j-api-1.6.0.jar
slf4j-log4j12-1.6.0.jar
spring-aop-3.2.6.RELEASE.jar
spring-beans-3.2.6.RELEASE.jar
spring-context-3.2.6.RELEASE.jar
spring-core-3.2.6.RELEASE.jar
spring-expression-3.2.6.RELEASE.jar
spring-web-3.2.6.RELEASE.jar
stax2-api-3.1.4.jar
velocity-1.7.jar
vuelinkcore-20.2.3.jar
vueservlet-20.2.3.jar
woodstox-core-asl-4.2.1.jar
wsdl4j-1.6.3.jar
xml-apis-ext-1.3.04.jar
xml-resolver-1.2.jar
xmlsec-1.5.6.jar
xmltooling-1.4.1.jar
xom-1.1.jar
Does anyone know how to get Apache CXF 2.7.11 on WebSphere 8.5 to be able to act as a server and as a client?
We had the same problem using Was 8.5 (jdk 1.7_64), CXF, JAXB & xmlbeans:
JAXB is the default xml/java binding used by CXF. Was 8.5 uses endorsed JAXB api definition version 2.2.2 (in <WebSphere-dir>\AppServer\endorsed_apis\jaxb-api.jar) and standard implementation (in JRE rt.jar).
Xmlbeans 2.4.x holds inside org.w3c.* classes already present in Was (<WebSphere-dir>\AppServer\java_1.7_64\jre\lib\xml.jar).
In the end we solved so:
first following the instructions here:
http://www.ibm.com/developerworks/websphere/library/techarticles/1001_thaker/1001_thaker.html
then deleting from our deploy the following jar:
activation-*,
stax-api-* (but not stax2-api!),
jaxb-api-*,
jaxb-impl-*,
xercesImpl-*,
xml-apis-*
last deleting all org.w3c classes inside xmlbeas-2.x.jar
This is a complete list of all thirdparty JAR files we are successfully using:
cxf-*-2.7.11.jar
dom4j-1.6.1.jar
ehcache-2.8.2.jar
ehcache-core-2.5.1.jar
jettison-1.1.jar
neethi-3.0.3.jar
ognl-3.0.6.jar
opensaml-2.6.1.jar
openws-1.5.1.jar
spring-*-3.2.13.RELEASE.jar
stax2-api-3.1.1.jar
woodstox-core-asl-4.2.1.jar
wsdl4j-1.6.3.jar
wss4j-1.6.10.jar
xml-resolver-1.2.jar
xmlbeans-2.3.0-now3c.jar
xmlpull-1.1.3.1.jar
xmlschema-core-2.1.0.jar
xmlsec-1.5.4.jar
xmltooling-1.4.1.jar
xpp3_min-1.1.4c.jar
xstream-1.4.7.jar
We hope this is helpful.
PARENT_LAST:
Maybe you have a third party library in your deployment with the javax.activation.DataHandler class. Try to remove the activation-1.1.jar from your deployment.
This post can be usefull for you: LinkageError whilst trying to invoke CXF/SOAP webservice

ParseException Unknown function termfreq in FunctionQuery

What is the right syntax if my query is formulated incorrectly in the link above for Apache 3.5 SOLR and do I have to enable anything specific in solrconfig.xml and schema.xml
Using Apache SOLR 3.5 and receiving a ParseException Uknown function termfreq in FunctionQuery(tf(text,amplifiers)'
http://localhost:8983/solr/select/?fl=score,documentPageId&defType=func&q=tf%28text,amplifiers%29
I am following the syntax on other websites because I don't know how to do it for the documentation on the wiki --> http://wiki.apache.org/solr/FunctionQuery
It won't work, the function query tf(field, term) that you are attempting to use is not available in 3.5, browse through ValueSourceParser if you want to double-check. You need to get Solr 4.x nightly build - Solr Nightly Build from trunk & use it, but beware Solr 4.x is not stable & released yet, there will be a significant level of API changes compared to 3.5.
If you are interested in poking into the code, you could for instance, if you are using Maven modify pom.xml to get the atrifacts from Trunk and browse the source code starting from ValueSourceParser that should let you know if those relevance functions exist & how their implementation is.
For Ex: You will see parsers related to the term vector function queries,
// From Solr 4 `ValueSourceParser` Trunk Source Code
addParser("tf", new ValueSourceParser() {
#Override
public ValueSource parse(FunctionQParser fp) throws ParseException {
TInfo tinfo = parseTerm(fp);
return new TFValueSource(tinfo.field, tinfo.val, tinfo.indexedField, tinfo.indexedBytes);
}
});

Resources