PyFlink - Issue using Scala UDF in JAR - apache-flink

I'm trying to register a Scala UDF in Pyflink using an external JAR as follows, but get below error.
Scala UDF:
package com.dummy
import org.apache.flink.table.functions.ScalarFunction
class dummyTransform(factor: Int) extends ScalarFunction {
def eval(s: String): Int = {
s.hashCode()
}
}
build.sbt:
name := "hello_scala_for_flink"
version := "0.1"
scalaVersion := "2.12.11"
libraryDependencies += "org.apache.flink" % "flink-table-common" % "1.11.2" % "provided"
assembly.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.5")
Scala project structure:
hello_scala_for_flink/
project/
src/
main/
resources/
scala/
com.dummy/
dummyTransform
hello_scala
test
target/
build.sbt
shell:
cd hello_scala_for_flink
sbt assembly
cp ./target/scala-2.12/hello_scala_for_flink-assembly-0.1.jar /Users/py-r/opt/anaconda3/envs/venv_pyflink_37/lib/python3.7/site-packages/pyflink/lib/
Python:
from pyflink.dataset import ExecutionEnvironment
exec_env = ExecutionEnvironment.get_execution_environment()
#exec_env = StreamExecutionEnvironment.get_execution_environment()
exec_env.set_parallelism(1)
t_config = TableConfig()
table_env = BatchTableEnvironment.create(exec_env, t_config)
table_env.register_java_function("hash_code","com.dummy.dummyTransform")
Error (incl. after restarting Anaconda):
Py4JJavaError: An error occurred while calling o12.newInstance.
: java.lang.InstantiationException: com.dummy.dummyTransform
at java.base/java.lang.Class.newInstance(Class.java:598)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.base/java.lang.Thread.run(Thread.java:832)
Caused by: java.lang.NoSuchMethodException: com.dummy.dummyTransform.<init>()
at java.base/java.lang.Class.getConstructor0(Class.java:3427)
at java.base/java.lang.Class.newInstance(Class.java:585)
11 more
Versions:
jdk = 1.8.0_151.jdk
scala = 2.12.11
python = 3.7
apache-beam = 2.19.0
apache-flink = 1.11.2
Any idea what the issue might be ?
Thanks for your support

Looks like I found the issue myself. Apparently only a class instantiation was required in above code:
class dummyTransform(factor: Int) extends ScalarFunction {
def eval(s: String): Int = {
s.hashCode() * factor
}
def this() = this(1)
}
Also, I changed to Scala 2.11.12 due to another error. Now everything seems to work: awesome !

Related

SparkSQL MS SQL Server , Get message "No suitable driver" After compiled

build.sbt
name := "BigData"
version := "0.1"
scalaVersion := "2.12.7"
libraryDependencies += "com.github.tototoshi" %% "scala-csv" % "1.3.5"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-sql
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.0"
// https://mvnrepository.com/artifact/com.microsoft.sqlserver/mssql-jdbc
libraryDependencies += "com.microsoft.sqlserver" % "mssql-jdbc" % "6.1.0.jre8" % Test
SparkMSSQL.scala
import org.apache.spark.sql.SparkSession
object SparkMSSQL {
def main(args: Array[String]): Unit = {
val spark = SparkSession.
builder.master("local[*]")
.appName("Simple Application")
.getOrCreate()
val url = "jdbc:sqlserver://localhost;databaseName=scalatest;integratedSecurity=true";
// Define database table to load into DataFrame
val jdbcDbTable = "dbo.user_profiles"
val df = spark
.read
.format("jdbc")
.option("url", url)
.option("dbtable", "dbo.user_profiles")
.load()
df.printSchema()
}
}
Error after compile
Exception in thread "main" java.sql.SQLException: No suitable driver
at java.sql.DriverManager.getDriver(DriverManager.java:315)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.$anonfun$driverClass$2(JDBCOptions.scala:105)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:105)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:35)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
Please advice , what is wrong with my code?
First, you have your jdbc driver in the test scope, so the jar is probably not loaded at runtime. But also, spark needs driver class information to create JDBC connection, so try adding the following option to the DF initializer:
.option("driver", "com.microsoft.sqlserver.jdbc.SQLServerDriver")

Slick can't connect to mssql database

I would like to use Slick (3.2.3) to connect to a MSSQL database.
Currently, my project is the following.
In application.conf, I have
somedbname = {
driver = "slick.jdbc.SQLServerProfile$"
db {
host = "somehost"
port = "someport"
databaseName = "Recupel.Datawarehouse"
url = "jdbc:sqlserver://"${somedbname.db.host}":"${somedbname.db.port}";databaseName="${somedbname.db.databaseName}";"
user = "someuser"
password = "somepassword"
}
}
The "somehost" looks like XX.X.XX.XX where X's are numbers.
My build.sbt contains
name := "test-slick"
version := "0.1"
scalaVersion in ThisBuild := "2.12.7"
libraryDependencies ++= Seq(
"com.typesafe.slick" %% "slick" % "3.2.3",
"com.typesafe.slick" %% "slick-hikaricp" % "3.2.3",
"org.slf4j" % "slf4j-nop" % "1.6.4",
"com.microsoft.sqlserver" % "mssql-jdbc" % "7.0.0.jre10"
)
The file with the "main" object contains
import slick.basic.DatabaseConfig
import slick.jdbc.JdbcProfile
import slick.jdbc.SQLServerProfile.api._
import scala.concurrent.Await
import scala.concurrent.duration._
val dbConfig: DatabaseConfig[JdbcProfile] = DatabaseConfig.forConfig("somedbname")
val db: JdbcProfile#Backend#Database = dbConfig.db
def main(args: Array[String]): Unit = {
try {
val future = db.run(sql"SELECT * FROM somettable".as[(Int, String, String, String, String,
String, String, String, String, String, String, String)])
println(Await.result(future, 10.seconds))
} finally {
db.close()
}
}
}
This, according to all the documentation that I could find, should be enough to connect to the database. However, when I run this, I get
[error] (run-main-0) java.sql.SQLTransientConnectionException: somedbname.db - Connection is not available, request timed out after 1004ms.
[error] java.sql.SQLTransientConnectionException: somedbname.db - Connection is not available, request timed out after 1004ms.
[error] at com.zaxxer.hikari.pool.HikariPool.createTimeoutException(HikariPool.java:548)
[error] at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:186)
[error] at com.zaxxer.hikari.pool.HikariPool.getConnection(HikariPool.java:145)
[error] at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:83)
[error] at slick.jdbc.hikaricp.HikariCPJdbcDataSource.createConnection(HikariCPJdbcDataSource.scala:14)
[error] at slick.jdbc.JdbcBackend$BaseSession.<init>(JdbcBackend.scala:453)
[error] at slick.jdbc.JdbcBackend$DatabaseDef.createSession(JdbcBackend.scala:46)
[error] at slick.jdbc.JdbcBackend$DatabaseDef.createSession(JdbcBackend.scala:37)
[error] at slick.basic.BasicBackend$DatabaseDef.acquireSession(BasicBackend.scala:249)
[error] at slick.basic.BasicBackend$DatabaseDef.acquireSession$(BasicBackend.scala:248)
[error] at slick.jdbc.JdbcBackend$DatabaseDef.acquireSession(JdbcBackend.scala:37)
[error] at slick.basic.BasicBackend$DatabaseDef$$anon$2.run(BasicBackend.scala:274)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1135)
[error] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
[error] at java.base/java.lang.Thread.run(Thread.java:844)
[error] Nonzero exit code: 1
Perhaps related and also annoying, when I run this code for the second (and subsequent) times, I get the following error instead:
Failed to get driver instance for jdbcUrl=jdbc:sqlserver://[...]
which forces me to kill and reload sbt each time.
What am I doing wrong? Worth noting: I can connect to the database with the same credential from a software like valentina.
As suggested by #MarkRotteveel, and following this link, I found a solution.
First, I explicitly set the driver, adding the line
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
in the db dictionary, after password = "somepassword".
Secondly, the default timeout (after one second) appears to be too short for my purposes, and therefore I added the line
connectionTimeout = "30 seconds"
after the previous driver line, still in the db dictionary.
Now it works.

Apache Flink: Class not resolvable through given classloader

I'm trying to consume some kafka topics through flink streams.
Below is the code for the stream.
object VehicleFuelEventStream {
def sink[IN](hosts: String, table: String, ds: DataStream[IN]): CassandraSink[IN] = {
CassandraSink.addSink(ds)
.setQuery(s"INSERT INTO global.$table values (id, vehicle_id, consumption, from, to, version) values (?, ?, ?, ?, ?, ?);")
.setClusterBuilder(new ClusterBuilder() {
override def buildCluster(builder: Cluster.Builder): Cluster = {
builder.addContactPoints(hosts.split(","): _*).build()
}
}).build()
}
def dataStream[T: TypeInformation](env: StreamExecutionEnvironment,
source: SourceFunction[String],
flinkConfigs: FlinkStreamProcessorConfigs,
windowFunction: WindowFunction[VehicleFuelEvent, (String, Double, SortedMap[Date, Double], Long, Long), String, TimeWindow]): DataStream[(String, Double, SortedMap[Date, Double], Long, Long)] = {
val fuelEventStream = env.addSource(source)
.map(e => EventSerialiserRepo.fromJsonStr[VehicleFuelEvent](e))
.filter(_.isSuccess)
.map(_.get)
.assignTimestampsAndWatermarks(VehicleFuelEventTimestampExtractor.withWatermarkDelay(flinkConfigs.windowWatermarkDelay))
.keyBy(_.vehicleId) // key by VehicleId
.timeWindow(Time.minutes(flinkConfigs.windowTimeWidth.toMinutes))
.apply(windowFunction)
fuelEventStream
}
}
Stream is triggered when play framework create it's dependencies on startup via google Guice as below.
#Singleton
class VehicleEventKafkaConsumer #Inject()(conf: Configuration,
lifecycle: ApplicationLifecycle,
repoFactory: StorageFactory,
cache: CacheApi,
cassandra: Cassandra,
fleetConfigs: FleetConfigManager) {
private val kafkaConfigs = KafkaConsumerConfigs(conf)
private val flinkConfigs = FlinkStreamProcessorConfigs(conf)
private val topicsWithClassTags = getClassTagsForTopics
private val cassandraConfigs = CassandraConfigs(conf)
private val repoCache = mutable.HashMap.empty[String, CachedSubjectStatePersistor]
private val props = new Properties()
// add props
private val env = StreamExecutionEnvironment.getExecutionEnvironment
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
env.enableCheckpointing(flinkConfigs.checkpointingInterval.toMillis)
if (flinkConfigs.enabled) {
topicsWithClassTags.toList
.map {
case (topic, tag) if tag.runtimeClass.isAssignableFrom(classOf[VehicleFuelEvent]) =>
Logger.info(s"starting for - $topic and $tag")
val source = new FlinkKafkaConsumer09[String](topic, new SimpleStringSchema(), props)
val fuelEventStream = VehicleFuelEventStream.dataStream[String](env, source, flinkConfigs, new VehicleFuelEventWindowFunction)
VehicleFuelEventStream.sink(cassandraConfigs.hosts, flinkConfigs.cassandraTable, fuelEventStream)
case (topic, _) =>
Logger.info(s"no stream processor found for topic $topic")
}
Logger.info("starting flink stream processors")
env.execute("flink vehicle event processors")
} else
Logger.info("Flink stream processor is disabled!")
}
I get the below errors on application startup.
03/13/2018 05:47:23 TriggerWindow(TumblingEventTimeWindows(1800000), ListStateDescriptor{serializer=org.apache.flink.api.common.typeutils.base.ListSerializer#e899e41f}, EventTimeTrigger(), WindowedStream.apply(WindowedStream.scala:582)) -> Sink: Cassandra Sink(4/4) switched to RUNNING
2018-03-13 05:47:23,262 - [info] o.a.f.r.t.Task - TriggerWindow(TumblingEventTimeWindows(1800000), ListStateDescriptor{serializer=org.apache.flink.api.common.typeutils.base.ListSerializer#e899e41f}, EventTimeTrigger(), WindowedStream.apply(WindowedStream.scala:582)) -> Sink: Cassandra Sink (3/4) (d5124be9bcef94bd0e305c4b4546b055) switched from RUNNING to FAILED.
org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: streams.fuelevent.VehicleFuelEventWindowFunction
ClassLoader info: URL ClassLoader:
Class not resolvable through given classloader.
at org.apache.flink.streaming.api.graph.StreamConfig.getStreamOperator(StreamConfig.java:232)
at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:95)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:745)
2018-03-13 05:47:23,262 - [info] o.a.f.r.t.Task - TriggerWindow(TumblingEventTimeWindows(1800000), ListStateDescriptor{serializer=org.apache.flink.api.common.typeutils.base.ListSerializer#e899e41f}, EventTimeTrigger(), WindowedStream.apply(WindowedStream.scala:582)) -> Sink: Cassandra Sink (2/4) (6b3de15a4f6
.....
org.apache.flink.streaming.runtime.tasks.StreamTaskException: Could not instantiate outputs in order.
at org.apache.flink.streaming.api.graph.StreamConfig.getOutEdgesInOrder(StreamConfig.java:394)
at org.apache.flink.streaming.runtime.tasks.OperatorChain.<init>(OperatorChain.java:103)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:231)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: streams.fuelevent.VehicleFuelEventStream$$anonfun$4
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader.loadClass(FlinkUserCodeClassLoaders.java:128)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
2018-03-13 05:47:23,336 - [info] o.a.f.r.t.Task - Source: Custom Source -> Map -> Filter -> Map -> Timestamps/Watermarks (4/4) (97b2313c985592fdec0ac4f7fba8062f) switched from RUNNING to FAILED.
dependencies
// kafka
libraryDependencies += "org.apache.kafka" %% "kafka" % "1.0.0"
//flink
libraryDependencies += "org.apache.flink" %% "flink-streaming-scala" % "1.4.2"
libraryDependencies += "org.apache.flink" %% "flink-connector-kafka-0.9" % "1.4.0"
libraryDependencies += "org.apache.flink" %% "flink-connector-cassandra" % "1.4.2"
Any help is appreciated to solve this issue.
Look like Flink can't found this class streams.fuelevent.VehicleFuelEventStream
It's that class inside Flink classpath or inside you Jar file?
Flink doc could bring some light on this https://ci.apache.org/projects/flink/flink-docs-release-1.4/monitoring/debugging_classloading.html

Functor misses implicit value for parameter instance (only after sbt clean)

I did some experiments with Kittens (https://github.com/milessabin/kittens) and have issues with compiling my code. I receive the following error.
[error] ...danirey\scala\kittens\Kittens.scala:23: could not find implicit value for parameter instance: cats.Functor[danirey.scala.kittens.AdtDefns.Tree]
[error] val funct = Functor[Tree]
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
The complete File is as follows
package danirey.scala.kittens
/**
* #author Dani
*/
import cats.Functor
import cats.syntax.AllSyntax
import cats.derived.functor._
import legacy._
import cats.derived.iterable.legacy._
import org.typelevel.discipline.scalatest.Discipline
import shapeless.cachedImplicit
object Kittens extends App {
val ft = new FunctorExperiment()
ft.print()
}
class FunctorExperiment extends AllSyntax {
import AdtDefns._
def print():Unit = {
val funct = Functor[Tree]
val tree: Tree[String] = Node(
Leaf("Reto"),
Node(
Leaf("Sandra"),
Leaf("Mike")
)
)
println(funct.map(tree)(_.length))
}
}
I have use the almost identical code in a ScalaTest which compiles without any issues.
package danirey.scala.kittens
import cats.Functor
import cats.syntax.AllSyntax
import cats.derived.functor._
import legacy._
import cats.derived.iterable.legacy._
import org.scalatest.FunSuite
import org.typelevel.discipline.scalatest.Discipline
import shapeless.cachedImplicit
/**
* #author Dani
*/
class FunctorExperimentTest extends FunSuite with Discipline with AllSyntax {
import AdtDefns._
test("functors experiment") {
val funct = Functor[Tree]
val tree: Tree[String] = Node(
Leaf("Reto"),
Node(
Leaf("Sandra"),
Leaf("Mike")
)
)
println(funct.map(tree)(_.length))
}
}
My build.sbt looks as follows
name := "shapeless-experiments"
version := "1.0-SNAPSHOT"
scalaVersion := "2.11.8"
exportJars := true
libraryDependencies ++= Seq(
"com.chuusai" % "shapeless_2.11" % "2.3.0",
"org.typelevel" % "kittens_2.11" % "1.0.0-M2",
"org.scalatest" %% "scalatest" % "3.0.0-M7" % "test"
)
scalacOptions ++= Seq(
"-feature",
"-language:higherKinds",
"-language:implicitConversions",
"-unchecked"
)
The most interesting thing is, that it compiles as part of an incremental compile.
If I comment line number 16, 23 and 32, then execute "sbt compile",
then remove the comments again and execute "sbt compile/package" it compiles and I can even execute the program. But as soon as I run "sbt clean", it will not compile anymore.
The AdtDefns Object is basically a copy of https://github.com/milessabin/kittens/blob/master/core/src/test/scala/cats/derived/adtdefns.scala
The relevant part is
object AdtDefns {
sealed trait Tree[T]
final case class Leaf[T](t: T) extends Tree[T]
final case class Node[T](l: Tree[T], r: Tree[T]) extends Tree[T]
}
PS: Would be nice if someone could create a tag for scala-kittens
#DaniRey we use kittens in our projects but only the sequence part. I am not aware of any project that use kittens derivation. What's your user case?

CXF2.7.2 + Weblogic 12c + Java 1.7

There is an issue with running CXF application of Weblogic 12c. Exception is as following:
org.apache.jcp.xml.dsig.internal.dom.DOMXMLSignatureFactory cannot be cast to javax.xml.crypto.dsig.XMLSignatureFactory
The interesting here is that DOMXMLSignatureFactory extends XMLSignatureFactory. I've tried to debug and haven't found the cause. XMLSec-1.5.3 code fails on following line:
XMLSignatureFactory fac = (XMLSignatureFactory)ps.newInstance(null);
private static XMLSignatureFactory findInstance(String mechanismType,
Provider provider) {
if (provider == null) {
provider = getProvider("XMLSignatureFactory", mechanismType);
}
Provider.Service ps = provider.getService("XMLSignatureFactory",
mechanismType);
if (ps == null) {
throw new NoSuchMechanismException("Cannot find " + mechanismType +
" mechanism type");
}
try {
XMLSignatureFactory fac = (XMLSignatureFactory)ps.newInstance(null);
fac.mechanismType = mechanismType;
fac.provider = provider;
return fac;
} catch (NoSuchAlgorithmException nsae) {
throw new NoSuchMechanismException("Cannot find " + mechanismType +
" mechanism type", nsae);
}
}
Instantiated class "org.apache.jcp.xml.dsig.internal.dom.DOMXMLSignatureFactory" has declaration:
public final class DOMXMLSignatureFactory extends XMLSignatureFactory {
...
}
Any ideas?
Full stacktrace:
org.apache.cxf.interceptor.Fault: org.apache.jcp.xml.dsig.internal.dom.DOMXMLSignatureFactory cannot be cast to javax.xml.crypto.dsig
.XMLSignatureFactory
at org.apache.cxf.ws.security.wss4j.policyhandlers.AsymmetricBindingHandler.doSignBeforeEncrypt(AsymmetricBindingHandler.java
:195)
at org.apache.cxf.ws.security.wss4j.policyhandlers.AsymmetricBindingHandler.handleBinding(AsymmetricBindingHandler.java:98)
at org.apache.cxf.ws.security.wss4j.PolicyBasedWSS4JOutInterceptor$PolicyBasedWSS4JOutInterceptorInternal.handleMessage(Polic
yBasedWSS4JOutInterceptor.java:165)
at org.apache.cxf.ws.security.wss4j.PolicyBasedWSS4JOutInterceptor$PolicyBasedWSS4JOutInterceptorInternal.handleMessage(Polic
yBasedWSS4JOutInterceptor.java:89)
at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:271)
at org.apache.cxf.endpoint.ClientImpl.doInvoke(ClientImpl.java:530)
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:463)
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:366)
at org.apache.cxf.endpoint.ClientImpl.invoke(ClientImpl.java:319)
at org.apache.cxf.frontend.ClientProxy.invokeSync(ClientProxy.java:96)
at org.apache.cxf.jaxws.JaxWsClientProxy.invoke(JaxWsClientProxy.java:133)
at $Proxy197.getProcessingEventDetails(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.paradase.top.green.hill.client.MiraclesPosterServlet.doPost(MiraclesPosterServlet.java:666)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:751)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:844)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:242)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:216)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:132)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:338)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:221)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3284)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3254)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.provider.WlsSubjectHandle.run(WlsSubjectHandle.java:57)
at weblogic.servlet.internal.WebAppServletContext.doSecuredExecute(WebAppServletContext.java:2163)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2089)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2074)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1513)
at weblogic.servlet.provider.ContainerSupportProviderImpl$WlsRequestExecutor.run(ContainerSupportProviderImpl.java:254)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
Caused by: java.lang.ClassCastException: org.apache.jcp.xml.dsig.internal.dom.DOMXMLSignatureFactory cannot be cast to javax.xml.cryp
to.dsig.XMLSignatureFactory
at javax.xml.crypto.dsig.XMLSignatureFactory.findInstance(XMLSignatureFactory.java:202)
at javax.xml.crypto.dsig.XMLSignatureFactory.getInstance(XMLSignatureFactory.java:292)
at org.apache.ws.security.message.WSSecSignature.init(WSSecSignature.java:126)
at org.apache.ws.security.message.WSSecSignature.<init>(WSSecSignature.java:119)
at org.apache.cxf.ws.security.wss4j.policyhandlers.AbstractBindingBuilder.getSignatureBuilder(AbstractBindingBuilder.java:172
3)
at org.apache.cxf.ws.security.wss4j.policyhandlers.AsymmetricBindingHandler.doSignature(AsymmetricBindingHandler.java:546)
at org.apache.cxf.ws.security.wss4j.policyhandlers.AsymmetricBindingHandler.doSignBeforeEncrypt(AsymmetricBindingHandler.java
:147)
... 35 more
CXF uses WSS4J and wss4j version 1.6.7 has this bug. Can you try upgrading to wss4j 1.6.9? CXF 2.7.3 might pick this up for you. This update fixed the issue for me running in ServiceMix 4.5.0.

Resources