Loading user classes in flink RichFilterFunction using User code classloader - apache-flink

I am trying to use the flink's UserCodeClassLoader but since I am new to Flink I could not understand exactly how to use it.
Scenario:
In the open() method of RichFilterFunction(), I want to load an external jar.
To do so, I do the following in open():
#Override
public void open(Configuration parameters) throws Exception {
ClassLoader userClassLoader = getRuntimeContext().getUserCodeClassLoader();
URL url = userClassLoader.getResource("/tmp/rohit/FilterTest/FilterTest.jar");
klazz = userClassLoader.loadClass("FilterTest");
Constructor<?> ctor = klazz.getConstructor();
Object obj = ctor.newInstance(new Object[] {});
control = (MyRichFilterInterface)obj;
... etc
However, I am getting ClassNotFoundException:
Caused by: java.lang.ClassNotFoundException: FilterTest
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader.loadClass(FlinkUserCodeClassLoaders.java:128)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at MyRichFilterFunction.open(MyRichFilterFunction.java:24)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:393)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:254)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:748)
My flink version is 1.4.0 & it is installed in /root/flink-1.4.0/
I have not changed any parameters in flink config specific to this issue.
If anyone knows what I am missing, that would be great!!!
I have used URLClassLoader earlier, but I am not sure how to use Flink's provided UserCodeClassLoader.

Related

The main method caused an error: No operators defined in streaming topology. Cannot execute

I have written the flink code to read the from pubsub. While executing the code with the command flink run Flink.jar I am getting the below mentioned error. I am using the flink version 1.9.3
Starting execution of program
------------------------------------------------------------
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: No operators defined in streaming topology. Cannot execute.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:621)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:466)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1008)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1081)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1081)
Caused by: java.lang.IllegalStateException: No operators defined in streaming topology. Cannot execute.
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraphGenerator(StreamExecutionEnvironment.java:1545)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:1540)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1507)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1489)
at org.flink.ReadFromPubsub.main(ReadFromPubsub.java:30)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:604)
... 9 more
Please find the code which I am using
package org.flink;
import org.apache.flink.api.common.serialization.DeserializationSchema;
import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.streaming.connectors.gcp.pubsub.PubSubSource;
public class ReadFromPubsub
{
public static void main(String args[]) throws Exception
{
System.out.println("Flink Pubsub Code Read 1");
StreamExecutionEnvironment streamExecEnv= StreamExecutionEnvironment.getExecutionEnvironment();
DeserializationSchema<String> deserializer = new SimpleStringSchema();
SourceFunction<String> pubsubSource = PubSubSource.newBuilder() .withDeserializationSchema(deserializer)
.withProjectName("vz-it-np-gudv-dev-vzntdo-0") .withSubscriptionName("subscription1").build();
streamExecEnv.addSource(pubsubSource);
streamExecEnv.execute();
}
}
I am trying to read the data from pubsub with flink code but not able to do so.
Flink uses lazy evaluation and since you haven't specified any sinks there would be no reason to execute this job.
From the Flink docs:
All Flink programs are executed lazily: When the program’s main method is executed, the data loading and transformations do not happen directly. Rather, each operation is created and added to a dataflow graph. The operations are actually executed when the execution is explicitly triggered by an execute() call on the execution environment.
However, your dataflow graph has no output in this case which makes processing unnecessary.
For debugging purposes, you can add a print sink to your source to make your example work:
streamExecEnv.addSource(pubsubSource).print();

Flink BlockElement Exception when updating to version 1.14.2

Before, everything works well with flink 1.13.1, lately we update it to flink 1.14.2, the following code is run: and it throws this exception:
<T> DataStream<Tuple3<String, String, T>> returnsInternal(SiddhiOperatorContext siddhiContext, String[] executionPlanIds) {
if (createdDataStream == null) {
DataStream<Tuple2<StreamRoute, Object>> mapped = this.dataStream.map(new MapFunction<Tuple2<StreamRoute, Object>, Tuple2<StreamRoute, Object>>() {
#Override
public Tuple2<StreamRoute, Object> map(Tuple2<StreamRoute, Object> value) throws Exception {
if (executionPlanIds != null && executionPlanIds.length != 0) {
for (String executionPlanId : executionPlanIds) {
if (!executionPlanId.isEmpty()
&& siddhiContext.getExecutionPlan(executionPlanId).IsUsedStream(value.f0.getInputStreamId())) {
value.f0.addExecutionPlanId(executionPlanId);
}
}
}
return value;
}
});
createdDataStream = SiddhiStreamFactory.createDataStream(siddhiContext, mapped);
}
return createdDataStream;
}
The exception and callstack are as follows:
org.apache.flink.api.common.InvalidProgramException: The
implementation of the BlockElement is not serializable. The object
probably contains or references non serializable fields.
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:164)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
at
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:2139)
at
org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:203)
at
org.apache.flink.streaming.api.datastream.DataStream.map(DataStream.java:577)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStreamBase.returnsInternal(ExecutionSiddhiStreamBase.java:135)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStreamBase.returnsInternal(ExecutionSiddhiStreamBase.java:123)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStream.returnAsRow(ExecutionSiddhiStream.java:180)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStream.returnAsRowWithQueryId(ExecutionSiddhiStream.java:165)
at
org.apache.flink.streaming.siddhi.SiddhiCEPITCase.testSimplePojoStreamAndReturnRowWithQueryId(SiddhiCEPITCase.java:245)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at
org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
at
org.apache.flink.util.TestNameProvider$1.evaluate(TestNameProvider.java:45)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:61) at
org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at
org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at
org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at
org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at
org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at
org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
at
org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
at org.junit.rules.RunRules.evaluate(RunRules.java:20) at
org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at
org.junit.runners.ParentRunner.run(ParentRunner.java:413) at
org.junit.runner.JUnitCore.run(JUnitCore.java:137) at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
at
com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
at
com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:221)
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
Caused by: java.io.NotSerializableException:
org.apache.flink.configuration.description.TextElement at
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at
org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:632)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:143)
... 45 more
So, why is there a problem and what's the difference between 1.13.1&1.14.0,how can we fix this problem?
Thank you,David Anderson. This should be a bug introduced by the latest flink commit of file flink-core/src/main/java/org/apache/flink/api/common/ExecutionConfig.java
Diff the file, we can find TextElement is used here where ClosureCleanerLevel is is used as a memeber of Serializable ExecutionConfig.
TextElement in ClosureCleanerLevel
In Flink Siddhi, ExecutionConfig is serialized which is used to serilize flink data to siddhi type on every taskmanager, so that should be the cause.
The simplest way to verify the problem is running code as followings in flink 1.13.5 and 1.14.0, the exception is reproduced in 1.14.0 . And the diff between 1.13.5 and 1.14.0 is only lates commit.
#Test
public void testExecutionConfigSerializable() throws Exception {
ExecutionConfig config = new ExecutionConfig();
ClosureCleaner.clean(config, ExecutionConfig.ClosureCleanerLevel.RECURSIVE, true);
}
Note that plain Java serialization still works for the ExecutionConfig, it is just the ClosureCleaner that rejects it because it does very strict checks w.r.t. serializability.
As such, the underlying problem could be that the closure of your map function is unnecessarily large.
The SiddhiOperatorContext that you pass into the method will become part of the map functiosn closure, so you could check whether you can minimize the size of that context such that it no longer relies on an ExecutionConfig.

Flink: Could not find a suitable table factory for 'org.apache.flink.table.factories.DeserializationSchemaFactory' in the classpath

I am using flink's table api, I receive data from kafka, then register it as
a table, then I use sql statement to process, and finally convert the result
back to a stream, write to a directory, the code looks like this:
def main(args: Array[String]): Unit = {
val sEnv = StreamExecutionEnvironment.getExecutionEnvironment
sEnv.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
val tEnv = TableEnvironment.getTableEnvironment(sEnv)
tEnv.connect(
new Kafka()
.version("0.11")
.topic("user")
.startFromEarliest()
.property("zookeeper.connect", "")
.property("bootstrap.servers", "")
)
.withFormat(
new Json()
.failOnMissingField(false)
.deriveSchema() //使用表的 schema
)
.withSchema(
new Schema()
.field("username_skey", Types.STRING)
)
.inAppendMode()
.registerTableSource("user")
val userTest: Table = tEnv.sqlQuery(
"""
select ** form ** join **"".stripMargin)
val endStream = tEnv.toRetractStream[Row](userTest)
endStream.writeAsText("/tmp/sqlres",WriteMode.OVERWRITE)
sEnv.execute("Test_New_Sign_Student")
}
I was successful in the local test, but when I submit the following command
in the cluster, I get the following error:
=======================================================
org.apache.flink.client.program.ProgramInvocationException: The main method
caused an error.
at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
at
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426)
at
org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:804)
at
org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:280)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)
at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1044)
at
org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1120)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at
org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1120)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could
not find a suitable table factory for
'org.apache.flink.table.factories.DeserializationSchemaFactory' in
the classpath.
Reason: No factory implements
'org.apache.flink.table.factories.DeserializationSchemaFactory'.
The following properties are requested:
connector.properties.0.key=zookeeper.connect
....
schema.9.name=roles
schema.9.type=VARCHAR
update-mode=append
The following factories have been considered:
org.apache.flink.table.sources.CsvBatchTableSourceFactory
org.apache.flink.table.sources.CsvAppendTableSourceFactory
org.apache.flink.table.sinks.CsvBatchTableSinkFactory
org.apache.flink.table.sinks.CsvAppendTableSinkFactory
org.apache.flink.streaming.connectors.kafka.Kafka011TableSourceSinkFactory
at
org.apache.flink.table.factories.TableFactoryService$.filterByFactoryClass(TableFactoryService.scala:176)
at
org.apache.flink.table.factories.TableFactoryService$.findInternal(TableFactoryService.scala:125)
at
org.apache.flink.table.factories.TableFactoryService$.find(TableFactoryService.scala:100)
at
org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.scala)
at
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactoryBase.getDeserializationSchema(KafkaTableSourceSinkFactoryBase.java:259)
at
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactoryBase.createStreamTableSource(KafkaTableSourceSinkFactoryBase.java:144)
at
org.apache.flink.table.factories.TableFactoryUtil$.findAndCreateTableSource(TableFactoryUtil.scala:50)
at
org.apache.flink.table.descriptors.ConnectTableDescriptor.registerTableSource(ConnectTableDescriptor.scala:44)
at
org.clay.test.Test_New_Sign_Student$.main(Test_New_Sign_Student.scala:64)
at
org.clay.test.Test_New_Sign_Student.main(Test_New_Sign_Student.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
===================================
Can someone tell me what caused this? I am very confused about this........
if you are using maven-shade-plugin, make sure SPI transformer is placed.
Flink uses java Service Provider to discover Source/Sink connector.
Without this transformer, you will 100% encoutner "org.apache.flink.table.api.NoMatchingTableFactoryException: Could
not find a suitable table factory", which happened on me.
https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/table/connect.html#update-mode
flink officially points out this, search "SPI" on this page
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
You have to add the JAR dependencies of the connectors (Kafka) and formats (JSON) that you are using to the classpath of your program, i.e., either build a fat JAR that includes them or provide them to the classpath of the Flink cluster by copying them in the ./lib folder.
Check the Flink documentation for links to download the respective dependencies.
I have the met the same problem, just add parameters --connector.type kafka when you run your application will solve this. see enter link description here

deserializeObject exception on WindowOperator

It's seem that the data in the WindowOperator can serialize successfully. however deserialize failed when the taskmanager restarted by jobmanager.
env: state
backend: hdfs;
jobmanager: high-availability
Root exception:
java.lang.Exception: Could not restore checkpointed state to operators and functions
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreStateLazy(StreamTask.java:414)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:208)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.io.StreamCorruptedException: invalid type code: 00
at java.io.ObjectInputStream$BlockDataInputStream.readBlockHeader(ObjectInputStream.java:2508)
at java.io.ObjectInputStream$BlockDataInputStream.refill(ObjectInputStream.java:2543)
at java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:2615)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at java.io.ObjectInputStream$BlockDataInputStream.readInt(ObjectInputStream.java:2820)
at java.io.ObjectInputStream.readInt(ObjectInputStream.java:971)
at java.util.HashMap.readObject(HashMap.java:1158)
at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:294)
at org.apache.flink.streaming.runtime.operators.windowing.WindowOperator$Context.<init>(WindowOperator.java:446)
at org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.restoreState(WindowOperator.java:621)
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreStateLazy(StreamTask.java:406)
I have experienced the same issue, when I run multiple flink application with the same custom window/trigger class reused.
(I don't know if it's your case)
But I add generated serialVersionUID in my different reused class and it's work fine now

Trinidad API usage caused an exception on weblogic managed server in my case

I faced an exception when I try to deploy to new created managedServer (which applied JRF).
When application wants to use org/apache/myfaces/trinidad/model/ChildPropertyTreeModel class, occured exception below. That was used to be run properly on WC_Spaces.
Even if I copied the trinidad-api, trinidad-api-impl.jar libraries into domainhome/lib folder, nothing changes.
What might be the reason?
oracle.adf.controller.activity.ActivityLogicException: ADFC-06015: An exception occured when invoking a task flow initializer.
at oracle.adfinternal.controller.util.Utils.createAndLogActivityLogicException(Utils.java:230)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.invokeInitializer(TaskFlowCallActivityLogic.java:709)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.enterTaskFlow(TaskFlowCallActivityLogic.java:625)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.invokeLocalTaskFlow(TaskFlowCallActivityLogic.java:337)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.invokeTaskFlow(TaskFlowCallActivityLogic.java:229)
at oracle.adfinternal.controller.engine.ControlFlowEngine.invokeTaskFlow(ControlFlowEngine.java:217)
at oracle.adfinternal.controller.state.ChildViewPortContextImpl.invokeTaskFlow(ChildViewPortContextImpl.java:104)
at oracle.adfinternal.controller.state.ControllerState.createChildViewPort(ControllerState.java:1387)
at oracle.adfinternal.controller.ControllerContextImpl.createChildViewPort(ControllerContextImpl.java:78)
at oracle.adf.controller.internal.binding.DCTaskFlowBinding.createRegionViewPortContext(DCTaskFlowBinding.java:474)
at oracle.adf.controller.internal.binding.DCTaskFlowBinding.getViewPort(DCTaskFlowBinding.java:392)
at oracle.adf.controller.internal.binding.TaskFlowRegionModel.doProcessBeginRegion(TaskFlowRegionModel.java:164)
at oracle.adf.controller.internal.binding.TaskFlowRegionModel.processBeginRegion(TaskFlowRegionModel.java:112)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.doRegionRefresh(TaskFlowRegionController.java:241)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.refreshRegion(TaskFlowRegionController.java:119)
at oracle.adf.model.binding.DCBindingContainer.internalRefreshControl(DCBindingContainer.java:3204)
at oracle.adf.model.binding.DCBindingContainer.refresh(DCBindingContainer.java:2876)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.doRegionRefresh(TaskFlowRegionController.java:270)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.refreshRegion(TaskFlowRegionController.java:119)
at oracle.adf.model.binding.DCBindingContainer.internalRefreshControl(DCBindingContainer.java:3204)
at oracle.adf.model.binding.DCBindingContainer.refresh(DCBindingContainer.java:2876)
at oracle.adf.controller.v2.lifecycle.PageLifecycleImpl.prepareRender(PageLifecycleImpl.java:561)
at oracle.adf.controller.faces.lifecycle.FacesPageLifecycle.prepareRender(FacesPageLifecycle.java:82)
at oracle.adf.controller.v2.lifecycle.Lifecycle$9.execute(Lifecycle.java:224)
at oracle.adfinternal.controller.lifecycle.LifecycleImpl.executePhase(LifecycleImpl.java:197)
at oracle.adfinternal.controller.faces.lifecycle.ADFPhaseListener.access$1000(ADFPhaseListener.java:23)
at oracle.adfinternal.controller.faces.lifecycle.ADFPhaseListener$5.before(ADFPhaseListener.java:402)
at oracle.adfinternal.controller.faces.lifecycle.ADFPhaseListener.beforePhase(ADFPhaseListener.java:64)
at oracle.adfinternal.controller.faces.lifecycle.ADFLifecyclePhaseListener.beforePhase(ADFLifecyclePhaseListener.java:44)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:352)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:222)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:266)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:205)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:106)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:446)
at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:446)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:271)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:177)
at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.adf.library.webapp.LibraryFilter.doFilter(LibraryFilter.java:179)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
at java.security.AccessController.doPrivileged(Native Method)
at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:139)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
Caused By: javax.el.ELException: java.lang.NoClassDefFoundError: org/apache/myfaces/trinidad/model/ChildPropertyTreeModel
at com.sun.el.parser.AstValue.invoke(AstValue.java:191)
at com.sun.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:297)
...
I had a custom class extended ChildPropertyTreeModel and implements Serializable.
I ve made it Serialized with SerializedTreeModel. And this class was in another project(structure) in another word war deployment package's web-inf/lib has contained this class. After i replaced the SerializedTreeModel class location into the same package with refered project, eventually solved the problem. I could not understand the magic honestly.
public class SerializedTreeModel extends ChildPropertyTreeModel implements Serializable
{
public SerializedTreeModel(Object dataList, String dataChild)
{
super(dataList, dataChild);
}
}

Resources