Flink BlockElement Exception when updating to version 1.14.2 - apache-flink

Before, everything works well with flink 1.13.1, lately we update it to flink 1.14.2, the following code is run: and it throws this exception:
<T> DataStream<Tuple3<String, String, T>> returnsInternal(SiddhiOperatorContext siddhiContext, String[] executionPlanIds) {
if (createdDataStream == null) {
DataStream<Tuple2<StreamRoute, Object>> mapped = this.dataStream.map(new MapFunction<Tuple2<StreamRoute, Object>, Tuple2<StreamRoute, Object>>() {
#Override
public Tuple2<StreamRoute, Object> map(Tuple2<StreamRoute, Object> value) throws Exception {
if (executionPlanIds != null && executionPlanIds.length != 0) {
for (String executionPlanId : executionPlanIds) {
if (!executionPlanId.isEmpty()
&& siddhiContext.getExecutionPlan(executionPlanId).IsUsedStream(value.f0.getInputStreamId())) {
value.f0.addExecutionPlanId(executionPlanId);
}
}
}
return value;
}
});
createdDataStream = SiddhiStreamFactory.createDataStream(siddhiContext, mapped);
}
return createdDataStream;
}
The exception and callstack are as follows:
org.apache.flink.api.common.InvalidProgramException: The
implementation of the BlockElement is not serializable. The object
probably contains or references non serializable fields.
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:164)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
at
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:2139)
at
org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:203)
at
org.apache.flink.streaming.api.datastream.DataStream.map(DataStream.java:577)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStreamBase.returnsInternal(ExecutionSiddhiStreamBase.java:135)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStreamBase.returnsInternal(ExecutionSiddhiStreamBase.java:123)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStream.returnAsRow(ExecutionSiddhiStream.java:180)
at
org.apache.flink.streaming.siddhi.ExecutionSiddhiStream.ExecutionSiddhiStream.returnAsRowWithQueryId(ExecutionSiddhiStream.java:165)
at
org.apache.flink.streaming.siddhi.SiddhiCEPITCase.testSimplePojoStreamAndReturnRowWithQueryId(SiddhiCEPITCase.java:245)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at
org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at
org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at
org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at
org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at
org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
at
org.apache.flink.util.TestNameProvider$1.evaluate(TestNameProvider.java:45)
at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:61) at
org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at
org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366) at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at
org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331) at
org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79) at
org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329) at
org.junit.runners.ParentRunner.access$100(ParentRunner.java:66) at
org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293) at
org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
at
org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:54)
at org.junit.rules.RunRules.evaluate(RunRules.java:20) at
org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306) at
org.junit.runners.ParentRunner.run(ParentRunner.java:413) at
org.junit.runner.JUnitCore.run(JUnitCore.java:137) at
com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:69)
at
com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
at
com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:221)
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:54)
Caused by: java.io.NotSerializableException:
org.apache.flink.configuration.description.TextElement at
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
at
org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:632)
at
org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:143)
... 45 more
So, why is there a problem and what's the difference between 1.13.1&1.14.0,how can we fix this problem?

Thank you,David Anderson. This should be a bug introduced by the latest flink commit of file flink-core/src/main/java/org/apache/flink/api/common/ExecutionConfig.java
Diff the file, we can find TextElement is used here where ClosureCleanerLevel is is used as a memeber of Serializable ExecutionConfig.
TextElement in ClosureCleanerLevel
In Flink Siddhi, ExecutionConfig is serialized which is used to serilize flink data to siddhi type on every taskmanager, so that should be the cause.
The simplest way to verify the problem is running code as followings in flink 1.13.5 and 1.14.0, the exception is reproduced in 1.14.0 . And the diff between 1.13.5 and 1.14.0 is only lates commit.
#Test
public void testExecutionConfigSerializable() throws Exception {
ExecutionConfig config = new ExecutionConfig();
ClosureCleaner.clean(config, ExecutionConfig.ClosureCleanerLevel.RECURSIVE, true);
}

Note that plain Java serialization still works for the ExecutionConfig, it is just the ClosureCleaner that rejects it because it does very strict checks w.r.t. serializability.
As such, the underlying problem could be that the closure of your map function is unnecessarily large.
The SiddhiOperatorContext that you pass into the method will become part of the map functiosn closure, so you could check whether you can minimize the size of that context such that it no longer relies on an ExecutionConfig.

Related

The main method caused an error: No operators defined in streaming topology. Cannot execute

I have written the flink code to read the from pubsub. While executing the code with the command flink run Flink.jar I am getting the below mentioned error. I am using the flink version 1.9.3
Starting execution of program
------------------------------------------------------------
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error: No operators defined in streaming topology. Cannot execute.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:621)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:466)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1008)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1081)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1081)
Caused by: java.lang.IllegalStateException: No operators defined in streaming topology. Cannot execute.
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraphGenerator(StreamExecutionEnvironment.java:1545)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.getStreamGraph(StreamExecutionEnvironment.java:1540)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1507)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1489)
at org.flink.ReadFromPubsub.main(ReadFromPubsub.java:30)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:604)
... 9 more
Please find the code which I am using
package org.flink;
import org.apache.flink.api.common.serialization.DeserializationSchema;
import org.apache.flink.api.common.serialization.SimpleStringSchema;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.source.SourceFunction;
import org.apache.flink.streaming.connectors.gcp.pubsub.PubSubSource;
public class ReadFromPubsub
{
public static void main(String args[]) throws Exception
{
System.out.println("Flink Pubsub Code Read 1");
StreamExecutionEnvironment streamExecEnv= StreamExecutionEnvironment.getExecutionEnvironment();
DeserializationSchema<String> deserializer = new SimpleStringSchema();
SourceFunction<String> pubsubSource = PubSubSource.newBuilder() .withDeserializationSchema(deserializer)
.withProjectName("vz-it-np-gudv-dev-vzntdo-0") .withSubscriptionName("subscription1").build();
streamExecEnv.addSource(pubsubSource);
streamExecEnv.execute();
}
}
I am trying to read the data from pubsub with flink code but not able to do so.
Flink uses lazy evaluation and since you haven't specified any sinks there would be no reason to execute this job.
From the Flink docs:
All Flink programs are executed lazily: When the program’s main method is executed, the data loading and transformations do not happen directly. Rather, each operation is created and added to a dataflow graph. The operations are actually executed when the execution is explicitly triggered by an execute() call on the execution environment.
However, your dataflow graph has no output in this case which makes processing unnecessary.
For debugging purposes, you can add a print sink to your source to make your example work:
streamExecEnv.addSource(pubsubSource).print();

Loading user classes in flink RichFilterFunction using User code classloader

I am trying to use the flink's UserCodeClassLoader but since I am new to Flink I could not understand exactly how to use it.
Scenario:
In the open() method of RichFilterFunction(), I want to load an external jar.
To do so, I do the following in open():
#Override
public void open(Configuration parameters) throws Exception {
ClassLoader userClassLoader = getRuntimeContext().getUserCodeClassLoader();
URL url = userClassLoader.getResource("/tmp/rohit/FilterTest/FilterTest.jar");
klazz = userClassLoader.loadClass("FilterTest");
Constructor<?> ctor = klazz.getConstructor();
Object obj = ctor.newInstance(new Object[] {});
control = (MyRichFilterInterface)obj;
... etc
However, I am getting ClassNotFoundException:
Caused by: java.lang.ClassNotFoundException: FilterTest
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.apache.flink.runtime.execution.librarycache.FlinkUserCodeClassLoaders$ChildFirstClassLoader.loadClass(FlinkUserCodeClassLoaders.java:128)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at MyRichFilterFunction.open(MyRichFilterFunction.java:24)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at org.apache.flink.streaming.runtime.tasks.StreamTask.openAllOperators(StreamTask.java:393)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:254)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:718)
at java.lang.Thread.run(Thread.java:748)
My flink version is 1.4.0 & it is installed in /root/flink-1.4.0/
I have not changed any parameters in flink config specific to this issue.
If anyone knows what I am missing, that would be great!!!
I have used URLClassLoader earlier, but I am not sure how to use Flink's provided UserCodeClassLoader.

Trinidad API usage caused an exception on weblogic managed server in my case

I faced an exception when I try to deploy to new created managedServer (which applied JRF).
When application wants to use org/apache/myfaces/trinidad/model/ChildPropertyTreeModel class, occured exception below. That was used to be run properly on WC_Spaces.
Even if I copied the trinidad-api, trinidad-api-impl.jar libraries into domainhome/lib folder, nothing changes.
What might be the reason?
oracle.adf.controller.activity.ActivityLogicException: ADFC-06015: An exception occured when invoking a task flow initializer.
at oracle.adfinternal.controller.util.Utils.createAndLogActivityLogicException(Utils.java:230)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.invokeInitializer(TaskFlowCallActivityLogic.java:709)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.enterTaskFlow(TaskFlowCallActivityLogic.java:625)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.invokeLocalTaskFlow(TaskFlowCallActivityLogic.java:337)
at oracle.adfinternal.controller.activity.TaskFlowCallActivityLogic.invokeTaskFlow(TaskFlowCallActivityLogic.java:229)
at oracle.adfinternal.controller.engine.ControlFlowEngine.invokeTaskFlow(ControlFlowEngine.java:217)
at oracle.adfinternal.controller.state.ChildViewPortContextImpl.invokeTaskFlow(ChildViewPortContextImpl.java:104)
at oracle.adfinternal.controller.state.ControllerState.createChildViewPort(ControllerState.java:1387)
at oracle.adfinternal.controller.ControllerContextImpl.createChildViewPort(ControllerContextImpl.java:78)
at oracle.adf.controller.internal.binding.DCTaskFlowBinding.createRegionViewPortContext(DCTaskFlowBinding.java:474)
at oracle.adf.controller.internal.binding.DCTaskFlowBinding.getViewPort(DCTaskFlowBinding.java:392)
at oracle.adf.controller.internal.binding.TaskFlowRegionModel.doProcessBeginRegion(TaskFlowRegionModel.java:164)
at oracle.adf.controller.internal.binding.TaskFlowRegionModel.processBeginRegion(TaskFlowRegionModel.java:112)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.doRegionRefresh(TaskFlowRegionController.java:241)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.refreshRegion(TaskFlowRegionController.java:119)
at oracle.adf.model.binding.DCBindingContainer.internalRefreshControl(DCBindingContainer.java:3204)
at oracle.adf.model.binding.DCBindingContainer.refresh(DCBindingContainer.java:2876)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.doRegionRefresh(TaskFlowRegionController.java:270)
at oracle.adf.controller.internal.binding.TaskFlowRegionController.refreshRegion(TaskFlowRegionController.java:119)
at oracle.adf.model.binding.DCBindingContainer.internalRefreshControl(DCBindingContainer.java:3204)
at oracle.adf.model.binding.DCBindingContainer.refresh(DCBindingContainer.java:2876)
at oracle.adf.controller.v2.lifecycle.PageLifecycleImpl.prepareRender(PageLifecycleImpl.java:561)
at oracle.adf.controller.faces.lifecycle.FacesPageLifecycle.prepareRender(FacesPageLifecycle.java:82)
at oracle.adf.controller.v2.lifecycle.Lifecycle$9.execute(Lifecycle.java:224)
at oracle.adfinternal.controller.lifecycle.LifecycleImpl.executePhase(LifecycleImpl.java:197)
at oracle.adfinternal.controller.faces.lifecycle.ADFPhaseListener.access$1000(ADFPhaseListener.java:23)
at oracle.adfinternal.controller.faces.lifecycle.ADFPhaseListener$5.before(ADFPhaseListener.java:402)
at oracle.adfinternal.controller.faces.lifecycle.ADFPhaseListener.beforePhase(ADFPhaseListener.java:64)
at oracle.adfinternal.controller.faces.lifecycle.ADFLifecyclePhaseListener.beforePhase(ADFLifecyclePhaseListener.java:44)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:352)
at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:222)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:266)
at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:301)
at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:205)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:106)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:446)
at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:446)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:271)
at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:177)
at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.adf.library.webapp.LibraryFilter.doFilter(LibraryFilter.java:179)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:119)
at java.security.AccessController.doPrivileged(Native Method)
at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:315)
at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:442)
at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:103)
at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:171)
at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:139)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3730)
at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3696)
at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2273)
at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2179)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1490)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:256)
at weblogic.work.ExecuteThread.run(ExecuteThread.java:221)
Caused By: javax.el.ELException: java.lang.NoClassDefFoundError: org/apache/myfaces/trinidad/model/ChildPropertyTreeModel
at com.sun.el.parser.AstValue.invoke(AstValue.java:191)
at com.sun.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:297)
...
I had a custom class extended ChildPropertyTreeModel and implements Serializable.
I ve made it Serialized with SerializedTreeModel. And this class was in another project(structure) in another word war deployment package's web-inf/lib has contained this class. After i replaced the SerializedTreeModel class location into the same package with refered project, eventually solved the problem. I could not understand the magic honestly.
public class SerializedTreeModel extends ChildPropertyTreeModel implements Serializable
{
public SerializedTreeModel(Object dataList, String dataChild)
{
super(dataList, dataChild);
}
}

Google App Engine Entity Relationship Annotations end up HTTP 500

I am creating a simple REST API via Google app engine. I have Task and Project objects. A Project can have one or more Tasks. Here is some details about these data objects:
#Entity
public class Project implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Key id;
private String title;
private String description;
private Date createdAt;
// Section 1
// #OneToMany(mappedBy = "project")
// private List<Task> tasks;
// ...
}
#Entity
public class Task implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private Key id;
private String shortDescription;
private String longDescription;
private Date createdAt;
private Date dueDate;
private boolean completed;
// Section 2
// #ManyToOne
// #JoinColumn(name = "id_project")
// private Project project;
// ...
}
The way I implemented the class above works fine (Section 1 and Section 2 are commented out). However, what I want to do is to relate Task objects to Project. Whenever I remove the comments above and activate Section 1 and Section 2 the errors below occur.
The error appears for Project operations
HTTP ERROR 500
Problem accessing /api/project. Reason:
Could not initialize class com.aspect.todo.dao.EMFService
Caused by:
java.lang.NoClassDefFoundError: Could not initialize class com.aspect.todo.dao.EMFService
at com.aspect.todo.dao.Dao.getProjects(Dao.java:144)
at com.aspect.todo.server.ProjectService.get(ProjectService.java:23)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
...
The error appears for Task operations
HTTP ERROR 500
Problem accessing /api/task. Reason:
INTERNAL_SERVER_ERROR
Caused by:
java.lang.ExceptionInInitializerError
at com.aspect.todo.dao.Dao.getTasks(Dao.java:98)
at com.aspect.todo.server.TaskService.get(TaskService.java:24)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
...
Caused by:
javax.persistence.PersistenceException: Provider error. Provider: org.datanucleus.store.appengine.jpa.DatastorePersistenceProvider
at javax.persistence.Persistence.createFactory(Persistence.java:176)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:112)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:66
...
Caused by:
Errors were encountered when initialising the specified MetaData. See the nested exceptions for details
org.datanucleus.exceptions.NucleusUserException: Errors were encountered when initialising the specified MetaData. See the nested exceptions for details
at org.datanucleus.metadata.MetaDataManager.initialiseFileMetaDataForUse(MetaDataManager.java:892)
at org.datanucleus.metadata.MetaDataManager.loadPersistenceUnit(MetaDataManager.java:794)
at org.datanucleus.jpa.EntityManagerFactoryImpl.initialisePMF(EntityManagerFactoryImpl.java:488)
...
Caused by:
Found Meta-Data for class com.aspect.todo.model.Task but this class is not enhanced!! Please enhance the class before running DataNucleus.
org.datanucleus.exceptions.NucleusUserException: Found Meta-Data for class com.aspect.todo.model.Task but this class is not enhanced!! Please enhance the class before running DataNucleus.
at org.datanucleus.metadata.MetaDataManager.initialiseClassMetaData(MetaDataManager.java:2225)
at org.datanucleus.metadata.MetaDataManager.initialiseFileMetaData(MetaDataManager.java:2176)
at org.datanucleus.metadata.MetaDataManager.initialiseFileMetaDataForUse(MetaDataManager.java:881)
...
The weird thing is when I start with these sections commented out, compile and run and then activate only Section 2 and rerun it works fine. If close and reopen Eclipse try again the error occurs again.
NOTE: Datanucleus JDO/JPA version: v1
Found Meta-Data for class com.aspect.todo.model.Task but this class is not enhanced!!
Please enhance the class before running DataNucleus.
The fact is that the classes are not enhanced at runtime and you have to find a way to enhance them prior to runtime that works for your environment ... Maven, Ant, command-line, DataNucleus Eclipse plugin, or GAE Eclipse plugin.
If you are using ANT then add the below lines in your build.xml before </project> tag close
<target name="datanucleusenhance" depends="compile"
description="Performs JDO enhancement on compiled data classes.">
<enhance_war war="war" />
And give the following command when build the ant "ant datanucleusenhance runserver"
Hope this is useful, it took me a while to find the solution.
Just upgrade your GAE SDK, it solved the problem for me.
Found Meta-Data for class com.aspect.todo.model.Task but this class is not enhanced!!
Please enhance the class before running DataNucleus.
I'll show here a way to enhance classes in Gradle environment. Use the following configuration.
Dependencies:
dependencies {
appengineSdk 'com.google.appengine:appengine-java-sdk:<version>'
compile 'org.datanucleus:datanucleus-enhancer:<version>'
compile 'org.datanucleus:datanucleus-api-jpa:<version>' // or datanucleus-api-jdo
/* other dependencies */
}
Plugin:
(for gradle-appengine-plugin version 1.9.5 or higher)
appengine {
enhancer {
version = "v2"
api = "jpa" // or "jdo"
enhanceOnBuild = true
}
/*...*/
}
(for older versions of gradle-appengine-plugin)
appengine {
enhancerVersion = "v2"
}
To make enhance run automatically before creating the war in your build.gradle file:
war.dependsOn appengineEnhance
If you have problems executing task :appengineEnhance, try to run it with --stacktrace (to view the stacktrace) or --info (to find the location of the error log file) options.

duplicate jdoconfig.xml on classpath during unit tests in Intellij IDEA (9.x pre-release)

As described here, IDEA is adding the jdoconfig.xml to the class path twice during unit test runs. Can anyone suggest a work-around for this?
/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/bin/java -Dgwt.args=-gen /Users/jgc/Library/Caches/IntelliJIDEA9M1/gwt/recipes.recipesea699bb7/Recipes.4dd34334/test/gen -out /Users/jgc/Library/Caches/IntelliJIDEA9M1/gwt/recipes.recipesea699bb7/Recipes.4dd34334/test/www -Dfile.encoding=MacRoman -classpath /Users/jgc/gwt-versions/gwt-mac-1.7.0/gwt-dev-mac.jar:/Users/jgc/Documents/beanstalk/recipes/test:/Users/jgc/Documents/beanstalk/recipes/src:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/deploy.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/dt.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/javaws.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/jce.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/management-agent.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/plugin.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/sa-jdi.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/alt-rt.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/charsets.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/classes.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/dt.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/jce.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/jconsole.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/jsse.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/laf.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/management-agent.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/../Classes/ui.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/ext/apple_provider.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/ext/dnsns.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/ext/localedata.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/ext/sunjce_provider.jar:/System/Library/Frameworks/JavaVM.framework/Versions/CurrentJDK/Home/lib/ext/sunpkcs11.jar:/Users/jgc/Documents/beanstalk/recipes/out/test/Recipes:/Users/jgc/Documents/beanstalk/recipes/out/production/Recipes:/Users/jgc/gwt-versions/gwt-mac-1.7.0/gwt-user.jar:/Applications/IntelliJ IDEA 9.0M1.app/lib/javaee_6.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/appengine-api-1.0-sdk-1.2.5.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/geronimo-jpa_3.0_spec-1.1.1.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/datanucleus-jpa-1.1.5.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/datanucleus-core-1.1.5.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/appengine-api-labs-1.2.5.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/gwttheme.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/datanucleus-appengine-1.0.3.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/geronimo-jta_1.1_spec-1.1.1.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/jdo2-api-2.3-eb.jar:/Users/jgc/Documents/beanstalk/recipes/war/WEB-INF/lib/gwt-servlet.jar:/Users/jgc/appengine-java-sdk-1.2.5/lib:/Users/jgc/appengine-java-sdk-1.2.5/lib/appengine-tools-api.jar:/Applications/IntelliJ IDEA 9.0M1.app/lib/junit-4.6.jar:/Users/jgc/appengine-java-sdk-1.2.5/lib/impl/appengine-api-labs.jar:/Users/jgc/appengine-java-sdk-1.2.5/lib/impl/appengine-api.jar:/Users/jgc/appengine-java-sdk-1.2.5/lib/impl/appengine-api-stubs.jar:/Users/jgc/appengine-java-sdk-1.2.5/lib/impl:/Users/jgc/appengine-java-sdk-1.2.5/lib/impl/appengine-local-runtime.jar:/Applications/IntelliJ IDEA 9.0M1.app/lib/idea_rt.jar com.intellij.rt.execution.junit.JUnitStarter -ideVersion5 -junit4 com.jgc.recipes.server.RecipeDAOTest
java.lang.ExceptionInInitializerError
at com.jgc.recipes.server.RecipeDAO.persistRecipe(RecipeDAO.java:27)
at com.jgc.recipes.server.RecipeDAOTest.testFoo(RecipeDAOTest.java:47)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:91)
at org.junit.runner.JUnitCore.run(JUnitCore.java:159)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:43)
Caused by: javax.jdo.JDOFatalUserException: Duplicate PMF name "transactions-optional" found in file:/Users/jgc/Documents/beanstalk/recipes/src/META-INF/jdoconfig.xml and file:/Users/jgc/Documents/beanstalk/recipes/out/production/Recipes/META-INF/jdoconfig.xml.
at javax.jdo.JDOHelper.getNamedPMFProperties(JDOHelper.java:1300)
at javax.jdo.JDOHelper.getPropertiesFromJdoconfig(JDOHelper.java:1232)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:1079)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:914)
at com.jgc.recipes.server.PMF.<clinit>(PMF.java:7)
... 20 more
Process finished with exit code 255
For the time being you could pass the PersistenceManagerFactory's properties programmatically, so without using a jdoconfig.xml file, e.g.
private static PersistenceManagerFactory pmfInstance;
static {
Map props = new HashMap();
props.put("javax.jdo.PersistenceManagerFactoryClass", "org.datanucleus.store.appengine.jdo.DatastoreJDOPersistenceManagerFactory");
props.put("javax.jdo.option.ConnectionURL", "appengine");
props.put("javax.jdo.option.NontransactionalRead", "true");
props.put("javax.jdo.option.NontransactionalWrite", "true");
props.put("javax.jdo.option.RetainValues", "true");
props.put("datanucleus.appengine.autoCreateDatastoreTxns", "true");
pmfInstance = JDOHelper.getPersistenceManagerFactory(props);
}
private PMF() {
}
public static PersistenceManagerFactory get() {
return pmfInstance;
}

Resources