fetch a list of objects . Use multi threading to fetch children objects and do some processing. collect all results and send to UI. threads produce inconsistent results. Some of the threads throw an error while some work fine.
sample code :
entities : Parent, Child
Parent {
Ref<Child> childKey ;
public Child getChild(){
return childKey.get();
}
}
List<Parent> parents ; //get from db
ExecutorService executorService = Executors.newFixedThreadPool(20, ThreadManager.currentRequestThreadFactory());
List<Future<ParentTO>> futures = new ArrayList<>();
for (Parent parent : parents) {
Future<ParentTO> future = executorService.submit(() -> {
return ObjectifyService.run(new Work<ParentTO>() {
public ParentTO run() {
parent.getChild(); ParentTO to = new ParentTO();
return to;
}
});
});
}
Error only shows on appengine and for some threads. It works fine on local server. If i reload page i see different results every time. Without threads it works fine.
java.util.concurrent.ExecutionException: com.googlecode.objectify.LoadException: Error loading Parent("FND1clkiTUa_lWsJow-Dxwdmadma")/Child(1): null
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:206)
at com.netkiller.erp.service.ParentService.ParentsToApprovalTOs(ParentService.java:1720)
at com.netkiller.erp.service.ParentService.getSharedParentsUI(ParentService.java:2000)
at com.netkiller.common.controller.ParentController.fetchSharedParents(ParentController.java:1266)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.web.method.support.InvocableHandlerMethod.invoke(InvocableHandlerMethod.java:214)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:132)
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:104)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandleMethod(RequestMappingHandlerAdapter.java:749)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:690)
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:83)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:945)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:876)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:961)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:852)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:837)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:848)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1772)
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:88)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:108)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at com.netkiller.common.filter.OpenIdFilter.doFilter(OpenIdFilter.java:859)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at com.netkiller.common.filter.ServiceFilter.doFilter(ServiceFilter.java:100)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at com.netkiller.common.filter.SecureProtocolFilter.doFilter(SecureProtocolFilter.java:35)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at com.googlecode.objectify.ObjectifyFilter.doFilter(ObjectifyFilter.java:48)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at com.google.apphosting.utils.servlet.JdbcMySqlConnectionCleanupFilter.doFilter(JdbcMySqlConnectionCleanupFilter.java:60)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1759)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:524)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:226)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at com.google.apphosting.runtime.jetty9.ParseBlobUploadHandler.handle(ParseBlobUploadHandler.java:119)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1182)
at com.google.apphosting.runtime.jetty9.AppEngineWebAppContext.doHandle(AppEngineWebAppContext.java:183)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:512)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at com.google.apphosting.runtime.jetty9.AppVersionHandlerMap.handle(AppVersionHandlerMap.java:293)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:539)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:333)
at com.google.apphosting.runtime.jetty9.RpcConnection.handle(RpcConnection.java:213)
at com.google.apphosting.runtime.jetty9.RpcConnector.serviceRequest(RpcConnector.java:81)
at com.google.apphosting.runtime.jetty9.JettyServletEngineAdapter.serviceRequest(JettyServletEngineAdapter.java:123)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.dispatchServletRequest(JavaRuntime.java:692)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.dispatchRequest(JavaRuntime.java:655)
at com.google.apphosting.runtime.JavaRuntime$RequestRunnable.run(JavaRuntime.java:625)
at com.google.apphosting.runtime.JavaRuntime$NullSandboxRequestRunnable.run(JavaRuntime.java:817)
at com.google.apphosting.runtime.ThreadGroupPool$PoolEntry.run(ThreadGroupPool.java:269)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.googlecode.objectify.LoadException: Error loading Parent("FND1clkiTUa_lWsJow-Dxwdmadma")/Child(1): null
at com.googlecode.objectify.impl.EntityMetadata.load(EntityMetadata.java:78)
at com.googlecode.objectify.impl.LoadEngine.load(LoadEngine.java:185)
at com.googlecode.objectify.impl.LoadEngine$1.nowUncached(LoadEngine.java:141)
at com.googlecode.objectify.impl.LoadEngine$1.nowUncached(LoadEngine.java:127)
at com.googlecode.objectify.util.ResultCache.now(ResultCache.java:30)
at com.googlecode.objectify.impl.Round$1.nowUncached(Round.java:71)
at com.googlecode.objectify.util.ResultCache.now(ResultCache.java:30)
at com.googlecode.objectify.impl.LoaderImpl.now(LoaderImpl.java:251)
at com.googlecode.objectify.impl.ref.LiveRef.get(LiveRef.java:47)
at com.netkiller.erp.domain.Parent.getChild(Parent.java:187)
at com.netkiller.common.dto.ApprovalTO.<init>(ApprovalTO.java:503)
at com.netkiller.erp.service.ParentService$2.run(ParentService.java:1708)
at com.netkiller.erp.service.ParentService$2.run(ParentService.java:1)
at com.googlecode.objectify.ObjectifyService.run(ObjectifyService.java:81)
at com.netkiller.erp.service.ParentService.lambda$8(ParentService.java:1706)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at com.google.apphosting.runtime.ApiProxyImpl$CurrentRequestThreadFactory.lambda$newThread$0(ApiProxyImpl.java:1213)
at java.security.AccessController.doPrivileged(Native Method)
at com.google.apphosting.runtime.ApiProxyImpl$CurrentRequestThreadFactory.lambda$newThread$1(ApiProxyImpl.java:1209)
at java.lang.Thread.run(Thread.java:748)
at com.google.apphosting.runtime.ApiProxyImpl$CurrentRequestThread.run(ApiProxyImpl.java:1185)
Caused by: java.util.NoSuchElementException
at java.util.ArrayDeque.removeLast(ArrayDeque.java:295)
at com.googlecode.objectify.impl.translate.LoadContext.exitContainerContext(LoadContext.java:141)
at com.googlecode.objectify.impl.translate.ClassPopulator.load(ClassPopulator.java:119)
at com.googlecode.objectify.impl.translate.ClassTranslator.loadSafe(ClassTranslator.java:122)
at com.googlecode.objectify.impl.translate.ClassTranslator.loadSafe(ClassTranslator.java:21)
at com.googlecode.objectify.impl.translate.NullSafeTranslator.load(NullSafeTranslator.java:17)
at com.googlecode.objectify.impl.EntityMetadata.load(EntityMetadata.java:74)
... 22 more
Thanks for any insight on the issue.
It's hard to tell exactly what's going on from the provided code, but it's easy to accidentally contaminate data when operating across multiple threads.
To reduce the chance that you're accidentally pulling data from the wrong thread's session, use ofy().load() to load everything instead of following refs. You can pass a Ref to ofy().load().ref(theRef).
If you always call ofy() then you will always get an instance correct for your thread of execution. Try that out and if you still have an issue, we can continue this discussion with whatever error message you see.
Related
While submitting the flink job on the dataproc cluster getting the below error. Please find the code base and the error. I am using the flink 1.9.3 version.
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: Could not retrieve the execution result. (JobID: f064ceaa5b318fdad9a77b2723b9ee64)
at org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:255)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
at org.apache.flink.streaming.api.environment.StreamContextEnvironment.execute(StreamContextEnvironment.java:60)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1507)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1489)
at org.flink.ReadFromPubsub.main(ReadFromPubsub.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:604)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:466)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:274)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:746)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:273)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:205)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1008)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1081)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)
at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1081)
Caused by: org.apache.flink.runtime.client.JobSubmissionException: Failed to submit JobGraph.
at org.apache.flink.client.program.rest.RestClusterClient.lambda$submitJob$8(RestClusterClient.java:391)
at java.util.concurrent.CompletableFuture.uniExceptionally(CompletableFuture.java:884)
at java.util.concurrent.CompletableFuture$UniExceptionally.tryFire(CompletableFuture.java:866)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.completeExceptionally(CompletableFuture.java:1990)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:263)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.rest.util.RestClientException: [Internal server error., <Exception on server side:
org.apache.flink.runtime.client.JobSubmissionException: Failed to submit job.
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$internalSubmitJob$2(Dispatcher.java:333)
at java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:836)
at java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:811)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:40)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:44)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:36)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1604)
... 6 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not set up JobManager
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:152)
at org.apache.flink.runtime.dispatcher.DefaultJobManagerRunnerFactory.createJobManagerRunner(DefaultJobManagerRunnerFactory.java:83)
at org.apache.flink.runtime.dispatcher.Dispatcher.lambda$createJobManagerRunner$5(Dispatcher.java:376)
at org.apache.flink.util.function.CheckedSupplier.lambda$unchecked$0(CheckedSupplier.java:34)
... 7 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Could not instantiate configured state backend
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:303)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:106)
at org.apache.flink.runtime.scheduler.LegacyScheduler.createExecutionGraph(LegacyScheduler.java:207)
at org.apache.flink.runtime.scheduler.LegacyScheduler.createAndRestoreExecutionGraph(LegacyScheduler.java:184)
at org.apache.flink.runtime.scheduler.LegacyScheduler.<init>(LegacyScheduler.java:176)
at org.apache.flink.runtime.scheduler.LegacySchedulerFactory.createInstance(LegacySchedulerFactory.java:70)
at org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:278)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:266)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:98)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.createJobMasterService(DefaultJobMasterServiceFactory.java:40)
at org.apache.flink.runtime.jobmaster.JobManagerRunner.<init>(JobManagerRunner.java:146)
... 10 more
Caused by: org.apache.flink.configuration.IllegalConfigurationException: Cannot create the RocksDB state backend: The configuration does not specify the checkpoint directory 'state.checkpoints.dir'
at org.apache.flink.contrib.streaming.state.RocksDBStateBackendFactory.createFromConfig(RocksDBStateBackendFactory.java:44)
at org.apache.flink.contrib.streaming.state.RocksDBStateBackendFactory.createFromConfig(RocksDBStateBackendFactory.java:32)
at org.apache.flink.runtime.state.StateBackendLoader.loadStateBackendFromConfig(StateBackendLoader.java:154)
at org.apache.flink.runtime.state.StateBackendLoader.fromApplicationOrConfigOrDefault(StateBackendLoader.java:219)
at org.apache.flink.runtime.executiongraph.ExecutionGraphBuilder.buildGraph(ExecutionGraphBuilder.java:299)
... 20 more
End of exception on server side>]
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:389)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:373)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Code snippet I executed
public class ReadFromPubsub
{
public static void main(String args[]) throws Exception
{
System.out.println("Flink Pubsub Code Read 1");
StreamExecutionEnvironment streamExecEnv = StreamExecutionEnvironment.getExecutionEnvironment();
System.out.println("Flink Pubsub Code Read 2");
DeserializationSchema<String> deserializer = new SimpleStringSchema();
System.out.println("Flink Pubsub Code Read 3");
SourceFunction<String> pubsubSource = PubSubSource.newBuilder()
.withDeserializationSchema(deserializer)
.withProjectName("vz-it-np-gudv-dev-vzntdo-0")
.withSubscriptionName("subscription1")
.build();
System.out.println("Flink Pubsub Code Read 4");
streamExecEnv.addSource(pubsubSource).print();
streamExecEnv.enableCheckpointing(10);
System.out.println("Flink Pubsub Code Read 5");
streamExecEnv.execute();
}
}
I can see all the print statements during the execution of the code. After the last print statement I am getting the error.
All the exceptions to be resolved
Generally, you would supply the appropriate savepoint/checkpointing directory within your Flink configuration within flink-conf.yaml as detailed in the docs. If you aren't currently setting it, you can do so in the following ways:
Via flink-conf.yaml (Preferred)
state.checkpoints.dir: "file://example/checkpoints"
Programmatically (within the job):
Configuration configuration = new Configuration();
conf.setString("state.checkpoints.dir", "file://example/checkpoints");
StreamExecutionEnvironment streamExecEnv =
StreamExecutionEnvironment.getExecutionEnvironment(configuration);
Via the CLI:
flink run -Dstate.checkpoints.dir="/example/checkpoints" your-job.jar
Additionally, if you didn't actually want to perform checkpointing, you could likely remove the following configuration within your job:
streamExecEnv.enableCheckpointing(10);
I started "playing" with Apache Flink recently. I've put together a small application to start testing the framework and so on. I'm currently running into a problem when trying to serialize a usual POJO class:
#Getter
#ToString
#EqualsAndHashCode
#NoArgsConstructor
#AllArgsConstructor
public final class Species {
private String name;
private List<String> abilities;
}
Somehow, I can tell by the stacktrace that the List type cannot be serialized, but according to Flink's documentation, that shouldn't be the case. This is the stacktrace:
2021-11-20 11:52:09,195 |- WARN in org.apache.flink.runtime.taskmanager.Task:1097 [Source: Collection Source (1/1)#0] - Source: Collection Source (1/1)#0 (4073b9fb97691d505e5a5557bf7e081b) switched from RUNNING to FAILED with failure cause: java.io.IOException: Failed to deserialize an element from the source. If you are using user-defined serialization (Value and Writable types), check the serialization functions.
Serializer is org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer#5c0e17e6
at org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:222)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:116)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:73)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:323)
Caused by: com.esotericsoftware.kryo.KryoException: java.lang.UnsupportedOperationException
Serialization trace:
abilities (org.acme.domain.Species)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.deserialize(KryoSerializer.java:354)
at org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:220)
... 3 more
Caused by: java.lang.UnsupportedOperationException
at java.base/java.util.ImmutableCollections.uoe(ImmutableCollections.java:72)
at java.base/java.util.ImmutableCollections$AbstractImmutableCollection.add(ImmutableCollections.java:76)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:109)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:22)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
... 7 more
Exception in thread "main" org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.runtime.minicluster.MiniClusterJobClient.lambda$getJobExecutionResult$3(MiniClusterJobClient.java:137)
at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$0(AkkaInvocationHandler.java:250)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.util.concurrent.FutureUtils.doForward(FutureUtils.java:1389)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$null$1(ClassLoadingUtils.java:93)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$guardCompletionWithContextClassLoader$2(ClassLoadingUtils.java:92)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$1.onComplete(AkkaFutureUtils.java:47)
at akka.dispatch.OnComplete.internal(Future.scala:300)
at akka.dispatch.OnComplete.internal(Future.scala:297)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:224)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:221)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$DirectExecutionContext.execute(AkkaFutureUtils.java:65)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:621)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:24)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:23)
at scala.concurrent.Future.$anonfun$andThen$1(Future.scala:532)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:228)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:218)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:209)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:679)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444)
at jdk.internal.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
... 5 more
Caused by: java.io.IOException: Failed to deserialize an element from the source. If you are using user-defined serialization (Value and Writable types), check the serialization functions.
Serializer is org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer#5c0e17e6
at org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:222)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:116)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:73)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:323)
Caused by: com.esotericsoftware.kryo.KryoException: java.lang.UnsupportedOperationException
Serialization trace:
abilities (org.acme.domain.Species)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.deserialize(KryoSerializer.java:354)
at org.apache.flink.streaming.api.functions.source.FromElementsFunction.run(FromElementsFunction.java:220)
... 3 more
Caused by: java.lang.UnsupportedOperationException
at java.base/java.util.ImmutableCollections.uoe(ImmutableCollections.java:72)
at java.base/java.util.ImmutableCollections$AbstractImmutableCollection.add(ImmutableCollections.java:76)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:109)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:22)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
... 7 more
Process finished with exit code 1
I've tried adding enableForceKyro to StreamExecutionEnvironment.getExecutionEnvironment().getCOnfig, but didn't change anything.
What am I missing here? I'm using Apache Flink 1.14.0 with Java (Eclipse Temurin) 11.x.
Since the issue is with Kryo serialization, you can register your own custom Kryo serializers. But in my experience this hasn't worked all that well for reasons I don't completely understand (not always used). Plus Kryo serialization is going to be much slower than creating a POJO that Flink can serialize using built-in support. So add setters for every field, verify nothing gets logged about class Species missing something that qualifies it for fast serialization, and you should be all set.
I'm are currently testing migration to Java 11 and I'm having issues with the logging component. In Java 8, everything was logging correctly, with the appropriate log level and aggregating logs into request. However, after moving to Java 11 that is no longer the case.
I went through the following guides:
https://cloud.google.com/appengine/docs/standard/java11/writing-application-logs
https://cloud.google.com/logging/docs/setup/java#the_javautillogging_handler
I made the required changes to our code and now the server crashes upon initialization. I get the following error:
java.lang.AbstractMethodError: Receiver class com.google.api.gax.grpc.InstantiatingGrpcChannelProvider does not define or inherit an implementation of the resolved method 'abstract com.google.api.gax.rpc.TransportChannelProvider withExecutor(java.util.concurrent.Executor)' of interface com.google.api.gax.rpc.TransportChannelProvider. at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:140) at com.google.api.gax.rpc.ClientContext.create(ClientContext.java:123) at com.google.cloud.logging.spi.v2.GrpcLoggingRpc.(GrpcLoggingRpc.java:132) at com.google.cloud.logging.LoggingOptions$DefaultLoggingRpcFactory.create(LoggingOptions.java:61) at com.google.cloud.logging.LoggingOptions$DefaultLoggingRpcFactory.create(LoggingOptions.java:55) at com.google.cloud.ServiceOptions.getRpc(ServiceOptions.java:561) at com.google.cloud.logging.LoggingOptions.getLoggingRpcV2(LoggingOptions.java:129) at com.google.cloud.logging.LoggingImpl.(LoggingImpl.java:109) at com.google.cloud.logging.LoggingOptions$DefaultLoggingFactory.create(LoggingOptions.java:46) at com.google.cloud.logging.LoggingOptions$DefaultLoggingFactory.create(LoggingOptions.java:41) at com.google.cloud.ServiceOptions.getService(ServiceOptions.java:541) at com.google.cloud.logging.LoggingHandler.getLogging(LoggingHandler.java:362) at com.google.cloud.logging.LoggingHandler.(LoggingHandler.java:195) at com.google.cloud.logging.LoggingHandler.(LoggingHandler.java:151) at com.google.cloud.logging.LoggingHandler.(LoggingHandler.java:120) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at java.base/java.lang.Class.newInstance(Class.java:584) at java.logging/java.util.logging.LogManager.createLoggerHandlers(LogManager.java:1000) at java.logging/java.util.logging.LogManager$4.run(LogManager.java:970) at java.logging/java.util.logging.LogManager$4.run(LogManager.java:966) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.logging/java.util.logging.LogManager.loadLoggerHandlers(LogManager.java:966) at java.logging/java.util.logging.LogManager.addLogger(LogManager.java:1199) at java.logging/java.util.logging.LogManager.demandLogger(LogManager.java:525) at java.logging/java.util.logging.LogManager.demandLogger(LogManager.java:515) at java.logging/java.util.logging.Logger.demandLogger(Logger.java:654) at java.logging/java.util.logging.Logger.getLogger(Logger.java:717) at java.logging/java.util.logging.Logger.getLogger(Logger.java:701) at com.altairix.comm.adf.root.log.DefaultLogger.(DefaultLogger.java:26) at com.altairix.comm.adf.root.log.LoggerFactory.(LoggerFactory.java:7) at com.altairix.adf.Adf_Server.(Adf_Server.java:234) at com.altairix.adf.root.servlet.AdfServlet.(AdfServlet.java:98) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at java.base/java.lang.Class.newInstance(Class.java:584) at org.eclipse.jetty.server.handler.ContextHandler$Context.createInstance(ContextHandler.java:2372) at org.eclipse.jetty.servlet.ServletContextHandler$Context.createServlet(ServletContextHandler.java:1166) at org.eclipse.jetty.servlet.ServletHolder.newInstance(ServletHolder.java:1207) at org.eclipse.jetty.servlet.ServletHolder.initServlet(ServletHolder.java:588) at org.eclipse.jetty.servlet.ServletHolder.getServlet(ServletHolder.java:472) at org.eclipse.jetty.servlet.ServletHolder.ensureInstance(ServletHolder.java:767) at org.eclipse.jetty.servlet.ServletHolder.prepare(ServletHolder.java:752) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:582) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143) at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:577) at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:223) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1127) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:515) at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1061) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:499) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:311) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:257) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:544) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:635) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:555) at java.base/java.lang.Thread.run(Thread.java:834)
The last call our code makes is: com.altairix.comm.adf.root.log.DefaultLogger.(DefaultLogger.java:26), which is:
logger = java.util.logging.Logger.getLogger(DefaultLogger.class.getName());
I'm not sure if my logging.properties file is correct, but it was copied from one of the guides above
.level = INFO
io.grpc.netty.level=INFO
sun.net.level=INFO
com.altairix.adf.root.log.ServerLogger.handlers=com.google.cloud.logging.LoggingHandler
com.altairix.comm.adf.root.log.DefaultLogger.handlers=com.google.cloud.logging.LoggingHandler
com.google.cloud.logging.LoggingHandler.log=custom_log
com.google.cloud.logging.LoggingHandler.level=FINEST
default : ERROR
com.google.cloud.logging.LoggingHandler.flushLevel=ERROR
com.google.cloud.logging.LoggingHandler.resourceType=container
com.google.cloud.logging.LoggingHandler.formatter=java.util.logging.SimpleFormatter
java.util.logging.SimpleFormatter.format=%3$s: %5$s%6$s
I was able to fix this by adding a handler to the logger:
import com.google.cloud.logging.LoggingHandler;
...
logger.addHandler(new LoggingHandler());
and also removing the reference to the logging.properties file, but I don't believe this is the intended use and it seems there is a bug when trying to use the logging.properties file
I tried to apply jpa transaction manager with spring transaction policy to one route. As a result, it is starting and thrown below issue. could you please let me know the cause of the issue and solution.Could you please share any quick-starts of camel jpa transaction for standalone mode. I can find quick-starts for fuse and server mode.
[pache.camel.spring.Main.main()] DefaultTypeConverter INFO Loaded 240 type converters
[pache.camel.spring.Main.main()] DefaultRuntimeEndpointRegistry INFO Runtime endpoint registry is in extended mode gathering usage statistics of all incoming and outgoing endpoints (cache limit: 1000)
[pache.camel.spring.Main.main()] JpaComponent INFO Using EntityManagerFactory configured: org.springframework.orm.jpa.LocalEntityManagerFactoryBean#147d849
[pache.camel.spring.Main.main()] JpaComponent INFO Using TransactionManager configured on this component: org.springframework.orm.jpa.JpaTransactionManager#a9b98d
[ERROR] *************************************
[ERROR] Error occurred while running main from: org.apache.camel.spring.Main
[ERROR]
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.camel.maven.RunMojo$1.run(RunMojo.java:458)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.NoSuchMethodError: org.apache.camel.processor.RedeliveryErrorHandler.<init>(Lorg/apache/camel/CamelContext;Lorg/apache/camel/Processor;Lorg/apache/camel/util/CamelLogger;Lorg/apache/camel/Processor;Lorg/apache/camel/processor/RedeliveryPolicy;Lorg/apache/camel/Processor;Ljava/lang/String;ZZLorg/apache/camel/Predicate;Ljava/util/concurrent/ScheduledExecutorService;Lorg/apache/camel/Processor;)V
at org.apache.camel.spring.spi.TransactionErrorHandler.<init>(TransactionErrorHandler.java:70)
at org.apache.camel.spring.spi.TransactionErrorHandlerBuilder.createErrorHandler(TransactionErrorHandlerBuilder.java:110)
at org.apache.camel.spring.spi.SpringTransactionPolicy.createTransactionErrorHandler(SpringTransactionPolicy.java:124)
at org.apache.camel.spring.spi.SpringTransactionPolicy.wrap(SpringTransactionPolicy.java:108)
at org.apache.camel.model.TransactedDefinition.createProcessor(TransactedDefinition.java:162)
at org.apache.camel.model.ProcessorDefinition.makeProcessorImpl(ProcessorDefinition.java:534)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:495)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:219)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:1069)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:196)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:974)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:3301)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:3024)
at org.apache.camel.impl.DefaultCamelContext.access$000(DefaultCamelContext.java:175)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2854)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2850)
at org.apache.camel.impl.DefaultCamelContext.doWithDefinedClassLoader(DefaultCamelContext.java:2873)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:2850)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:2819)
at org.apache.camel.spring.SpringCamelContext.maybeStart(SpringCamelContext.java:270)
at org.apache.camel.spring.SpringCamelContext.onApplicationEvent(SpringCamelContext.java:136)
at org.apache.camel.spring.CamelContextFactoryBean.onApplicationEvent(CamelContextFactoryBean.java:340)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:96)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:334)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:954)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.context.support.FileSystemXmlApplicationContext.<init>(FileSystemXmlApplicationContext.java:140)
at org.springframework.context.support.FileSystemXmlApplicationContext.<init>(FileSystemXmlApplicationContext.java:94)
at org.apache.camel.spring.Main.createDefaultApplicationContext(Main.java:205)
at org.apache.camel.spring.Main.doStart(Main.java:154)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.main.MainSupport.run(MainSupport.java:138)
at org.apache.camel.main.MainSupport.run(MainSupport.java:390)
at org.apache.camel.spring.Main.main(Main.java:87)
... 6 more
[ERROR] *************************************
[WARNING] thread Thread[Timer-0,5,org.apache.camel.spring.Main] was interrupted but is still alive after waiting at least 15000msecs
[WARNING] thread Thread[Timer-0,5,org.apache.camel.spring.Main] will linger despite being asked to die via interruption
[WARNING] thread Thread[derby.rawStoreDaemon,5,derby.daemons] will linger despite being asked to die via interruption
[WARNING] NOTE: 2 thread(s) did not finish despite being asked to via interruption. This is not a problem with exec:java, it is a problem with the running code. Although not serious, it should be remedied.
[WARNING] Couldn't destroy threadgroup org.apache.camel.maven.RunMojo$IsolatedThreadGroup[name=org.apache.camel.spring.Main,maxpri=10]
java.lang.IllegalThreadStateException
at java.lang.ThreadGroup.destroy(ThreadGroup.java:778)
at org.apache.camel.maven.RunMojo.execute(RunMojo.java:491)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:290)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] ------------------------------------------------------------------------
I have been trying to connect the eclipse kura to AWS IOT platform but not able to connect. I have referred this documentation https://eclipse.github.io/kura/cloud/kura-aws-cloud.html. Everything works untill i try to configure SSl parameters through kura web UI. After entering the Correct values for the storage-alias, private key and certificate it gives me the following exception :
ErrorLogger INFO:
ILLEGAL_ARGUMENTorg.eclipse.kura.web.shared.GwtKuraException:
ILLEGAL_ARGUMENT
at Unknown.tg(<anonymous>)
at Unknown.Bg(<anonymous>)
at Unknown.Sfc(<anonymous>)
at Unknown.Xfc(<anonymous>)
at Unknown.hdb(<anonymous>)
at Unknown.ddb(<anonymous>)
at Unknown.Bbb(<anonymous>)
at Unknown.jcb(<anonymous>)
at Unknown.sr(<anonymous>)
at Unknown.Gr(<anonymous>)
at Unknown.<anonymous>(<anonymous>)
at Unknown.Ch(<anonymous>)
at Unknown.Fh(<anonymous>)
at Unknown.<anonymous>(<anonymous>)
Following the other steps (in documentation) and trying to connect it gives the following exception in kura console :
org.eclipse.kura.KuraConnectException: "Connection failed. Cannot connect"
at org.eclipse.kura.core.data.transport.mqtt.MqttDataTransport.connect(MqttDataTransport.java:333)
at org.eclipse.kura.core.data.DataServiceImpl.connect(DataServiceImpl.java:403)
at org.eclipse.kura.web.server.GwtStatusServiceImpl.connectDataService(GwtStatusServiceImpl.java:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.google.gwt.user.server.rpc.RPC.invokeAndEncodeResponse(RPC.java:587)
at com.google.gwt.user.server.rpc.RemoteServiceServlet.processCall(RemoteServiceServlet.java:333)
at com.google.gwt.user.server.rpc.RemoteServiceServlet.processCall(RemoteServiceServlet.java:303)
at com.google.gwt.user.server.rpc.RemoteServiceServlet.processPost(RemoteServiceServlet.java:373)
at com.google.gwt.user.server.rpc.AbstractRemoteServiceServlet.doPost(AbstractRemoteServiceServlet.java:62)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:707)
at org.eclipse.kura.web.server.OsgiRemoteServiceServlet.service(OsgiRemoteServiceServlet.java:43)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.equinox.http.servlet.internal.HttpServiceRuntimeImpl$LegacyServlet.service(HttpServiceRuntimeImpl.java:1221)
at org.eclipse.equinox.http.servlet.internal.registration.EndpointRegistration.service(EndpointRegistration.java:153)
at org.eclipse.equinox.http.servlet.internal.servlet.ResponseStateHandler.processRequest(ResponseStateHandler.java:62)
at org.eclipse.equinox.http.servlet.internal.context.DispatchTargets.doDispatch(DispatchTargets.java:132)
at org.eclipse.equinox.http.servlet.internal.servlet.ProxyServlet.service(ProxyServlet.java:100)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at org.eclipse.equinox.http.jetty.internal.HttpServerManager$InternalHttpServiceServlet.service(HttpServerManager.java:310)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:845)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:583)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:224)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1174)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:185)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1106)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:524)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:319)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:253)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
at java.lang.Thread.run(Thread.java:748)
Caused by: MqttException (0) - javax.net.ssl.SSLHandshakeException: Received fatal alert: bad_certificate
at org.eclipse.paho.client.mqttv3.internal.ExceptionHelper.createMqttException(ExceptionHelper.java:38)
at org.eclipse.paho.client.mqttv3.internal.ClientComms$ConnectBG.run(ClientComms.java:664)
... 1 more
Caused by: javax.net.ssl.SSLHandshakeException: Received fatal alert: bad_certificate
at sun.security.ssl.Alerts.getSSLException(Alerts.java:192)
at sun.security.ssl.Alerts.getSSLException(Alerts.java:154)
at sun.security.ssl.SSLSocketImpl.recvAlert(SSLSocketImpl.java:2023)
at sun.security.ssl.SSLSocketImpl.readRecord(SSLSocketImpl.java:1125)
at sun.security.ssl.SSLSocketImpl.performInitialHandshake(SSLSocketImpl.java:1375)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1403)
at sun.security.ssl.SSLSocketImpl.startHandshake(SSLSocketImpl.java:1387)
at org.eclipse.paho.client.mqttv3.internal.SSLNetworkModule.start(SSLNetworkModule.java:93)
at org.eclipse.paho.client.mqttv3.internal.ClientComms$ConnectBG.run(ClientComms.java:650)
... 1 more
Solutions I have tried:
The broker url has the port 8883 and an mqtts protocol
the value Lwt.retain is set to false
the value for storage-alias and topic.context.account-name is same.
But nothing works.
Kura version : 3.0.0
Device : Linux Machine
Any help will be appreciated.