I'm using flink table api, using kafka as input source and json as table schema. I got this error when submit my program:
```
The program finished with the following exception:
org.apache.flink.client.program.ProgramInvocationException: The main method caused an error.
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:804)
at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:280)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)
at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1044)
at org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1120)
at org.apache.flink.runtime.security.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:30)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1120)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException:
Could not find a suitable table factory for 'org.apache.flink.table.factories.StreamTableSourceFactory' in
the classpath.
Reason: No context matches.
The following properties are requested:
connector.properties.0.key=zookeeper.connect
connector.properties.0.value=localhost:2181
connector.properties.1.key=bootstrap.servers
connector.properties.1.value=localhost:9092
connector.property-version=1
connector.startup-mode=latest-offset
connector.topic=flink_table_test
connector.type=kafka
connector.version=0.10
format.fail-on-missing-field=true
format.json-schema={\t'type': 'object',\t'properties': {\t\t'uid': {\t\t\t'type': 'number'\t\t},\t\t'name': {\t\t\t'type': 'string'\t\t},\t\t'age': {\t\t\t'type': 'number'\t\t},\t\t'timestamp': {\t\t\t'type': 'long'\t\t}\t}}
format.property-version=1
format.type=json
schema.0.name=uid
schema.0.type=BIGINT
schema.1.name=name
schema.1.type=VARCHAR
schema.2.name=age
schema.2.type=INT
schema.3.name=ts
schema.3.type=TIMESTAMP
update-mode=append
The following factories have been considered:
org.apache.flink.formats.json.JsonRowFormatFactory
org.apache.flink.table.sources.CsvBatchTableSourceFactory
org.apache.flink.table.sources.CsvAppendTableSourceFactory
org.apache.flink.table.sinks.CsvBatchTableSinkFactory
org.apache.flink.table.sinks.CsvAppendTableSinkFactory
at org.apache.flink.table.factories.TableFactoryService$.filterByContext(TableFactoryService.scala:214)
at org.apache.flink.table.factories.TableFactoryService$.findInternal(TableFactoryService.scala:130)
at org.apache.flink.table.factories.TableFactoryService$.find(TableFactoryService.scala:81)
at org.apache.flink.table.factories.TableFactoryUtil$.findAndCreateTableSource(TableFactoryUtil.scala:49)
at org.apache.flink.table.descriptors.ConnectTableDescriptor.registerTableSource(ConnectTableDescriptor.scala:44)
at com.akulaku.data.flink.QueryTable$.main(QueryTable.scala:53)
at com.akulaku.data.flink.QueryTable.main(QueryTable.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
... 9 more
```
I already add the flink-json-1.6.1.jar into flink lib path.
Could anyone helps?
Try adding the kafka connector sql jar (e.g. flink-connector-kafka-0.11_2.11-1.6.1-sql-jar.jar). You can find the whole list of connector dependencies here.
Related
I tried to apply jpa transaction manager with spring transaction policy to one route. As a result, it is starting and thrown below issue. could you please let me know the cause of the issue and solution.Could you please share any quick-starts of camel jpa transaction for standalone mode. I can find quick-starts for fuse and server mode.
[pache.camel.spring.Main.main()] DefaultTypeConverter INFO Loaded 240 type converters
[pache.camel.spring.Main.main()] DefaultRuntimeEndpointRegistry INFO Runtime endpoint registry is in extended mode gathering usage statistics of all incoming and outgoing endpoints (cache limit: 1000)
[pache.camel.spring.Main.main()] JpaComponent INFO Using EntityManagerFactory configured: org.springframework.orm.jpa.LocalEntityManagerFactoryBean#147d849
[pache.camel.spring.Main.main()] JpaComponent INFO Using TransactionManager configured on this component: org.springframework.orm.jpa.JpaTransactionManager#a9b98d
[ERROR] *************************************
[ERROR] Error occurred while running main from: org.apache.camel.spring.Main
[ERROR]
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.apache.camel.maven.RunMojo$1.run(RunMojo.java:458)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.NoSuchMethodError: org.apache.camel.processor.RedeliveryErrorHandler.<init>(Lorg/apache/camel/CamelContext;Lorg/apache/camel/Processor;Lorg/apache/camel/util/CamelLogger;Lorg/apache/camel/Processor;Lorg/apache/camel/processor/RedeliveryPolicy;Lorg/apache/camel/Processor;Ljava/lang/String;ZZLorg/apache/camel/Predicate;Ljava/util/concurrent/ScheduledExecutorService;Lorg/apache/camel/Processor;)V
at org.apache.camel.spring.spi.TransactionErrorHandler.<init>(TransactionErrorHandler.java:70)
at org.apache.camel.spring.spi.TransactionErrorHandlerBuilder.createErrorHandler(TransactionErrorHandlerBuilder.java:110)
at org.apache.camel.spring.spi.SpringTransactionPolicy.createTransactionErrorHandler(SpringTransactionPolicy.java:124)
at org.apache.camel.spring.spi.SpringTransactionPolicy.wrap(SpringTransactionPolicy.java:108)
at org.apache.camel.model.TransactedDefinition.createProcessor(TransactedDefinition.java:162)
at org.apache.camel.model.ProcessorDefinition.makeProcessorImpl(ProcessorDefinition.java:534)
at org.apache.camel.model.ProcessorDefinition.makeProcessor(ProcessorDefinition.java:495)
at org.apache.camel.model.ProcessorDefinition.addRoutes(ProcessorDefinition.java:219)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:1069)
at org.apache.camel.model.RouteDefinition.addRoutes(RouteDefinition.java:196)
at org.apache.camel.impl.DefaultCamelContext.startRoute(DefaultCamelContext.java:974)
at org.apache.camel.impl.DefaultCamelContext.startRouteDefinitions(DefaultCamelContext.java:3301)
at org.apache.camel.impl.DefaultCamelContext.doStartCamel(DefaultCamelContext.java:3024)
at org.apache.camel.impl.DefaultCamelContext.access$000(DefaultCamelContext.java:175)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2854)
at org.apache.camel.impl.DefaultCamelContext$2.call(DefaultCamelContext.java:2850)
at org.apache.camel.impl.DefaultCamelContext.doWithDefinedClassLoader(DefaultCamelContext.java:2873)
at org.apache.camel.impl.DefaultCamelContext.doStart(DefaultCamelContext.java:2850)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:2819)
at org.apache.camel.spring.SpringCamelContext.maybeStart(SpringCamelContext.java:270)
at org.apache.camel.spring.SpringCamelContext.onApplicationEvent(SpringCamelContext.java:136)
at org.apache.camel.spring.CamelContextFactoryBean.onApplicationEvent(CamelContextFactoryBean.java:340)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:96)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:334)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:954)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.context.support.FileSystemXmlApplicationContext.<init>(FileSystemXmlApplicationContext.java:140)
at org.springframework.context.support.FileSystemXmlApplicationContext.<init>(FileSystemXmlApplicationContext.java:94)
at org.apache.camel.spring.Main.createDefaultApplicationContext(Main.java:205)
at org.apache.camel.spring.Main.doStart(Main.java:154)
at org.apache.camel.support.ServiceSupport.start(ServiceSupport.java:61)
at org.apache.camel.main.MainSupport.run(MainSupport.java:138)
at org.apache.camel.main.MainSupport.run(MainSupport.java:390)
at org.apache.camel.spring.Main.main(Main.java:87)
... 6 more
[ERROR] *************************************
[WARNING] thread Thread[Timer-0,5,org.apache.camel.spring.Main] was interrupted but is still alive after waiting at least 15000msecs
[WARNING] thread Thread[Timer-0,5,org.apache.camel.spring.Main] will linger despite being asked to die via interruption
[WARNING] thread Thread[derby.rawStoreDaemon,5,derby.daemons] will linger despite being asked to die via interruption
[WARNING] NOTE: 2 thread(s) did not finish despite being asked to via interruption. This is not a problem with exec:java, it is a problem with the running code. Although not serious, it should be remedied.
[WARNING] Couldn't destroy threadgroup org.apache.camel.maven.RunMojo$IsolatedThreadGroup[name=org.apache.camel.spring.Main,maxpri=10]
java.lang.IllegalThreadStateException
at java.lang.ThreadGroup.destroy(ThreadGroup.java:778)
at org.apache.camel.maven.RunMojo.execute(RunMojo.java:491)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:154)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:146)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:290)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:194)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
[INFO] ------------------------------------------------------------------------
In solr.xml, I have added the following lines of code.
<str name="adminHandler">com.ontotext.solr.handler.admin.GraphDBConnectorAdminHandler</str>
When I try starting the solr server, I get the following exception:
2019-03-16 16:49:15.624 ERROR (main) [ ] o.a.s.c.SolrCore null:java.lang.NoClassDefFoundError: com/google/common/collect/ImmutableMap
at org.apache.solr.handler.admin.CoreAdminHandler.<clinit>(CoreAdminHandler.java:223)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:541)
at org.apache.solr.core.SolrResourceLoader.newInstance(SolrResourceLoader.java:626)
at org.apache.solr.core.CoreContainer.createHandler(CoreContainer.java:1696)
at org.apache.solr.core.CoreContainer.load(CoreContainer.java:575)
at org.apache.solr.servlet.SolrDispatchFilter.createCoreContainer(SolrDispatchFilter.java:253)
at org.apache.solr.servlet.SolrDispatchFilter.init(SolrDispatchFilter.java:173)
at org.eclipse.jetty.servlet.FilterHolder.initialize(FilterHolder.java:136)
at org.eclipse.jetty.servlet.ServletHandler.lambda$initialize$0(ServletHandler.java:750)
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
at java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:742)
at java.util.stream.Streams$ConcatSpliterator.forEachRemaining(Streams.java:742)
at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580)
at org.eclipse.jetty.servlet.ServletHandler.initialize(ServletHandler.java:744)
at org.eclipse.jetty.servlet.ServletContextHandler.startContext(ServletContextHandler.java:368)
at org.eclipse.jetty.webapp.WebAppContext.startWebapp(WebAppContext.java:1497)
at org.eclipse.jetty.webapp.WebAppContext.startContext(WebAppContext.java:1459)
at org.eclipse.jetty.server.handler.ContextHandler.doStart(ContextHandler.java:852)
at org.eclipse.jetty.servlet.ServletContextHandler.doStart(ServletContextHandler.java:278)
at org.eclipse.jetty.webapp.WebAppContext.doStart(WebAppContext.java:545)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.bindings.StandardStarter.processBinding(StandardStarter.java:46)
at org.eclipse.jetty.deploy.AppLifeCycle.runBindings(AppLifeCycle.java:192)
at org.eclipse.jetty.deploy.DeploymentManager.requestAppGoal(DeploymentManager.java:505)
at org.eclipse.jetty.deploy.DeploymentManager.addApp(DeploymentManager.java:151)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.fileAdded(ScanningAppProvider.java:180)
at org.eclipse.jetty.deploy.providers.WebAppProvider.fileAdded(WebAppProvider.java:453)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider$1.fileAdded(ScanningAppProvider.java:64)
at org.eclipse.jetty.util.Scanner.reportAddition(Scanner.java:610)
at org.eclipse.jetty.util.Scanner.reportDifferences(Scanner.java:529)
at org.eclipse.jetty.util.Scanner.scan(Scanner.java:392)
at org.eclipse.jetty.util.Scanner.doStart(Scanner.java:313)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.providers.ScanningAppProvider.doStart(ScanningAppProvider.java:150)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.deploy.DeploymentManager.startAppProvider(DeploymentManager.java:579)
at org.eclipse.jetty.deploy.DeploymentManager.doStart(DeploymentManager.java:240)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.util.component.ContainerLifeCycle.start(ContainerLifeCycle.java:138)
at org.eclipse.jetty.server.Server.start(Server.java:415)
at org.eclipse.jetty.util.component.ContainerLifeCycle.doStart(ContainerLifeCycle.java:117)
at org.eclipse.jetty.server.handler.AbstractHandler.doStart(AbstractHandler.java:113)
at org.eclipse.jetty.server.Server.doStart(Server.java:382)
at org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:1572)
at org.eclipse.jetty.xml.XmlConfiguration$1.run(XmlConfiguration.java:1512)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.jetty.xml.XmlConfiguration.main(XmlConfiguration.java:1511)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.jetty.start.Main.invokeMain(Main.java:220)
at org.eclipse.jetty.start.Main.start(Main.java:490)
at org.eclipse.jetty.start.Main.main(Main.java:77)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.ImmutableMap
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 57 more
What should I do now?
I have all jars required and it gets loaded in the classloader, while my solr starts without custom admin handler.
I refered the following document to add graph db solr connector to automatically sync the data when added to graphdb
http://graphdb.ontotext.com/documentation/enterprise/solr-graphdb-connector.html
Your missing class definition tells you that the class com/google/common/collect/ImmutableMap is missing. This class is part of Google's Guava.
If you're using Maven or something similar you can add the dependency there, or if not, you can download the jar for the regular JRE from mvnrepository.com.
I am using flink's table api, I receive data from kafka, then register it as
a table, then I use sql statement to process, and finally convert the result
back to a stream, write to a directory, the code looks like this:
def main(args: Array[String]): Unit = {
val sEnv = StreamExecutionEnvironment.getExecutionEnvironment
sEnv.setStreamTimeCharacteristic(TimeCharacteristic.EventTime)
val tEnv = TableEnvironment.getTableEnvironment(sEnv)
tEnv.connect(
new Kafka()
.version("0.11")
.topic("user")
.startFromEarliest()
.property("zookeeper.connect", "")
.property("bootstrap.servers", "")
)
.withFormat(
new Json()
.failOnMissingField(false)
.deriveSchema() //使用表的 schema
)
.withSchema(
new Schema()
.field("username_skey", Types.STRING)
)
.inAppendMode()
.registerTableSource("user")
val userTest: Table = tEnv.sqlQuery(
"""
select ** form ** join **"".stripMargin)
val endStream = tEnv.toRetractStream[Row](userTest)
endStream.writeAsText("/tmp/sqlres",WriteMode.OVERWRITE)
sEnv.execute("Test_New_Sign_Student")
}
I was successful in the local test, but when I submit the following command
in the cluster, I get the following error:
=======================================================
org.apache.flink.client.program.ProgramInvocationException: The main method
caused an error.
at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:546)
at
org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:421)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:426)
at
org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:804)
at
org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:280)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:215)
at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1044)
at
org.apache.flink.client.cli.CliFrontend.lambda$main$11(CliFrontend.java:1120)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at
org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1120)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could
not find a suitable table factory for
'org.apache.flink.table.factories.DeserializationSchemaFactory' in
the classpath.
Reason: No factory implements
'org.apache.flink.table.factories.DeserializationSchemaFactory'.
The following properties are requested:
connector.properties.0.key=zookeeper.connect
....
schema.9.name=roles
schema.9.type=VARCHAR
update-mode=append
The following factories have been considered:
org.apache.flink.table.sources.CsvBatchTableSourceFactory
org.apache.flink.table.sources.CsvAppendTableSourceFactory
org.apache.flink.table.sinks.CsvBatchTableSinkFactory
org.apache.flink.table.sinks.CsvAppendTableSinkFactory
org.apache.flink.streaming.connectors.kafka.Kafka011TableSourceSinkFactory
at
org.apache.flink.table.factories.TableFactoryService$.filterByFactoryClass(TableFactoryService.scala:176)
at
org.apache.flink.table.factories.TableFactoryService$.findInternal(TableFactoryService.scala:125)
at
org.apache.flink.table.factories.TableFactoryService$.find(TableFactoryService.scala:100)
at
org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.scala)
at
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactoryBase.getDeserializationSchema(KafkaTableSourceSinkFactoryBase.java:259)
at
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactoryBase.createStreamTableSource(KafkaTableSourceSinkFactoryBase.java:144)
at
org.apache.flink.table.factories.TableFactoryUtil$.findAndCreateTableSource(TableFactoryUtil.scala:50)
at
org.apache.flink.table.descriptors.ConnectTableDescriptor.registerTableSource(ConnectTableDescriptor.scala:44)
at
org.clay.test.Test_New_Sign_Student$.main(Test_New_Sign_Student.scala:64)
at
org.clay.test.Test_New_Sign_Student.main(Test_New_Sign_Student.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:529)
===================================
Can someone tell me what caused this? I am very confused about this........
if you are using maven-shade-plugin, make sure SPI transformer is placed.
Flink uses java Service Provider to discover Source/Sink connector.
Without this transformer, you will 100% encoutner "org.apache.flink.table.api.NoMatchingTableFactoryException: Could
not find a suitable table factory", which happened on me.
https://ci.apache.org/projects/flink/flink-docs-release-1.9/dev/table/connect.html#update-mode
flink officially points out this, search "SPI" on this page
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
</transformers>
You have to add the JAR dependencies of the connectors (Kafka) and formats (JSON) that you are using to the classpath of your program, i.e., either build a fat JAR that includes them or provide them to the classpath of the Flink cluster by copying them in the ./lib folder.
Check the Flink documentation for links to download the respective dependencies.
I have the met the same problem, just add parameters --connector.type kafka when you run your application will solve this. see enter link description here
It's seem that the data in the WindowOperator can serialize successfully. however deserialize failed when the taskmanager restarted by jobmanager.
env: state
backend: hdfs;
jobmanager: high-availability
Root exception:
java.lang.Exception: Could not restore checkpointed state to operators and functions
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreStateLazy(StreamTask.java:414)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:208)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:584)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.io.StreamCorruptedException: invalid type code: 00
at java.io.ObjectInputStream$BlockDataInputStream.readBlockHeader(ObjectInputStream.java:2508)
at java.io.ObjectInputStream$BlockDataInputStream.refill(ObjectInputStream.java:2543)
at java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:2615)
at java.io.DataInputStream.readInt(DataInputStream.java:387)
at java.io.ObjectInputStream$BlockDataInputStream.readInt(ObjectInputStream.java:2820)
at java.io.ObjectInputStream.readInt(ObjectInputStream.java:971)
at java.util.HashMap.readObject(HashMap.java:1158)
at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:294)
at org.apache.flink.streaming.runtime.operators.windowing.WindowOperator$Context.<init>(WindowOperator.java:446)
at org.apache.flink.streaming.runtime.operators.windowing.WindowOperator.restoreState(WindowOperator.java:621)
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreStateLazy(StreamTask.java:406)
I have experienced the same issue, when I run multiple flink application with the same custom window/trigger class reused.
(I don't know if it's your case)
But I add generated serialVersionUID in my different reused class and it's work fine now
I'm trying to resolve the problem with ClassNotFoundException for 3 days, and I can't find the solution, so I'm asking for help, the problem happens when i try to use every method except find or findAll on EJB Entity Beans.
For example when I try to use method remove():
ctx = new InitialContext();
remote = (CategoriesRemote) ctx.lookup("CategoriesFacade/remote");
remote.remove(category);
I get not nice looking Exception:
Exception in thread "Thread-7" java.lang.RuntimeException: java.lang.ClassNotFoundException: org.eclipse.persistence.indirection.IndirectList
at org.jboss.aop.joinpoint.MethodInvocation.getArguments(MethodInvocation.java:318)
at org.jboss.ejb3.stateless.StatelessContainer.dynamicInvoke(StatelessContainer.java:386)
at org.jboss.ejb3.session.InvokableContextClassProxyHack._dynamicInvoke(InvokableContextClassProxyHack.java:53)
at org.jboss.aop.Dispatcher.invoke(Dispatcher.java:91)
at org.jboss.aspects.remoting.AOPRemotingInvocationHandler.invoke(AOPRemotingInvocationHandler.java:82)
at org.jboss.remoting.ServerInvoker.invoke(ServerInvoker.java:898)
at org.jboss.remoting.transport.socket.ServerThread.completeInvocation(ServerThread.java:791)
at org.jboss.remoting.transport.socket.ServerThread.processInvocation(ServerThread.java:744)
at org.jboss.remoting.transport.socket.ServerThread.dorun(ServerThread.java:548)
at org.jboss.remoting.transport.socket.ServerThread.run(ServerThread.java:234)
Caused by: java.lang.ClassNotFoundException: org.eclipse.persistence.indirection.IndirectList
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at java.lang.ClassLoader.loadClass(ClassLoader.java:248)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at sun.rmi.server.LoaderHandler.loadClass(LoaderHandler.java:434)
at sun.rmi.server.LoaderHandler.loadClass(LoaderHandler.java:165)
at java.rmi.server.RMIClassLoader$2.loadClass(RMIClassLoader.java:620)
at org.jboss.system.JBossRMIClassLoader.loadClass(JBossRMIClassLoader.java:91)
at java.rmi.server.RMIClassLoader.loadClass(RMIClassLoader.java:247)
at sun.rmi.server.MarshalInputStream.resolveClass(MarshalInputStream.java:197)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1574)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1495)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1731)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1946)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1870)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1752)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1328)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1666)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1322)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:350)
at java.rmi.MarshalledObject.get(MarshalledObject.java:142)
at org.jboss.aop.joinpoint.MethodInvocation.getArguments(MethodInvocation.java:309)
at org.jboss.ejb3.stateless.StatelessContainer.dynamicInvoke(StatelessContainer.java:386)
at org.jboss.ejb3.session.InvokableContextClassProxyHack._dynamicInvoke(InvokableContextClassProxyHack.java:53)
at org.jboss.aop.Dispatcher.invoke(Dispatcher.java:91)
at org.jboss.aspects.remoting.AOPRemotingInvocationHandler.invoke(AOPRemotingInvocationHandler.java:82)
at org.jboss.remoting.ServerInvoker.invoke(ServerInvoker.java:898)
at org.jboss.remoting.transport.socket.ServerThread.completeInvocation(ServerThread.java:791)
at org.jboss.remoting.transport.socket.ServerThread.processInvocation(ServerThread.java:744)
at org.jboss.remoting.transport.socket.ServerThread.dorun(ServerThread.java:548)
at org.jboss.remoting.transport.socket.ServerThread.run(ServerThread.java:234)
at org.jboss.remoting.MicroRemoteClientInvoker.invoke(MicroRemoteClientInvoker.java:216)
at org.jboss.remoting.Client.invoke(Client.java:1961)
at org.jboss.remoting.Client.invoke(Client.java:804)
at org.jboss.aspects.remoting.InvokeRemoteInterceptor.invoke(InvokeRemoteInterceptor.java:60)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.aspects.tx.ClientTxPropagationInterceptor.invoke(ClientTxPropagationInterceptor.java:61)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.ejb3.security.client.SecurityClientInterceptor.invoke(SecurityClientInterceptor.java:65)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.ejb3.remoting.IsLocalInterceptor.invoke(IsLocalInterceptor.java:77)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.ejb3.async.impl.interceptor.AsynchronousClientInterceptor.invoke(AsynchronousClientInterceptor.java:143)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.aspects.remoting.PojiProxy.invoke(PojiProxy.java:62)
at $Proxy9.invoke(Unknown Source)
at org.jboss.ejb3.proxy.impl.handler.session.SessionProxyInvocationHandlerBase.invoke(SessionProxyInvocationHandlerBase.java:185)
at $Proxy18.remove(Unknown Source)
at nowyskos.modules.cms.CategorySender.run(CategorySender.java:88)
at java.lang.Thread.run(Thread.java:662)
at org.jboss.aspects.remoting.InvokeRemoteInterceptor.invoke(InvokeRemoteInterceptor.java:72)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.aspects.tx.ClientTxPropagationInterceptor.invoke(ClientTxPropagationInterceptor.java:61)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.ejb3.security.client.SecurityClientInterceptor.invoke(SecurityClientInterceptor.java:65)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.ejb3.remoting.IsLocalInterceptor.invoke(IsLocalInterceptor.java:77)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.ejb3.async.impl.interceptor.AsynchronousClientInterceptor.invoke(AsynchronousClientInterceptor.java:143)
at org.jboss.aop.joinpoint.MethodInvocation.invokeNext(MethodInvocation.java:102)
at org.jboss.aspects.remoting.PojiProxy.invoke(PojiProxy.java:62)
at $Proxy9.invoke(Unknown Source)
at org.jboss.ejb3.proxy.impl.handler.session.SessionProxyInvocationHandlerBase.invoke(SessionProxyInvocationHandlerBase.java:185)
at $Proxy18.remove(Unknown Source)
at nowyskos.modules.cms.CategorySender.run(CategorySender.java:88)
at java.lang.Thread.run(Thread.java:662)
Previously I got others exceptions regarding no security manager, I resolved it adding
System.setSecuritymanager(new RMISecurityManager);
and adding client.all file to VM Options.
But I can't find the solution for this problem.
I think that in this case the problem is not because of ejb or security manager, but because worng classpath, but in my Netbeans project I added Eclipselink library, additionaly I set the global classpath with this folder containing jar file of eclipselink which have IndirectList class.
I'm getting this error in my client application. EJB Module works on JBoss Server.
Situation for me looks weird, I don't know why find or findAll methods works perfect and remove or create crashes.
I'm really exhaused because of the problem, I don't know how to move on, please help.
Well, according to what I see here, this class "org.eclipse.persistence.indirection.IndirectList" which is missing is present in glassfish distributions. So, I guess there might be a wrong distribution of eclipselink that you are using for your project that works with JBoss AS. A quick (but not be the right solution), would be to add one of the jars (from here) which best matches your environment to your classpath. My advice would be to double check the version, distribution of eclipselink.