Hello Flink Community,
following the documentation to troubleshoot unloading of dynamically loaded classes in Flink I added the database driver library to the opt/flink/lib folder on both the Flink JobManager Container and TaskManager Containers running on K8s (Flink Session Cluster, version: 1.11).
I marked the library as provided in my build.sbt file.
The rest of the user code is p[art of the fat jar build by sbt assembly.
Now when I submit a job to the flink cluster using the Flink API (upload and run endpoints) it won't accept the job due to the following error:
java.lang.ClassNotFoundException: com.vertica.jdbc.Driver
Why is the jar not picked up by the Flink classloader?
I even added the class pattern to the config option without any difference:
classloader.parent-first-patterns-additional: com.vertica.jdbc.;
Link: https://ci.apache.org/projects/flink/flink-docs-release-1.12/ops/debugging/debugging_classloading.html#unloading-of-dynamically-loaded-classes-in-user-code
Any recommendation would be highly appreciated.
Cheers
Please confirm your jdbc maven dependency is not provided.
when the library is provided, the library is active when compile and test.
Related
I would like to read and write some data with Apache Flink 1.11.2 from S3. The documentation recommends to use the presto plugin for checkpoints and the hadoop plugin for pipeline data.
After reading this section you have to copy the plugins from /opt to /plugin. I can find the flink-s3-fs-presto-1.11.2.jar under /opt but there is no flink-s3-fs-hadoop-1.11.2.jar. Where can i find the s3-hadoop plugin for setting up my production environment?
And how can i use these plugins in the IDE? Simply adding these to pom.xml als provided dependencies? And then how can i pass the crentials in IDE?
That is weird I can see that they are both present in the official binaries in opt in 1.11.1. However if You can't find them, You can simply try to get the jars from Maven here and copy them to the required place. Another thing that may work is adding the dependency into the project with compile scope.
Running the job locally is described here. There are various ways of configuring the credentials when running the job in IDE, one might be adding core-site.xml to resources folder with proper configruation.
EDIT:
As for the local execution it was explained here a little bit.
In using Flink 1.8.3 I have recently started getting a runtime
ClassNotFoundException as shown in the following stack trace:
Caused by: java.lang.NoSuchMethodError: org.apache.flink.api.java.ClosureCleaner.clean(Ljava/lang/Object;Z)V
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducerBase.<init>(FlinkKafkaProducerBase.java:146)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer09.<init>(FlinkKafkaProducer09.java:190)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer010.<init>(FlinkKafkaProducer010.java:197)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer010.<init>(FlinkKafkaProducer010.java:75)
I don't see any obvious old class imports when looking through the dependency graph using the gradle project-report plugin. In my gradle dependencies I am careful in depending on flink:1.8.3 jars throughout.
Additionally I don't see a version of ClosureCleaner that includes a single arg clean(Object) method. I am pretty confused by this CNFException. Unsure as to which KafkaProducer jar is in the environment when running under Flink JobManager.
It looks to me as though an old flink-java is being imported assuming the ClosureCleaner.clean(Object) single-argument invocation is old. I could use some help on this one.
thx in advance,
james
I am trying to run a flink steaming program that uses kafka connector(latest universal connector).
The jobs runs without any problem on IntelliJ but when I am submitting the code build into jar using sbt package is giving me below error.
java.lang.ClassNotFoundException: org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase
I also used the jar built using traditional IntellIJ option but still i get the above error.
Most probably the issue is the fact that You are not including the dependencies in Your JAR file. Connector dependencies are not included in the Flink binary.
Generally, the preferred way of tackling this issue is to use the proper plugin for Your build tool like shade-plugin for Maven or assembly for sbt to create so-called fat-jar i.e. the JAR with the dependencies included.
I am using Windows7, Java 8, Flink 1.5.2. Extracted the tar file, and started bin\start-cluster.bat. I can open the browser URL localhost:8081, it shows all default options.
Developed a small Stream app with Kafka integration, I can receive Kafka message within flink and flink is forwarding it to Kafka (other topic) successfully.
I could test this from Eclipse IDE and from CLI.
Problem: When I run this jar from bin\flink run , the running job's details are not shown on Dashboard, also could not see completed/failed states.
Thanks
When deploying a Camel route to FuseESB, as FuseESB tries to start up the jar file, it gives the following exception in the log:
Found initial references null for OSGi service (&(language=js)
(objectClass=org.apache.camel.spi.LanguageResolver))
This causes the bundle to enter a grace period for a few minutes, after which it times out and its status moves to failed. Note that I'm not using javascript in the application, but I assume it is loaded as part of loading Camel core.
Details of my setup:
Code in question is written using an OSGi blueprint xml file to define the beans.
Code is packaged as a jar, as opposed to an OSGi bundle.
Code is deployed by being dropped into the deploy directory so it is deployed by the FAB deployer.
I believe I have the relevant Camel features installed.
Output from features:list:
[installed ] [2.10.0.fuse-71-047] camel-script-javascript camel-2.10.0.fuse-71-047
[installed ] [2.10.0.fuse-71-047] camel-script camel-2.10.0.fuse-71-047
I have worked around this by:
reverting to a spring xml file to define the beans
packaging the code as a bundle, not a jar
I still don't understand why the Blueprint version didn't work, but the question is now less urgent than it was.
With FAB you should declare the dependencies to your Camel components in your pom.xml file, and use scope=provided.
See more details at: http://fuse.fusesource.org/bundle/overview.html