Flink Error: java.lang.ClassNotFoundException: org.apache.flink.shaded.calcite.com.google.common.base.Throwables - apache-flink

I am using Flink for streaming the data which is in the csv file. I want to put it into table format with certain schema. For this purpose I am using Flink-table_2.10-1.1.3.jar (Table api) but I got the errors:
log4j:WARN No appenders could be found for logger (org.apache.flink.api.java.typeutils.TypeExtractor).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/shaded/calcite/com/google/common/base/Throwables
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.create(JaninoRelMetadataProvider.java:450)
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.revise(JaninoRelMetadataProvider.java:460)
at org.apache.calcite.rel.metadata.RelMetadataQuery.revise(RelMetadataQuery.java:186)
at org.apache.calcite.rel.metadata.RelMetadataQuery.collations(RelMetadataQuery.java:484)
at org.apache.calcite.rel.metadata.RelMdCollation.project(RelMdCollation.java:207)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:122)
at org.apache.calcite.rel.logical.LogicalProject$1.get(LogicalProject.java:120)
at org.apache.calcite.plan.RelTraitSet.replaceIfs(RelTraitSet.java:238)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:116)
at org.apache.calcite.rel.logical.LogicalProject.create(LogicalProject.java:108)
at org.apache.flink.api.table.plan.logical.Project.construct(operators.scala:90)
at org.apache.flink.api.table.plan.logical.Project.construct(operators.scala:85)
at org.apache.flink.api.table.plan.logical.LogicalNode.toRelNode(LogicalNode.scala:78)
at org.apache.flink.api.table.Table.getRelNode(table.scala:66)
at org.apache.flink.api.table.StreamTableEnvironment.translate(StreamTableEnvironment.scala:243)
at org.apache.flink.api.java.table.StreamTableEnvironment.toDataStream(StreamTableEnvironment.scala:147)
at table_streaming_test.main(table_streaming_test.java:90)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.shaded.calcite.com.google.common.base.Throwables
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
When I explore the corresponding jar, the respective class is present there. Can you please tell that why this is happening?
Also can I get the maven source so that I can build the Flink-table .jar at my place?

I had the same problem with CEP library. I added to my pom file but I kept getting ClassNotFoundException. I even packaged it with my jar file via IntelliJ but didn't work.
If you're using their flink-quickstart archetype, I think there are some other things to change in pom file to make it work. When I created a clean project and added flink dependencies myself, I didn't get that exception anymore. You can try and see if this approach works.
You can also add flink-table JAR file to lib folder in Flink. this also fixed my problem with CEP library. the JAR file is available in Maven repository website. download the version you want.
According to the Table and SQL document on Flink website:
Note: The Table API is currently not part of the binary distribution.
See linking with it for cluster execution here.

I was also facing the same problem with Table api in flink v1.4.2.
I added flink-table_2.11-1.4.2.jar file present in opt folder to the lib folder and restarted flink.
This works for me. Hopefully works for you too :)

Related

FlinkKafkaConsumer fails to read from a LZ4 compressed topic

We've got several flink applications reading from Kafka topics, and they work fine. But recently we've added a new topic to the existing flink job and it started failing immediately on startup with the following root error:
Caused by: org.apache.kafka.common.KafkaException: java.lang.NoClassDefFoundError: net/jpountz/lz4/LZ4Exception
at org.apache.kafka.common.record.CompressionType$4.wrapForInput(CompressionType.java:113)
at org.apache.kafka.common.record.DefaultRecordBatch.compressedIterator(DefaultRecordBatch.java:256)
at org.apache.kafka.common.record.DefaultRecordBatch.streamingIterator(DefaultRecordBatch.java:334)
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.nextFetchedRecord(Fetcher.java:1208)
at org.apache.kafka.clients.consumer.internals.Fetcher$PartitionRecords.fetchRecords(Fetcher.java:1245)
... 7 more
I found out that this topic has the lz4 compression and guess that flink for some reason is unable to work with it. Adding lz4 dependencies directly to the app didn't work, and what's weird - it runs fine locally, but fails on the remote cluster.
The flink runtime version is 1.9.1, and we have the same version of all other dependencies in our application:
flink-streaming-java_2.11, flink-connector-kafka_2.11, flink-java and flink-clients_2.11
Could this be happening due to flink not having a dependency to the lz4 lib inside?
Found the solution. No version upgrade was needed, nor the additional dependencies to the application itself. What worked out for us is adding the lz4 library jar directly to the flink libs folder in the Docker image. After that, the error with lz4 compression disappeared.

Apache Flink Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Traversable

I have just started learning Apache Flink and found the guide link to start the development in EClipse IDE.
I followed the this to start off but getting the below error
00:20:26,993 INFO org.apache.flink.api.java.ExecutionEnvironment - The job has 0 registered types and 0 default Kryo serializers
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Traversable
at java.lang.ClassLoader.defineClass1(Native Method)
Here I have place the Error log log File ... Please let me know if your require more details. Thanks, Nyamath
java.util.zip.ZipException: invalid LOC header (bad signature)
Your Scala jar file provided by Maven seems to be corrupted. Please update your Maven dependencies by executing this from your project folder on the command line:
mvn -U clean install
In Eclipse, right click on your project and click on Update - - > Maven dependencies.
If that does not work you'll need to delete the corrupted Jar file in the .m2/repositories/ folder.

Java Appengine SDK 1.9.6 Method not found

I have been having a terrible time figuring out a method not found problem. I've found similar questions on the appengine google group but none of the answers have helped solve the problem. Running my war locally with the dev server works fine but when I deploy my app I get the error below. I've included the top of the exception, the top of the last caused by and a list of the jars in my war's lib folder.
Here is the top part of the exception. I put each agurment on it's own line to make it easier to read
java.lang.NoSuchMethodError: com.google.appengine.api.datastore.Key.<init>(
Ljava/lang/String;
Lcom/google/appengine/api/datastore/Key;
Ljava/lang/String;
Lcom/google/appengine/api/datastore/AppIdNamespace;)V
Here is the top of the last caused by.
Caused by: java.lang.NoSuchMethodError: com.google.appengine.api.datastore.Key.<init>(
Ljava/lang/String
;Lcom/google/appengine/api/datastore/Key;
Ljava/lang/String;
Lcom/google/appengine/api/datastore/AppIdNamespace;)V
at com.google.appengine.api.datastore.KeyFactory.createKey(KeyFactory.java:84)
at com.google.appengine.api.datastore.KeyFactory.createKey(KeyFactory.java:77)
at com.googlecode.objectify.Key.<init>(Key.java:97)
Here is a listing of the jars in my war's WEB-INF/lib folder.
aopalliance-1.0.jar
appengine-api-1.0-sdk-1.9.6.jar
appengine-api-labs-1.9.6.jar
appengine-jsr107cache-1.9.6.jar
asm-3.1.jar
cglib-2.2.1-v20090111.jar
client-only-0.1.jar
datanucleus-appengine-1.0.10.final.jar
datanucleus-core-1.1.5.jar
datanucleus-jpa-1.1.5.jar
geronimo-jpa_3.0_spec-1.1.1.jar
geronimo-jta_1.1_spec-1.1.1.jar
gin-1.5.0.jar
guava-15.0.jar
guava-gwt-15.0.jar
guice-3.0.jar
guice-assistedinject-3.0.jar
guice-multibindings-3.0.jar
guice-servlet-3.0.jar
gwt-servlet.jar
hibernate-validator-4.1.0.Final-sources.jar
hibernate-validator-4.1.0.Final.jar
hibernate-validator-annotation-processor-4.1.0.Final.jar
javax.inject-1.jar
jdo2-api-2.3-eb.jar
jsr107cache-1.1.jar
jsr173-1.0.jar
jsr305-1.3.9.jar
jta-1.1.jar
libservice.jar
log4j-over-slf4j-1.6.1.jar
mgwt-1.2.0-rc-opera-removed.jar
objectify-5.0.2.jar
persistence-api-1.0.jar
server-and-client-0.1.jar
server-only.jar
slf4j-api-1.7.2.jar
uadetector-core-0.9.2.jar
uadetector-resources-2013.02.jar
validation-api-1.0.0.GA-sources.jar
validation-api-1.0.0.GA.jar
This was happening because of a quirk of gradle and javac with a little help from GWT and Objectify.
Since ~version 4.1, Objectify has had a separate jar with some appengine java sources in it so that Objectify classes can be used in the browser via GWT.
By default, sources included in compile dependencies in gradle will included in the compile output of the compile task. This is because gradle does not use the sourcepath parameter to the java compiler and according to Oracle's javac documentation:
If the -sourcepath option is not specified, the user class path is
also searched for source files.
I fixed the problem by adding this to my gradle file
compileJava.options.compilerArgs += "-sourcepath"
compileJava.options.compilerArgs += ""

java.lang.NoClassDefFoundError: com/google/common/collect/Maps - Selenium

Dear Selenium Experts,
I have come across the following run-time error from a JPA 2.0 program which appears to be related to Firefox Profile for some reason:
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/collect/Maps
at org.openqa.selenium.firefox.FirefoxProfile.(FirefoxProfile.java:56)
at org.openqa.selenium.firefox.FirefoxProfile.(FirefoxProfile.java:79)
at model.DownloadCarDetail.getMercedezDetail(model.DownloadCarDetail:72)
at model.DownloadCarDetail.getMercedezDetail.main (model.DownloadCarDetail.getMercedezDetail.java:47)
Caused by: java.lang.ClassNotFoundException: com.google.common.collect.Maps
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 4 more
Java Result: 1
I have kept Firefox to version 15 so that it is supported by Selenium Webdriver but suspect that the issue is to do with not able to read profile directory.
Your assistance would be very much apprecaited.
Many thanks,
George
The problem you are seeing has nothing to do with your Firefox profile.
Actually, it is the JVM classloader that complains that it can't find the com.google.common.collect.Maps class.
This usually means that you don't have Guava (which is a dependency for Selenium) on your classpath. Clean and rebuild your project, check your classpath, check the various versions of libraries that might be there. If you're using some kind of dependency management system (Maven, Ivy etc.), check it's configured right.
Import the .jar file to Eclipse downloaded from here (depending on the current version).
open this link https://www.seleniumhq.org/download/ and download Java 3.11.0(current version) open zip file to desktop on netbeans or eclips click add jars/file in selenium-java-3.11.0\libs select all file also in selenium-java-3.11.0 select client-combined-3.11.0.jar you will be ok. dont remember to add System.setProperty("webdriver.chrome.driver", "C:\\chromedriver.exe"); code. you can download chromedriver this link https://chromedriver.storage.googleapis.com/index.html?path=2.38/
In my case the Guava dependency was corrupt. Worked fine after I deleted the corrupt jars and rebuilt whole project.
Thank you for offerring your suggestion for a solution to this issue. I have found the exact answer from Selenium 2 WebDriver NoClassDefFoundErrorS which has resolved the underlying issue.
George
add below maven dependency and clean and compile your code.
<!-- https://mvnrepository.com/artifact/com.google.common/google-collect -->
<dependency>
<groupId>com.google.common</groupId>
<artifactId>google-collect</artifactId>
<version>0.5</version>
</dependency>

NoClassDefFoundError MimeTypeException with PDF extraction

I am getting an exception trying to use update/extract with PDF files
My Set up is:-
Ubuntu Server 11.10
Tomcat 6
Solr 3.5.0.2011.11.22.15.54.38
I can browse to solr/admin OK
I have put all the contrib/extract and apache-solr-cell3.5.0.jar libraries into the tomcat folder webapps/solr/WEB-INF/lib
I am calling extract using:-
curl "http://localhost:8080/solr/update/extract?uprefix=attr_&fmap.content=attr_content&commit=true" -F "file=/path/to/my.pdf"
error is
java.lang.NoClassDefFoundError: org/apache/tika/mime/MimeTypeException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:383)
at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:425)
at org.apache.solr.core.SolrCore.createRequestHandler(SolrCore.java:461)
at org.apache.solr.core.RequestHandlers$LazyRequestHandlerWrapper.getWrappedHandler(RequestHandlers.java:248)
at org.apache.solr.core.RequestHandlers$LazyRequestHandlerWrapper.handleRequest(RequestHandlers.java:239)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1372)
Would appreciate any pointers - the only time this error seems to come up elsewhere is with Nutch and cached results.
I have tried sending the mimetype in the querystring and also a *.doc file but got the same error.
According to the error message it is not a MimeTypeException exception you get: The problem is a NoClassDefFoundError, because Solr cannot load the class MimeTypeException.
Normally this class is present in tika-core.jar.
Make sure you actually have that file and also check if you have a lib statement in your solrconfig.xml pointing to the right directory.
This was due to the basic error of copying the necessary tika libraries (to tomcat6/webapps/solr/WEB-INF/lib) but leaving ownership of the jar files as ROOT instead of chown-ing them to TOMCAT6. After setting the right permission and restarting Tomcat it started working OK
Found the solution of this problem, I was using SolrJ to update my pdf indexing.
after deploy solr to tomcat, I didn't include the following libraries into the tomcat/webapp
and I get all the lazy loading problem, etc etc
I even try to get apache tika...
until I do this...
shutdown tomcat
\apache-solr-3.5.0\contrib\extraction
copy the libraries above to below
\apache-tomcat-7.0.26\webapps\solr\WEB-INF\lib
startup tomcat
cheers

Resources