Zeppelin: While running Spark code getting spark-interpreter-0.10.0.jar file not found - apache-zeppelin

Getting following error while executing spark code through Zeppelin.
ERROR deploy.ClientEndpoint: Exception from cluster was: java.nio.file.NoSuchFileException: /opt/zeppelin/zeppelin/interpreter/spark/spark-interpreter-0.10.0.jar
java.nio.file.NoSuchFileException: /opt/zeppelin/zeppelin/interpreter/spark/spark-interpreter-0.10.0.jar
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
at sun.nio.fs.UnixCopyFile.copy(UnixCopyFile.java:526)
at sun.nio.fs.UnixFileSystemProvider.copy(UnixFileSystemProvider.java:253)
at java.nio.file.Files.copy(Files.java:1274)
at org.apache.spark.util.Utils$.copyRecursive(Utils.scala:726)
at org.apache.spark.util.Utils$.copyFile(Utils.scala:697)
at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:771)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:541)
at org.apache.spark.deploy.worker.DriverRunner.downloadUserJar(DriverRunner.scala:162)
at org.apache.spark.deploy.worker.DriverRunner.prepareAndRunDriver(DriverRunner.scala:180)
at org.apache.spark.deploy.worker.DriverRunner$$anon$2.run(DriverRunner.scala:99)

You should set the property deployMode to client.

I was working on setup of Zeppelin 0.10 , Spark 3.x and Yarn. This issue got resolved, please refer the image

Related

java.lang.ClassNotFoundException: org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer when trying to run spark job in zeppelin

1.Summarizing the problem
I have build zeppelin from the source code by running the below command.
mvn clean package -DskipTests -Pspark-2.3 -Pscala-2.11
The build was successful.
Launched apache zeppelin on kubernetes cluster and could see zeppelin-server starts perfectly fine.
but when trying to run a spark notebook the spark interpreter pod goes into completed/succeded state with below errors in the logs from spark-interpreter.log
WARN [2020-03-06 00:42:37,683] ({main} Logging.scala[logWarning]:87) - Failed to load org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.
java.lang.ClassNotFoundException: org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer
2.Describe what you’ve tried
I did not find any resolution so could not try any solution to this problem yet.
any suggestions or ideas would be highly appreciated.
I figured out the issue and was able to resolve by adding --jars with interpreter and spark jars in the zeppelin-env.sh script but later stuck into different issue.
Now that interpreter is starting but unable to launch executors.
Below is the error message, if anybody would like to provide any inputs, would appreciate it.
java.lang.NoClassDefFoundError: org/sonatype/aether/resolution/DependencyResolutionException
Thank you.

Flink: Error while starting scala-shell - Could not create the DispatcherResourceManagerComponent

I am using Flink version 1.10.0 and while starting the Scala shell using 'start-scala-shell.sh', it throws an exception as follows:
Exception in thread "main" org.apache.flink.util.FlinkException: Could not create the DispatcherResourceManagerComponent.
at org.apache.flink.runtime.entrypoint.component.DefaultDispatcherResourceManagerComponentFactory.create(DefaultDispatcherResourceManagerComponentFactory.java:261)
I've changed the rest-port from 8081 to 8089, but still facing the same issue.
I even tried this with Flink 1.9.2, but faced the same issue.
Kindly help!

Building flink from source---error of resolving dependencies

I am trying to build Apache Flink from the source. I am strictly following
the instructions given #github flink page. https://github.com/apache/flink
I am facing the error for resolving the dependencies for project
flink-mapr-fs.
Firstly, I tried using latest maven, and afterward, as per instruction at
the flink github page https://github.com/apache/flink
https://github.com/apache/flink , I also tried via maven 3.1.1, but the
same error.
In addition, I tried for flink 1.8.0 and 1.7.2 as well but the same problem exist. The detail of error is as below:
[ERROR] Failed to execute goal on project flink-mapr-fs: Could not resolve dependencies for project org.apache.flink:flink-mapr-fs:jar:1.7.2: Could not transfer artifact com.mapr.hadoop:maprfs:jar:5.2.1-mapr from/to mapr-releases (http://repository.mapr.com/maven/): GET request of: com/mapr/hadoop/maprfs/5.2.1-mapr/maprfs-5.2.1-mapr.jar from mapr-releases failed: Premature end of Content-Length delimited message body (expected: 49411916; received: 39845888 -> [Help 1]

Apache Flink Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Traversable

I have just started learning Apache Flink and found the guide link to start the development in EClipse IDE.
I followed the this to start off but getting the below error
00:20:26,993 INFO org.apache.flink.api.java.ExecutionEnvironment - The job has 0 registered types and 0 default Kryo serializers
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/Traversable
at java.lang.ClassLoader.defineClass1(Native Method)
Here I have place the Error log log File ... Please let me know if your require more details. Thanks, Nyamath
java.util.zip.ZipException: invalid LOC header (bad signature)
Your Scala jar file provided by Maven seems to be corrupted. Please update your Maven dependencies by executing this from your project folder on the command line:
mvn -U clean install
In Eclipse, right click on your project and click on Update - - > Maven dependencies.
If that does not work you'll need to delete the corrupted Jar file in the .m2/repositories/ folder.

Solr: 404 error with getting admin page

I've installed Solr on my Ubuntu to this path
/opt/solr/solr-4.10.2
After installing I started Solr:
sudo bin/solr start from /opt/solr/solr-4.10.2 directory
As I can understand it started successfully
Waiting to see Solr listening on port 8983 [/]
Started Solr server on port 8983 (pid=8385). Happy searching!
But when I try to get to admin page
http://localhost:8983/solr
I got 404 error:
HTTP ERROR: 404
Problem accessing /solr. Reason:
Not Found
Powered by Jetty://
Do you have any suggestion what's going wrong and where to look in order to fix this problem?
Since this error can be caused by a lot of things, you need to access the log file and debug the execution.
First of all, open your Node log file, located in /opt/solr/solr-4.10.2/node1/log and look for something weird (Shift+F for Errors).
Generally, this error occurs when you have a mismatch between the Solr required Java JDK and your current Java JDK.
When I had this problem, I found in the log file the following error message java.lang.UnsupportedClassVersionError: org/apache/solr/servlet/SolrDispatchFilter : Unsupported major.minor version 51.0 and realized the problem was the java version.
To solve this, try to change the current JDK, using the command sudo update-alternatives --config javac.
If the error still occurs, try to uninstall all unused JDK's, because Solr is getting the wrong path.
The final solution to this issue is to open the file /opt/solr/solr-4.10.2/solar.in.sh and edit the SOLR_JAVA_HOME, writing the right JDK path (e.g /usr/lib/jvm/java-1.7.0)
Disclosure: the secret is look in the log file and figure out what is causing the issue.
Cheers.
try:
http://localhost:8983/solr/index.html
[solr's web.xml]
<servlet>
<servlet-name>LoadAdminUI</servlet-name>
<servlet-class>org.apache.solr.servlet.LoadAdminUiServlet</servlet-class>

Resources