Spark works normally in Zeppelin but when I add a jar dependency and run something like:
val df = spark.read.json("path/file1")
I got the the following error:
com.fasterxml.jackson.databind.JsonMappingException: Jackson version is too old 2.5.4
If I run it a second time I got:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$
(Not sure why the error changes for the second run)
If I do not attach the jar the code works normally. Is this connected with conflicts maybe?
Tried to do something like the following in the interpreter option in Spark but still the same error:
artifact: path/utils.jar
exclude: com.fasterxml.jackson.databind (Is this the correct way to write how to exclude jackson?)
Any insights?
Related
I have created a package Xnumber.And I have not registered it.
Now, I am creating another package(SHbundle) which uses the above package.Now in the julia REPL locally when I execute the following code:
julia> add "https://gitlab.com/vyush/Xnumber.jl.git"
julia> using Xnumber
It works fine locally and I can use the function but On pushing to SHbundle after adding Xnumber as dependency.
The pipeline script fail.
The command being executed is
- |
julia --project=#. -e '
using Pkg
Pkg.build()
Pkg.test(coverage=true)'
The error that I get is ERROR: expected package Xnumber [fdc6275c] to be registered. The package works fine locally but is giving error while executing the pipeline script.
I there any workaround for this without registering the packages.
The Links for these packages are: Xnumber,SHbundle
Remove the /Manifest.toml in your .gitignore file
Commit the Manifest.toml file after julia> add "https://gitlab.com/vyush/Xnumber.jl.git"
Add an extra step of Pkg.instantiate() in your pipeline
i'm trying z.load in apache zeppelin as following:
%dep
z.load("/zeppelin-0.5.6-incubating-bin-all/lplibs/hive/csv-serde-1.0.5-jar-with-dependencies.jar")
I get an ERROR and it says (not sure this is the error):
Must be used before SparkInterpreter (%spark) initialized
Hint: put this paragraph before any Spark code and restart Zeppelin/Interpreter
this zeppelin section is the first i have in my notebook so i'm not sure what its complaining about..
Right now I can't check your problem, but you should restart interpreter (pushing restart button) before loading dependency jar file.
There might be a chance that Sparkcontext has already been started by other notebook.
So as Kangrok mentioned, just restart Spark interpreter.
But apart from that, why don't you use the latest zeppelin, in which you don't need to use %dep to load your dependencies. Instead it can load from Interpreter screen.
More details can be found here https://zeppelin.incubator.apache.org/docs/0.6.0-incubating-SNAPSHOT/manual/dependencymanagement.html
I want to log defects automatic on jira whenever any test case gets fail.for that i have used JiraTesttResultReport plugin.i have successfully able to create my report.xml file.but i got below errors:
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
[JiraTestResultReporter] [INFO] Examining test results...
ERROR: Publisher JiraTestResultReporter.JiraReporter aborted due to exception
java.lang.NoSuchMethodError: hudson.model.AbstractBuild.getTestResultAction()Lhudson/tasks/test/AbstractTestResultAction;
at JiraTestResultReporter.JiraReporter.perform(JiraReporter.java:105)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:761)
at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:721)
at hudson.model.Build$BuildExecution.post2(Build.java:183)
at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:670)
at hudson.model.Run.execute(Run.java:1766)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:374)
On what version of Jenkins are you working?
JiraTestResultReporter does not work on Jenkins 1.577+
It's a known bug
A workaround is to build and install a snapshot of the plugin.
I am deploying Solr 4.3.0 in Tomcat 7.
Everything works fine but DataImportHandler. I can go to the
http://localhost:8080/solr/#/collection1/dataimport//dataimport
screen and see the dataimport options load at the UI.
Still, I can see any of my entities load in the "entity" combo box. Inside the configuration box, at the right side I can see the error below.
Apache Tomcat/7.0.41 - Error
report
525D76;}--> HTTP Status 500 - Filter execution threw an exception
noshade="noshade">type Exception reportmessage
Filter execution threw an exceptiondescription
The server encountered an internal error that prevented it from
fulfilling this request.exception
javax.servlet.ServletException: Filter execution threw an
exception root cause
java.lang.NoClassDefFoundError: org/apache/log4j/spi/LoggingEvent
org.apache.solr.logging.log4j.EventAppender.append(EventAppender.java:35)
org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
org.apache.log4j.Category.callAppenders(Category.java:206)
org.apache.log4j.Category.forcedLog(Category.java:391)
org.apache.log4j.Category.log(Category.java:856)
org.slf4j.impl.Log4jLoggerAdapter.error(Log4jLoggerAdapter.java:498)
org.apache.solr.common.SolrException.log(SolrException.java:119)
org.apache.solr.servlet.ResponseUtils.getErrorInfo(ResponseUtils.java:58)
org.apache.solr.servlet.SolrDispatchFilter.sendError(SolrDispatchFilter.java:691)
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:380)
org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:155)
note The full stack trace of the root cause is
available in the Apache Tomcat/7.0.41 logs.Apache Tomcat/7.0.41
Problem is that I have the "log4j-1.2.16.jar" loaded in the classpath (it's on Tomcat lib dir).
Anyone have stepped in this problem?
Try following the steps outlined in Using the example logging setup in containers other than Jetty. I have encountered this same error when running Solr 4.3 until I followed these steps to configure logging.
After changing the directory, did you change the directory path in solrconfig.xml file.
I just want to make sure after the making changes in configuration file, did you restart the tomcat and solr server?
You need to copy the slf4j-log4j12-1.6.6.jar from the ext of Solr into the lib folder.
You also need to put the logging.properties file there.
So I followed the tutorial on the H2 Documentation page and used the "Connecting to a Database using JDBC" method of connecting to the database. I First added the h2-*.jar file to the Lib Folder (through Netbeans) and used the following to make the connection to my database I previously created.
Class.forName("org.h2.Driver");
connection = DriverManager.getConnection("jdbc:h2:~/" + DatabaseName);
This turned out to work in the IDE environment, however when I attempted to run the application directly from the jar executable I get the following error:
java.lang.ClassNotFoundException: org.h2.Driver ...
this error occurs at the Class.forName() class loader. So I did a little looking around and found that this was a prominent problem. One solution people use was to extract the class Loader from the current Thread as so:
Thread t = Thread.currentThread();
ClassLoader cl = t.getContextClassLoader();
cl.getClass().getClassLoader();
Class toRun = cl.loadClass("org.h2.Driver");
Unfortunately this seems to still result in the same error, so i'm wondering what i'm doing wrong. Should I be doing something to make sure the Driver is in the class path? I have no I idea how if that's the case.
Thanks!
You need to add the h2-*.jar file to the classpath when running the application, for example using
java -cp h2*.jar -jar yourApp.jar