I have a Flink application and generating a fat jar with all its dependencies I can submit it successfully on the remote cluster, using flink run command. Trying to submit the application on IntelliJ IDE, I faced the folloing error:
Caused by: org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: MyFlink$1
ClassLoader info: URL ClassLoader:
file: '/tmp/blobStore-0fe289f8-b35b-4666-8402-67f9f6a22f55/cache/blob_3fd776b533f2268b6cf7ef1cc62b187bc4513c99' (valid JAR)
Class not resolvable through given classloader.
I packaged dependencies in a single jar file and pass it in createRemoteEnvironment method.
StreamExecutionEnvironment env = StreamExecutionEnvironment.createRemoteEnvironment(
"jobmanager", 6123, 2,
"FlinkProcessing_dependencies.jar"
);
How can I disaper this error?
Notice: Passing all user-defined classes in addition to dependencies, it runs successfully, but I don't want to pass the classes jars and export a jar file each time I change my classes!
Related
I am using Apache Flink version 1.13.1
I wrote a custom metrics reporter, but the JobManager does not seem to recognise it. On startup, the JobManager shows the following warning log:
2021-08-25 14:54:06,243 WARN org.apache.flink.runtime.metrics.ReporterSetup [] - The reporter factory (org.apache.flink.metrics.kafka.KafkaReporterFactory) could not be found for reporter kafka. Available factories: [org.apache.flink.metrics.slf4j.Slf4jReporterFactory, org.apache.flink.metrics.datadog.DatadogHttpReporterFactory, org.apache.flink.metrics.prometheus.PrometheusPushGatewayReporterFactory, org.apache.flink.metrics.graphite.GraphiteReporterFactory, org.apache.flink.metrics.statsd.StatsDReporterFactory, org.apache.flink.metrics.prometheus.PrometheusReporterFactory, org.apache.flink.metrics.jmx.JMXReporterFactory, org.apache.flink.metrics.influxdb.InfluxdbReporterFactory].
2021-08-25 14:54:06,245 INFO org.apache.flink.runtime.metrics.MetricRegistryImpl [] - No metrics reporter configured, no metrics will be exposed/reported.
I have a folder within the Flink plugins folder called metrics-kafka which contains the packaged jar for the metrics reporter. I have also copied this jar to the lib folder, both of which did not work. See the configuration and code used below.
Flink configuration file:
metrics.reporter.kafka.factory.class: org.apache.flink.metrics.kafka.KafkaReporterFactory
metrics.reporter.kafka.class: org.apache.flink.metrics.kafka.KafkaReporter
metrics.reporter.kafka.interval: 15 SECONDS
Metrics reporter factory class:
package org.apache.flink.metrics.kafka
import org.apache.flink.metrics.reporter.{InterceptInstantiationViaReflection, MetricReporter, MetricReporterFactory}
import java.util.Properties
#InterceptInstantiationViaReflection(reporterClassName = "org.apache.flink.metrics.kafka.KafkaReporter")
class KafkaReporterFactory extends MetricReporterFactory{
override def createMetricReporter(properties: Properties): MetricReporter = {
new KafkaReporter()
}
}
Metrics reporter class:
package org.apache.flink.metrics.kafka
import org.apache.flink.metrics.MetricConfig
import org.apache.flink.metrics.reporter.{InstantiateViaFactory, Scheduled}
#InstantiateViaFactory(factoryClassName = "org.apache.flink.metrics.kafka.KafkaReporterFactory")
class KafkaReporter extends MyAbstractReporter with Scheduled{
...
}
I found that I needed to add a file called org.apache.flink.metrics.reporter.MetricReporterFactory with the contents org.apache.flink.metrics.kafka.KafkaReporterFactory in /resources/META-INF/services/.
I am very new to Zeppelin/spark and couldn't get an accurate description of steps to configure new dependencies like that of NLP libraries.
Found similar issue here.
I was trying to use Johnsnowlabs NLP library in Zeppelin notebook (spark version2.2.1).
Setup included :
In Zeppelin's Interpreters configurations for Spark, include the following artifact:
com.johnsnowlabs.nlp:spark-nlp_2.11:2.5.4
Then, in conf/zeppelin-env.sh, setup SPARK_SUBMIT_OPTIONS.
export SPARK_SUBMIT_OPTIONS=” — packages JohnSnowLabs:spark-nlp:2.2.2". Then restarted Zeppelin.
But the below program gives the error :
%spark
import com.johnsnowlabs.nlp.base._
import com.johnsnowlabs.nlp.annotator._
<console>:26: error: object johnsnowlabs is not a member of package com
import com.johnsnowlabs.nlp.base._
^
<console>:27: error: object johnsnowlabs is not a member of package com
import com.johnsnowlabs.nlp.annotator._
Can someone please share how this can be done? I referred this link .
TIA
you don't need to edit the conf/zeppelin-env.sh (anyway you're using it incorrectly, as you're specifying completely different version), you can make all changes via Zeppelin UI. Go to the Spark interpreter configuration, and put com.johnsnowlabs.nlp:spark-nlp_2.11:2.5.4 into spark.jars.packages configuration property (or add it if it doesn't exist), and into the Dependencies at the end of configuration (for some reason, it isn't automatically pulled into driver classpath).
I'm currently following the Vespa tutorials, and ran into an issue with the HTTP API use-case. Everything works fine from the mvn install package to the vespa-deploy prepare target/application.zip.
The call to vespa-deploy activate returns normally, but the application then never gets available on localhost:8080. Looking at /opt/vespa/logs/vespa/vespa.log (in the VM) one finds the following stack trace:
Container.com.yahoo.jdisc.core.StandaloneMain error Unexpected:
exception=
java.lang.IllegalArgumentException: Could not create a component with id 'com.mydomain.demo.DemoComponent'.
Tried to load class directly, since no bundle was found for spec: sample-app-http-api-searcher.
If a bundle with the same name is installed, there is a either a version mismatch or the installed bundle's version contains a qualifier string.
at com.yahoo.osgi.OsgiImpl.resolveFromClassPath(OsgiImpl.java:48)
...
This occurred using a fresh Docker image with a clean clone of the sample-apps git repository. Preparing and activating the basic sample as well as the other http example did work seamlessly.
I checked the sources and the xml files for obvious problems but don't have any clue about what is failing and where.
target/application.zip contains
application/components/http-api-using-searcher-1.0.1-deploy.jar
application/hosts.xml
application/searchdefinitions/basic.sd
application/services.xml
And the jar itself does contain a com/mydomain/demo/DemoComponent.class file (among other things).
Potentially related issue on the github tracker: https://github.com/vespa-engine/vespa/issues/3479 I'll be posting a link to this question there as well, but I still think it's worth a SO question, at least to get some action behind the vespa tag :)
The bundle id in the application's services.xml file was wrong. Please pull the application from git and try again now. See also PR: https://github.com/vespa-engine/sample-apps/pull/18
Brief explanation: The bundle id given in the bundle="<id>" declaration in services.xml must match the 'Bundle-SymbolicName' in the bundle's manifest. When the bundle has been built with the Vespa bundle-plugin, the symbolic name is by default the same as the project's artifactId. Hence, in most cases you just have to verify that the bundle id matches the artifactId.
I need to know how I can properly add existing Java.class files to an Android Studio Project. My goal is to use these classes in an Android Project.
The Class Files are already written in Eclipse for another Java Project.
I've already tried File->New->New Module->selecting Java Library->Finish but that doesn't work properly.
As you probably all know it makes the MyClass Class by default.
For testing I imported com.example.* in my MainActivity and tried to build an Object of that Class inside the onCreate() Method.
The problem is it can't compile the Project. I got the following Errors:
Error:(7, 1) error: package com.example does not exist
Error:(16, 9) error: cannot find symbol class MyClass
Note: C:\Users\...\MainActivity.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Error:Execution failed for task ':app:compileDebugJava'.
> Compilation failed; see the compiler error output for details.
Can anybody explain how to import my Java.class files correctly so that I can use them in my project?
You can add as a local library package to your project in Android Studio.
In android Project window, right click on app and select New -> Module.
In the Create New Module window, select Java Library and click next. Then give the module name, for example HttpClient in Library Name field. Then give Java package name as same as your existing package com.example.xxx. Then give one existing class file name AnyFileName in Java Class Name field.
Now new module is created with the name HttpClient and the package name is com.example.xxx. With an empty class file Anyfilename.java
Now copy all your .java files to the HttpClient folder created inside your android project. Now it would have overwritten the empty file Anyfilename.java also.
After copying all .java files would have automatically added to the library module.
And you will get 3rd build.gradle file for your module. Already you might be having 2 build.gradle for your android project.
In your app's build.gradle file, include local library dependency compile project(":HttpClient"). Now you can import java files in HttpClient module to android app's java files.
Note : Above information is given based on Android Studio 2.3.3
I have a custom Camel component packaged in a separate jar. As advised here:
http://camel.apache.org/how-do-i-add-a-component.html
I created a file a META-INF/services/org/apache/camel/component/aq (aq is the component scheme name) containing:
class=<full class name>
Everything works When I run a test program standalone. However, when I try deploying it into a container (servicemix, karaf) it cannot resolve the component scheme name:
org.apache.camel.RuntimeCamelException: org.apache.camel.FailedToCreateRouteException: Failed to create route route7: Route(route7)[[From[aq:oprDequeuer]] -> [WireTap[properties:... because of Failed to resolve endpoint: aq://queue1 due to: No component found with scheme: aq
Also, when I register the component explicitly:
CamelContext context = getContext();
context.addComponent("aq", new AQComponent(context));
it works fine, including ServiceMix.
Make sure that file in META-INF is included in the JAR.
If that file is missing then the component cannot be auto discovered, and that is your problem. As you build a component for OSGi, then maybe the felix bundle plugin somehow does not include that file.
I suggest to double check this, and look inside the built JAR if the file is included.