I am using Apache Flink version 1.13.1
I wrote a custom metrics reporter, but the JobManager does not seem to recognise it. On startup, the JobManager shows the following warning log:
2021-08-25 14:54:06,243 WARN org.apache.flink.runtime.metrics.ReporterSetup [] - The reporter factory (org.apache.flink.metrics.kafka.KafkaReporterFactory) could not be found for reporter kafka. Available factories: [org.apache.flink.metrics.slf4j.Slf4jReporterFactory, org.apache.flink.metrics.datadog.DatadogHttpReporterFactory, org.apache.flink.metrics.prometheus.PrometheusPushGatewayReporterFactory, org.apache.flink.metrics.graphite.GraphiteReporterFactory, org.apache.flink.metrics.statsd.StatsDReporterFactory, org.apache.flink.metrics.prometheus.PrometheusReporterFactory, org.apache.flink.metrics.jmx.JMXReporterFactory, org.apache.flink.metrics.influxdb.InfluxdbReporterFactory].
2021-08-25 14:54:06,245 INFO org.apache.flink.runtime.metrics.MetricRegistryImpl [] - No metrics reporter configured, no metrics will be exposed/reported.
I have a folder within the Flink plugins folder called metrics-kafka which contains the packaged jar for the metrics reporter. I have also copied this jar to the lib folder, both of which did not work. See the configuration and code used below.
Flink configuration file:
metrics.reporter.kafka.factory.class: org.apache.flink.metrics.kafka.KafkaReporterFactory
metrics.reporter.kafka.class: org.apache.flink.metrics.kafka.KafkaReporter
metrics.reporter.kafka.interval: 15 SECONDS
Metrics reporter factory class:
package org.apache.flink.metrics.kafka
import org.apache.flink.metrics.reporter.{InterceptInstantiationViaReflection, MetricReporter, MetricReporterFactory}
import java.util.Properties
#InterceptInstantiationViaReflection(reporterClassName = "org.apache.flink.metrics.kafka.KafkaReporter")
class KafkaReporterFactory extends MetricReporterFactory{
override def createMetricReporter(properties: Properties): MetricReporter = {
new KafkaReporter()
}
}
Metrics reporter class:
package org.apache.flink.metrics.kafka
import org.apache.flink.metrics.MetricConfig
import org.apache.flink.metrics.reporter.{InstantiateViaFactory, Scheduled}
#InstantiateViaFactory(factoryClassName = "org.apache.flink.metrics.kafka.KafkaReporterFactory")
class KafkaReporter extends MyAbstractReporter with Scheduled{
...
}
I found that I needed to add a file called org.apache.flink.metrics.reporter.MetricReporterFactory with the contents org.apache.flink.metrics.kafka.KafkaReporterFactory in /resources/META-INF/services/.
Related
I was trying to execute basic test from Cucumber feature file but found the error:
Exception in thread "main" java.lang.NoClassDefFoundError: io/cucumber/plugin/SummaryPrinter
Feature File:
Feature: Login Action
Scenario: Successful Login with Valid Credentials
Given User is on Home Page
When User Navigate to LogIn Page
And User enters UserName and Password
Then Message displayed Login Successfully
Scenario: Successful LogOut
When User LogOut from the Application
Then Message displayed LogOut Successfully
Test Runner File:
package test;
import org.junit.runner.RunWith;
import io.cucumber.junit.Cucumber;
import io.cucumber.junit.CucumberOptions;
#RunWith(Cucumber.class)
#CucumberOptions(
features = "Feature"
,glue={"stepdefinition"}
)
public class TestRunner {
}
I am using eclipse and have installed cucumber plugin through eclipse marketplace.
Can anyone help me to fix this issue?
package cucumberOptions;
import cucumber.api.CucumberOptions;
import cucumber.api.testng.AbstractTestNGCucumberTests;
//#RunWith(Cucumber.class)
#CucumberOptions(
features = "src/test/java/features",
glue = "stepDefinations")
public class TestRunner extends AbstractTestNGCucumberTests {
}
Just keep in mind that features, stepDefinations, are folders at the same level, also my testRunner class is inside a folder called 'cucumberOptions' which is at the same level than the other 2 folders, in my case they are inside test folder, since this is a maven like project
It looks that missing dependency of one of the jar. It's really bean hectic and cumbersome with new versions of cucumber as there are always coming with breaking changes. I also seen same error while upgrading from cucumber 5.1.3 to 5.7.0. After spending sometime i resolved it by adding below additional dependency in pom:
<dependency>
<groupId>io.cucumber</groupId>
<artifactId>cucumber-plugin</artifactId>
<version>5.7.0</version>
</dependency>
I have a Flink application and generating a fat jar with all its dependencies I can submit it successfully on the remote cluster, using flink run command. Trying to submit the application on IntelliJ IDE, I faced the folloing error:
Caused by: org.apache.flink.streaming.runtime.tasks.StreamTaskException: Cannot load user class: MyFlink$1
ClassLoader info: URL ClassLoader:
file: '/tmp/blobStore-0fe289f8-b35b-4666-8402-67f9f6a22f55/cache/blob_3fd776b533f2268b6cf7ef1cc62b187bc4513c99' (valid JAR)
Class not resolvable through given classloader.
I packaged dependencies in a single jar file and pass it in createRemoteEnvironment method.
StreamExecutionEnvironment env = StreamExecutionEnvironment.createRemoteEnvironment(
"jobmanager", 6123, 2,
"FlinkProcessing_dependencies.jar"
);
How can I disaper this error?
Notice: Passing all user-defined classes in addition to dependencies, it runs successfully, but I don't want to pass the classes jars and export a jar file each time I change my classes!
I am going to integrate salesforce library into liferay 7 mvc portlet, the following steps are what i did:
Add libraries to class path. In Eclipse, go to Project > Properties > Java Build Path > Libraries > Add External JARs, then add the sfdc-wsc JAR to this list
Add below line to build.gradle:
compile group: 'com.force.api', name: 'force-wsc', version: '40.1.1'
The java source code is ok until i use gradle build to build project, the following error occur:
error: package com.sforce.soap.enterprise does not exist import com.sforce.soap.enterprise.EnterpriseConnection;
error: package com.sforce.soap.enterprise does not exist import com.sforce.soap.enterprise.QueryResult;
error: package com.sforce.soap.enterprise does not exist import com.sforce.soap.enterprise.SaveResult;
I also set the bnd file as follow according to blog post of DAVID H NEBINGER: https://web.liferay.com/web/user.26526/blog/-/blogs/osgi-module-dependencies
But nothing is improved
Bundle-ClassPath:.,\lib/externalLib.jar
-includeresource:\
lib/externalLib.jar=externalLib.jar,\
lib/commons-lang.jar=commons-lang=[0-9]*.jar
Please give any suggestion to correct.
Thanks in advance
I recently developed such a solution, but I used a different approach. I've implemented an OSGi bundle that exports Salesforce's SOAP APIs. This way you can use Salesforce APIs in any other Liferay bundle.
On this Salesforce SOAP API Client OSGi Bundle repository you find the sources. The OSGi bundle is also available on Maven Central.
Once you install the Salesforce SOAP API Client OSGi bundle, you can use it in any other Liferay bundle, such as your MVC Portlet. This sample project Salesforce Liferay Gogo Shell Command Client implements a set of Gogo Shell commands that allow us to interact with the Salesforce CRM system.
In your particular case, if you want to include external libraries via Gradle, then you can declare your dependency through the key compileInclude.
dependencies {
compileOnly group: "org.osgi", name: "org.osgi.core", version: "6.0.0"
compileOnly group: "org.osgi", name: "org.osgi.service.component.annotations", version: "1.3.0"
compileOnly group: "com.liferay.portal", name: "com.liferay.portal.kernel", version: "2.6.0"
compileOnly group: "org.apache.felix", name: "org.apache.felix.gogo.runtime", version: '1.0.6'
compileInclude group: 'org.fusesource.jansi', name: 'jansi', version: '1.16'
compileInclude 'de.vandermeer:asciitable:0.3.2'}
This way you do not have to do anything on the bnd file. The external jar, like magic will be placed inside your bundle and MANIFEST will be correct.
If you want generate your stubs then you go at Force.com Web Service Connector (WSC)
I've been reading in the Allure wiki however I do not seem to be able to get started with Allure.
I have an IntelliJ project which I use to run JUnit tests with Selenium. I want to add Allure for better feedback after test runs. However I've not been able to understand how Allure would integrate with the rest of my project.
On the wiki page for JUnit it looks like Allure with JUnit only supports maven projects? How can I set up allure to work with an IntelliJ project?
I want to add Allure for better feedback after test runs
It is strange that you don't have a build tool.
But for single test (as you mention) following will work.
Dependencies - you need aither allure-junit-adaptor or allure-testng-adaptor
Allure implements test listener, which should be added to test runner:
For TestNG it happens automatically (once you add adaptor dependency).
For JUnit you should add listener manually. I don't know how to add it to Intellij Idea JUnit runner, but you can always run tests programmatically:
public static void main(String[] args) {
JUnitCore runner = new JUnitCore();
runner.addListener(new AllureRunListener());
runner.run(CalculatorTest.class);
}
That will generate XML report in target/allure-results folder.
If you need advanced Allure features like file attachments and test steps you need another dependency (aspectjweaver) and according JVM args, e.g.
-javaagent:lib/aspectjweaver-1.8.7.jar
To generate HTML report from existing XML report you can:
either use Allure CLI (requires tool installation http://wiki.qatools.ru/display/AL/Allure+Commandline)
or use 'mvn site' on existing project (e.g. https://github.com/allure-examples/allure-junit-example)
Open your HTML report in Firefox (or look here how to open locally generated report in Chrome).
To have allure-results after launching JUnit4 tests from IDE using context menu or controls on the gutter I use #RunWith anntotation where I set my custom runner with AllureJUnit4 listener.
CustomRunner class
package ru.atconsulting.qa.system.junit4runners;
import io.qameta.allure.junit4.AllureJunit4;
import org.junit.runner.notification.RunNotifier;
import org.junit.runners.BlockJUnit4ClassRunner;
import org.junit.runners.model.InitializationError;
public class CustomRunner extends BlockJUnit4ClassRunner {
/**
* Creates a BlockJUnit4ClassRunner to run {#code klass}
*
* #param klass
* #throws InitializationError if the test class is malformed.
*/
public CustomRunner(Class<?> klass) throws InitializationError {
super(klass);
}
#Override
public void run(RunNotifier notifier) {
notifier.addListener(new AllureJunit4()); //just add listener
super.run(notifier);
}
}
Using #RunWith annotation in test classes
#RunWith(CustomRunner.class)
public class ExampleSuite {
#Test
public void testExample() {
....
}
}
I am building a GAE webapp using Python. I am also using the Datastore and trying to bulk upload data to the DB using the terminal and a CSV file as per:
https://developers.google.com/appengine/docs/python/tools/uploadingdata
I have created a Loader class in a separate.py file in my app root directory. I am not really sure if this loader class should be in my main.py webapp file, or another file in the root directory.
Loader class:
import datetime
from google.appengine.ext import db
from google.appengine.tools import bulkloader
import models
class FTSELoader(bulkloader.Loader):
def __init__(self):
bulkloader.Loader.__init__(self, 'FTSE',
[('date', lambda x: datetime.datetime.strptime(x, '%Y/%m/%d')),
('close', float)])
loaders = [FTSELoader]
My kind class (i.e. my Datastore table) I am trying to create/upload is called "FTSE". I then run this command in Terminal:
appcfg.py upload_data --config_file=FTSEdataloader.py --filename=FTSEdata.csv -- kind=FTSE --url=http://<myapp.appspot.com>/_ah/remote_api
I get the following error:
File "FTSEdataloader.py", line 4, in
import models
ImportError: No module named models
I do not have a "models.py" like in the GAE demonstration. What should take its place?
Thanks
I had the same problem. I'm not sure why the appcfg.py can't find the models module when running the upload script. I got around the problem by doing this:
import datetime
from google.appengine.ext import db
from google.appengine.tools import bulkloader
class FTSE(db.Model):
date = DateTimeProperty()
close = FloatProperty()
class FTSELoader(bulkloader.Loader):
def __init__(self):
bulkloader.Loader.__init__(self, 'FTSE',
[('date', lambda x: datetime.datetime.strptime(x, '%Y/%m/%d')),
('close', float)])
loaders = [FTSELoader]
Basically it is just putting your model definition in the bulkloader. It certainly isn't the best way to do this, but it will work around the PYTHONPATH problem that appcfg.py seems to have when it is running the bulk upload.
You do this using a file of Python code. The file imports or defines the Model classes for the entities being created, defines a loader class for each kind you wish to import, and declares the available loader classes in a global variable.
For example, say you have a Model class named "FTSE" defined in a file named models.py (which is in your PYTHON PATH, such as the directory where you'll run the tool Ex: C:\Python27) that resembles the following:
models.py
from google.appengine.ext import db
class FTSE(db.Model):
date = db.DateProperty()
close = db.FloatProperty()