I created a program in Flink (Java) to calculate the average of 9 fake sensors on 3 different rooms. The program runs fine if I start the jar file. So I decided to start the flink standalone-cluster to check the TaskManagers running my Job and respective tasks, like here (https://ci.apache.org/projects/flink/flink-docs-stable/tutorials/local_setup.html). I am running everything on my machine.
Why Can I not see the job running on the dashboard (http://localhost:8081/#/overview) but if I watch the log files (tail -f log/flink--client--*-T430.log) I can see something being processed?
Moreover, the print() method is spilling the output to the console.
I start my application with this command ./bin/flink run examples/explore-flink.jar -c
But maybe there is some parameter on a config file that I have to configure. Here is my code:
import org.apache.flink.api.common.functions.RichMapFunction;
import org.apache.flink.api.common.state.MapState;
import org.apache.flink.api.common.state.MapStateDescriptor;
import org.apache.flink.api.java.functions.KeySelector;
import org.apache.flink.api.java.tuple.Tuple2;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.sense.flink.mqtt.MqttTemperature;
import org.sense.flink.mqtt.TemperatureMqttConsumer;
public class SensorsMultipleReadingMqttEdgentQEP {
private boolean checkpointEnable = true;
private long checkpointInterval = 1000;
private CheckpointingMode checkpointMode = CheckpointingMode.EXACTLY_ONCE;
public SensorsMultipleReadingMqttEdgentQEP() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
env.setStreamTimeCharacteristic(TimeCharacteristic.IngestionTime);
if (checkpointEnable)
env.enableCheckpointing(checkpointInterval, checkpointMode);
DataStream<MqttTemperature> temperatureStream01 = env.addSource(new TemperatureMqttConsumer("topic-edgent-01"));
DataStream<MqttTemperature> temperatureStream02 = env.addSource(new TemperatureMqttConsumer("topic-edgent-02"));
DataStream<MqttTemperature> temperatureStream03 = env.addSource(new TemperatureMqttConsumer("topic-edgent-03"));
DataStream<MqttTemperature> temperatureStreams = temperatureStream01.union(temperatureStream02)
.union(temperatureStream03);
DataStream<Tuple2<String, Double>> average = temperatureStreams.keyBy(new TemperatureKeySelector())
.map(new AverageTempMapper());
average.print();
String executionPlan = env.getExecutionPlan();
System.out.println("ExecutionPlan ........................ ");
System.out.println(executionPlan);
System.out.println("........................ ");
// env.execute("SensorsMultipleReadingMqttEdgentQEP");
env.execute();
}
public static class TemperatureKeySelector implements KeySelector<MqttTemperature, Integer> {
private static final long serialVersionUID = 5905504239899133953L;
#Override
public Integer getKey(MqttTemperature value) throws Exception {
return value.getId();
}
}
public static class AverageTempMapper extends RichMapFunction<MqttTemperature, Tuple2<String, Double>> {
private static final long serialVersionUID = -5489672634096634902L;
private MapState<String, Double> averageTemp;
#Override
public void open(Configuration parameters) throws Exception {
averageTemp = getRuntimeContext()
.getMapState(new MapStateDescriptor<>("average-temperature", String.class, Double.class));
}
#Override
public Tuple2<String, Double> map(MqttTemperature value) throws Exception {
String key = "no-room";
Double temp = value.getTemp();
if (value.getId().equals(1) || value.getId().equals(2) || value.getId().equals(3)) {
key = "room-A";
} else if (value.getId().equals(4) || value.getId().equals(5) || value.getId().equals(6)) {
key = "room-B";
} else if (value.getId().equals(7) || value.getId().equals(8) || value.getId().equals(9)) {
key = "room-C";
} else {
System.err.println("Sensor not defined in any room.");
}
if (averageTemp.contains(key)) {
temp = (averageTemp.get(key) + value.getTemp()) / 2;
} else {
averageTemp.put(key, temp);
}
return new Tuple2<String, Double>(key, temp);
}
}
}
Thanks,
Felipe
After I select the option "Extract required libraries into generated JAR" it worked. Strange because I was generating the JAR with the option "Package required libraries into generated JAR" and it was not working.
Related
I have below avro schema User.avsc
{
"type": "record",
"namespace": "com.myorg",
"name": "User",
"fields": [
{
"name": "id",
"type": "long"
},
{
"name": "name",
"type": "string"
}
]
}
The below java User.java class is generated from above User.avsc using avro-maven-plugin.
package com.myorg;
import java.io.IOException;
import java.io.ObjectInput;
import java.io.ObjectOutput;
import java.nio.ByteBuffer;
import org.apache.avro.AvroRuntimeException;
import org.apache.avro.Schema;
import org.apache.avro.Schema.Parser;
import org.apache.avro.data.RecordBuilder;
import org.apache.avro.io.DatumReader;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.message.BinaryMessageDecoder;
import org.apache.avro.message.BinaryMessageEncoder;
import org.apache.avro.message.SchemaStore;
import org.apache.avro.specific.AvroGenerated;
import org.apache.avro.specific.SpecificData;
import org.apache.avro.specific.SpecificRecord;
import org.apache.avro.specific.SpecificRecordBase;
import org.apache.avro.specific.SpecificRecordBuilderBase;
#AvroGenerated
public class User extends SpecificRecordBase implements SpecificRecord {
private static final long serialVersionUID = 8699049231783654635L;
public static final Schema SCHEMA$ = (new Parser()).parse("{\"type\":\"record\",\"name\":\"User\",\"namespace\":\"com.myorg\",\"fields\":[{\"name\":\"id\",\"type\":\"long\"},{\"name\":\"name\",\"type\":{\"type\":\"string\",\"avro.java.string\":\"String\"}}]}");
private static SpecificData MODEL$ = new SpecificData();
private static final BinaryMessageEncoder<User> ENCODER;
private static final BinaryMessageDecoder<User> DECODER;
/** #deprecated */
#Deprecated
public long id;
/** #deprecated */
#Deprecated
public String name;
private static final DatumWriter<User> WRITER$;
private static final DatumReader<User> READER$;
public static Schema getClassSchema() {
return SCHEMA$;
}
public static BinaryMessageDecoder<User> getDecoder() {
return DECODER;
}
public static BinaryMessageDecoder<User> createDecoder(SchemaStore resolver) {
return new BinaryMessageDecoder(MODEL$, SCHEMA$, resolver);
}
public ByteBuffer toByteBuffer() throws IOException {
return ENCODER.encode(this);
}
public static User fromByteBuffer(ByteBuffer b) throws IOException {
return (User)DECODER.decode(b);
}
public User() {
}
public User(Long id, String name) {
this.id = id;
this.name = name;
}
public Schema getSchema() {
return SCHEMA$;
}
public Object get(int field$) {
switch(field$) {
case 0:
return this.id;
case 1:
return this.name;
default:
throw new AvroRuntimeException("Bad index");
}
}
public void put(int field$, Object value$) {
switch(field$) {
case 0:
this.id = (Long)value$;
break;
case 1:
this.name = (String)value$;
break;
default:
throw new AvroRuntimeException("Bad index");
}
}
public Long getId() {
return this.id;
}
public void setId(Long value) {
this.id = value;
}
public String getName() {
return this.name;
}
public void setName(String value) {
this.name = value;
}
public void writeExternal(ObjectOutput out) throws IOException {
WRITER$.write(this, SpecificData.getEncoder(out));
}
public void readExternal(ObjectInput in) throws IOException {
READER$.read(this, SpecificData.getDecoder(in));
}
static {
ENCODER = new BinaryMessageEncoder(MODEL$, SCHEMA$);
DECODER = new BinaryMessageDecoder(MODEL$, SCHEMA$);
WRITER$ = MODEL$.createDatumWriter(SCHEMA$);
READER$ = MODEL$.createDatumReader(SCHEMA$);
}
}
I want to write an instance of User SpecificRecord into File using apache flink`s FileSink.
Below is the program that I wrote -
import org.apache.flink.connector.file.sink.FileSink;
import org.apache.flink.core.fs.Path;
import org.apache.flink.formats.avro.AvroWriters;
import org.apache.flink.streaming.api.CheckpointingMode;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import com.myorg.User;
import org.apache.flink.streaming.api.functions.sink.filesystem.OutputFileConfig;
import org.apache.flink.streaming.api.functions.sink.filesystem.bucketassigners.DateTimeBucketAssigner;
import org.apache.flink.streaming.api.functions.sink.filesystem.rollingpolicies.OnCheckpointRollingPolicy;
import java.util.Arrays;
public class AvroFileSinkApp {
private static final String OUTPUT_PATH = "./il/";
public static void main(String[] args) throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment().enableCheckpointing(5000);
env.getCheckpointConfig().setCheckpointingMode(CheckpointingMode.EXACTLY_ONCE);
env.setParallelism(4);
OutputFileConfig config = OutputFileConfig
.builder()
.withPartPrefix("il")
.withPartSuffix(".avro")
.build();
DataStream<User> source = env.fromCollection(Arrays.asList(getUser(), getUser(), getUser(), getUser(), getUser(), getUser()));
source.sinkTo(FileSink.forBulkFormat(new Path(OUTPUT_PATH), AvroWriters.forSpecificRecord(User.class)).withBucketCheckInterval(5000).withRollingPolicy(OnCheckpointRollingPolicy.build())
.withOutputFileConfig(config).withBucketAssigner(new DateTimeBucketAssigner<>("yyyy/MM/dd/HH")).build());
env.execute("FileSinkProgram");
Thread.sleep(300000);
}
public static User getUser() {
User u = new User();
u.setId(1L);
u.setName("raj");
return u;
}
}
I wrote this program using this and this as reference. The project is on github here.
When I run the program, the in progress files are getting created but not checkpointing and committing the temp files. I've added Thread.sleep(300000); but couldn't see the inprogress files to avro files.
I've awaited the main thread for an hour as well but no luck.
Any idea what is stopping in-progress files moving to finished state?
This problem is mainly because Source is a BOUNDED Source. The execution of the entire Flink Job is over before the Checkpoint has been executed.
You can refer to the following example to generate User records instead of fromCollection
/** Data-generating source function. */
public static final class Generator
implements SourceFunction<Tuple2<Integer, Integer>>, CheckpointedFunction {
private static final long serialVersionUID = -2819385275681175792L;
private final int numKeys;
private final int idlenessMs;
private final int recordsToEmit;
private volatile int numRecordsEmitted = 0;
private volatile boolean canceled = false;
private ListState<Integer> state = null;
Generator(final int numKeys, final int idlenessMs, final int durationSeconds) {
this.numKeys = numKeys;
this.idlenessMs = idlenessMs;
this.recordsToEmit = ((durationSeconds * 1000) / idlenessMs) * numKeys;
}
#Override
public void run(final SourceContext<Tuple2<Integer, Integer>> ctx) throws Exception {
while (numRecordsEmitted < recordsToEmit) {
synchronized (ctx.getCheckpointLock()) {
for (int i = 0; i < numKeys; i++) {
ctx.collect(Tuple2.of(i, numRecordsEmitted));
numRecordsEmitted++;
}
}
Thread.sleep(idlenessMs);
}
while (!canceled) {
Thread.sleep(50);
}
}
#Override
public void cancel() {
canceled = true;
}
#Override
public void initializeState(FunctionInitializationContext context) throws Exception {
state =
context.getOperatorStateStore()
.getListState(
new ListStateDescriptor<Integer>(
"state", IntSerializer.INSTANCE));
for (Integer i : state.get()) {
numRecordsEmitted += i;
}
}
#Override
public void snapshotState(FunctionSnapshotContext context) throws Exception {
state.clear();
state.add(numRecordsEmitted);
}
}
}
The flink flow has multi data stream, then I merge those data stream with org.apache.flink.streaming.api.datastream.DataStream#union method.
Then, I got the problem, the datastream is disordered and I can not set window to sort the data in data stream.
Sorting union of streams to identify user sessions in Apache Flink
I got the the answer, but the com.liam.learn.flink.example.union.UnionStreamDemo.SortFunction#onTimer
never been invoked.
Environment Info: flink version 1.7.0
In general, I hope to sort the union datastream witout watermark.
You need watermarks so that the sorting function knows when it can safely emit sorted elements. Without watermarks, you get get an record from stream B that has an earlier date than any of the first N records of stream A, right?
But adding watermarks is easy, especially if you know that "event time" is strictly increasing for any one stream. Below is some code I wrote that extends what David Anderson posted in his answer to the other SO issue you referenced above - hopefully this will get you started.
-- Ken
package com.scaleunlimited.flinksnippets;
import java.util.PriorityQueue;
import java.util.Random;
import org.apache.flink.api.common.state.ValueState;
import org.apache.flink.api.common.state.ValueStateDescriptor;
import org.apache.flink.api.common.typeinfo.TypeHint;
import org.apache.flink.api.common.typeinfo.TypeInformation;
import org.apache.flink.configuration.Configuration;
import org.apache.flink.streaming.api.TimeCharacteristic;
import org.apache.flink.streaming.api.TimerService;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.api.functions.KeyedProcessFunction;
import org.apache.flink.streaming.api.functions.source.RichParallelSourceFunction;
import org.apache.flink.streaming.api.functions.timestamps.AscendingTimestampExtractor;
import org.apache.flink.util.Collector;
import org.junit.Test;
public class MergeAndSortStreamsTest {
#Test
public void testMergeAndSort() throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment(2);
env.setStreamTimeCharacteristic(TimeCharacteristic.EventTime);
DataStream<Event> streamA = env.addSource(new EventSource("A"))
.assignTimestampsAndWatermarks(new EventTSWAssigner());
DataStream<Event> streamB = env.addSource(new EventSource("B"))
.assignTimestampsAndWatermarks(new EventTSWAssigner());
streamA.union(streamB)
.keyBy(r -> r.getKey())
.process(new SortByTimestampFunction())
.print();
env.execute();
}
private static class Event implements Comparable<Event> {
private String _label;
private long _timestamp;
public Event(String label, long timestamp) {
_label = label;
_timestamp = timestamp;
}
public String getLabel() {
return _label;
}
public void setLabel(String label) {
_label = label;
}
public String getKey() {
return "1";
}
public long getTimestamp() {
return _timestamp;
}
public void setTimestamp(long timestamp) {
_timestamp = timestamp;
}
#Override
public String toString() {
return String.format("%s # %d", _label, _timestamp);
}
#Override
public int compareTo(Event o) {
return Long.compare(_timestamp, o._timestamp);
}
}
#SuppressWarnings("serial")
private static class EventTSWAssigner extends AscendingTimestampExtractor<Event> {
#Override
public long extractAscendingTimestamp(Event element) {
return element.getTimestamp();
}
}
#SuppressWarnings("serial")
private static class SortByTimestampFunction extends KeyedProcessFunction<String, Event, Event> {
private ValueState<PriorityQueue<Event>> queueState = null;
#Override
public void open(Configuration config) {
ValueStateDescriptor<PriorityQueue<Event>> descriptor = new ValueStateDescriptor<>(
// state name
"sorted-events",
// type information of state
TypeInformation.of(new TypeHint<PriorityQueue<Event>>() {
}));
queueState = getRuntimeContext().getState(descriptor);
}
#Override
public void processElement(Event event, Context context, Collector<Event> out) throws Exception {
TimerService timerService = context.timerService();
long currentWatermark = timerService.currentWatermark();
System.out.format("processElement called with watermark %d\n", currentWatermark);
if (context.timestamp() > currentWatermark) {
PriorityQueue<Event> queue = queueState.value();
if (queue == null) {
queue = new PriorityQueue<>(10);
}
queue.add(event);
queueState.update(queue);
timerService.registerEventTimeTimer(event.getTimestamp());
}
}
#Override
public void onTimer(long timestamp, OnTimerContext context, Collector<Event> out) throws Exception {
PriorityQueue<Event> queue = queueState.value();
long watermark = context.timerService().currentWatermark();
System.out.format("onTimer called with watermark %d\n", watermark);
Event head = queue.peek();
while (head != null && head.getTimestamp() <= watermark) {
out.collect(head);
queue.remove(head);
head = queue.peek();
}
}
}
#SuppressWarnings("serial")
private static class EventSource extends RichParallelSourceFunction<Event> {
private String _prefix;
private transient Random _rand;
private transient boolean _running;
private transient int _numEvents;
public EventSource(String prefix) {
_prefix = prefix;
}
#Override
public void open(Configuration parameters) throws Exception {
super.open(parameters);
_rand = new Random(_prefix.hashCode() + getRuntimeContext().getIndexOfThisSubtask());
}
#Override
public void cancel() {
_running = false;
}
#Override
public void run(SourceContext<Event> context) throws Exception {
_running = true;
_numEvents = 0;
long timestamp = System.currentTimeMillis() + _rand.nextInt(10);
while (_running && (_numEvents < 100)) {
long deltaTime = timestamp - System.currentTimeMillis();
if (deltaTime > 0) {
Thread.sleep(deltaTime);
}
context.collect(new Event(_prefix, timestamp));
_numEvents++;
// Generate a timestamp every 5...15 ms, average is 10.
timestamp += (5 + _rand.nextInt(10));
}
}
}
}
ExtentReports report;
ExtentTest logger;
Code runs correct for 1 class but throws null pointer exception for 2nd class when I use Extent report
1. I initialized in #BeforeSuite
2. Then Initialized in #BeforeMethod
3. In testng.xml there are 2 classes Class1 & Class2
4. On execution of testng.xml - All #Test of class1 runs completely but Class2 throws null pointer exception error when it reads in BeforeMethod(Initialized as mentioned in step2)
Extent report has been initialized in Testbase class and created getter for it so that other class can read it
Note: When I change BeforeSuite to BeforeClass i.e initialization of is done in BeforeClass then It runs fine but it produces extent report of only class1.
Also I am using Aftermethod to Flush the report and quit driver. Any solution to get rid of this null pointer exception
Below is the complete code
1. TestBase Class
package sampletestproject;
import java.io.File;
import java.io.IOException;
import java.text.DateFormat;
import java.util.Date;
import java.util.concurrent.TimeUnit;
import org.apache.commons.io.FileUtils;
import org.openqa.selenium.OutputType;
import org.openqa.selenium.TakesScreenshot;
import org.openqa.selenium.WebDriver;
import
org.openqa.selenium.chrome.ChromeDriver;
import
org.openqa.selenium.firefox.FirefoxDriver;
import
org.openqa.selenium.support.PageFactory;
import org.testng.ITestContext;
import org.testng.ITestNGMethod;
import org.testng.annotations.AfterClass;
import org.testng.annotations.BeforeSuite;
public class TestBasee {
private ExtentReports report;
public WebDriver driverObj;
public Homepagee homeObj;
#BeforeSuite (alwaysRun = true)
public void beforeTest(){
System.out.println("In #BeforeSuite");
report = new ExtentReports("G:\\ExtentReport"+fn_GetTimeStamp()+".html");
System.out.println("Out #BeforeSuite");
}
#AfterClass(alwaysRun = true)
public void tearDown(ITestContext context) throws IOException, InterruptedException{
System.out.println("#AfterClass In tear down");
ITestNGMethod[] tngMethods = context.getAllTestMethods();
int i=1;
for(ITestNGMethod tng: tngMethods){
String methodName = "Method"+i+": "+ tng.getMethodName();
i++;
System.out.println(methodName);
}
}
public static String fn_GetTimeStamp() {
DateFormat DF = DateFormat.getDateTimeInstance();
String DateVal = DF.format(new Date());
DateVal = DateVal.replaceAll("/", "_");
DateVal = DateVal.replaceAll(",", "_");
DateVal = DateVal.replaceAll(":", "_");
DateVal = DateVal.replaceAll(" ", "_");
return DateVal;
}
/******************** Open Site **************************/
public Homepagee gm_OpenApp(String BrowserName, String URL) throws Exception {
System.out.println("In gm_OpenAp Method");
System.out.println(BrowserName+" -- "+URL);
gm_LaunchBrowser(BrowserName);
Thread.sleep(2000);
gm_OpenURL(URL);
Thread.sleep(2000);
homeObj = PageFactory.initElements(driverObj, Homepagee.class);
return homeObj;
}
public void gm_OpenURL(String URL) {
driverObj.get(URL);
}
public void gm_LaunchBrowser(String browserName) throws Exception{
// Launch Chrome browser
if (browserName.equalsIgnoreCase("CH") == true) {
System.setProperty("webdriver.chrome.driver", "MasterFiles\\Drivers\\ChromeDriver\\Chromedriver_win32_v2.38\\chromedriver.exe");
driverObj = new ChromeDriver();
}
// Launch Firefox browser
else if (browserName.equalsIgnoreCase("FF") == true) {
System.setProperty("webdriver.gecko.driver", "MasterFiles\\Drivers\\GeckoDriver\\64Bit\\v20\\geckodriver.exe");
driverObj = new FirefoxDriver();
}
driverObj.manage().timeouts().implicitlyWait(60, TimeUnit.SECONDS);
driverObj.manage().timeouts().pageLoadTimeout(250, TimeUnit.SECONDS);
driverObj.manage().window().maximize();
}
/****************TAKE SCREENSHOT*****************/
public String gm_setScreenshotPath_forExtentReporting(String elementName, String resultStatus) {
System.out.println("In gm_setScreenshotPath_forExtentReporting");
System.out.println(elementName + "--" + resultStatus);
String screenshotPath = "G:\\QA\\AutomationTools\\Selenium\\WorkspaceMars1\\1.2hp.com.automationprac\\TestReports\\ExtentReport\\Screenshots\\"+ resultStatus + "\\" + elementName + "_" + fn_GetTimeStamp() + ".png";
return screenshotPath;
}
public void gm_TakeSnapshot(String destFilePath) throws IOException, InterruptedException {
TakesScreenshot tss = (TakesScreenshot) driverObj;
File srcfileobj = tss.getScreenshotAs(OutputType.FILE);
File destFileObj = new File(destFilePath);
FileUtils.copyFile(srcfileobj, destFileObj);
}
public ExtentReports getReport() {
return report;
}
public void setReport(ExtentReports report) {
this.report = report;
}
}
2. SignIn -Page Object
package sampletestproject;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.support.FindBy;
public class SignInPagee extends TestBasee{
#FindBy(xpath = "//div[#id = 'center_column']/h1")
public static WebElement PageHeading_SignIn_txt;
#FindBy(xpath = "//h3[contains(text(), 'Already registered?')]")
public static WebElement SectionHeading_SignIn_txt;
public SignInPagee(WebDriver driverObj){
this.driverObj = driverObj;
}
}
3. Homepage - PageObject
package sampletestproject;
import java.io.IOException;
import java.util.List;
import java.util.NoSuchElementException;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.WebElement;
import
org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.support.FindBy;
import
org.openqa.selenium.support.PageFactory;
import
org.openqa.selenium.support.ui.FluentWait;
import org.openqa.selenium.support.ui.Wait;
import
import com.google.common.base.Function;
public class Homepagee extends TestBasee {
public Homepagee(WebDriver driverObj){
this.driverObj = driverObj;
}
public SignInPagee navigateToSignInPage(){
System.out.println("In navigateToSignInPage");
driverObj.navigate().to("http://automationpractice.com/index.php?controller=authentication&back=my-account");
SignInPagee signInPageObj = PageFactory.initElements(driverObj, SignInPagee.class);
return signInPageObj;
}
}
4. HomepageTest -Testpage
package sampletestproject;
import java.lang.reflect.Method;
import org.testng.Assert;
import org.testng.ITestResult;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Parameters;
import org.testng.annotations.Test;
import com.relevantcodes.extentreports.ExtentReports;
import com.relevantcodes.extentreports.ExtentTest;
import com.relevantcodes.extentreports.LogStatus;
public class HomepageeTest extends TestBasee {
//ExtentReports report;
ExtentTest logger;
String elementName;
String Comment;
String actualResult;
String expectedResult;
#BeforeMethod(alwaysRun = true)
#Parameters({ "Browser", "URL"})
public void getBrowser(Method method, String Browser, String URL) throws Exception{
logger = getReport().startTest((this.getClass().getSimpleName()+"::"+method.getName()), method.getName());
logger.assignAuthor("VK");
logger.assignCategory("HomePage - Smoketesting and TextTesting--Live");
homeObj = gm_OpenApp(Browser, URL);
}
#AfterMethod (alwaysRun = true)
public void publishReport_SIP(ITestResult result) throws Exception{
System.out.println("publishReport_SIP");
String resultStatus = null;
if(result.getStatus() == ITestResult.FAILURE){
resultStatus = "FAILED";
String screenshot_Path = gm_setScreenshotPath_forExtentReporting(elementName, resultStatus);
gm_TakeSnapshot(screenshot_Path);
String image = logger.addScreenCapture(screenshot_Path);
logger.log(LogStatus.FAIL, Comment, image);
}else{
resultStatus = "PASSED";
String screenshot_Path = gm_setScreenshotPath_forExtentReporting(elementName, resultStatus);
gm_TakeSnapshot(screenshot_Path);
System.out.println("screenshot_Path: "+screenshot_Path);
String image = logger.addScreenCapture(screenshot_Path);
logger.log(LogStatus.PASS, Comment, image);
}
getReport().endTest(logger);
getReport().flush();
System.out.println("closing now_SIP.");
driverObj.quit();
}
//"********Validation of SignIn Link********");
#Test(priority=0, groups = {"Smoke"})
public void validateHeaderSignInLink_HP() throws Exception{
System.out.println("In validateHeaderSignInLink Method_HP");
elementName = "SignInLink";
Comment = "validateHeaderSignInLink";
actualResult = "http://automationpractice.com/index.php?controller=authentication&back=my-account";
expectedResult = "http://automationpractice.com/index.php?controller=authentication&back=my-account";
Assert.assertEquals(actualResult, expectedResult);
System.out.println("Out of validateHeaderSignInLink method_HP");
}
//"********Validation of GetSavingNow Button********");
#Test (priority = 1, groups = {"Smoke"})
public void validateGetSavingNowButton_HP() throws Exception{
System.out.println("In validateGetSavingNowButton Method_HP");
elementName = "GETSAVINGSNOWButton";
Comment = "validateGetSavingNowButton";
expectedResult = "http://automationpractice.com/index.php";
actualResult = "http://automationpractice.com/index.php";
Assert.assertEquals(actualResult, expectedResult);
System.out.println("Out of validateGetSavingNowButton method_HP");
}
#Test (priority = 2, groups = {"UITest"})
//"********Validation of SearchBox********");
public void validateSearchField_HP() throws Exception{
System.out.println("In validateSearchField Method_HP");
elementName = "FadedShortSleeveTshirts_lnktxt";
Comment = "validateSearchField";
actualResult = "Faded Short Sleeve T-shirtss"; //Just to produce a failed result
expectedResult = "Faded Short Sleeve T-shirts";
Assert.assertEquals(actualResult, expectedResult);
System.out.println("Out of validateSearchField method_HP");
}
#Test (priority = 3, enabled = true, groups = {"Smoke", "UITest"})
//"********Validation of Slider1********");
public void validateHomepageSlider1_HP() throws Exception{
System.out.println("In validateHomepageSlider1 Method_HP");
elementName = "Homepage Slider1";
Comment = "validateHomepageSlider1";
actualResult = "https://www.prestashop.com/en?utm_source=v16_homeslider";
expectedResult = "https://www.prestashop.com/en?utm_source=v16_homeslider";
Assert.assertEquals(actualResult, expectedResult);
System.out.println("Out of validateHomepageSlider1 method_HP");
}
}
5. SignIntest Class- Testpage
package sampletestproject;
import java.lang.reflect.Method;
import org.testng.Assert;
import org.testng.ITestResult;
import org.testng.annotations.AfterMethod;
import org.testng.annotations.BeforeMethod;
import org.testng.annotations.Parameters;
import org.testng.annotations.Test;
import
com.relevantcodes.extentreports.ExtentTest;
import
com.relevantcodes.extentreports.LogStatus;
public class SignInnTest extends TestBasee{
SignInPagee lognObj;
ExtentTest logger;
String elementName;
String Comment;
String expectedResult;
String actualResult;
#BeforeMethod(alwaysRun = true)
#Parameters({ "Browser", "URL"})
public void getBrowser(Method method, String Browser, String URL) throws Exception{
logger = getReport().startTest((this.getClass().getSimpleName()+"::"+method.getName()), method.getName());
logger.assignAuthor("VK");
logger.assignCategory("SignInpage - Smoketesting and TextTesting--Live");
homeObj = gm_OpenApp(Browser, URL);
lognObj = homeObj.navigateToSignInPage();
}
#AfterMethod (alwaysRun = true)
public void publishReport_SIP(ITestResult result) throws Exception{
System.out.println("publishReport_SIP");
String resultStatus = null;
if(result.getStatus() == ITestResult.FAILURE){
resultStatus = "FAILED";
String screenshot_Path = gm_setScreenshotPath_forExtentReporting(elementName, resultStatus);
gm_TakeSnapshot(screenshot_Path);
String image = logger.addScreenCapture(screenshot_Path);
logger.log(LogStatus.FAIL, Comment, image);
}else{
resultStatus = "PASSED";
String screenshot_Path = gm_setScreenshotPath_forExtentReporting(elementName, resultStatus);
gm_TakeSnapshot(screenshot_Path);
System.out.println("screenshot_Path: "+screenshot_Path);
String image = logger.addScreenCapture(screenshot_Path);
logger.log(LogStatus.PASS, Comment, image);
}
getReport().endTest(logger);
getReport().flush();
System.out.println("closing now_SIP.");
driverObj.quit();
}
#Test (priority = 0, groups = {"Smoke""})
public void validateSignInPage_PageHeading_SIP() throws Exception{
System.out.println("In validateSignInPage_PageHeading Method_SIP");
elementName = "SignIn_PageHeading_txt";
Comment = "validatePageHeading_SignInpage";
actualResult = "AUTHENTICATION";
expectedResult = "AUTHENTICATION";
Assert.assertEquals(actualResult, expectedResult); //Here test will pass
System.out.println("Out of validateSignInPageHeading method_SIP");
}
#Test (groups = {"UITest"}, dependsOnMethods = { "validateSignInPage_PageHeading_SIP" })
public void validateSignInPage_SignInSectionHeading_SIP() throws Exception{
System.out.println("In validateSignInPage_SignInSectionHeading Method_SIP");
elementName = "SignInPage_SignInSectionHeading_txt";
Comment = "validateSectionHeading_SignInpage";
actualResult = "ALREADY REGISTERED1?";
expectedResult = "ALREADY REGISTERED?";
Assert.assertEquals(actualResult, expectedResult); //Here Test will fail as actual not equal to expected
System.out.println("Out of validateSignInPage_SignInSectionHeading method_SIP");
}
}
6.testng.xml
suite name="Test" parallel = "tests" thread-count = "1">
<test name="CHTest" >
<parameter name="Browser" value="CH" ></parameter>
<parameter name="URL" value="http://automationpractice.com/index.php"></parameter>
<groups><run>
<include name="Smoke"/>
<include name="UITest"/>
</run></groups>
<classes>
<class name= "sampletestproject.SignInnTest" />
<class name= "sampletestproject.HomepageeTest"/>
</classes></test></suite>
I would say it's better to use a threadlocal here or an implementation similar to for managing tests and similar one for ExtentReports:
https://github.com/anshooarora/extentreports-java/blob/master/src/test/java/com/aventstack/extentreports/common/ExtentTestManager.java
Also, if you see the examples section of the docs, you already have a barebones ExtentTestNGReportBuilder example which you can use without creating any custom code like here.
The issue you are facing is due to pricinciples of Java and managaging instances than of ExtentReports. If you want a single report for all your classes then make sure there is always 1 instance only for the entire run. Prevent any behavior which recreates the instances which in your case happens - each class resets the report variable thus resetting the ExtentReports instance.
Moreover, I would recommend using ITestListener in such cases and an example of that is shown below so to separate reporting from your test code:
public class ExtentITestListener
implements ITestListener {
private static final ExtentReports EXTENT = Extent.getInstance();
private static ThreadLocal<ExtentTest> methodTest = new ThreadLocal<ExtentTest>();
private static ThreadLocal<ExtentTest> dataProviderTest = new ThreadLocal<>();
#Override
public synchronized void onStart(ITestContext context) { }
#Override
public synchronized void onFinish(ITestContext context) {
EXTENT.flush();
}
#Override
public synchronized void onTestStart(ITestResult result) {
String methodName = result.getMethod().getMethodName();
if (result.getParameters().length>0) {
if (methodTest.get() != null && methodTest.get().getModel().getName().equals(methodName)) { }
else {
createTest(result);
}
String paramName = Arrays.asList(result.getParameters()).toString();
ExtentTest paramTest = methodTest.get().createNode(paramName);
dataProviderTest.set(paramTest);
} else {
createTest(result);
}
}
private void createTest(ITestResult result) {
String methodName = result.getMethod().getMethodName();
ExtentTest test = EXTENT.createTest(methodName);
methodTest.set(test);
String[] groups = result.getMethod().getGroups();
if (groups.length > 0) {
Arrays.asList(groups)
.forEach(x -> methodTest.get().assignCategory(x));
}
}
#Override
public synchronized void onTestSuccess(ITestResult result) {
getTest(result).pass("Test passed");
}
private ExtentTest getTest(ITestResult result) {
ExtentTest t = result.getParameters() != null && result.getParameters().length>0
? dataProviderTest.get()
: methodTest.get();
return t;
}
#Override
public synchronized void onTestFailure(ITestResult result) {
getTest(result).fail(result.getThrowable());
}
#Override
public synchronized void onTestSkipped(ITestResult result) {
getTest(result).skip(result.getThrowable());
}
#Override
public synchronized void onTestFailedButWithinSuccessPercentage(ITestResult result) { }
}
I have also faced the same issue while executing the 2 classes from the testNG.xml file. I found that the ExtentReport variable gets null after running #AfterMethod annotation that's why it's returning null pointer exception while running test cases from another class.
The solution that worked for me is by making ExtentReport variable as Static so that it creates a single copy of extent report variable and while running multiple classes you won't get that error.
For ex: static ExtentReports reports;
Let me know if the issue still persists.
During ingest json data from kafka and save them as parquet files which be loaded into hive, I met the same issue mentioned in Flink BucketingSink with Custom AvroParquetWriter create empty file
. Does anyone know how to resolve it? Thank you. I used Apache Flink 1.4.0 + HDFS 2.7.3
You can directly implement the Writer interface. It could look the following way:
import org.apache.flink.util.Preconditions;
import org.apache.avro.Schema;
import org.apache.avro.generic.GenericData;
import org.apache.avro.generic.GenericRecord;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.parquet.avro.AvroParquetWriter;
import org.apache.parquet.hadoop.ParquetWriter;
import org.apache.parquet.hadoop.metadata.CompressionCodecName;
import java.io.IOException;
/**
* Parquet writer.
*
* #param <T>
*/
public class ParquetSinkWriter<T extends GenericRecord> implements Writer<T> {
private static final long serialVersionUID = -975302556515811398L;
private final CompressionCodecName compressionCodecName = CompressionCodecName.SNAPPY;
private final int pageSize = 64 * 1024;
private final String schemaRepresentation;
private transient Schema schema;
private transient ParquetWriter<GenericRecord> writer;
private transient Path path;
private int position;
public ParquetSinkWriter(String schemaRepresentation) {
this.schemaRepresentation = Preconditions.checkNotNull(schemaRepresentation);
}
#Override
public void open(FileSystem fs, Path path) throws IOException {
this.position = 0;
this.path = path;
if (writer != null) {
writer.close();
}
writer = createWriter();
}
#Override
public long flush() throws IOException {
Preconditions.checkNotNull(writer);
position += writer.getDataSize();
writer.close();
writer = createWriter();
return position;
}
#Override
public long getPos() throws IOException {
Preconditions.checkNotNull(writer);
return position + writer.getDataSize();
}
#Override
public void close() throws IOException {
if (writer != null) {
writer.close();
writer = null;
}
}
#Override
public void write(T element) throws IOException {
Preconditions.checkNotNull(writer);
writer.write(element);
}
#Override
public Writer<T> duplicate() {
return new ParquetSinkWriter<>(schemaRepresentation);
}
private ParquetWriter<GenericRecord> createWriter() throws IOException {
if (schema == null) {
schema = new Schema.Parser().parse(schemaRepresentation);
}
return AvroParquetWriter.<GenericRecord>builder(path)
.withSchema(schema)
.withDataModel(new GenericData())
.withCompressionCodec(compressionCodecName)
.withPageSize(pageSize)
.build();
}
}
I have a flink cep code that reads from socket and detects for a pattern. Lets say the pattern(word) is 'alert'. If the word alert occurs five times or more, an alert should be created. But I am getting an input mismatch error. Flink version is 1.3.0. Thanks in advance !!
package pattern;
import org.apache.flink.cep.CEP;
import org.apache.flink.cep.PatternStream;
import org.apache.flink.cep.pattern.Pattern;
import org.apache.flink.cep.pattern.conditions.IterativeCondition;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.util.Collector;
import java.util.List;
import java.util.Map;
public class cep {
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStreamSource<String> dss = env.socketTextStream("localhost", 3005);
dss.print();
Pattern<String,String> pattern = Pattern.<String> begin("first")
.where(new IterativeCondition<String>() {
#Override
public boolean filter(String word, Context<String> context) throws Exception {
return word.equals("alert");
}
})
.times(5);
PatternStream<String> patternstream = CEP.pattern(dss, pattern);
DataStream<String> alerts = patternstream
.flatSelect((Map<String,List<String>> in, Collector<String> out) -> {
String first = in.get("first").get(0);
for (int i = 0; i < 6; i++ ) {
out.collect(first);
}
});
alerts.print();
env.execute();
}
}
Just some clarification on the original problem. In 1.3.0 there was a bug that made using lambdas as arguments to select/flatSelect impossible.
It was fixed in 1.3.1, so your first version of the code would work with 1.3.1.
Besides I think you misinterpret the times quantifier. It matches exact number of times. So in your case it will return only when event will be matched exactly 3 times, not 3 or more.
So I have got the code to work. Here is the working solution,
package pattern;
import org.apache.flink.cep.CEP;
import org.apache.flink.cep.PatternSelectFunction;
import org.apache.flink.cep.PatternStream;
import org.apache.flink.cep.pattern.Pattern;
import org.apache.flink.cep.pattern.conditions.IterativeCondition;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.datastream.DataStreamSource;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.util.Collector;
import java.util.List;
import java.util.Map;
public class cep {
public static void main(String[] args) throws Exception {
StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStreamSource<String> dss = env.socketTextStream("localhost", 3005);
dss.print();
Pattern<String,String> pattern = Pattern.<String> begin("first")
.where(new IterativeCondition<String>() {
#Override
public boolean filter(String word, Context<String> context) throws Exception {
return word.equals("alert");
}
})
.times(5);
PatternStream<String> patternstream = CEP.pattern(dss, pattern);
DataStream<String> alerts = patternstream
.select(new PatternSelectFunction<String, String>() {
#Override
public String select(Map<String, List<String>> in) throws Exception {
String first = in.get("first").get(0);
if(first.equals("alert")){
return ("5 or more alerts");
}
else{
return (" ");
}
}
});
alerts.print();
env.execute();
}
}