Trying to add a new Tab in Allure report - allure

I'm trying to create a plugin for Allure, that will add a new tab called Browsers, that will display TC statuses across all browsers, but I'm stuck at the first step of it - adding Browsers Tab to the report itself. I was using allure-report-plugin-api and instructions from this question - Allure: How do I customize the test report to write "Browsers" instead of "Xunit"? and examples from the git repo of the allure-report-plugin-api.
But I have no luck, the tab isn't added, despite the code being so simple. Could you, please, point me where I did mistakes and show me what is the correct way to do it? Big thanks in advance!
Here's an example how I'm trying to add the new tab
Here's the project structure
src
--->main
--->--->java
--->--->--->allure
--->--->--->--->(Class) BrowserInfo
--->--->resourses
--->--->--->(directory)allure
--->--->--->--->(directory)BrowserInfo
--->--->--->--->--->en.json
--->--->--->--->--->script.js
--->test
--->--->allure
--->--->--->(Class) GoogleSearchTest
--->--->testcases
--->--->--->SearchTest.xml
pom.xml
Here's BrowserInfo class
package allure;
import ru.yandex.qatools.allure.Allure;
import ru.yandex.qatools.allure.data.AllureAttachment;
import ru.yandex.qatools.allure.data.AllureStep;
import ru.yandex.qatools.allure.data.AllureTestCase;
import ru.yandex.qatools.allure.data.plugins.DefaultTabPlugin;
import ru.yandex.qatools.allure.data.plugins.Plugin;
import ru.yandex.qatools.allure.model.Label;
import java.util.ArrayList;
import java.util.List;
#Plugin.Name("browserList")
public class BrowserInfo extends DefaultTabPlugin {
#Override
public void process(AllureTestCase data) {
}
}
Here's en.json
{
"browserList": {
"TITLE": "Browsers",
"TITLE_FULL": "List of browsers"
}
}
Here's script.js
/*global angular*/
(function() {
"use strict";
var module = angular.module('allure.browserList', []);
module.config(function($stateProvider, allureTabsProvider) {
allureTabsProvider.addTab('browserList', {title: 'browserList.TITLE'});
});
})();
Here's just a little test
package allure;
import junit.framework.Assert;
import org.openqa.selenium.By;
import org.openqa.selenium.WebDriver;
import org.openqa.selenium.firefox.FirefoxDriver;
import org.testng.annotations.Test;
public class GoogleSearchTest {
#Test
public void searchBananasTest() {
WebDriver driver = new FirefoxDriver();
driver.get("https://www.google.com/");
driver.findElement(By.id("lst-ib")).sendKeys("BANANAS");
driver.findElement(By.cssSelector("[type = 'submit']")).click();
Assert.assertTrue(driver.findElement(
By.cssSelector("[data-async-context='query:BANANAS'] h3")
).getText().toLowerCase().contains("banana"));
driver.quit();
}
}
Here's TestNG test xml for the test
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd" >
<suite name="searching bananas">
<test name="searching bananas" preserve-order="true">
<classes>
<class name="allure.GoogleSearchTest">
<methods>
<include name = "searchBananasTest"/>
</methods>
</class>
</classes>
</test>
</suite>
Here's pom.xml file
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>AllurePluginTest</groupId>
<artifactId>AllurePluginTest</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<name>allure</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<aspectj.version>1.8.5</aspectj.version>
<allure.version>1.4.16</allure.version>
</properties>
<build>
<plugins>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.16</version>
<configuration>
<suiteXmlFiles>
<suiteXmlFile>${suitexml}</suiteXmlFile>
</suiteXmlFiles>
<testFailureIgnore>false</testFailureIgnore>
<argLine>
-javaagent:"${settings.localRepository}/org/aspectj/aspectjweaver/${aspectj.version}/aspectjweaver-${aspectj.version}.jar"
</argLine>
</configuration>
<dependencies>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>${aspectj.version}</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>ru.yandex.qatools.allure</groupId>
<artifactId>allure-testng-adaptor</artifactId>
<version>${allure.version}</version>
<exclusions>
<exclusion>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>2.46.0</version>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.8.7</version>
</dependency>
<dependency>
<groupId>ru.yandex.qatools.allure</groupId>
<artifactId>allure-report-plugin-api</artifactId>
<version>1.4.16</version>
</dependency>
</dependencies>
<reporting>
<excludeDefaults>true</excludeDefaults>
<plugins>
<plugin>
<groupId>ru.yandex.qatools.allure</groupId>
<artifactId>allure-maven-plugin</artifactId>
<version>2.2</version>
</plugin>
</plugins>
</reporting>
</project>

The main problem is that Allure loads plugins via Java SPI. So you need to create file
ru.yandex.qatools.allure.data.plugins.Plugin in META-INF/services/ in your resources folder with the following content:
allure.BrowserInfo
Then you need to configure allure-maven-plugin to use your plugin:
<reporting>
<excludeDefaults>true</excludeDefaults>
<plugins>
<plugin>
<groupId>ru.yandex.qatools.allure</groupId>
<artifactId>allure-maven-plugin</artifactId>
<version>2.2</version>
<configuration>
<plugins>
<plugin>
<groupId>${project.groupId}</groupId>
<artifactId>${project.artifactId}</artifactId>
<version>${project.version}</version>
</plugin>
</plugins>
</configuration>
</plugin>
</plugins>
</reporting>
Note: the plugin should be installed to local repository.
I recommend to use the separate project for your plugin and generate the preview report using maven-invoker-plugin. In this case you no need to run tests (you can just place test results to folder you need) and no need to install/deploy plugin.
And few more comments for you:
BrowserInfo
By default Allure expects that each plugin provides some data in file ${pluginName}.json. So you need to add some dummy data. An example you can simple add field like this:
#Plugin.Data
private List<String> strings = new ArrayList<>();
The other way configure this behavior in script.js (empty resolve section):
allureTabsProvider.addTab('browserList', {title: 'browserList.TITLE', resolve: {}});
Translation
To add translation to the report use the following command:
allureTabsProvider.addTranslation('cats');
Take a look: Allure JavaScript API
Plugin template
Allure looks for tab.tpl.html for each tab plugin. So you need to add it to your plugin resources.
I hope it helps.

Add dependencies ALlure Report Builder then add below code
// It will generate the Allure Report folder.
new AllureReportBuilder("1.5.4", new File("target/allure-report")).unpackFace();
new AllureReportBuilder("1.5.4", new File("target/allure-report")).processResults(new File("target/allure-results"));

Related

Apache Flink: java.lang.NoClassDefFoundError for FlinkKafkaConsumer

I am trying to test simple flink kafka example.
mvn package works fine and then I ran ../../flink-1.14.3/bin/flink run -c com.comapny.flinktest.App target/flinktest-1.0-SNAPSHOT.jar and got the following error:
java.lang.NoClassDefFoundError: org/apache/flink/streaming/connectors/kafka/FlinkKafkaConsumer
at com.optiver.flinktest.App.main(App.java:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:355)
at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:222)
at org.apache.flink.client.ClientUtils.executeProgram(ClientUtils.java:114)
at org.apache.flink.client.cli.CliFrontend.executeProgram(CliFrontend.java:812)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:246)
at org.apache.flink.client.cli.CliFrontend.parseAndRun(CliFrontend.java:1054)
at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1132)
at org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1132)
My pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.company.flinktest</groupId>
<artifactId>flinktest</artifactId>
<version>1.0-SNAPSHOT</version>
<name>flinktest</name>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-core</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_2.12</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_2.12</artifactId>
<version>1.14.3</version>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_2.12</artifactId>
<version>1.14.3</version>
<scope>Compile</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.2</version>
<configuration>
<archive>
<manifest>
<mainClass>com.company.flinktest.App</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
</project>
Folder structure: src/main/java/com/company/flinktest/App.java
Editor: Vscode
App.java Code:
package com.company.flinktest;
import java.util.Properties;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import java.util.InvalidPropertiesFormatException;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.flink.streaming.util.serialization.SimpleStringSchema;
public class App{
public static void main(String[] args) throws Exception{
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
Properties properties = new Properties();
properties.setProperty("bootstrap.servers", "kafkaserver:9092");
properties.setProperty("group.id", "testgroup");
FlinkKafkaConsumer<String> myConsumer = new FlinkKafkaConsumer<String>(
"mytopic",
new SimpleStringSchema(),
properties);
DataStream<String> stream = env.addSource(myConsumer);
stream.print();
env.execute("flink from kafka");
}
}
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
It looks like your post is mostly code; please add some more details.
You can refer to my answer in one of the posts. Check the Scala language version though.

KuduSink<MyModel> fails to start

I'm trying to write a ETL pipeline from kafka to HDFS using flink.
I'm using the bahir KuduSink and a PojoOperationMapper
It throws an exception before starting. I've included my code, pom, and exception stack trace.
Is there something obvious I'm missing?
package pipeline.poc.model;
import lombok.Data;
#Data
public class MyModel{
private String msgKey;
private String msgData;
}
Pipeline mapping
package pipeline.poc;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationFeature;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
public class MessageMapFunction implements MapFunction<ObjectNode, MyModel> {
/**
*
*/
private static final long serialVersionUID = 1L;
private final ObjectMapper mapper;
public MessageMapFunction() {
super();
mapper = new ObjectMapper();
mapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false);
}
#Override
public MyModel map(ObjectNode value) throws Exception {
JsonNode msgValue = value.get("value");
return mapper.convertValue(msgValue, MyModel.class);
}
}
The pipeline program
package pipeline.poc;
import java.util.Properties;
import org.apache.flink.api.common.functions.MapFunction;
import org.apache.flink.connectors.kudu.connector.KuduTableInfo;
import org.apache.flink.connectors.kudu.connector.writer.KuduWriterConfig;
import org.apache.flink.connectors.kudu.connector.writer.PojoOperationMapper;
import org.apache.flink.connectors.kudu.connector.writer.AbstractSingleOperationMapper.KuduOperation;
import org.apache.flink.connectors.kudu.streaming.KuduSink;
import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
import org.apache.flink.streaming.api.datastream.DataStream;
import org.apache.flink.streaming.api.environment.StreamExecutionEnvironment;
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer;
import org.apache.flink.streaming.connectors.kafka.KafkaDeserializationSchema;
import org.apache.flink.streaming.util.serialization.JSONKeyValueDeserializationSchema;
public class Pipeline {
private final StreamExecutionEnvironment env;
private final KuduWriterConfig kuduConfig;
private final PojoOperationMapper<MyModel> operationMapper;
private final KuduSink<MyModel> kuduSink;
private final KafkaDeserializationSchema<ObjectNode> schema;
private final FlinkKafkaConsumer<ObjectNode> consumer;
private final String[] columns = {"key", "value"};
private final MapFunction<ObjectNode, MyModel> messageMapFunction;
public static void main(String[] args) {
new Pipeline().run();
}
Pipeline() {
env = StreamExecutionEnvironment.getExecutionEnvironment();
schema = new JSONKeyValueDeserializationSchema(false);
kuduConfig = KuduWriterConfig.Builder
.setMasters("localhost:7051,localhost:7151,localhost:7251")
.build();
operationMapper = new PojoOperationMapper<> (
MyModel.class,
columns,
KuduOperation.INSERT);
kuduSink = new KuduSink<>(
kuduConfig,
KuduTableInfo.forTable("TOYTABLE"),
operationMapper);
Properties props = new Properties();
props.setProperty("bootstrap.servers", "localhost:9092");
props.setProperty("group.id", "pipeline.demo");
consumer = new FlinkKafkaConsumer<>(
"pipeline.demo",
schema,
props);
messageMapFunction = new MessageMapFunction();
}
public void run() {
DataStream<ObjectNode> dataStream = env.addSource(consumer);
DataStream<MyModel> messageStream = dataStream.map(messageMapFunction);
// just printing the mapped stream works
// messageStream.print();
// Adding the kuduSink throw an exception
messageStream.addSink(kuduSink);
try {
env.execute("Pipeline Demo");
} catch (Exception e) {
e.printStackTrace();
}
}
}
It throw this exception
ERROR StatusLogger No Log4j 2 configuration file found. Using default configuration (logging only errors to the console), or user programmatically provided configurations. Set system property 'log4j2.debug' to show Log4j 2 internal initialization logging. See https://logging.apache.org/log4j/2.x/manual/configuration.html for instructions on how to configure Log4j 2
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.flink.api.java.ClosureCleaner (file:/Users/smitopher/.m2/repository/org/apache/flink/flink-core/1.13.1/flink-core-1.13.1.jar) to field java.util.Properties.serialVersionUID
WARNING: Please consider reporting this to the maintainers of org.apache.flink.api.java.ClosureCleaner
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Exception in thread "main" org.apache.flink.api.common.InvalidProgramException: [Ljava.lang.reflect.Field;#1095f122 is not serializable. The object probably contains or references non serializable fields.
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:164)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:132)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:69)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.clean(StreamExecutionEnvironment.java:2053)
at org.apache.flink.streaming.api.datastream.DataStream.clean(DataStream.java:203)
at org.apache.flink.streaming.api.datastream.DataStream.addSink(DataStream.java:1243)
at pipeline.poc.Pipeline.run(Pipeline.java:75)
at pipeline.poc.Pipeline.main(Pipeline.java:34)
Caused by: java.io.NotSerializableException: java.lang.reflect.Field
at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1185)
at java.base/java.io.ObjectOutputStream.writeArray(ObjectOutputStream.java:1379)
at java.base/java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1175)
at java.base/java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:349)
at org.apache.flink.util.InstantiationUtil.serializeObject(InstantiationUtil.java:624)
at org.apache.flink.api.java.ClosureCleaner.clean(ClosureCleaner.java:143)
... 8 more
pom.xml
<!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor
license agreements. See the NOTICE file distributed with this work for additional
information regarding copyright ownership. The ASF licenses this file to
you under the Apache License, Version 2.0 (the "License"); you may not use
this file except in compliance with the License. You may obtain a copy of
the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required
by applicable law or agreed to in writing, software distributed under the
License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS
OF ANY KIND, either express or implied. See the License for the specific
language governing permissions and limitations under the License. -->
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>pipeline.poc</groupId>
<artifactId>kafka-flink-pipeline</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Flink Quickstart Job</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<flink.version>1.13.1</flink.version>
<target.java.version>11</target.java.version>
<scala.binary.version>2.11</scala.binary.version>
<maven.compiler.source>${target.java.version}</maven.compiler.source>
<maven.compiler.target>${target.java.version}</maven.compiler.target>
<log4j.version>2.12.1</log4j.version>
</properties>
<repositories>
<repository>
<id>apache.snapshots</id>
<name>Apache Development Snapshot Repository</name>
<url>https://repository.apache.org/content/repositories/snapshots/</url>
<releases>
<enabled>false</enabled>
</releases>
<snapshots>
<enabled>true</enabled>
</snapshots>
</repository>
</repositories>
<dependencies>
<!-- Apache Flink dependencies -->
<!-- These dependencies are provided, because they should not be packaged
into the JAR file. -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-java</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-streaming-java_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-clients_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
<scope>provided</scope>
</dependency>
<!-- Add connector dependencies here. They must be in the default scope
(compile). -->
<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-connector-kafka_${scala.binary.version}</artifactId>
<version>${flink.version}</version>
</dependency>
<!-- Add logging framework, to produce console output when running in the
IDE. -->
<!-- These dependencies are excluded from the application JAR by default. -->
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-slf4j-impl</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>${log4j.version}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.apache.bahir</groupId>
<artifactId>flink-connector-kudu_2.11</artifactId>
<version>1.1-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.projectlombok</groupId>
<artifactId>lombok</artifactId>
<version>1.18.20</version>
</dependency>
</dependencies>
<build>
<plugins>
<!-- Java Compiler -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>${target.java.version}</source>
<target>${target.java.version}</target>
</configuration>
</plugin>
<!-- We use the maven-shade plugin to create a fat jar that contains all
necessary dependencies. -->
<!-- Change the value of <mainClass>...</mainClass> if your program entry
point changes. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.1</version>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.apache.flink:force-shading</exclude>
<exclude>com.google.code.findbugs:jsr305</exclude>
<exclude>org.slf4j:*</exclude>
<exclude>org.apache.logging.log4j:*</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<!-- Do not copy the signatures in the META-INF folder. Otherwise,
this might cause SecurityExceptions when using the JAR. -->
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.apple.pipeline.poc.StreamingJob</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<!-- This improves the out-of-the-box experience in Eclipse by resolving
some warnings. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<versionRange>[3.1.1,)</versionRange>
<goals>
<goal>shade</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore />
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<versionRange>[3.1,)</versionRange>
<goals>
<goal>testCompile</goal>
<goal>compile</goal>
</goals>
</pluginExecutionFilter>
<action>
<ignore />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>

How to execute jar file of a TestNG project in Eclipse

I am able to run a TestNg project in Eclipse by running the testng.xml file. After building the Maven project, I tried to run the jar file in the terminal using the command java -jar jarname.
But it showing an error "no main manifest attribute". So i created a runner class and tried to run that class by run as -> Java application. But it throwing an error "cannot find class in classpath"
My code is given below.
pom.xml file
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-
4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>MockTest</groupId>
<artifactId>MockFrameTest</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>3.141.59</version>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.14.3</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<type>maven-plugin</type>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.8.7</version>
</dependency>
<dependency>
<groupId>com.aventstack</groupId>
<artifactId>extentreports</artifactId>
<version>3.1.5</version>
</dependency>
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.28</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<configuration>
<suiteXmlFiles>
<suiteXmlFile>testng.xml</suiteXmlFile>
</suiteXmlFiles>
</configuration>
</plugin>
</plugins>
</build>
testng.xml file
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE suite SYSTEM "https://testng.org/testng-1.0.dtd">
<suite name="Suite">
<listeners>
<listener class-name="com.reportListners.testListner"/>
</listeners>
<test name="Test">
<classes>
<class name="com.tests.Login"/>
<class name="com.tests.Home"/>
</classes>
</test> <!-- Test -->
</suite> <!-- Suite -->
runner.java
package runner;
import java.util.ArrayList;
import java.util.List;
import org.testng.TestNG;
public class TestNGrunner {
public static void main(String[] args) {
TestNG runner=new TestNG();
// Create a list of String
List<String> suitefiles=new ArrayList<String>();
// Add xml file which you have to execute
suitefiles.add(".//testng.xml");
// now set xml file for execution
runner.setTestSuites(suitefiles);
// finally execute the runner using run method
runner.run();
}
}
So how can i run this TestNG project using the Jar file. Can anyone help me?

Sparkjava with a Jetty returns 404

I'm trying to run a single page web application written in angularjs with spark java on a jetty, the project is written in eclipse kepler as a Maven project.
the error I'm receiving when I go into the url http://localhost:8085/ is as follows, and it doesnt matter if I try to access any other url's or point directly to the index.html/jsp still gets the same error.
HTTP ERROR 404
Problem accessing /. Reason:
Not Found
Powered by Jetty://
Eclipse Console is showing that the server is up and running
== Spark has ignited ...
Listening on localhost:8085 [Thread-2] INFO org.eclipse.jetty.server.Server - jetty-9.0.2.v20130417 [Thread-2]
INFO org.eclipse.jetty.server.ServerConnector - Started
ServerConnector#12ee37d9{HTTP/1.1}{localhost:8085}
Added the Package Explorer in case that might help.
I'll add some of the files code and if move is needed just let me know and ill edit to add them.
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>todoapp</groupId>
<artifactId>todoapp1</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.5</version>
</dependency>
<dependency>
<groupId>org.mongodb</groupId>
<artifactId>mongo-java-driver</artifactId>
<version>2.11.3</version>
</dependency>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.2.4</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.eclipse.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<version>9.0.2.v20130417</version>
<configuration>
<webApp>
<contextPath>/public</contextPath>
</webApp>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>true</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.todoapp.Bootstrap</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Bootstrap.java
package com.todoapp;
import com.mongodb.*;
import static spark.Spark.setIpAddress;
import static spark.Spark.setPort;
import static spark.SparkBase.staticFileLocation;
public class Bootstrap {
private static final String IP_ADDRESS = System.getenv("OPENSHIFT_DIY_IP") != null ? System.getenv("OPENSHIFT_DIY_IP") : "localhost";
private static final int PORT = System.getenv("OPENSHIFT_DIY_IP") != null ? Integer.parseInt(System.getenv("OPENSHIFT_DIY_IP")) : 8085;
public static void main(String[] args) throws Exception {
setIpAddress(IP_ADDRESS);
setPort(PORT);
staticFileLocation("/public");
new TodoResource(new TodoService(mongo()));
}
private static DB mongo() throws Exception {
String host = System.getenv("OPENSHIFT_MONGODB_DB_HOST");
if (host == null) {
MongoClient mongoClient = new MongoClient("localhost");
return mongoClient.getDB("todoapp");
}
int port = Integer.parseInt(System.getenv("OPENSHIFT_MONGODB_DB_PORT"));
String dbname = System.getenv("OPENSHIFT_APP_NAME");
String username = System.getenv("OPENSHIFT_MONGODB_DB_USERNAME");
String password = System.getenv("OPENSHIFT_MONGODB_DB_PASSWORD");
MongoClientOptions mongoClientOptions = MongoClientOptions.builder().connectionsPerHost(20).build();
MongoClient mongoClient = new MongoClient(new ServerAddress(host, port), mongoClientOptions);
mongoClient.setWriteConcern(WriteConcern.SAFE);
DB db = mongoClient.getDB(dbname);
if (db.authenticate(username, password.toCharArray())) {
return db;
} else {
throw new RuntimeException("Not able to authenticate with MongoDB");
}
}
}
this 2 files are the ones responsible for the jetty server, now I have been researching the internet for over a day now trying to find solution and reasons for this error, but nothing helped me.
Make sure your /public directory, where your html files land, is accessible through your project's classpath. In order to test this, try using staticFiles.externalLocation (this for version 2.5, externalStaticFileLocation in your case) with the full path to /public as the param. If it works, you'd probably move your /public directory to the same location as your java packages and it will work with staticFileLocation("/public") as you expect.

Bad request when updating Appengine with mvn appengine:update

i'm getting the following error, when I try to update a appengine-application with the appengine-maven-plugin:
400 Bad Request
Error when loading application configuration:
Unable to assign value '1.8.3' to attribute 'version':
Value '1.8.3' for version does not match expression '^(?:^(?!-)[a-z\d\-]{0,62}[a-z\d]$)$'
This is confusing to my because my appengine-web.xml looks like follows:
<appengine-web-app xmlns="http://appengine.google.com/ns/1.0">
<application>helloworld</application>
<version>0-0-1</version>
<threadsafe>true</threadsafe>
<precompilation-enabled>false</precompilation-enabled>
<system-properties>
<property name="java.util.logging.config.file" value="WEB-INF/logging.properties"/>
</system-properties>
</appengine-web-app>
I'm wondering why appengine-maven-plugin wants to use 1.8.3 as application-version. 1.8.3 is the version of appengine-sdk i want to use.
In my POM it's configured as follows:
<dependency>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-api-1.0-sdk</artifactId>
<version>${appengine.version}</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>2.5</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>jstl</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
and later on
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${appengine.version}</version>
<configuration>
<appVersion>${appengine.app.version}</appVersion>
</configuration>
</plugin>
${appengine.app.version} points to 1.8.3
I'm using Maven in Version 3.1 and Java 1.7.0_25
What do I wrong? Can anyone help my?
Thanks a lot
I had the same issue as you described. When I added the "version" element in the configuration element, with value pointing to the version of the app in my appengine-web.xml file, mvn appengine:update completed successfully. (maven v3.1.0, appengine plugin v1.8.3)
in pom.xml:
....
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${appengine.version}</version>
<configuration>
<version>MY-VERSION</version>
</configuration>
</plugin>
...
in appengine-web.xml:
...
<version>MY-VERSION</version>
...
If you generated the project with the archetype skeleton, like I did, and you have a block similar to
<properties>
<app.id>MY-GAE-PROJECT-ID</app.id>
<app.version>1</app.version>
<appengine.version>1.9.20</appengine.version>
<gcloud.plugin.version>0.9.58.v20150505</gcloud.plugin.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
in your pom.xml and your appengine-web.xml looked like
<?xml version="1.0" encoding="utf-8"?>
<appengine-web-app xmlns="http://appengine.google.com/ns/1.0">
<application>${app.id}</application>
<version>1</version>
etc....
when it got made, then, modify appengine-web.xml to be ${app.version} because they so helpfully already added that property with the archetype but never used it anywhere. Then, update your pom.xml's app.version to be your appropriate version (if you don't use "1"). Then, scroll down in the pom.xml to where you see
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${appengine.version}</version>
<configuration>
<enableJarClasses>false</enableJarClasses>
and inside the configuration block there add
<version>${app.version}</version>
Try to change appengine-web.xml entry from <version>0-0-1</version> to <version>1</version>. Regards, Adam.
In changed only the pom.xml file by adding the plugin>configuration>version tag (per below)...
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${project.appengine.version}</version>
<configuration>
<port>8888</port>
*<version>${app.version}</version>*
</configuration>
</plugin>
The way I solved this issue was trivial in my console I executed mvn clean install and then the appcfg.cmd -A [your app] update target\appengine-try-java-1.0 command.

Resources