Maven surefire report's filename has redundant spaces when the tests are written using cucumber - maven-surefire-plugin

I am using maven-surefire-plugin version 2.16 along with tests written using cucumber. The .xml and .txt reports do come up fine in the sure-fire reports folder. The text within the the .xml and .txt file along with the names has a lot of extra spaces that is proportional to the number of step definitions executed cumulatively. Also the filename has a number of spaces proportional to number of steps executed. In case I run a lot of tests then the file simply does not save with the following exception
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project **:
Execution default-test of goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test failed: org.apache.maven.surefire.util.NestedRuntimeException: null;
nested exception is org.apache.maven.surefire.report.ReporterException:
Unable to create file for report: /Users/kgupta2/git/$$$$$$$$/target/surefire-reports/Scenario: Using $$$$$ .txt (File name too long);
nested exception is java.io.FileNotFoundException: /Users/kgupta2/git/$$$$$$$$/target/surefire-reports/Scenario: Using $$$$$$$$ .txt (File name too long)
-> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
Clearly the filename becomes long and I have verified that it is proportional to the number of steps executed. I am running cucumber using JUnit.
Here is my configuration for maven-surefire-plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.16</version>
<configuration>
<testSourceDirectory>${project.basedir}/src/test/java</testSourceDirectory>
<includes>
<include>**/CucumberJunitRun.java</include>
</includes>
</configuration>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit47</artifactId>
<version>2.16</version>
</dependency>
</dependencies>
</plugin>
I am unable to understand why these additional spaces pop up.

I got this to work. It was an issue with incorrect configuration of plugins. Since I was using JUnit to run my cucumber tests, I went ahead and removed all references of TestNG from my pom and that got it to work magically. Some references of TestNG were coming from my parent pom. Just checked that my effective pom had some dependency of testNG. Really do not have a great clue as to why this was happening. But the correct configuration fixed a bunch of other errors that were creeping in

Related

Apache Camel - Camel Report Plugin is not generating the JaCoCo xml

I am using Apache Camel 3.14.0 with SpringBoot and trying to generate code coverage report with JaCoCo xml so that SonarQube can use it to show code coverage. I followed all the steps mentioned in https://camel.apache.org/manual/camel-report-maven-plugin.html#_camelroute_coverage but still the jacoco xml file is not getting generated rather it shows the coverage in console.
I have below dependency in my pom.xml:
<dependency>
<groupId>org.apache.camel</groupId>
<artifactId>camel-management</artifactId>
<version>${camel.springboot.version}</version>
<scope>test</scope>
</dependency>
My test file has below annotations:
#CamelSpringBootTest
#EnableAutoConfiguration
#SpringBootTest(classes = MyApplication.class)
#MockEndpoints(value = "direct:first-route")
#EnableRouteCoverage
I executed mvn test command and all the tests are passed and generated individual test xml file inside target/camel-route-coverage directory. After this, I am executing mvn camel-report:route-coverage -DgenerateJacocoXmlReport=true command which shows code/route coverage on console but doesn't generate target/site/jacoco/xmlJacoco.xml file.
Due to this, I am unable to view code coverage in SonarQube.

Does anyone have a GluonFX codebase with successful native builds after including application logging?

I started using LogBack logging library in my GluonFX app.
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.2.6</version>
</dependency>
It's fine during mvn gluonfx:run but my build fails with mvn gluonfx:build.
Here's an excerpt of my stacktrace. I'll change logging libraries if someone can tell me what's working for them.
Exception raised in scope ForkJoinPool-2-worker-3.ClosedWorldAnalysis.AnalysisGraphBuilderPhase: org.graalvm.compiler.java.BytecodeParser$BytecodeParserError: com.oracle.graal.pointsto.constraints.UnresolvedElementException: Discovered unresolved type during parsing: sun.reflect.Reflection. To diagnose the issue you can use the --allow-incomplete-classpath option. The missing type is then reported at run time when it is accessed the first time.
Caused by: com.oracle.graal.pointsto.constraints.UnresolvedElementException: Discovered unresolved type during parsing: sun.reflect.Reflection. To diagnose the issue you can use the --allow-incomplete-classpath option. The missing type is then reported at run time when it is accessed the first time.

upgraded flink from 1.10 to 1.11, met error 'No ExecutorFactory found to execute the application'

java.lang.IllegalStateException: No ExecutorFactory found to execute the application.
at org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:84)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.executeAsync(StreamExecutionEnvironment.java:1803)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1713)
at org.apache.flink.streaming.api.environment.LocalStreamEnvironment.execute(LocalStreamEnvironment.java:74)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1699)
at org.apache.flink.streaming.api.environment.StreamExecutionEnvironment.execute(StreamExecutionEnvironment.java:1681)
at com.cep.StaticAlarmGenerationEntryTest.main(StaticAlarmGenerationEntryTest.java:149)
The error I met after I upgraded FLink from 1.10 to 1.11, and my IDE is eclipse.
and I tried to add artifactId:flink-clients_${scala.binary.version}, but still failed. Anybody already met and solved this issue, pls tell me. thanks a lot.
See the 1.11 release note, where you now have to add an explicit dependency on flink-clients.
I have solved my problem this way :
1.Use Java 8, apparently Flink has some sort of problem with Java 11 or 15
2.Change all the Scopes to "compile"
you can change scopes in this path : Project Structure → Modules → Dependencies → There is a table that one of its column's name is Scope
I found the reason why the error happened event I add dependency flink-clients. I upgraded Flink from 1.10 to 1.11, just edited the version of Flink, but not changed Scala version. Here also should change Scala version to 2.12. And the project is generated base on 1.10 archetype and Scala version is 2.11. Every time I build the project, it use the 2.11 environments.
So the fast way to solve this issue is :
use mvn archetype:generate -DarchetypeGroupId=org.apache.flink -DarchetypeArtifactId=flink-quickstart-java -DarchetypeVersion=1.11.0 this command to generate new project.
copy all your old code to this new project. You will find that the flink-clinets already added in the pom.xml.
I had this problem when I was packaging up the flink job in a shaded jar. When shading, if there are files with the same name in multiple jars it will overwrite the file as it unzips each jar into the new shaded jar.
Flink uses the file META-INF/services/org.apache.flink.core.execution.PipelineExecutorFactory to discover different executor factories and this file is present in multiple jars, each with different contents.
To fix this, I had to tell the maven-shade plugin to combine these files together as it came across them, and this solved the problem for me.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>job</shadedClassifierName>
<transformers>
<!-- add this to combine the PipelineExecutorFactory files into one -->
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/services/org.apache.flink.core.execution.PipelineExecutorFactory</resource>
</transformer>
</transformers>
...

Error resolving version for plugin 'org.apache.maven.plugins:maven-jar-plugin' from the repositories ,Plugin not found in any plugin repository

I am trying to Maven build my Java project but it is getting failed and i am getting below error:
Error resolving version for plugin 'org.apache.maven.plugins:maven-jar-plugin' from the repositories [local (C:\Users\Vinita.Gupta.m2\repository), central (https://repo.maven.apache.org/maven2)]: Plugin not found in any plugin repository -> [Help 1]
I am trying to build the project in Eclipse Neon and have installed and setup Maven path,m2connector for Eclipse.
Note: I am not connected to network using any proxy
Most of the solutions which i have found online were having issues due to proxy settings but i am directly connected to network and able to browse below directory via browser but couldn't connect through Maven:
https://repo.maven.apache.org/maven2
Please help !!
I had the same error, but for the maven-failsafe-plugin.
With the help of a colleague, we got rid of the problem adding the version line to the plugin declaration in the project pom.xml. We added the <version>2.20.1</version> line resulting in the following plugin declaration.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.20.1</version>
<configuration>
<skipTests>false</skipTests>
<argLine>-Djava.library.path=${project.parent.basedir}/lib/${arquitecturaMaquina}</argLine>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
It is a long working project (over a year) and never had this error. I do not know what triggered it, but all the other answers about resetting the local repository, checking the proxy, etc did not work.
Then we added that version line to the pom.xml and the project built correctly again. Now I removed the line and it is still working. My guess is that it indeed had something to do with some messed up cache of some kind, and that requiring the specific version got rid of the mess and now it is working as before the problem appeared.
Add this to your pom.xml and it should work
<pluginRepositories>
<pluginRepository>
<id>maven2</id>
<url>https://repo.maven.apache.org/maven2/</url>
</pluginRepository>
</pluginRepositories>
I had a weird solution. I got the same error when i was connected in Office LAN. Once I switched to common wifi, the error was not thrown. So i believe this is more to do with the network restriction.
Spent a day working on it, finally, the solution was to change the URL in settings.xml of a .m2 directory
old line was:
<url>http://blah:8081/bla/blabla/</url>
However, for a reason unknown to me, and from my machine only, the URL was getting rejected. As a result, I was getting the error mentioned in a title... Therefore, the solution was to put the full URL such as:
<url>http://blah.org.blah:8081/bla/blabla/</url>
-Then, simply re-open IDE (if using it), update the project (build, clean or whatever else is necessary to refresh).
You may also need to specify <port> or to add the proxy settings, but none of that was required in my case. Therefore, the solution is about adjusting the settings.xml. I have tried adding the version numbers as suggested and it was a partial solution. Cleaning and updating the project (without the settings.xml change) did nothing. Removing *.lastUpdated from .m2 was not a solution either.
What worked for me was to add a Proxy Bypass for repo.maven.apache.org under ‘Window’ > ‘Preferences’ > ‘General’ > ‘Network connections’.
After that I ran into compilation errors related to certain classes not being resolved. Just did a clean and build, viola Build Successful!
Check settings.xml file is available in C:\Users\username.m2\settings.xml and change below
C:/Users/username/.m2/repository
Its worked for me.
Added this to pom.xml and update project.
Worked for me.
<build>
<finalName>spring-security-demo</finalName>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>3.2.0</version>
</plugin>
</plugins>
</pluginManagement>
</build>

AppEngine Maven Plugin jarSplittingExcludes

I'm trying to include the stanford-corenlp library into my AppEngine application, but am running into "JAR too big" errors when uploading:
com.google.appengine.tools.admin.LocalIOException: Jar.../stanford-corenlp-3.3.0-models.jar is too large. Consider using --enable_jar_splitting.
Unable to update app: Jar /tmp/appcfg4516706702870427847.tmp/WEB-INF/lib/stanford-corenlp-3.3.0-models.jar is too large. Consider using --enable_jar_splitting.
As suggested, I tried the enable_jar_splitting option, but some individuals files were too large and those can't be split. Looking through the plugin docs I found:
jarSplittingExcludes
User property: appengine.jarSplittingExcludes
When --enable-jar-splitting is set, files that match the list of comma
separated SUFFIXES will be excluded from all jars.
So I tried to exclude some of the large files that I'm not using with this pom.xml:
<plugin>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-maven-plugin</artifactId>
<version>${appengine.target.version}</version>
<configuration>
<enableJarSplitting>true</enableJarSplitting>
<jarSplittingExcludes>englishRNN.ser.gz</jarSplittingExcludes>
</configuration>
</plugin>
However, now when I run
mvn appengine:update
I get:
[INFO] Updating Google App Engine Application
Bad argument: Unknown option: --jar_splitting_excludes
usage: AppCfg [options] <action> [<app-dir>] [<argument>]
...
options:
....
--enable_jar_splitting
Split large jar files (> 10M) into smaller fragments.
--jar_splitting_excludes=SUFFIXES
When --enable-jar-splitting is set, files that match
the list of comma separated SUFFIXES will be excluded
from all jars.
Any ideas on what I'm doing wrong?
Turns out it was a bug. Should be fixed in 1.9.7. You can clone the development branch for earlier access:
$ git clone https://code.google.com/p/appengine-maven-plugin/
$ cd appengine-maven-plugin
$ mvn install
and change in you pom the plugin version to 1.9.7-SNAPSHOT (keep the other artifacts the same, change only for the plugin)

Resources