Maven/Jacoco - after succesful tests and jacoco datafiles merge, how to get a merged report? - maven-plugin

For a Maven multi module project, running unit and/or integration tests with code Coverage Jacoco work fine. Merging the Jacoco code coverage results in 1 datafile works! So, this is not a duplicate question.
Based on that correctly merge Jacoco datafile, how can I get an overall report?
UPDATE: after scanning all solutions, a lot of trial and error, I created a simple project that satisfied all requirements, see below.

Finally I managed to create a simple but complete Jacoco demo project showing. It a simple but complete solution, also using the combined Jacoco datafile and an aggregate report!
Multi module project
Unit test (via mvn clean install)
Integration test (via mvn clean install -P integration-test)
Jacoco - test coverage ( both aggregate datafile and aggregate reporting)
FindBugs - code quality
Enjoy the sipmle demo project.

Related

Results for deleted / renamed tests still appear in Allure reports using Jenkins Allure plugin

I am using the Jenkins Allure plugin to generate reports for PyTest runs.
I've noticed that if I delete a failing test from my repository, or rename a failing test, the Allure reports generated by Jenkins continue to show failures for the old tests, even though they no longer exist and did not run in the most recent job.
How do I ensure that Allure reports only contain results for tests that actually ran in the latest job?
You should generate the results in allure-results directory in your root project.
Every time you run your job, those new allure results files will be generated in the Jenkins workspace. You should clean your workspace before the build starts to ensure that you are taking the last execution:
Frank Escobar's answer is correct.
I want to add that if you're using a pipeline the option in his screenshot is not available.
In that case, use the Jenkins Clear Workspace plugin https://jenkins.io/doc/pipeline/steps/ws-cleanup/ and create a pipeline step to clear your workspace before starting the test run.

how to include multiple code coverage's in the pipeline

I have an entity framework .net core application as a backend and .net core react app for the front-end.
I am trying to setup azure pipelines for this project.
While I am setting the pipeline for .net core react app I am running the library tests (code coverage is generated) also as it is the referenced project to my UI project.
The Issue here is when I run the JEST tests for the .net core react app, it is also generating test coverage but in the summary of build pipeline test coverage tab is not showing the code coverage when I enable the code coverage of library.
I am able to see both coverages in the artifacts published.
How can I see both coverages in my build summary?
To combine multiple coverage reports into one, you can use the reportgenerator task. The reportgenerator takes multiple coverage files and combines them into one single coverage file. This new coverage file can then be published using the
Example:
steps:
- task: Palmmedia.reportgenerator.reportgenerator-build-release-task.reportgenerator#4
displayName: ReportGenerator
inputs:
reports: '*\TestResults\*\coverage.cobertura.xml'
- task: PublishCodeCoverageResults#1
displayName: 'Publish code coverage'
inputs:
codeCoverageTool: Cobertura
summaryFileLocation: 'coveragereport\Cobertura.xml'
failIfCoverageEmpty: true
Here's an article that describes this in greater detail: https://www.jamescroft.co.uk/combining-multiple-code-coverage-results-in-azure-devops/
how to include multiple code coverage's in the pipeline
It seems you are using the Publish Code Coverage Results task, it's unlike to use Publish test results task. But you could not be able to publish multiple coverage test results in a single task.
When you have two coverage.xml files, Azure Devops Build definition will only use one of them.
To resolve this issue, try to add another Publish Code Coverage Results task in build pipeline for each package.
If it not helps you, please share your build definition in your question.
Hope this helps.

Proper method to implement Jest tests in Jenkins build

We're using Jest to perform our React.js unit tests (on the frontend) of our Node.js app which runs in a docker container.
We have set up a Pipeline in Jenkins but I'm unsure of the best way (or best practice) to include the tests as part of the pipeline.
The steps we have are the following:
Check out the code from source control
NPM install and npm run build (front-end)
Docker build + publish
Deploy app
Bump version
Git push
Docker cleanup
I have 3 main queries:
A. I'm assuming it's best to include npm run test between Step 1 and Step 2 and if all tests pass successfully to move further?
B. But how are the snapshots handled? For example, if there's some change which occurred that generates a difference in a snapshot this will not be "checked" back into the source control.
C. I read that people use Cobertura, jest-junit, etc to have unit tests and coverage within Jenkins - what is the best?
Thanks in advance.
Good questions!
A. You can run tests after npm install. And if all tests pass you move further. Another common thing to do is to run linting or code style check.
B. A bad snapshot will fail tests. Which is why it's important to update snapshots before committing. If your jenkins is hooked up to a code review system, you may disable merges that fail builds, to make sure bad snapshots don't get on your master branch.
C. I have seen people use jest-junit, but that's only because there was a requirement to have the coverage report combined with a junit coverage report. If you don't have any particular requirements around the structure of the report, then the default report jest produces should be fine, and you don't need anything extra.

Jenkins Jacoco Plugin not linking Groovy source files

Is there a way to configure the Jenkins Jacoco Plugin to link Groovy source files to the coverage report? The coverage statistics are calculated correctly, however, in a mixed Java/Groovy Project, only the Java files are linked. The configuration looks as follows:
Switching to the latest release (3.0.3) I was able to fix that issue. However you still need to manually tell the plugin to check for *.groovy source files, e.g.:
jacoco classPattern: 'build/classes',
execPattern: 'build/jacoco/test.exec',
sourceInclusionPattern: '**/*.groovy', // new option required to tell the plugin to search for *.groovy source files
sourcePattern: 'src,test'
Based upon this bug report, it looks like the 2.2+ releases have changed how source code is linked in the report such that it only works for *.java files. One possible work-around is to downgrade the JaCoCo plugin to 2.1.0. This is what we did and it works; although I am not sure what features and bug fixes we give up in 2.2+ so it might not be worth it in your situation.
It looks like there is a Pull Request that needs to be reviewed and merged so that it can be released in an upcoming version.

How to include chutzpah test cases in tests project to output folder

I have a web project, say 'WebProj' in which I have defined all the my javascript source files with angular code. I am defining my chutzpah unit test cases for those javascript source files in another project 'WebProj.Tests' along with my other C# test cases. I am having both the web and tests project under same solution.
My problem is that when I try to integrate the web project to the TFS build process I cannot run the chutzpah test cases as the web project output folder don't have the chutzpah test case files copied to the project output folder. At the same time the test cases are executed if I have the chutzpah test cases defined in the same web project.
How can I execute the javascript chutzpah test cases on build process if those are defined in a separate Test project and include them in project output folder after build?
To have the test case files copied to the build output folder, you need to right-click the .js file and select Properties -> set the Copy to Output Directory property to be Copy always.
Additionally, you need to follow the steps below to run chutzpah tests in TFS build process.
Install jasmine.js to the test project.
Install Chutzpah Test Adapter
Install Chutzpah test runner to the solution (on solution level, not project level).
Set the test assembly to match your javascript test naming convention. e.g. ***.tests.js
Configure to use the custom test adapter during TFS build process. 1). If you are working with vNext build, go to Visual Studio Test step, set the Path to Custom Test Adapters property to be similar to $(Build.SourcesDirectory)\packages (get the path via NuGet restore); 2). If you are working with XAML build, go to Build –> Manage Build Controllers, set Version Control Path to custom Assemblies to the package path.
Completed steps can be found on the "But what if you want run Jasmine.JS test?" part in this blog: http://blogs.blackmarble.co.uk/blogs/rfennell/post/2015/09/23/Running-nUnit-and-JasmineJS-unit-tests-in-TFSVSO-vNext-build.aspx
And also this blog (do not follow the Step2 to check in these files into TFS version control, instead use NuGet to donwload these packages.) http://blogs.msdn.com/b/visualstudioalm/archive/2012/07/09/javascript-unit-tests-on-team-foundation-service-with-chutzpah.aspx
Add another Visual Studio Test step to your build definition that just runs the JavaScript tests as shown below; the redacted portion in the green box of the Test step is the path to your test files. When the build runs, the stats from the two test runs will be combined in the build output.

Resources