Publishing xunit functional test results to Azure devops - selenium-webdriver

I just setup a new project (.net core: netcoreapp2.1) using xUnit Tests for UI test automation using selenium. I can run the tests via build and release pipelines on devops, but I cannot get the test results on the tests tab.
Now I'm wondering: how do I get XML-Reports of my testruns. On the release pipeline, I have a Publish Test Result task, but results are not getting published with the error below.
No Result Found to Publish 'D:\a\r1\a\Global Platform-QA\drop\TestResults\TEST.XML'.
2019-06-27T02:48:16.2148676Z No build level attachments to publish.
I have tried changing the test result format to junit but I am still missing something. I have added an empty TEST.XML to the testresults folder as well, but still haven't been able to figure out the missing link.
Below is the yaml for the build and release pipelines on devops.
pool:steps:
steps:
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: restore
projects: '**/*.csproj'
- task: DotNetCoreCLI#2
displayName: Build
inputs:
projects: '**/*.csproj'
arguments: '--configuration $(BuildConfiguration)'
- task: DotNetCoreCLI#2
displayName: Publish
inputs:
command: publish
publishWebProjects: false
projects: '$(Parameters.RestoreBuildProjects)'
arguments: '--configuration $(BuildConfiguration) --output
$(build.artifactstagingdirectory)'
zipAfterPublish: false
modifyOutputPath: false
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
Release pipeline:
steps:
- task: DotNetCoreCLI#2
displayName: CVProSmokeTest
inputs:
command: custom
projects: '**/CVProSmokeTest.dll'
custom: vstest
arguments: '--logger:trx;logfilename=TEST.xml'
workingDirectory: '$(System.DefaultWorkingDirectory)'
continueOnError: true
condition: succeededOrFailed()
timeoutInMinutes: 20
steps:
- task: PublishTestResults#2
displayName: 'Publish Test Results'
inputs:
testResultsFormat: XUnit
testResultsFiles: '**/TEST.xml'
mergeTestResults: true
testRunTitle: Selenium
condition: succeededOrFailed()
timeoutInMinutes: 20

One of my colleagues helped me fix the pipeline, here are the modified steps, before and after the dotnet CLI task, bash script was added for files.
pre build list files
steps:
- bash: |
pwd
ls -alR
workingDirectory: '$(System.DefaultWorkingDirectory)'
displayName: ' Pre-Build List Files'
After the build task, before publish, post build list files.
steps:
- bash: |
pwd
ls -alR
cat */test-results.xml
workingDirectory: '$(System.DefaultWorkingDirectory)'
displayName: ' Post-Build List Files'
The results are published and can be viewed under the Tests tab.
https://developercommunity.visualstudio.com/content/problem/624719/publishing-xunit-test-results-to-azure-devops.html

Related

how to get output for junit test in git actions steps

I'm new to git actions and was wondering if the following is possible. I'm running a junit test that is creating a salesforce record and returning the record Id, I would like to save the Id from the test log output and send it to to slack. Wondering if its possible to set the record Id as a variable create another job to send it to slack?
this is my workflow for creating the record
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Set up JDK 14
uses: actions/setup-java#v1
with:
java-version: 14
cache: maven
- name: Build project with Maven
run: mvn -B package --file pom.xml -Dtest="StageSFDCTestData#insertNewAccountTwo"

WPF application failing on Azure DevOps build - The runtime pack for Microsoft.WindowsDesktop.App.Runtime.win-x64 was not downloaded

I am trying to set up a build on an Azure DevOps server but my build is failing with the following message
C:\TFS_bldagent\_work\_tool\dotnet\sdk\3.1.301\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.Sdk.FrameworkReferenceResolution.targets(317,5): Error NETSDK1112: The runtime pack for Microsoft.WindowsDesktop.App.Runtime.win-x64 was not downloaded. Try running a NuGet restore with the RuntimeIdentifier 'win-x64'.
In the yaml file I have added a script to run nuget restore with the identifier 'winx-x64' and the yaml file looks like this :
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#0
- task: UseDotNet#2
inputs:
version: '5.0.x'
includePreviewVersions: true # Required for preview versions
- task: UseDotNet#2
inputs:
version: '3.0.x'
packageType: runtime
- script: :dotnet restore -r 'win-x64'
- task: NuGetCommand#2
inputs:
command: 'restore'
restoreSolution: '**/*.sln'
feedsToUse: 'select'
nugetConfigPath: 'nuget.config'
- task: VSBuild#1
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
Has anyone had a similar issue before?
I am using one package in my solution that is not compatible with .net core and is restored using .net framework at the beginning, but I don't think this is causing the issue. The target framework of the app is netcoreapp3.1

Generate SQL Server schema change script on Azure DevOps pipeline

I am trying to allow a pipeline to publish a schema change to an on-premises SQL Server 2017 instance, but I want to do that in two steps:
Generate schema change script action
After approval, publish
I know that can be achieved by publishing to SQL Azure by setting deploymentAction: 'Script' and then deploymentAction: 'Publish'
Is there a way to publish to an on-premises SQL Server in a similar way? I have tried the SqlDacpacDeploymentOnMachineGroup task, but it does not seem possible to do it in two steps with this task
I have finally managed to implement SQL schema generation with database changes, followed by publishing those changes (after approval). Some remarks:
This won't work if the changes will cause data loss.
The path for the sqlpackage is only correct when Visual Studio 2019 is installed, like in windows-2019 images.
A previous package.dacpac was previously generated by building the .sqlproj project(s).
I passed the following variables through a group variable (more information on how to create group variables here):
targetDBConnectionString
servername
databasename
adminlogin
adminPassword
I have added an approval to the ApplyChanges stage
(within Pipelines menu, choose environments, then the ApplyChanges
environment, and then approvals and checks from the three dots
button, on the top right corner). This way the changes are not
applied to the database before manual approval takes place.
stage: VerifyScript
displayName: 'Script database schema changes'
dependsOn:
- Build
jobs:
- deployment: VerifyScript
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'scriptverification'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropDacpac
patterns: '**/*'
- task: CmdLine#2
displayName: 'Generate schema changes script'
inputs:
script: |
"c:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\140\sqlpackage.exe" ^
/action:script ^
/diagnostics:true ^
/sourcefile:$(Pipeline.Workspace)\dropDacpac\path\to\the\dacpacFile\package.dacpac ^
/targetConnectionString:$(targetDBConnectionString) ^
/outputpath:$(Build.StagingDirectory)\changesScript.sql
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.StagingDirectory)'
artifactName: dropSqlSchemaChangesScript
condition: succeededOrFailed()
- task: PowerShell#2
displayName: Show Auto Generated SQL Script
inputs:
targetType: 'inline'
script: |
Write-Host "Auto Generated SQL Update Script:"
Get-Content $(Build.StagingDirectory)\changesScript.sql | foreach {Write-Output $_}
- stage: ApplyChanges
displayName: 'Apply database schema changes'
dependsOn: VerifyScript
jobs:
- deployment: ApplyChanges
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'applyChanges'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropSqlSchemaChangesScript
- task: SqlDacpacDeploymentOnMachineGroup#0
displayName: 'Deploy SQL schema changes script'
inputs:
taskType: 'sqlQuery'
sqlFile: '$(Pipeline.Workspace)\dropSqlSchemaChangesScript\changesScript.sql'
targetMethod: 'server'
authScheme: 'sqlServerAuthentication'
serverName: '$(servername)'
databaseName: '$(databasename)'
sqlUsername: '$(adminlogin)'
sqlPassword: '$(adminPassword)'
I don't think there is a built-in task that can do this for you. However, SQLPackage.exe has the option. Take a look at this post: http://diegogiacomelli.com.br/azure-pipelines-generating-db-script/. It describes using a command line task to generate the sql script.
Once you have the script, you could either use the SqlDacpacDeploymentOnMachineGroup to publish the script (although it won't re-use the already generated script), or you can write a powershell script that publishes the sql script to the database. An example of such a script can be found here: https://careers.centric.eu/nl/blog/custom-azure-devops-pipeline-task-execute-sql-script/

Azure DevOps pipeline get DACPAC from Solution with Asp.Net MVC + Database Projects

I have a Visual Studio Solution (.sln) with 2 projects:
ASP.NET MVC (principal) produces zip with all files to publish in IIS
Database Project that produces .dacpac file to publish in SQL-Server
I generated a Pipeline to generate the Artifact of the Web and DB projects, but it doesn't include the .dacpac file.
this is my yml:
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild#1
inputs:
solution: '$(solution)'
msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactStagingDirectory)"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: CopyFiles#1
inputs:
Contents: '*.dacpac'
TargetFolder: '$(build.artifactStagingDirectory)'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
I tried to add the Database project as a dependency of the Web Project but I get the same result.
I'm not sure if it is generating it:
I tried to manually copy (as you can see on yml) but not found it:
Any idea?
Solved changing source folder and contents on CopyFile task
- task: CopyFiles#1
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: 'it-EMDB/bin/$(BuildConfiguration)/*'
TargetFolder: '$(Build.ArtifactStagingDirectory)'

Is it possible to create an apk in Ionic 4 - (React) via CLI only?

I want to generate an apk for an Ionic-React app, only using the command-line.
I am able to use Ionic Capacitor to make an apk via Android-Studio, but I want to purely use the CLI to generate the same apk. Is this possible to do? Thanks.
I have since solved this issue and am now able to build an Ionic React app apk on Azure dev-ops without having to interact with Android studio at all.
Here is the YAML -
Firstly - Building the Ionic project
trigger:
- master
variables:
scheme: ''
sdk: 'iphoneos'
configuration: 'Release'
pool:
vmImage: 'macos-latest'
steps:
- task: UseNode#1
inputs:
checkLatest: true
- task: Npm#1
inputs:
command: 'install'
- script: npm run citest
- task: Npm#1
inputs:
command: 'custom'
customCommand: 'install -g ionic'
- task: CmdLine#2
inputs:
script: 'ionic build --prod'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: 'build'
ArtifactName: 'drop'
publishLocation: 'Container'
Secondly - Building .apk
- task: CmdLine#2
inputs:
script: 'ionic capacitor add android'
- task: PythonScript#0
inputs:
scriptSource: 'filePath'
scriptPath: 'changeGradle.py'
arguments: '--version=$(Build.BuildId)'
- task: CopyFiles#2
inputs:
SourceFolder: 'resources/androidResources/res'
Contents: '**'
TargetFolder: 'android/app/src/main/res'
OverWrite: true
- task: Gradle#2
inputs:
workingDirectory: 'android'
gradleWrapperFile: 'android/gradlew'
gradleOptions: '-Xmx3072m'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '1.8'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
testResultsFiles: '**/TEST-*.xml'
tasks: 'build'
- task: AndroidSigning#3
inputs:
apkFiles: 'android/app/build/outputs/apk/release/*.apk'
apksignerKeystoreFile: XXX.jks
apksignerKeystorePassword: 'XXX'
apksignerKeystoreAlias: 'XXX'
apksignerKeyPassword: 'XXX'
zipalign: true
- task: CopyFiles#2
inputs:
Contents: '**/*'
TargetFolder: '$(build.artifactStagingDirectory)'
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.artifactStagingDirectory)'
publishLocation: 'pipeline'
Publishing the .apk to Google Play internal track is handled in the Azure release pipeline (not build).
The PythonScript command is a custom script I wrote to change the Gradlew build no. as no other solutions seemed to work for me.
I hope this helps someone as this was a nightmare to figure out on my own!
So when you install and configure Ionic for Android you need to follow this guide here:
https://ionicframework.com/docs/installation/android
Then you can build your hybrid apps:
https://ionicframework.com/docs/publishing/play-store
For apps that leverage Cordova you need to have Android Studio & SDK installed, but you do not have to use them directly, ionic CLI will do it for you when you are using commands like:
$ionic cordova build android --prod --release
For apps using Capacitor there is currently no way to avoid using Xcode or Studio:
https://capacitor.ionicframework.com/docs/basics/building-your-app

Resources