I am trying to allow a pipeline to publish a schema change to an on-premises SQL Server 2017 instance, but I want to do that in two steps:
Generate schema change script action
After approval, publish
I know that can be achieved by publishing to SQL Azure by setting deploymentAction: 'Script' and then deploymentAction: 'Publish'
Is there a way to publish to an on-premises SQL Server in a similar way? I have tried the SqlDacpacDeploymentOnMachineGroup task, but it does not seem possible to do it in two steps with this task
I have finally managed to implement SQL schema generation with database changes, followed by publishing those changes (after approval). Some remarks:
This won't work if the changes will cause data loss.
The path for the sqlpackage is only correct when Visual Studio 2019 is installed, like in windows-2019 images.
A previous package.dacpac was previously generated by building the .sqlproj project(s).
I passed the following variables through a group variable (more information on how to create group variables here):
targetDBConnectionString
servername
databasename
adminlogin
adminPassword
I have added an approval to the ApplyChanges stage
(within Pipelines menu, choose environments, then the ApplyChanges
environment, and then approvals and checks from the three dots
button, on the top right corner). This way the changes are not
applied to the database before manual approval takes place.
stage: VerifyScript
displayName: 'Script database schema changes'
dependsOn:
- Build
jobs:
- deployment: VerifyScript
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'scriptverification'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropDacpac
patterns: '**/*'
- task: CmdLine#2
displayName: 'Generate schema changes script'
inputs:
script: |
"c:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\140\sqlpackage.exe" ^
/action:script ^
/diagnostics:true ^
/sourcefile:$(Pipeline.Workspace)\dropDacpac\path\to\the\dacpacFile\package.dacpac ^
/targetConnectionString:$(targetDBConnectionString) ^
/outputpath:$(Build.StagingDirectory)\changesScript.sql
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.StagingDirectory)'
artifactName: dropSqlSchemaChangesScript
condition: succeededOrFailed()
- task: PowerShell#2
displayName: Show Auto Generated SQL Script
inputs:
targetType: 'inline'
script: |
Write-Host "Auto Generated SQL Update Script:"
Get-Content $(Build.StagingDirectory)\changesScript.sql | foreach {Write-Output $_}
- stage: ApplyChanges
displayName: 'Apply database schema changes'
dependsOn: VerifyScript
jobs:
- deployment: ApplyChanges
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'applyChanges'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropSqlSchemaChangesScript
- task: SqlDacpacDeploymentOnMachineGroup#0
displayName: 'Deploy SQL schema changes script'
inputs:
taskType: 'sqlQuery'
sqlFile: '$(Pipeline.Workspace)\dropSqlSchemaChangesScript\changesScript.sql'
targetMethod: 'server'
authScheme: 'sqlServerAuthentication'
serverName: '$(servername)'
databaseName: '$(databasename)'
sqlUsername: '$(adminlogin)'
sqlPassword: '$(adminPassword)'
I don't think there is a built-in task that can do this for you. However, SQLPackage.exe has the option. Take a look at this post: http://diegogiacomelli.com.br/azure-pipelines-generating-db-script/. It describes using a command line task to generate the sql script.
Once you have the script, you could either use the SqlDacpacDeploymentOnMachineGroup to publish the script (although it won't re-use the already generated script), or you can write a powershell script that publishes the sql script to the database. An example of such a script can be found here: https://careers.centric.eu/nl/blog/custom-azure-devops-pipeline-task-execute-sql-script/
Related
Description
I'm trying out the service containers for integrated database tests in azure devops pipelines.
As per this opensourced dummy ci cd pipeline project https://dev.azure.com/funktechno/_git/dotnet%20ci%20pipelines. I was experimenting with azure devops service containers for integrated pipeline testing. I got postgress and mysql to work. I'm having issues with with microsoft sql server.
yml file
resources:
containers:
- container: mssql
image: mcr.microsoft.com/mssql/server:2017-latest
ports:
# - 1433
- 1433:1433
options: -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -e 'MSSQL_PID=Express'
- job: unit_test_db_mssql
# condition: eq('${{ variables.runDbTests }}', 'true')
# continueOnError: true
pool:
vmImage: 'ubuntu-latest'
services:
localhostsqlserver: mssql
steps:
- task: UseDotNet#2
displayName: 'Use .NET Core sdk 2.2'
inputs:
packageType: sdk
version: 2.2.207
installationPath: $(Agent.ToolsDirectory)/dotnet
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
# echo Write your commands here...
# echo ${{agent.services.localhostsqlserver.ports.1433}}
# echo Write your commands here end...
- task: CmdLine#2
displayName: 'enabledb'
inputs:
script: |
cp ./MyProject.Repository.Test/Data/appSettings.devops.mssql.json ./MyProject.Repository.Test/Data/AppSettings.json
- task: DotNetCoreCLI#2
inputs:
command: 'test'
workingDirectory: MyProject.Repository.Test
arguments: '--collect:"XPlat Code Coverage"'
- task: PublishCodeCoverageResults#1
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: '$(Agent.TempDirectory)\**\coverage.cobertura.xml'
db connection string
{
"sqlserver": {
"ConnectionStrings": {
"Provider": "sqlserver",
"DefaultConnection": "User ID=sa;Password=yourStrong(!)Password;Server=localhost;Database=mockDb;Pooling=true;"
}
}
}
Debugging
Most I can tell is that when I run Bash#3 to check the environment variables postgres and mysql print something similar to
/bin/bash --noprofile --norc /home/vsts/work/_temp/b9ec7d77-4bc2-47ab-b767-6a5e95ec3ea6.sh
"id": "b294d39b9cc1f0d337bdbf92fb2a95f0197e6ef78ce28e9d5ad6521496713708"
"pg11": {
}
while the mssql fails to print an id
========================== Starting Command Output ===========================
/bin/bash --noprofile --norc /home/vsts/work/_temp/70ae8517-5199-487f-9067-aee67f8437bb.sh
}
}
update this doesn't happen when using ubuntu-latest, but still have mssql connection issue
Database Error logging.
I'm currently getting this error in the pipeline
Error Message:
Failed for sqlserver
providername:Microsoft.EntityFrameworkCore.SqlServer m:A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 40 - Could not open a connection to SQL Server)
In TestDbContext.cs my error message is this. If people have pointer for getting more details it would be appreciated.
catch (System.Exception e)
{
var assertMsg = "Failed for " + connectionStrings.Provider + "\n" + " providername:" + dbContext.Database.ProviderName + " m:";
if (e.InnerException != null)
assertMsg += e.InnerException.Message;
else
assertMsg += e.Message;
_exceptionMessage = assertMsg;
}
Example pipeline: https://dev.azure.com/funktechno/dotnet%20ci%20pipelines/_build/results?buildId=73&view=logs&j=ce03965c-7621-5e79-6882-94ddf3daf982&t=a73693a5-1de9-5e3d-a243-942c60ab4775
Notes
I already know that azure devops pipeline mssql server doesn't work in the windows agents b/c they are windows server 2019 and the windows container version of mssql server is not well supported and only works for windows server 2016. It fails on the initialize container step when I do that.
I tried several things for the unit_test_db_mssql, changing ubuntu version, changing parameters, changing mssql server version, all give about the same error.
If people know of command line methods that work in linux to test if the mssql docker instance is ready, this may help as well.
Progress Update
got mssql working in github actions and gitlab pipelines so far.
postgres and mysql work just fine on azure devops and bitbucket pipelines.
still no luck on getting mssql working on azure devops.bitbucket wouldn't work as well although it did give a lot of docker details of the running database, but then again nobody really cares about bitbucket pipelines with their measly 50 minutes. You can see all of these pipelines in the referenced repository, they are all public, pull requests are also possible since it's open-sourced.
Per the help from Starain Chen [MSFT] in https://developercommunity.visualstudio.com/content/problem/1159426/working-examples-using-service-container-of-sql-se.html. It looks like a 10 second delay is needed to wait for the container to be ready.
adding
- task: PowerShell#2
displayName: 'delay 10'
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
start-sleep -s 10
Gets the db connection to work. I'm assuming that maybe the mssql docker container is ready by then.
I ran into this issue over the past several days and came upon your post. I was getting the same behavior and then something clicked.
IMPORTANT NOTE: If you are using PowerShell on Windows to run these commands use double quotes instead of single quotes.
that note is from https://hub.docker.com/_/microsoft-mssql-server
I believe that changing your line from
options: -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -e 'MSSQL_PID=Express'
to
options: -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=yourStrong(!)Password" -e "MSSQL_PID=Express"
should get it to work. I think its also interesting that in the link you posted above the password is passed in as its own line, which probably resolves the issue, not necessarily the 10 second delay. (example below)
- container: mssql
image: mcr.microsoft.com/mssql/server:2017-latest
env:
ACCEPT_EULA: Y
SA_PASSWORD: Password123
MSSQL_PID: Express
I am trying to set up a build on an Azure DevOps server but my build is failing with the following message
C:\TFS_bldagent\_work\_tool\dotnet\sdk\3.1.301\Sdks\Microsoft.NET.Sdk\targets\Microsoft.NET.Sdk.FrameworkReferenceResolution.targets(317,5): Error NETSDK1112: The runtime pack for Microsoft.WindowsDesktop.App.Runtime.win-x64 was not downloaded. Try running a NuGet restore with the RuntimeIdentifier 'win-x64'.
In the yaml file I have added a script to run nuget restore with the identifier 'winx-x64' and the yaml file looks like this :
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#0
- task: UseDotNet#2
inputs:
version: '5.0.x'
includePreviewVersions: true # Required for preview versions
- task: UseDotNet#2
inputs:
version: '3.0.x'
packageType: runtime
- script: :dotnet restore -r 'win-x64'
- task: NuGetCommand#2
inputs:
command: 'restore'
restoreSolution: '**/*.sln'
feedsToUse: 'select'
nugetConfigPath: 'nuget.config'
- task: VSBuild#1
inputs:
solution: '$(solution)'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
Has anyone had a similar issue before?
I am using one package in my solution that is not compatible with .net core and is restored using .net framework at the beginning, but I don't think this is causing the issue. The target framework of the app is netcoreapp3.1
I'm currently migrating our CI/CD pipeline from a Bitbucket/Jenkins environment to hosted GitLab with additional custom gitlab-ci runners. So far anything seems fine, except when it comes down to services, especially regarding MSSQL server.
I've setup a gitlab-ci.yml file, which contains a service and a build stage job which basically just executes some msbuild targets.
I call the AttachDatabase target which then internally connects to the database and prepares anything for unittesting. Unfortunately I'm not able to connect to the database, whether I alias the service or not.
According to the documentation, I should just be able to use the alias name defined in services in my connection string found in Library.Build.Database.targets to connect to the databse.
I've setup a small reference project which ilustrates the problem: mssql-test.
If the pipeline is run the following error message is shown in the log:
error : A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections
image: "mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2019"
variables:
PROJ_NAME: MSSQL.Test.proj
MSBUILD_BIN: 'C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\MSBuild\Current\Bin\msbuild.exe'
NUGET_BIN: 'C:\Program Files\NuGet\nuget.exe'
ACCEPT_EULA: 'Y'
sa_password: Unit-T3st3r
services:
- name: advitec/mssql-server-windows-developer
alias: mssql
attachdatabase:
stage: build
tags:
- windows-1809
- 3volutions
- docker-windows
cache:
paths:
- packages
before_script:
- cmd /C "$NUGET_BIN" restore .\packages.config -OutputDirectory .\packages
allow_failure: false
script:
- cmd /C "$MSBUILD_BIN" "$PROJ_NAME" -t:AttachDatabase -v:Minimal "-p:Configuration=ReleaseForTesting;UniqueBuildNumber=$CI_PIPELINE_IID"
I'm running a custom windows Gitlab runner (just for performance reasons), below the according config.toml
Runner:
concurrent = 1
check_interval = 0
[session_server]
session_timeout = 1800
[[runners]]
name = "gitlab-runner-02-windows-server-datacenter-1809"
url = "https://gitlab.com/"
token = "****"
executor = "docker-windows"
[runners.custom_build_dir]
[runners.cache]
[runners.cache.s3]
[runners.cache.gcs]
[runners.docker]
tls_verify = false
image = "mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2019"
privileged = false
disable_entrypoint_overwrite = false
oom_kill_disable = false
disable_cache = false
volumes = ["c:\\cache"]
shm_size = 0
Any ideas what I'm missing?
Cheers
I have a Visual Studio Solution (.sln) with 2 projects:
ASP.NET MVC (principal) produces zip with all files to publish in IIS
Database Project that produces .dacpac file to publish in SQL-Server
I generated a Pipeline to generate the Artifact of the Web and DB projects, but it doesn't include the .dacpac file.
this is my yml:
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
solution: '**/*.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: VSBuild#1
inputs:
solution: '$(solution)'
msbuildArgs: '/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactStagingDirectory)"'
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: VSTest#2
inputs:
platform: '$(buildPlatform)'
configuration: '$(buildConfiguration)'
- task: CopyFiles#1
inputs:
Contents: '*.dacpac'
TargetFolder: '$(build.artifactStagingDirectory)'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'drop'
publishLocation: 'Container'
I tried to add the Database project as a dependency of the Web Project but I get the same result.
I'm not sure if it is generating it:
I tried to manually copy (as you can see on yml) but not found it:
Any idea?
Solved changing source folder and contents on CopyFile task
- task: CopyFiles#1
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: 'it-EMDB/bin/$(BuildConfiguration)/*'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
I just setup a new project (.net core: netcoreapp2.1) using xUnit Tests for UI test automation using selenium. I can run the tests via build and release pipelines on devops, but I cannot get the test results on the tests tab.
Now I'm wondering: how do I get XML-Reports of my testruns. On the release pipeline, I have a Publish Test Result task, but results are not getting published with the error below.
No Result Found to Publish 'D:\a\r1\a\Global Platform-QA\drop\TestResults\TEST.XML'.
2019-06-27T02:48:16.2148676Z No build level attachments to publish.
I have tried changing the test result format to junit but I am still missing something. I have added an empty TEST.XML to the testresults folder as well, but still haven't been able to figure out the missing link.
Below is the yaml for the build and release pipelines on devops.
pool:steps:
steps:
- task: DotNetCoreCLI#2
displayName: Restore
inputs:
command: restore
projects: '**/*.csproj'
- task: DotNetCoreCLI#2
displayName: Build
inputs:
projects: '**/*.csproj'
arguments: '--configuration $(BuildConfiguration)'
- task: DotNetCoreCLI#2
displayName: Publish
inputs:
command: publish
publishWebProjects: false
projects: '$(Parameters.RestoreBuildProjects)'
arguments: '--configuration $(BuildConfiguration) --output
$(build.artifactstagingdirectory)'
zipAfterPublish: false
modifyOutputPath: false
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
condition: succeededOrFailed()
Release pipeline:
steps:
- task: DotNetCoreCLI#2
displayName: CVProSmokeTest
inputs:
command: custom
projects: '**/CVProSmokeTest.dll'
custom: vstest
arguments: '--logger:trx;logfilename=TEST.xml'
workingDirectory: '$(System.DefaultWorkingDirectory)'
continueOnError: true
condition: succeededOrFailed()
timeoutInMinutes: 20
steps:
- task: PublishTestResults#2
displayName: 'Publish Test Results'
inputs:
testResultsFormat: XUnit
testResultsFiles: '**/TEST.xml'
mergeTestResults: true
testRunTitle: Selenium
condition: succeededOrFailed()
timeoutInMinutes: 20
One of my colleagues helped me fix the pipeline, here are the modified steps, before and after the dotnet CLI task, bash script was added for files.
pre build list files
steps:
- bash: |
pwd
ls -alR
workingDirectory: '$(System.DefaultWorkingDirectory)'
displayName: ' Pre-Build List Files'
After the build task, before publish, post build list files.
steps:
- bash: |
pwd
ls -alR
cat */test-results.xml
workingDirectory: '$(System.DefaultWorkingDirectory)'
displayName: ' Post-Build List Files'
The results are published and can be viewed under the Tests tab.
https://developercommunity.visualstudio.com/content/problem/624719/publishing-xunit-test-results-to-azure-devops.html