I am trying to automate my testing using Nuke Build on GitHub. I am using a Docker to run a SQL Server container for testing. When it is run by GitHub Actions this error occurs. The image is successfully created and the error occurs when the tests try to access the database.
Error message:
Microsoft.Data.SqlClient.SqlException : Cannot open database "MyDB" requested by the login. The login failed.
Login failed for user 'sa'
It runs successfully locally on my Windows 10 computer. I also tried to used Azure DevOps Pipeline and and the same error occurs. I tried looking around at similar issues such as this but they are not specific to when running on GitHub.
My .yaml script:
name: ci
on:
push:
branches:
- main
- feature/*
pull_request:
branches:
- feature/*
jobs:
ubuntu-latest:
name: ubuntu-latest
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Run './build.cmd Test'
run: ./build.cmd Test
Snippets from my Build.cs modeled from this:
Target RunDbContainer => _ => _
.Before(Compile)
.Executes(() =>
{
DockerTasks.DockerRun(x => x
.SetImage("mcr.microsoft.com/mssql/server:2019-latest")
.SetEnv(new string[] { "ACCEPT_EULA=Y", "SA_PASSWORD=Password_01" })
.SetPublish("1455:1433")
.SetDetach(true)
.SetName("sql1")
.SetHostname("sql1"));
System.Threading.Thread.Sleep(10000);
});
Target Test => _ => _
.Triggers(UnitTest, RunDbContainer, IntegrationTest, SqlServerContainerStop);
The database is created and deleted during testing using EF Core.
A connection string is used to create a DbContext. Then context.Database.EnsureCreated() is used to create the database.
For the connection string I tired using localhost and the IP address gotten using:
"Server=127.17.0.2,1455;Database=MyDb;User Id=sa;Password=Password_01;MultipleActiveResultSets=True;Trusted_Connection=False;"
"Server=localhost,1455;Database=MyDb;User Id=sa;Password=Password_01;MultipleActiveResultSets=True;Trusted_Connection=False;"
I got the IP address following these instuctions:
docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' sql1
Related
I'm trying to deploy my sanity react app, here's the repo:
https://github.com/sebastian-meckovski/sanity-admin-area
here's my yml file:
trigger:
- main
- feature/*
pool:
vmImage: 'ubuntu-latest'
variables:
SANITY_AUTH_TOKEN: $(SanityDeployAPIToken)
SANITY_STUDIO_PROJECT_BASEPATH: /studio
steps:
- task: NodeTool#0
inputs:
versionSpec: '14.x'
- task: CmdLine#2
inputs:
script: |
cd src/Studio
npm install -g #sanity/cli
npm install
sanity deploy
been using this article for help: https://blog.novacare.no/deploy-sanity-studio-with-azure-devops/
I'm getting this error in cmdline in azure:
Your project has not been assigned a studio hostname.
To deploy your Sanity Studio to our hosted Sanity.Studio service,
you will need one. Please enter the part you want to use.
any ideas?
Sanity Deployment in Azure Pipeline
That is because you would be asked to choose a unique hostname for your Studio when you first deploy sanity.
This is a pop-up verification window that needs to interact with our developers.
But the hosted agent is in the service mode instead of Interactive. Please check the docuement Azure Pipelines agents for some more details.
In this case, the verification window will not pop-up for the hosted agent. That is the reason why you get the error the project has not been assigned a studio hostname.
To resolve this issue, you need to create your private agent and set it as Interactive mode.
Self-hosted Linux agents
Or you could just deploy it from your local with command line, so that you could assign a studio hostname.
Of course, a more complete solution would be to have Sanity pre-specify the hostname somewhere (a json-like file) and then read it when deploying Sanity without specifying it during the process. But I'm not a Sanity expert and it's not clear if there is such a way.
I cannot get a Release pipeline in Azure DevOps to successfully deploy build files from a React app to an Azure App Service.
This is the YAML file for the app:
trigger:
- main
variables:
buildConfiguration: 'Release'
stages:
- stage: Build
displayName: 'Build my web application'
jobs:
- job: 'Build'
displayName: 'Build job'
pool:
vmImage: ubuntu-latest
demands:
- npm
steps:
- task: NodeTool#0
inputs:
versionSpec: '16.x'
displayName: 'Install Node.js'
- script: |
npm install
npm run build
displayName: 'npm install and build'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: 'build'
ArtifactName: 'drop'
publishLocation: 'Container'
displayName: 'Build artifact'
As you'd expect, this puts the resultant build files in 'drop'. I can confirm this by inspecting the contents of 'drop' as it is a Published Artifact I can click on in the Summary tab for the Build process.
It's the Release that fails. This is the log for the release:
2022-03-28T11:29:39.9940600Z ##[section]Starting: Azure Web App Deploy: my-app-serv
2022-03-28T11:29:39.9952321Z ==============================================================================
2022-03-28T11:29:39.9952723Z Task : Azure Web App
2022-03-28T11:29:39.9953008Z Description : Deploy an Azure Web App for Linux or Windows
2022-03-28T11:29:39.9953295Z Version : 1.200.0
2022-03-28T11:29:39.9953540Z Author : Microsoft Corporation
2022-03-28T11:29:39.9953833Z Help : https://aka.ms/azurewebapptroubleshooting
2022-03-28T11:29:39.9954210Z ==============================================================================
2022-03-28T11:29:40.3697650Z Got service connection details for Azure App Service:'my-app-serv'
2022-03-28T11:29:42.3999385Z Package deployment using ZIP Deploy initiated.
2022-03-28T11:30:18.0663125Z Updating submodules.
2022-03-28T11:30:18.0670674Z Preparing deployment for commit id 'dc023bbe-d'.
2022-03-28T11:30:18.0672154Z Repository path is /tmp/zipdeploy/extracted
2022-03-28T11:30:18.0673178Z Running oryx build...
2022-03-28T11:30:19.1423345Z Command: oryx build /tmp/zipdeploy/extracted -o /home/site/wwwroot --platform nodejs --platform-version 16 -i /tmp/8da10ae4b1f9200 -p compress_node_modules=tar-gz --log-file /tmp/build-debug.log
2022-03-28T11:30:19.1431972Z Operation performed by Microsoft Oryx, https://github.com/Microsoft/Oryx
2022-03-28T11:30:19.1453191Z You can report issues at https://github.com/Microsoft/Oryx/issues
2022-03-28T11:30:19.1453685Z
2022-03-28T11:30:19.1454256Z Oryx Version: 0.2.20211207.1, Commit: 46633df49cc8fbe9718772a3c894df221273b2af, ReleaseTagName: 20211207.1
2022-03-28T11:30:19.1457307Z
2022-03-28T11:30:19.1463475Z Build Operation ID: |DTbD+7CrQyM=.49dfa157_
2022-03-28T11:30:19.1465355Z Repository Commit : dc023bbe-d46e-46f2-9d49-6e8157706c19
2022-03-28T11:30:19.1465695Z
2022-03-28T11:30:19.1466122Z Detecting platforms...
2022-03-28T11:30:19.1466558Z Could not detect any platform in the source directory.
2022-03-28T11:30:19.1467416Z Error: Couldn't detect a version for the platform 'nodejs' in the repo.
2022-03-28T11:30:19.1469069Z Error: Couldn't detect a version for the platform 'nodejs' in the repo.\n/opt/Kudu/Scripts/starter.sh oryx build /tmp/zipdeploy/extracted -o /home/site/wwwroot --platform nodejs --platform-version 16 -i /tmp/8da10ae4b1f9200 -p compress_node_modules=tar-gz --log-file /tmp/build-debug.log
2022-03-28T11:30:19.1469950Z Deployment Failed.
2022-03-28T11:30:19.1510175Z ##[error]Failed to deploy web package to App Service.
2022-03-28T11:30:19.1525344Z ##[error]To debug further please check Kudu stack trace URL : https://$my-app-serv:***#my-app-serv.scm.azurewebsites.net/api/vfs/LogFiles/kudu/trace
2022-03-28T11:30:19.1527823Z ##[error]Error: Package deployment using ZIP Deploy failed. Refer logs for more details.
2022-03-28T11:30:30.1233247Z Successfully added release annotation to the Application Insight : my-app-serv
2022-03-28T11:30:32.2997996Z Successfully updated deployment History at (CUT)
2022-03-28T11:30:34.0322983Z App Service Application URL: http://my-app-serv.azurewebsites.net
2022-03-28T11:30:34.0390276Z ##[section]Finishing: Azure Web App Deploy: my-app-serv
The Release uses Azure Web App Deploy. App Type is 'Web App on Linux'. 'Package or Folder' is the 'drop' folder. Runtime stack is '16 LTS (NODE|16-lts)' (but it also doesn't work if that's empty).
The drop folder does not contain zipped output. I don't understand why the Release operation is referred to as a Zip Deploy. Am I missing something to avoid the error 'Error: Couldn't detect a version for the platform 'nodejs' in the repo.'?
I'm just expecting the contents in the 'drop' folder to be successfully copied to App Service, and the web app run so I can test it (and in the long time, setup automated tests).
I've tried a number of different things with the Build, including zipping the build artifacts, with no luck. I don't think the build is the problem though, as the files in the 'drop' folder are the files I want copied.
So I think it's the Release that's the problem. But that looks so simple.
I start with an Agent and add an Azure Web App deployment task. It seems to successfully pickup the drop folder, as I've tried other values that show an obvious error when that is wrong. The target App Service is Linux, so the Web App Deploy App type is set to 'Web App on Linux'.
I've seen a few different approaches in stackoverflow, but no answers to this approach. Maybe I'm going about this the wrong way, but on the surface it looks right, as if I get this right, I can easily manage manual deployments, authorisations, etc. as supported by Releases.
Thanks in advance
One of the possible workarounds that you can try is to make the SCM_DO_BUILD_DURING_DEPLOYMENT= FALSE.
After making this as false, you should be able to deploy the app.
Also please refer these links with similar issue for more information.
Reference 1 ,
Reference 2
Description
I'm trying out the service containers for integrated database tests in azure devops pipelines.
As per this opensourced dummy ci cd pipeline project https://dev.azure.com/funktechno/_git/dotnet%20ci%20pipelines. I was experimenting with azure devops service containers for integrated pipeline testing. I got postgress and mysql to work. I'm having issues with with microsoft sql server.
yml file
resources:
containers:
- container: mssql
image: mcr.microsoft.com/mssql/server:2017-latest
ports:
# - 1433
- 1433:1433
options: -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -e 'MSSQL_PID=Express'
- job: unit_test_db_mssql
# condition: eq('${{ variables.runDbTests }}', 'true')
# continueOnError: true
pool:
vmImage: 'ubuntu-latest'
services:
localhostsqlserver: mssql
steps:
- task: UseDotNet#2
displayName: 'Use .NET Core sdk 2.2'
inputs:
packageType: sdk
version: 2.2.207
installationPath: $(Agent.ToolsDirectory)/dotnet
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
# echo Write your commands here...
# echo ${{agent.services.localhostsqlserver.ports.1433}}
# echo Write your commands here end...
- task: CmdLine#2
displayName: 'enabledb'
inputs:
script: |
cp ./MyProject.Repository.Test/Data/appSettings.devops.mssql.json ./MyProject.Repository.Test/Data/AppSettings.json
- task: DotNetCoreCLI#2
inputs:
command: 'test'
workingDirectory: MyProject.Repository.Test
arguments: '--collect:"XPlat Code Coverage"'
- task: PublishCodeCoverageResults#1
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: '$(Agent.TempDirectory)\**\coverage.cobertura.xml'
db connection string
{
"sqlserver": {
"ConnectionStrings": {
"Provider": "sqlserver",
"DefaultConnection": "User ID=sa;Password=yourStrong(!)Password;Server=localhost;Database=mockDb;Pooling=true;"
}
}
}
Debugging
Most I can tell is that when I run Bash#3 to check the environment variables postgres and mysql print something similar to
/bin/bash --noprofile --norc /home/vsts/work/_temp/b9ec7d77-4bc2-47ab-b767-6a5e95ec3ea6.sh
"id": "b294d39b9cc1f0d337bdbf92fb2a95f0197e6ef78ce28e9d5ad6521496713708"
"pg11": {
}
while the mssql fails to print an id
========================== Starting Command Output ===========================
/bin/bash --noprofile --norc /home/vsts/work/_temp/70ae8517-5199-487f-9067-aee67f8437bb.sh
}
}
update this doesn't happen when using ubuntu-latest, but still have mssql connection issue
Database Error logging.
I'm currently getting this error in the pipeline
Error Message:
Failed for sqlserver
providername:Microsoft.EntityFrameworkCore.SqlServer m:A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 40 - Could not open a connection to SQL Server)
In TestDbContext.cs my error message is this. If people have pointer for getting more details it would be appreciated.
catch (System.Exception e)
{
var assertMsg = "Failed for " + connectionStrings.Provider + "\n" + " providername:" + dbContext.Database.ProviderName + " m:";
if (e.InnerException != null)
assertMsg += e.InnerException.Message;
else
assertMsg += e.Message;
_exceptionMessage = assertMsg;
}
Example pipeline: https://dev.azure.com/funktechno/dotnet%20ci%20pipelines/_build/results?buildId=73&view=logs&j=ce03965c-7621-5e79-6882-94ddf3daf982&t=a73693a5-1de9-5e3d-a243-942c60ab4775
Notes
I already know that azure devops pipeline mssql server doesn't work in the windows agents b/c they are windows server 2019 and the windows container version of mssql server is not well supported and only works for windows server 2016. It fails on the initialize container step when I do that.
I tried several things for the unit_test_db_mssql, changing ubuntu version, changing parameters, changing mssql server version, all give about the same error.
If people know of command line methods that work in linux to test if the mssql docker instance is ready, this may help as well.
Progress Update
got mssql working in github actions and gitlab pipelines so far.
postgres and mysql work just fine on azure devops and bitbucket pipelines.
still no luck on getting mssql working on azure devops.bitbucket wouldn't work as well although it did give a lot of docker details of the running database, but then again nobody really cares about bitbucket pipelines with their measly 50 minutes. You can see all of these pipelines in the referenced repository, they are all public, pull requests are also possible since it's open-sourced.
Per the help from Starain Chen [MSFT] in https://developercommunity.visualstudio.com/content/problem/1159426/working-examples-using-service-container-of-sql-se.html. It looks like a 10 second delay is needed to wait for the container to be ready.
adding
- task: PowerShell#2
displayName: 'delay 10'
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
start-sleep -s 10
Gets the db connection to work. I'm assuming that maybe the mssql docker container is ready by then.
I ran into this issue over the past several days and came upon your post. I was getting the same behavior and then something clicked.
IMPORTANT NOTE: If you are using PowerShell on Windows to run these commands use double quotes instead of single quotes.
that note is from https://hub.docker.com/_/microsoft-mssql-server
I believe that changing your line from
options: -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -e 'MSSQL_PID=Express'
to
options: -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=yourStrong(!)Password" -e "MSSQL_PID=Express"
should get it to work. I think its also interesting that in the link you posted above the password is passed in as its own line, which probably resolves the issue, not necessarily the 10 second delay. (example below)
- container: mssql
image: mcr.microsoft.com/mssql/server:2017-latest
env:
ACCEPT_EULA: Y
SA_PASSWORD: Password123
MSSQL_PID: Express
I'm currently migrating our CI/CD pipeline from a Bitbucket/Jenkins environment to hosted GitLab with additional custom gitlab-ci runners. So far anything seems fine, except when it comes down to services, especially regarding MSSQL server.
I've setup a gitlab-ci.yml file, which contains a service and a build stage job which basically just executes some msbuild targets.
I call the AttachDatabase target which then internally connects to the database and prepares anything for unittesting. Unfortunately I'm not able to connect to the database, whether I alias the service or not.
According to the documentation, I should just be able to use the alias name defined in services in my connection string found in Library.Build.Database.targets to connect to the databse.
I've setup a small reference project which ilustrates the problem: mssql-test.
If the pipeline is run the following error message is shown in the log:
error : A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections
image: "mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2019"
variables:
PROJ_NAME: MSSQL.Test.proj
MSBUILD_BIN: 'C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\MSBuild\Current\Bin\msbuild.exe'
NUGET_BIN: 'C:\Program Files\NuGet\nuget.exe'
ACCEPT_EULA: 'Y'
sa_password: Unit-T3st3r
services:
- name: advitec/mssql-server-windows-developer
alias: mssql
attachdatabase:
stage: build
tags:
- windows-1809
- 3volutions
- docker-windows
cache:
paths:
- packages
before_script:
- cmd /C "$NUGET_BIN" restore .\packages.config -OutputDirectory .\packages
allow_failure: false
script:
- cmd /C "$MSBUILD_BIN" "$PROJ_NAME" -t:AttachDatabase -v:Minimal "-p:Configuration=ReleaseForTesting;UniqueBuildNumber=$CI_PIPELINE_IID"
I'm running a custom windows Gitlab runner (just for performance reasons), below the according config.toml
Runner:
concurrent = 1
check_interval = 0
[session_server]
session_timeout = 1800
[[runners]]
name = "gitlab-runner-02-windows-server-datacenter-1809"
url = "https://gitlab.com/"
token = "****"
executor = "docker-windows"
[runners.custom_build_dir]
[runners.cache]
[runners.cache.s3]
[runners.cache.gcs]
[runners.docker]
tls_verify = false
image = "mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2019"
privileged = false
disable_entrypoint_overwrite = false
oom_kill_disable = false
disable_cache = false
volumes = ["c:\\cache"]
shm_size = 0
Any ideas what I'm missing?
Cheers
I am trying to allow a pipeline to publish a schema change to an on-premises SQL Server 2017 instance, but I want to do that in two steps:
Generate schema change script action
After approval, publish
I know that can be achieved by publishing to SQL Azure by setting deploymentAction: 'Script' and then deploymentAction: 'Publish'
Is there a way to publish to an on-premises SQL Server in a similar way? I have tried the SqlDacpacDeploymentOnMachineGroup task, but it does not seem possible to do it in two steps with this task
I have finally managed to implement SQL schema generation with database changes, followed by publishing those changes (after approval). Some remarks:
This won't work if the changes will cause data loss.
The path for the sqlpackage is only correct when Visual Studio 2019 is installed, like in windows-2019 images.
A previous package.dacpac was previously generated by building the .sqlproj project(s).
I passed the following variables through a group variable (more information on how to create group variables here):
targetDBConnectionString
servername
databasename
adminlogin
adminPassword
I have added an approval to the ApplyChanges stage
(within Pipelines menu, choose environments, then the ApplyChanges
environment, and then approvals and checks from the three dots
button, on the top right corner). This way the changes are not
applied to the database before manual approval takes place.
stage: VerifyScript
displayName: 'Script database schema changes'
dependsOn:
- Build
jobs:
- deployment: VerifyScript
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'scriptverification'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropDacpac
patterns: '**/*'
- task: CmdLine#2
displayName: 'Generate schema changes script'
inputs:
script: |
"c:\Program Files (x86)\Microsoft Visual Studio\2019\Enterprise\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\140\sqlpackage.exe" ^
/action:script ^
/diagnostics:true ^
/sourcefile:$(Pipeline.Workspace)\dropDacpac\path\to\the\dacpacFile\package.dacpac ^
/targetConnectionString:$(targetDBConnectionString) ^
/outputpath:$(Build.StagingDirectory)\changesScript.sql
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.StagingDirectory)'
artifactName: dropSqlSchemaChangesScript
condition: succeededOrFailed()
- task: PowerShell#2
displayName: Show Auto Generated SQL Script
inputs:
targetType: 'inline'
script: |
Write-Host "Auto Generated SQL Update Script:"
Get-Content $(Build.StagingDirectory)\changesScript.sql | foreach {Write-Output $_}
- stage: ApplyChanges
displayName: 'Apply database schema changes'
dependsOn: VerifyScript
jobs:
- deployment: ApplyChanges
pool:
vmImage: 'windows-2019'
variables:
- group: 'Timeline CV - Release'
environment: 'applyChanges'
strategy:
runOnce:
deploy:
steps:
- download: current
artifact: dropSqlSchemaChangesScript
- task: SqlDacpacDeploymentOnMachineGroup#0
displayName: 'Deploy SQL schema changes script'
inputs:
taskType: 'sqlQuery'
sqlFile: '$(Pipeline.Workspace)\dropSqlSchemaChangesScript\changesScript.sql'
targetMethod: 'server'
authScheme: 'sqlServerAuthentication'
serverName: '$(servername)'
databaseName: '$(databasename)'
sqlUsername: '$(adminlogin)'
sqlPassword: '$(adminPassword)'
I don't think there is a built-in task that can do this for you. However, SQLPackage.exe has the option. Take a look at this post: http://diegogiacomelli.com.br/azure-pipelines-generating-db-script/. It describes using a command line task to generate the sql script.
Once you have the script, you could either use the SqlDacpacDeploymentOnMachineGroup to publish the script (although it won't re-use the already generated script), or you can write a powershell script that publishes the sql script to the database. An example of such a script can be found here: https://careers.centric.eu/nl/blog/custom-azure-devops-pipeline-task-execute-sql-script/