I'm currently migrating our CI/CD pipeline from a Bitbucket/Jenkins environment to hosted GitLab with additional custom gitlab-ci runners. So far anything seems fine, except when it comes down to services, especially regarding MSSQL server.
I've setup a gitlab-ci.yml file, which contains a service and a build stage job which basically just executes some msbuild targets.
I call the AttachDatabase target which then internally connects to the database and prepares anything for unittesting. Unfortunately I'm not able to connect to the database, whether I alias the service or not.
According to the documentation, I should just be able to use the alias name defined in services in my connection string found in Library.Build.Database.targets to connect to the databse.
I've setup a small reference project which ilustrates the problem: mssql-test.
If the pipeline is run the following error message is shown in the log:
error : A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections
image: "mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2019"
variables:
PROJ_NAME: MSSQL.Test.proj
MSBUILD_BIN: 'C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\MSBuild\Current\Bin\msbuild.exe'
NUGET_BIN: 'C:\Program Files\NuGet\nuget.exe'
ACCEPT_EULA: 'Y'
sa_password: Unit-T3st3r
services:
- name: advitec/mssql-server-windows-developer
alias: mssql
attachdatabase:
stage: build
tags:
- windows-1809
- 3volutions
- docker-windows
cache:
paths:
- packages
before_script:
- cmd /C "$NUGET_BIN" restore .\packages.config -OutputDirectory .\packages
allow_failure: false
script:
- cmd /C "$MSBUILD_BIN" "$PROJ_NAME" -t:AttachDatabase -v:Minimal "-p:Configuration=ReleaseForTesting;UniqueBuildNumber=$CI_PIPELINE_IID"
I'm running a custom windows Gitlab runner (just for performance reasons), below the according config.toml
Runner:
concurrent = 1
check_interval = 0
[session_server]
session_timeout = 1800
[[runners]]
name = "gitlab-runner-02-windows-server-datacenter-1809"
url = "https://gitlab.com/"
token = "****"
executor = "docker-windows"
[runners.custom_build_dir]
[runners.cache]
[runners.cache.s3]
[runners.cache.gcs]
[runners.docker]
tls_verify = false
image = "mcr.microsoft.com/dotnet/framework/sdk:4.8-windowsservercore-ltsc2019"
privileged = false
disable_entrypoint_overwrite = false
oom_kill_disable = false
disable_cache = false
volumes = ["c:\\cache"]
shm_size = 0
Any ideas what I'm missing?
Cheers
Related
I'm trying to deploy my sanity react app, here's the repo:
https://github.com/sebastian-meckovski/sanity-admin-area
here's my yml file:
trigger:
- main
- feature/*
pool:
vmImage: 'ubuntu-latest'
variables:
SANITY_AUTH_TOKEN: $(SanityDeployAPIToken)
SANITY_STUDIO_PROJECT_BASEPATH: /studio
steps:
- task: NodeTool#0
inputs:
versionSpec: '14.x'
- task: CmdLine#2
inputs:
script: |
cd src/Studio
npm install -g #sanity/cli
npm install
sanity deploy
been using this article for help: https://blog.novacare.no/deploy-sanity-studio-with-azure-devops/
I'm getting this error in cmdline in azure:
Your project has not been assigned a studio hostname.
To deploy your Sanity Studio to our hosted Sanity.Studio service,
you will need one. Please enter the part you want to use.
any ideas?
Sanity Deployment in Azure Pipeline
That is because you would be asked to choose a unique hostname for your Studio when you first deploy sanity.
This is a pop-up verification window that needs to interact with our developers.
But the hosted agent is in the service mode instead of Interactive. Please check the docuement Azure Pipelines agents for some more details.
In this case, the verification window will not pop-up for the hosted agent. That is the reason why you get the error the project has not been assigned a studio hostname.
To resolve this issue, you need to create your private agent and set it as Interactive mode.
Self-hosted Linux agents
Or you could just deploy it from your local with command line, so that you could assign a studio hostname.
Of course, a more complete solution would be to have Sanity pre-specify the hostname somewhere (a json-like file) and then read it when deploying Sanity without specifying it during the process. But I'm not a Sanity expert and it's not clear if there is such a way.
I'm trying to have a SQL Server Docker container run and have a database backup file restored using Terraform. So far I can get the SQL server Docker image to download and run as a container but I can't get the sqlcmd to run inside the container. I have it set up where the sqlcmd runs automatically when the container starts. This is the log message I get in the container:
Sqlcmd: Error: Microsoft ODBC Driver 17 for SQL Server : Login timeout expired.
Sqlcmd: Error: Microsoft ODBC Driver 17 for SQL Server : TCP Provider: Error code 0x2749.
Sqlcmd: Error: Microsoft ODBC Driver 17 for SQL Server : A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online..
SQL Server 2019 will run as non-root by default.
This container is running as user mssql.
To learn more visit https://go.microsoft.com/fwlink/?linkid=2099216.
This is my terraform file:
terraform {
required_providers {
docker = {
source = "kreuzwerker/docker"
version = "2.16.0"
}
}
}
provider "docker" {
host = "npipe:////.//pipe//docker_engine"
}
resource "docker_image" "mssql" {
name = "mcr.microsoft.com/mssql/server:2019-latest"
keep_locally = true
}
resource "docker_container" "mssql" {
image = docker_image.mssql.latest
name = "sqldatabase"
attach = false
rm = false
restart = "no"
command = [ "/opt/mssql-tools/bin/sqlcmd", "-S localhost", "-U mssql", "-P <StrongPassword>" ]
env = [
"ACCEPT_EULA=Y",
"MSSQL_SA_PASSWORD=<StrongPassword>"
]
ports {
internal = 1433
external = 1433
}
}
I'm following this Microsoft tutorial about restoring a SQL Server database in a Docker container: https://learn.microsoft.com/en-us/sql/linux/tutorial-restore-backup-in-sql-server-container?view=sql-server-ver15
Right now I'm just trying to run sqlcmd in the container to see if it works then I'll try restoring an actual database backup. How do I get the sqlcmd to work? I'm brand new to Terraform.
UPDATE:
As #Nick.McDermaid suggested, I ran the sqlcmd manually in the container and it worked. Also the username I had was wrong. It should be SA, not mssql. I got throw off because in the container log it says This container is running as user mssql. which must be different from a SQL Server username.
As #AlwaysLearning suggested it is a timing issue, Terraform starts the Docker container and runs sqlcmd when SQL Server is not ready.
I guess I need some kind of entrypoint.sh file that Terraform can run.
Please replace
command = [ "/opt/mssql/bin/sqlservr","--accept-eula"]
I am trying to automate my testing using Nuke Build on GitHub. I am using a Docker to run a SQL Server container for testing. When it is run by GitHub Actions this error occurs. The image is successfully created and the error occurs when the tests try to access the database.
Error message:
Microsoft.Data.SqlClient.SqlException : Cannot open database "MyDB" requested by the login. The login failed.
Login failed for user 'sa'
It runs successfully locally on my Windows 10 computer. I also tried to used Azure DevOps Pipeline and and the same error occurs. I tried looking around at similar issues such as this but they are not specific to when running on GitHub.
My .yaml script:
name: ci
on:
push:
branches:
- main
- feature/*
pull_request:
branches:
- feature/*
jobs:
ubuntu-latest:
name: ubuntu-latest
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Run './build.cmd Test'
run: ./build.cmd Test
Snippets from my Build.cs modeled from this:
Target RunDbContainer => _ => _
.Before(Compile)
.Executes(() =>
{
DockerTasks.DockerRun(x => x
.SetImage("mcr.microsoft.com/mssql/server:2019-latest")
.SetEnv(new string[] { "ACCEPT_EULA=Y", "SA_PASSWORD=Password_01" })
.SetPublish("1455:1433")
.SetDetach(true)
.SetName("sql1")
.SetHostname("sql1"));
System.Threading.Thread.Sleep(10000);
});
Target Test => _ => _
.Triggers(UnitTest, RunDbContainer, IntegrationTest, SqlServerContainerStop);
The database is created and deleted during testing using EF Core.
A connection string is used to create a DbContext. Then context.Database.EnsureCreated() is used to create the database.
For the connection string I tired using localhost and the IP address gotten using:
"Server=127.17.0.2,1455;Database=MyDb;User Id=sa;Password=Password_01;MultipleActiveResultSets=True;Trusted_Connection=False;"
"Server=localhost,1455;Database=MyDb;User Id=sa;Password=Password_01;MultipleActiveResultSets=True;Trusted_Connection=False;"
I got the IP address following these instuctions:
docker inspect -f '{{range.NetworkSettings.Networks}}{{.IPAddress}}{{end}}' sql1
Description
I'm trying out the service containers for integrated database tests in azure devops pipelines.
As per this opensourced dummy ci cd pipeline project https://dev.azure.com/funktechno/_git/dotnet%20ci%20pipelines. I was experimenting with azure devops service containers for integrated pipeline testing. I got postgress and mysql to work. I'm having issues with with microsoft sql server.
yml file
resources:
containers:
- container: mssql
image: mcr.microsoft.com/mssql/server:2017-latest
ports:
# - 1433
- 1433:1433
options: -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -e 'MSSQL_PID=Express'
- job: unit_test_db_mssql
# condition: eq('${{ variables.runDbTests }}', 'true')
# continueOnError: true
pool:
vmImage: 'ubuntu-latest'
services:
localhostsqlserver: mssql
steps:
- task: UseDotNet#2
displayName: 'Use .NET Core sdk 2.2'
inputs:
packageType: sdk
version: 2.2.207
installationPath: $(Agent.ToolsDirectory)/dotnet
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
restoreSolution: '$(solution)'
- task: Bash#3
inputs:
targetType: 'inline'
script: 'env | sort'
# echo Write your commands here...
# echo ${{agent.services.localhostsqlserver.ports.1433}}
# echo Write your commands here end...
- task: CmdLine#2
displayName: 'enabledb'
inputs:
script: |
cp ./MyProject.Repository.Test/Data/appSettings.devops.mssql.json ./MyProject.Repository.Test/Data/AppSettings.json
- task: DotNetCoreCLI#2
inputs:
command: 'test'
workingDirectory: MyProject.Repository.Test
arguments: '--collect:"XPlat Code Coverage"'
- task: PublishCodeCoverageResults#1
inputs:
codeCoverageTool: 'Cobertura'
summaryFileLocation: '$(Agent.TempDirectory)\**\coverage.cobertura.xml'
db connection string
{
"sqlserver": {
"ConnectionStrings": {
"Provider": "sqlserver",
"DefaultConnection": "User ID=sa;Password=yourStrong(!)Password;Server=localhost;Database=mockDb;Pooling=true;"
}
}
}
Debugging
Most I can tell is that when I run Bash#3 to check the environment variables postgres and mysql print something similar to
/bin/bash --noprofile --norc /home/vsts/work/_temp/b9ec7d77-4bc2-47ab-b767-6a5e95ec3ea6.sh
"id": "b294d39b9cc1f0d337bdbf92fb2a95f0197e6ef78ce28e9d5ad6521496713708"
"pg11": {
}
while the mssql fails to print an id
========================== Starting Command Output ===========================
/bin/bash --noprofile --norc /home/vsts/work/_temp/70ae8517-5199-487f-9067-aee67f8437bb.sh
}
}
update this doesn't happen when using ubuntu-latest, but still have mssql connection issue
Database Error logging.
I'm currently getting this error in the pipeline
Error Message:
Failed for sqlserver
providername:Microsoft.EntityFrameworkCore.SqlServer m:A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 40 - Could not open a connection to SQL Server)
In TestDbContext.cs my error message is this. If people have pointer for getting more details it would be appreciated.
catch (System.Exception e)
{
var assertMsg = "Failed for " + connectionStrings.Provider + "\n" + " providername:" + dbContext.Database.ProviderName + " m:";
if (e.InnerException != null)
assertMsg += e.InnerException.Message;
else
assertMsg += e.Message;
_exceptionMessage = assertMsg;
}
Example pipeline: https://dev.azure.com/funktechno/dotnet%20ci%20pipelines/_build/results?buildId=73&view=logs&j=ce03965c-7621-5e79-6882-94ddf3daf982&t=a73693a5-1de9-5e3d-a243-942c60ab4775
Notes
I already know that azure devops pipeline mssql server doesn't work in the windows agents b/c they are windows server 2019 and the windows container version of mssql server is not well supported and only works for windows server 2016. It fails on the initialize container step when I do that.
I tried several things for the unit_test_db_mssql, changing ubuntu version, changing parameters, changing mssql server version, all give about the same error.
If people know of command line methods that work in linux to test if the mssql docker instance is ready, this may help as well.
Progress Update
got mssql working in github actions and gitlab pipelines so far.
postgres and mysql work just fine on azure devops and bitbucket pipelines.
still no luck on getting mssql working on azure devops.bitbucket wouldn't work as well although it did give a lot of docker details of the running database, but then again nobody really cares about bitbucket pipelines with their measly 50 minutes. You can see all of these pipelines in the referenced repository, they are all public, pull requests are also possible since it's open-sourced.
Per the help from Starain Chen [MSFT] in https://developercommunity.visualstudio.com/content/problem/1159426/working-examples-using-service-container-of-sql-se.html. It looks like a 10 second delay is needed to wait for the container to be ready.
adding
- task: PowerShell#2
displayName: 'delay 10'
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
start-sleep -s 10
Gets the db connection to work. I'm assuming that maybe the mssql docker container is ready by then.
I ran into this issue over the past several days and came upon your post. I was getting the same behavior and then something clicked.
IMPORTANT NOTE: If you are using PowerShell on Windows to run these commands use double quotes instead of single quotes.
that note is from https://hub.docker.com/_/microsoft-mssql-server
I believe that changing your line from
options: -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=yourStrong(!)Password' -e 'MSSQL_PID=Express'
to
options: -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=yourStrong(!)Password" -e "MSSQL_PID=Express"
should get it to work. I think its also interesting that in the link you posted above the password is passed in as its own line, which probably resolves the issue, not necessarily the 10 second delay. (example below)
- container: mssql
image: mcr.microsoft.com/mssql/server:2017-latest
env:
ACCEPT_EULA: Y
SA_PASSWORD: Password123
MSSQL_PID: Express
I wanted to do a realtime deployment of my model on azure, so I plan to create an image which firsts queries an ID in azure SQL db to get the required features, then predicts using my model and returns the predictions. The error I get from PyODBC library is that drivers are not installed
I tried it on the azure ML jupyter notebook to establish the connection and found that no drivers are being installed in the environment itself. After some research i found that i should create a docker image and deploy it there, but i still met with the same results
driver= '{ODBC Driver 13 for SQL Server}'
cnxn = pyodbc.connect('DRIVER='+driver+';SERVER='+server+';PORT=1433;DATABASE='+database+';UID='+username+';PWD='+ password+';Encrypt=yes'+';TrustServerCertificate=no'+';Connection Timeout=30;')
('01000', "[01000] [unixODBC][Driver Manager]Can't open lib 'ODBC
Driver 13 for SQL Server' : file not found (0) (SQLDriverConnect)")
i want a result to the query instead i get this message
and/or you could use pymssql==2.1.1, if you add the following docker steps, in the deployment configuration (using either Environments or ContainerImages - preferred is Environments):
from azureml.core import Environment
from azureml.core.environment import CondaDependencies
conda_dep = CondaDependencies()
conda_dep.add_pip_package('pymssql==2.1.1')
myenv = Environment(name="mssqlenv")
myenv.python.conda_dependencies=conda_dep
myenv.docker.enabled = True
myenv.docker.base_dockerfile = 'FROM mcr.microsoft.com/azureml/base:latest\nRUN apt-get update && apt-get -y install freetds-dev freetds-bin vim gcc'
myenv.docker.base_image = None
Or, if you're using the ContainerImage class, you could add these Docker Steps
from azureml.core.image import Image, ContainerImage
image_config = ContainerImage.image_configuration(runtime= "python", execution_script="score.py", conda_file="myenv.yml", docker_file="Dockerfile.steps")
# Assuming this :
# RUN apt-get update && apt-get -y install freetds-dev freetds-bin vim gcc
# is in a file called Dockerfile.steps, it should produce the same result.
See this answer for more details on how I've done it using an Estimator Step and a custom docker container. You could use this Dockerfile to locally create a Docker container for that Estimator step (no need to do that if you're just using an Estimator run outside of a pipeline) :
FROM continuumio/miniconda3:4.4.10
RUN apt-get update && apt-get -y install freetds-dev freetds-bin gcc
RUN pip install Cython
For more details see this posting :using estimator in pipeline with custom docker images. Hope that helps!
Per my experience, I think the comment as #DavidBrowne-Microsoft said is right.
There is a similar SO thread I am getting an error while connecting to an sql DB in Jupyter Notebook answered by me, which I think it will help you to install the latest msodbcsql driver for Linux on Microsoft Azure Notebook or Docker.
Meanwhile, there is a detail about the connection string for Azure SQL Database which you need to carefully note, that you should use {ODBC Driver 17 for SQL Server} instead of {ODBC Driver 13 for SQL Server} if your Azure SQL Database had been created recently (ignore the connection string shown in Azure portal).
you can use AzureML built in solution dataset to connect to your SQL server.
To do so, you can first create an azure_sql_database datastore. reference here
Then create a dataset by passing the datastore you created and the query you want to run.
reference here
sample code
from azureml.core import Dataset, Datastore, Workspace
workspace = Workspace.from_config()
sql_datastore = Datastore.register_azure_sql_database(workspace = workspace,
datastore_name = 'sql_dstore',
server_name = 'your SQL server name',
database_name = 'your SQL database name',
tenant_id = 'your directory ID/tenant ID of the service principal',
client_id = 'the Client ID/Application ID of the service principal',
client_secret = 'the secret of the service principal')
sql_dataset = Dataset.Tabular.from_sql_query((sql_datastore, 'SELECT * FROM my_table'))
You can also do it via UI at ml.azure.com where you can register an azure SQL datastore using your user name and password.