Everyone, I'm looking on a way to deploy updates to our on prem databases using Azure DevOps and I'm running into a roadblock on the release definition. I have my DACPAC ready to go, but not sure how to get that over to my on prem server.
I see the WinRM-SQL Server DB Deployment as a task, but not sure how to set that up. I have seen a couple of videos that use the SQL Server Database Deploy as an option, but it looks like that task has been deprecated, so it looks like I will need to use the WinRM-SQL task. So, could anyone point me in the right direction on how to set this task up to use my local SQL server or possibly a tutorial that help get me started?
You will also have to install a release agent on the target server where you will be deploying the database, assign it to a Deployment Group, create your release pipeline template and then run a release. I wrote a blog post about how to deploy a database to an on-prem SQL Server by leveraging Azure DevOps: https://jpvelasco.com/deploying-a-sql-server-database-onto-an-on-prem-server-using-azure-devops/
Hope this helps.
If you already created a Deployment Group, within your Pipeline:
Click to Add a new stage: [1]: https://i.stack.imgur.com/vc5TI.png
On the right side, (SELECT A TEMPLATE screen) type SQL in the search box
Select: IIS website and SQL database deployment, this will add a Stage with Two tasks: IIS deployment and SQL DB Deploy.
Delete the IIS Deployment Task
Configure the SQL DB Deploy task - It does not say it is deprecated.
Related
I've been trying to deploy a database via a pipeline from Azure DevOps to an Azure resource group.
I have an ARM template for my database server in my Repo along with my DACPAC file.
In my release pipeline I first deploy the database server with an admin user defined in the ARM template to my resource group.
Then I use the "Azure SQL DacpacTask" to deploy the database schema. Here I give the admin credentials and it works flawlessly.
The issue is that the customer doesn't want it deployed as a DACPAC but rather an SQL script. They've given me a "CREATE TO..."-script from their database, created inside MS SQL Studio, which is also in my Repo.
The "Azure SQL DacpacTask" inside my release pipeline has an option for using an SQL file instead of DACPAC, but it doesn't work for me.
But no matter how I do it, my pipeline fails when running the "Azure SQL DacpacTask" and all I get, even in debug-mode is "Login failed for user '***'".
I can connect to the SQL Server through MSSQL Studio on my local machine using the admin credentials defined in my ARM template.
I've tried by adding the agent's IP before running the SQL script, but with no success.
Can anyone point me in the right direction or maybe tell me what I'm doing wrong? Why is it that it keeps failing to log in?
They want it deployed with little to no human interaction. Is it doable only through the ADO pipeline?
EDIT
Additional info:
I have tried with hardcoded password and user with no luck.
If I manually create a database and then try to deploy one via script and pipeline, it fails because a database already exists, and not because of a failed login.
I've setup Audit in Azure which generates 2 files with little to no info. I'm not sure what I'm looking at.
We have a web app which due to budgetary reasons is running on a windows VM (IIS) with its database also running in SQL Server on the same server.
We have a build pipeline set up in Azure Devops which builds the web app and then creates an idempotent SQL migration file (We use Entity Framework), both the compiled app and the SQL file are copied into the build artefact.
We then have a release pipeline which deploys the web app into IIS on the server.
What I cant figure out is how to get the SQL file run into the database.
I have tried the "SQL Database Deploy" task but that seems to only want a .dacpac file, or the path of a SQL file which is already on the server - I dont seem to be able to give it a file that exists on the build machine to execute remotely.
I know that because we are using EF I could just make the application do its migrations on startup but that means that the app needs to run as a user with schema privileges which we don't really want, currently the app is only a data reader/writer.
Is there some mechanism that I can take the SQL script from the build artefact and run it on the remote VM? If not, what are my options for getting the file onto the VM so that I can run it using the "SQL Database Deploy" task? I don't want to deploy it with the web app as although its a small risk we don't want it lingering in a public folder.
Any help appreciated.
We have the following setup - 2 SQL servers(01 & 02) configured as AlwaysOn with a listener node/alias (LT) pointing to which server is currently primary(01/02).
What would be the best way to configure an Azure DevOps pipeline to deploy changes to the SQL servers. Along with deploying the changes the release pipeline also needs to include steps for restoring a new DB backup from another server.
My thought is to configure everything on listener(LT) this includes running the power shell script to create the AzDo client as well. But what will happen once the servers(Primary & Secondary) are switched?
Trying to find article or solution page in Azure but I am not successful yet.
The title is pretty much self explanatory. I am looking for a known best practice or solution with steps to follow to run docker with SQL Server in Azure.
I have Docker with SQL Server Express, Docker for Windows, running locally and my expectation is simply deploying this to Azure.
Based on my short experience with Azure, I probably need to set up some Azure service where I can deploy my docker image and run, not sure what that Azure product should be (probably more of Azure Container than Azure SQL)
well, given your requirement of windows containers (why?), you can use either Azure Container Instances (but be mindful of base images they support) or AKS engine. I'd discard webapps.
I'm having some trouble using Azure's elastic pool feature. Right now, I have a SQL server with a bunch of databases that I want to store in an elastic pool to cut costs. My problem is that right now, everything is deployed in a very specific, non-negotiable way. Resource groups, storage accounts, and SQL servers are spun up with ARM templates and databases are added to the server with Microsoft's Azure SQL Database Deployment function in VSTS. Everything I do should be automated for disaster recovery.
Is there a way to configure the server on creation so everything goes automatically into the elastic pool? Is there a way to change the Azure SQL Database Deployment step so it adds the db to the pool and not just the server?
Thanks in advance.
Edit: I ended up adding a powershell task to VSTS and running a simple json+script that took in the resource group, server name, pool name, and database names, then ran Set-AzureRmSqlDatabase.
You can manage SQL Database elastic pools using PowerShell (e.g. Set-AzureRmSqlDatabase: Sets properties for a database, or moves an existing database into, out of, or between elastic pools.), so you can try to use Azure PowerShell step/task to move database to elastic pool.
Regarding ARM template, you can get it when creating SQL Database in azure portal:
On the other hand, there is an article about a template allows to deploy a new SQL Elastic Pool with its new associated SQL Server and new SQL Databases to assign to it: Deploy a new SQL Elastic Pool
I solved it with a custom build step. Included the required parameters in a json and ran a powershell script that connected with
Initialize-AzurePowershellSupport -ConnectedServiceName $ConnectedServiceName -ErrorAction SilentlyContinue
$subscription = (Get-AzureRmContext).Subscription.SubscriptionName #(Get-AzureRmContext).Subscription.SubscriptionName
and then used Set-AzureRmSqlDatabaseto move the database to an elastic pool.