How to capture changes made by Post Deploy script? - sql-server

We need to capture all the changes on database by the Dacpac deployment. We are using sqlpackage Action:DeployReport command to capture the changes but this does not capture the changes made by post deploy script. Our post deploy script consists of grant permissions statements and we want to capture those changes as well.
Is there any way we can include the changes made by the post deploy script?

Related

Azure DevOps - how to execute Pre and Post Deployment SQL scripts

In Azure DevOps release pipeline, How do I get our Script.PreDeployment.sql and Script.PostDeployment.sql files to execute during our SQL Server database deploy task?
In our release pipeline we have a Database Deployment phase, which has a SQL Server database deploy task. The tasks publishes our DACPAC file just fine. However I cannot figure out how to get the pre and post deployment scripts to execute.
Both of the scripts are included in the project, both have the appropriate BuildAction set to PreDeploy and PostDeploy. Yet in the logs of the dacpac deployment, there is no indication that the files were run - I have a bunch of PRINT statements in there.
I am also working on the post deployment scrip by using SSDT approach, and for deployment I am using the Azure SQL DacpacTask task in my Azure Pipeline, so you just need to create the post deployment scripts as you can see this in image and save it, after you run the Azure build this will add it in the Azure Pipeline artifact ill automatically executed in above azure tasks under release pipeline. First it will executed the database deployment and after that it runs the post deployment script. It works for me .
You can make use of Command line task to run those pre and post deployment SQL script using the SQLCMD tool.
The arguments to this its execution script are:
-S {database-server-name}.database.windows.net
-U {username}#{database-server-name}
-P {password}
-d {database-name}
-i {SQL file}
If you store the pre/postdeployment script in artifact, you can specify -i as like $(System.DefaultWorkingDirectory)/drop/Script.PreDeployment.sql.
Once Postdeployment script is added to the project, it will integrate with DacPac.
.sqlproj should have below PostDeploy Item group like -
<ItemGroup> <PostDeploy Include="PostDeploymentScript path"/> </ItemGroup>.
Sometime it does not update in .sqlproj and postdeployment does not work.
It should be added by default when postdeployment is added, just verify before publish.
.sqlproj

Bitbucket and Database Development

I have a Windows server with MS SQL Server running on it.
On the SQL Server developers have created stored procedures, views, tables, triggers.
On the Windows server developers created shell scripts.
I would like to start versioning the code described above in a BitBucket repository. I have a repository created in BitBucket.
How should the branches be organized in this repository? i.e. "SQL Server\Database\\ ...
"Windows Server\\shell_script" ...
Can I connect BitBucket to SQL Server and Windows Server and specify which code needs to be versioned?
Are both 1 and 2 options above possible?
I just need to version control the changes to the code and have the ability to mark under which project the code change was made.
I am new to BitBucket. I am using the web front end of it. I do not know how to configure command line access, so please try not to reference Bitbucket commands. Sorry if I sound confusing.
Please help.
I know this is an old question but anyway, in principle I'd recommend:
Put all the server shell scripts into one place and make that a git repo linked to your bitbucket repo
Add a server shell script to export what you want version controlled from the SQL db
The export from the SQL db should be to text files so they are easily 'diffable'
You might as well make the export to a sub-directory within the shell scripts repo so that everything is in one place and can't get out of sync
So you only have one branch, not a separate one for server shell scripts and db
Make sure people run the export script and then commit everything when they make a change
You ideally have a test server which means you'd want a way to push changes from the repo into the SQL db. I presume you can do this with a script but deleting the server setup and re-creating it from the text files.
So basically, you can't connect an SQL db to bitbucket directly. You need scripts to read and write to the db from a repo.

SQL Server Database Project

I want to use database project for script deployment in Azure SQL Server, I don't want to import full database. I just want to use database project for delta script. I added a project and included one script file with none as build action that contains create table statement , I am publishing the project, It's completing successfully but create statement is not executing. What is wrong here? Is there any other way to do this?
TLDR: Set your build action to "Post Deployment Script".
Longer:
What happens in SSDT is that all the files that have a build action of "Build" are built into a model of what the database should look like. When the deploy happens that model is compared to the target database and if there are any changes, a change script it generated and then optionally deployed.
If you have any file marked pre or post deployment script then they are either prepended or appended to the change script and will be run as part of the deployment.
If you have any files with a build action of "None" then SSDT ignores them, you could put anything in there, even an ascii picture of a donkey and the project will still build and deploy (obviously your ascii donkey won't get deployed anywhere).
If you just want to use SSDT to do your deployments you can just set the build action to pre or post deploy and it will be included. This is pretty odd though, either don't use SSDT or use SSDT and put the model of your entire database in there.
Personally, I would use SSDT properly and live the dream.
Ed

2 Separate DACPAC files for test and production

I have a VS2015 database project (sqlproj) and I created a lot of test data. I added a parameter to the PostDeploymentScript.sql file and when I need an empty database, I set it false and when I publish it doesn't include test data. When I need a demo database I set it true and when I publish, it also adds test data after deployment.
On the other hand, I want to create two different DACPAC files to prevent manual process and build both of them automatically at once. I searched a little bit and found several articles like this:
http://www.techrepublic.com/blog/data-center/auto-deploy-and-version-your-sql-server-database-with-ssdt/
but I couldn't apply what he said. What am I missing?
I created an (almost) empty database project (Lets say Base.sqlproj) which adds lookup table data after deployment. I created another DB project (Base_Plus_TestData.sqlproj) and added a database reference for the first database.
What I need is, if client needs to deploy empty database I'd like to give them the Base.DACPAC. If client needs to deploy a demo database with test data, I want to give them Base_Plus_TestData.DACPAC.
What should I do for this purpose and what am I doing wrong?
There a couple of extra options over what you already do with a switch to include data, I would choose the first :)
1 - Just give customers who want demo data a script to run after deploying the database (you could do something like use a powershell script/.net app to deploy your data and optionally the data)
2 - The post deploy script can be edited in a dacpac, you could build your project, copy the dacpac and then edit the post deploy script to include your data on one of the dacpacs.
3 - Create a separate ssdt project that references your main database project with a "same database" reference and the extra post deploy script - wheb you build you will get two dacpacs you can deploy either together if you want data or just the database.
If you also have data in your original dacpac to deploy you will need to copy it into the "with data" dacpac.
Ed

Can I manage database using database project without knowing connection string?

I have created database project. I am able to upgrade my changes in my sql server. Now I have deploy the same changes on another environment. Also I dont want to change my previous data. I dont have to access that Sql server so I don`t know the connection string.
I have some options, like to deploy the .dacpac file or .sql script, but it first delete the database then creates new one. So that I loosing my data.
Please help me. If any option is there?
The options I see for this are:
Ask for a backup (or extract schema tables using task-->Generate scripts ion ssms) - restore this somewhere and use sqlpackage to generate a deployment script you can ask them to run
Ask them to run sqlpackage.exe and either generate a script or run it directly
Ask them for permissions so you can do it
If the database is being deleted then you have the option "CreateNewDatabase" set to true which would be bad in a production environment so remove it or set it to false!
If they run it or you ask for permissions, these are the minimum permissions you need to generate a script (to run the script you will probably need dbo):
https://the.agilesql.club/Blogs/Ed-Elliott/What-Permissions-Do-I-Need-To-Generate-A-Deploy-Script-With-SSDT (my blog)

Resources