Copy migration scripts created inside docker to outside database - database

I am currently working on a project where I use Postges inside a Docker and manage it via Flask-Migrate but only during development.
My goal is to use a database directly on a VPS in production.
Can I create migration scripts on the Docker databse and then copy them to run on the VPS database?

Your migration scripts will be stored with your source code, not inside the Docker container where the database is. So it really does not matter, as long as you set the database connection URL properly you can generate and apply the migrations on any supported database, be it hosted on a container or a VPS.

Related

How can I programmatically import a .bacpac file into an on-premises SQL Server instance?

I am working on a project which hosts multiple services in Microsoft Azure, including their own databases, which are Azure SQL Databases. The development happens on developer laptops using Docker Compose including the database server which is using the mcr.microsoft.com/mssql/server:2022-latest docker image.
The task at hand is to create a scripted solution with PowerShell, which can import backups into both Azure SQL Database and the SQL Server running in Docker manually. This way, we want to export and version different sets of testdata, which we want to import into Azure environments after a deployment in some cases or use it to create a data baseline for local development.
Even though both databases are Microsoft SQL Servers, they are different products built from the same codebase for different environments and different supported feature sets. (Screenshot taken in the Azure Portal)
The Azure SQL Database does not work with *.bak files, that leaves the *.bacpac format, which is the recommended solution for manual exports in Azure SQL Database (see here). The import into another Azure SQL Database is quite simple, using the New-AzSqlDatabaseImport Command.
What I have been unable to find so far is a way to programmatically import the *.bacpac into my SQL Server in Docker as part of the PowerShell script.
Not all developers use Windows machines, so a solution that is cross-platform or inside the Docker container is required.
I have tried:
using the T-SQL command RESTORE DATABASE, which didn't work, because it expects a *.bak file
using the sqlpackage utility as described in step 6 of this article, but that utility was not found in the container (starting container process caused: exec: "/opt/mssql/bin/sqlpackage": stat /opt/mssql/bin/sqlpackage: no such file or directory: unknown)
The only way I have found so far to import the *.bacpac file so far is manually by using the SSMS Import Data-tier Application wizard, but not in any way I can use in a PowerShell script.
I really really want to stay with a backup file solution and avoid exporting my databases to T-SQL scripts. Any help is appreciated.
The comment above is correct - you can accomplish this by adding sqlpackage to your Docker container.
I have a Github project to convert .bak to .bacpac, and I built a Docker container with SQL Server and sqlpackage.

Deploying Azure SQL Server with external data sources

I have two environments, development and production. Each one of them has 2 databases, one main database and another that is used as an external data source.
Currently I'm trying to create a github deployment pipeline, I use SSDT to do source control and I have the actions for building the .dacpac and sending it to Azure working. How can I configure so that when in production the database looks to external source of prod, instead of the same external source as dev, when I deploy changes to production?

SQL Database Project build and publish thru jenkins

Database - SQL Server
Version control - GIT/Bit bucket
Automation - Jenkins pipeline.
Question/task - I need to build SQL database project & deploy thru jenkins pipeline.
Currently we manually build & publish the database but I have scripts which I can use to build(dacpac) & publish to the database but problem scripts only work in VS(visual studio) command prompt.
Build SQL database project thru Jenkins - Is it possible, if so how?
Publish SQL database project thru Jenkins - Is it possible, if so how?
Please help me understand the process involved.
I used the DacFx API provided by Microsoft and created an API service that I integrated with jerkins. In the middle of the pipeline, I passed the SqlProject location, my target database information, and dacpac destination. The API uses Publish Method to Deploy and generate the delta and deploy it to the target database.
See the link below.
https://www.nuget.org/packages/Microsoft.SqlServer.DACFx

Visual Studio Database Project + Deploy + TFS Build + Multi Tenancy

We have recently migrated to using the Visual Studio database projects. What we want to do is for the database to deploy when the TFS build server builds.
This is relatively simple and we have this working for a single database, however, what we need is for it to deploy to multiple database as we have a SaaS product with multiple databases. So for example, when we do a QA build, all the different databases with various configurations on the QA DB server should be updated.
Is there a 'proper' way to do this?
Our current plan is to take the deployment .sql script that will be generated from the database configured for deployment, then create a custom build task which runs this script against the rest of the databases.
I don't think there is a standard way of doing this, so we created a custom build task that iterates over the databases we want to deploy to executing the deployment script generated by the standard database project's deploy against each DB.

Liquibase Grails database migrations

I followed the excellent tutorial on
http://www.jakusys.de/blog/2008/09/grails-and-liquibase-how-to-use/
in regard to my dev database on local machine where grails is installed. All went well.
Now I want to deploy grails war to remote website where I setup mysql on remote server.
But I am at loss now. How do I apply the command:
grails migrate
so that the now the remote database has the DATABASECHANGELOG table.
In the database there is some production data that I will manually copy from my local mysql to fresh install of remote mysql database while most of other tables are fresh and have no data. I am waiting for reply to this question to make sure I don't mess up something before actually launch my grails application on remote production server.
You can migrate a remote DB from your computer, using grails.env variable, like:
grails migrate -Dgrails.env=production

Resources