How to run migrations on a private RDS instance using Prisma? - database

I have a Node project which is using terraform for provisioning and resource creation while serverless is used to deploy the functions. I'm using Prisma as the ORM with graphql. My problem is that I have a private RDS instance which is created through terraform and provisioned. Now I want to run the
npx prisma migrate
command but i'm not sure where do I run it as it is a private RDS instance and I can't run the migration command before terraform creates the resource.
I have tried codebuild as a build solution but unfortunately that requires a NAT gateway which I can't use as I need to work with public subnets.

Related

Azure Devops deploying web app to a server - how to also update database

We have a web app which due to budgetary reasons is running on a windows VM (IIS) with its database also running in SQL Server on the same server.
We have a build pipeline set up in Azure Devops which builds the web app and then creates an idempotent SQL migration file (We use Entity Framework), both the compiled app and the SQL file are copied into the build artefact.
We then have a release pipeline which deploys the web app into IIS on the server.
What I cant figure out is how to get the SQL file run into the database.
I have tried the "SQL Database Deploy" task but that seems to only want a .dacpac file, or the path of a SQL file which is already on the server - I dont seem to be able to give it a file that exists on the build machine to execute remotely.
I know that because we are using EF I could just make the application do its migrations on startup but that means that the app needs to run as a user with schema privileges which we don't really want, currently the app is only a data reader/writer.
Is there some mechanism that I can take the SQL script from the build artefact and run it on the remote VM? If not, what are my options for getting the file onto the VM so that I can run it using the "SQL Database Deploy" task? I don't want to deploy it with the web app as although its a small risk we don't want it lingering in a public folder.
Any help appreciated.

How to dockerize a Visual Studio SQL Database Project?

I have a solution consisting of different small web-services, which right now I can simply build and run each in their own container via docker-compose.
Now, another service will be added, which will need some data access. Since running a mssql server on in a docker container is rather easy and basically consists of one entry in the docker compose, I would like to do the following:
Create a SQL Database project and add it to my solution
Configure the database project, so that it fits the need of the web service
build a containerized SQL-Sever and automatically publish the database schema from my database project
build the web service and connect it up with the containerized SQL server
The goal is, that after running docker-compose, everything should be up and running.
Is this possible as described?
One additional note: The whole thing is just a hobby project, I want to get familiar with docker, APIs and access to databases.

Connecting Google Container Registry to Cloud SQL

we're using Google App Engine and Cloud SQL for a django web app. We want to run migrations during the build; however, GAE uses Container Registry to build the app, and Container Registry not authenticated to access Cloud SQL. So, as expected, the migrations fail to due to a rejected connection.
How does someone authorize Container Registry to access Cloud SQL?
When you say:
GAE uses Container Registry to build the app, and Container Registry not authenticated to access Cloud SQL.
I assume that you mean:
GAE uses Container Builder to build the app, and the Container Builder Service Account is not authenticated to access Cloud SQL.
Assuming that's what you need, this document explains how to use IAM to grant additional permissions to the Service Account: https://cloud.google.com/container-builder/docs/how-to/service-account-permissions
If you are in fact asking a different question, please clarify, including an example that demonstrates the problem you are having.

How to deploy database on linux

I have a small asp.net core website that I push to my server via jenkins. Jenkins does git checkout and then dotnet restore and dotnet run. It works for the website, but I added entity framework and I'm a little confused. How exactly do I move my local database to the server? Or should I create one on the server and then reference it?
I have one mssql database on (localdb)\MSSQLLocalDB, but when I run the server and try to go to a page which gets data from the database I get 500 Internal Server Error.
I would like to have one local db for testing and one on the server, but I just can't wrap my head around all of this.
Well in development, you should write a init script for your database. This will create all the required stuff your application needs.
So in linux...
Install the MySQL, get the users set up, and init the database.
In your application...
Provide the connection string for the DB installed in Linux.
I am not running my app in c# but this is similar to my node app. That is what I do. I develop in windows with Postgres. Then my prod is on a GoDaddy Linux cloud server and I have Postgres installed in that. When I do my git pull for the latest, I don't have to change much because of the .env file for my environment variables.

DB scripts automation from GIT to AWS SQL server using Jenkins?

Hi Is there any one who can help me out with this....
I was trying to automate some DB scripts migration using Jenkins. All I am doing is Using a Jenkins job trying to migrate DB scripts which are in GIT repository to AWS server (SQL server 2008-R2) and execute those scripts...
What all the server and access credentials I need to have from DB server end(like access key, secret key, DNS name..etc) in order to configure a Jenkins job.
SQL access to the DB
For SQL access to the DB you need the following:
Hostname or RDS endpoint
TCP port number
DB name
DB user
DB password
The Jenkins machine needs network access to the DB host, which means you need to allow this in the Security Groups and have VPC connectivity.
AWS API access
For API access to AWS, the best practice is to have an IAM role assigned to the Jenkins machine. The role should have necessary permissions in the IAM policies attached to it.
In addition, you need to have internet access from the Jenkins machine or configure VPC endpoints so that Jenkins can reach the AWS API.
NOTE: You can't assign an IAM role to an existing EC2 machine. IAM roles can only be assigned during instance creation.

Resources