I have a solution consisting of different small web-services, which right now I can simply build and run each in their own container via docker-compose.
Now, another service will be added, which will need some data access. Since running a mssql server on in a docker container is rather easy and basically consists of one entry in the docker compose, I would like to do the following:
Create a SQL Database project and add it to my solution
Configure the database project, so that it fits the need of the web service
build a containerized SQL-Sever and automatically publish the database schema from my database project
build the web service and connect it up with the containerized SQL server
The goal is, that after running docker-compose, everything should be up and running.
Is this possible as described?
One additional note: The whole thing is just a hobby project, I want to get familiar with docker, APIs and access to databases.
Related
We have a web app which due to budgetary reasons is running on a windows VM (IIS) with its database also running in SQL Server on the same server.
We have a build pipeline set up in Azure Devops which builds the web app and then creates an idempotent SQL migration file (We use Entity Framework), both the compiled app and the SQL file are copied into the build artefact.
We then have a release pipeline which deploys the web app into IIS on the server.
What I cant figure out is how to get the SQL file run into the database.
I have tried the "SQL Database Deploy" task but that seems to only want a .dacpac file, or the path of a SQL file which is already on the server - I dont seem to be able to give it a file that exists on the build machine to execute remotely.
I know that because we are using EF I could just make the application do its migrations on startup but that means that the app needs to run as a user with schema privileges which we don't really want, currently the app is only a data reader/writer.
Is there some mechanism that I can take the SQL script from the build artefact and run it on the remote VM? If not, what are my options for getting the file onto the VM so that I can run it using the "SQL Database Deploy" task? I don't want to deploy it with the web app as although its a small risk we don't want it lingering in a public folder.
Any help appreciated.
Trying to find article or solution page in Azure but I am not successful yet.
The title is pretty much self explanatory. I am looking for a known best practice or solution with steps to follow to run docker with SQL Server in Azure.
I have Docker with SQL Server Express, Docker for Windows, running locally and my expectation is simply deploying this to Azure.
Based on my short experience with Azure, I probably need to set up some Azure service where I can deploy my docker image and run, not sure what that Azure product should be (probably more of Azure Container than Azure SQL)
well, given your requirement of windows containers (why?), you can use either Azure Container Instances (but be mindful of base images they support) or AKS engine. I'd discard webapps.
I have a small asp.net core website that I push to my server via jenkins. Jenkins does git checkout and then dotnet restore and dotnet run. It works for the website, but I added entity framework and I'm a little confused. How exactly do I move my local database to the server? Or should I create one on the server and then reference it?
I have one mssql database on (localdb)\MSSQLLocalDB, but when I run the server and try to go to a page which gets data from the database I get 500 Internal Server Error.
I would like to have one local db for testing and one on the server, but I just can't wrap my head around all of this.
Well in development, you should write a init script for your database. This will create all the required stuff your application needs.
So in linux...
Install the MySQL, get the users set up, and init the database.
In your application...
Provide the connection string for the DB installed in Linux.
I am not running my app in c# but this is similar to my node app. That is what I do. I develop in windows with Postgres. Then my prod is on a GoDaddy Linux cloud server and I have Postgres installed in that. When I do my git pull for the latest, I don't have to change much because of the .env file for my environment variables.
Here's the scenario:
MVC web project
Three MS SQL Server database projects
One of the databases must be populated with lookup tables
Other tables are user data and don't need data uploading
GoDaddy hosting
Visual Studio 2013
I'd like to deploy everything (web project, sql schemas, reference data) to GoDaddy in one fell swoop, but they appear to only offer FTP uploading. When using FTP in the Web Publishing Wizard, it says "Database preview not supported for this method" which I'm taking "method" to mean FTP. I can publish the web project fine in FTP, but of course without the databases the web application generates errors.
So here are my questions
There is a "Web Deploy" publish method listed in the wizard, but GoDaddy has no information on how to set this up. Can this be used with GoDaddy and will it publish DBs also?
How does one configure the project to use the local SQL Server when running on localhost, but when deployed it uses the GoDaddy SQL Servers?
Can the data in the local DB be uploaded as part of the publishing wizard process, or is SQL Server Management Studio the tool of choice?
Thanks!
I don't believe GoDaddy supports WebDeploy. They didn't when I left their service a few years ago. You can talk to them to confirm whether this has changed.
This is the role of Web.Config Transforms. For an intro to the topic, see here; the article is a little out of date and doesn't mention one of the most useful points - you can add transforms for each publish profile, so they're applied according to your publish settings.
You probably can't upload the local DB file. In almost every hosting situation, the SQL server and the web server are two separate machines, and don't share any files (corollary: the web server doesn't have the SQL service installed). One workaround you can try is to publish the DB directly from your own machine. That is, if you can connect to the DB from your machine, you can do a Web Deploy publish to your own machine but it will send the SQL changes to your GoDaddy DB server.
A more advanced workaround for #3:
Set up your FTP publishing settings for your files
Figure out how to publish your DB through WebDeploy only/from the command line (you can refer to here for a sample using WebDeploy from the command line; note this is going from GoDaddy -> Local, but it's trivial to turn it around)
Customize the web publish pipeline to insert an MSBuild target to execute your WebDeploy command line (see here for an example of modifying the pipeline; you can add the target directly in your .pubxml file if you're not intending to use it for multiple projects).
This will give you a single publish profile which will separately publish your files (via FTP) and your DB (via WebDeploy).
I have created a dynamic web project in Eclipse. The project creates a database within it. Testing it on a local machine having Tomcat server works fine. I want to deploy this project with the database to another system running Tomcat. How to accomplish this task? Internet has been of little help to me.
For example a web project Students created. A database with table named StudentsOfStdV created. This database has some data. For testing purposes I want to export the .war file and the database along with it to a different machine running a server. How to do it?
You need to Export War from Eclipse and deploy it on Tomcat.
Read tutorials on
Web application Deployment Guide: http://tomcat.apache.org/tomcat-4.0-doc/appdev/deployment.html
How to setup Tomcat: http://www.coreservlets.com/Apache-Tomcat-Tutorial/
Export War from eclipse: http://www.java-tips.org/other-api-tips/eclipse/how-to-make-war-file-in-eclipse.html
EDIT:
Database does not go along with .war. The database server has to be installed first and then You can create it on first run or manually.