SQL Project in VSTS Continuous Integration via WinRM - sql-server

Here is my solution.
I have a DB project in it, which multiple publish profiles. When I run publish from Visual Studio Manually by loading any profile it works perfectly.
We have our IIS in Azure VM and SQL in another Azure VM
We have a Continuous builds setup on Visual Studio Team Services, which will deploy site to Azure VM.
To the same build definition, I added a WinRm - SQL Server DB Deployment
Here is my Buil defination for dacpac step
When I run my build I get the following error
Microsoft.PowerShell.Commands.WriteErrorException: Deployment on one
or more machines failed. System.Exception: Invalid Publish Profile [
xxxxxxx ] provided
I don't really find any references on how to give profile information and why is it failing. Any help would be greatly appreciated.
Update
In order to resolve this as per #Jason Ye - MSFT suggestion, I copied my publish profile to Azure SQL VM and copied its full path and updated the profile path in build definition.
Then I got stuck with the following issue
Microsoft.PowerShell.Commands.WriteErrorException: Deployment on one
or more machines failed. System.Exception: Publishing to database
'DbName' on server 'xxxxxx.eastus.cloudapp.azure.com'. * Could
not load package from '/*.dacpac'. Illegal characters in path.
Since dacpac will keep changing, I cannot copy it to server before. How to get the path.

Related

Azure DevOps Pipeline Azure SQL Deploy won't work with SQL script (Failed to login)

I've been trying to deploy a database via a pipeline from Azure DevOps to an Azure resource group.
I have an ARM template for my database server in my Repo along with my DACPAC file.
In my release pipeline I first deploy the database server with an admin user defined in the ARM template to my resource group.
Then I use the "Azure SQL DacpacTask" to deploy the database schema. Here I give the admin credentials and it works flawlessly.
The issue is that the customer doesn't want it deployed as a DACPAC but rather an SQL script. They've given me a "CREATE TO..."-script from their database, created inside MS SQL Studio, which is also in my Repo.
The "Azure SQL DacpacTask" inside my release pipeline has an option for using an SQL file instead of DACPAC, but it doesn't work for me.
But no matter how I do it, my pipeline fails when running the "Azure SQL DacpacTask" and all I get, even in debug-mode is "Login failed for user '***'".
I can connect to the SQL Server through MSSQL Studio on my local machine using the admin credentials defined in my ARM template.
I've tried by adding the agent's IP before running the SQL script, but with no success.
Can anyone point me in the right direction or maybe tell me what I'm doing wrong? Why is it that it keeps failing to log in?
They want it deployed with little to no human interaction. Is it doable only through the ADO pipeline?
EDIT
Additional info:
I have tried with hardcoded password and user with no luck.
If I manually create a database and then try to deploy one via script and pipeline, it fails because a database already exists, and not because of a failed login.
I've setup Audit in Azure which generates 2 files with little to no info. I'm not sure what I'm looking at.

Azure Devops deploying web app to a server - how to also update database

We have a web app which due to budgetary reasons is running on a windows VM (IIS) with its database also running in SQL Server on the same server.
We have a build pipeline set up in Azure Devops which builds the web app and then creates an idempotent SQL migration file (We use Entity Framework), both the compiled app and the SQL file are copied into the build artefact.
We then have a release pipeline which deploys the web app into IIS on the server.
What I cant figure out is how to get the SQL file run into the database.
I have tried the "SQL Database Deploy" task but that seems to only want a .dacpac file, or the path of a SQL file which is already on the server - I dont seem to be able to give it a file that exists on the build machine to execute remotely.
I know that because we are using EF I could just make the application do its migrations on startup but that means that the app needs to run as a user with schema privileges which we don't really want, currently the app is only a data reader/writer.
Is there some mechanism that I can take the SQL script from the build artefact and run it on the remote VM? If not, what are my options for getting the file onto the VM so that I can run it using the "SQL Database Deploy" task? I don't want to deploy it with the web app as although its a small risk we don't want it lingering in a public folder.
Any help appreciated.

vs 2017 entity core 2 not using the right data directory

experiencing an odd issue I've yet to see on any of my other machines. This is a fresh laptop, so I have installed VS 2017, SQL Server 2017 express, then created a quick sample project using one of the stock .net core projects (with authentication stored "in-app"). This, of course, creates some basic entity migrations and DB context.
When I run I'm getting access denied errors. So, of course, I checked SQL service default user which is an admin. I then run basic migration commands and receive this
So, of course, my next step was to double check the default data locations of SQL since it appears to be trying to store it in C:\Users root?! I have never had to bother touching this during install, but worth a look. And of course, they are as I expected in their default locations of C:\Program Files\ etc
Rapidly running out of things to try at this point - and considering this is a fresh windows 10 install, with bare-bones vs 2017 and SQL express 2017 it feels a lot like a bug here. Everything is a default if you were to File -> New Project -> .NET Core Web Application with INdividual accounts.
Anyone have any thoughts or things worth trying? Why is it trying to store my DB in C:\Users? Connection string -
"DefaultConnection": "Server=(localdb)\\mssqllocaldb;Database=aspnet-WebApplication3-53bc9b9d-9d6a-45d4-8429-2a2761773502;Trusted_Connection=True;MultipleActiveResultSets=true"
Thanks!
-Marc
You are not using SQL Server Express.
The local string says (localdb), which is the SQL Server engine running in user space. This is a big difference. Usually SQL Server or SQL Server express runs as service. (localdb) is not an alias for localhost (loopback address). It's a special name for a minified version of SQL Server which runs in user space.
When SQL Server runs as a service, it needs to have read/write permissions to the folder it writes. This is usually NOT THE CASE when the file is located within the User folder.
LocalDB on the other side, is always started when you start debugging your application and runs with the permissions of the user. So if your file was created by an admin user or outside of a directory you have write permissions.
Also, when you mount a database to SQL Server (Express), then the file is protected from write access to other applications, so LocalDB can't open it neither.
LocalDB is made for development to offer most of the SQL Server features but without all the hard setup and permanently running service in the background.
Essentially you have two options:
Use the SQL Server connection string as #TanvirArjel suggested
Detach the database from SQL Server express, copy it to your user folder (C:\Users\<myusername>\) and then correct the path to it
Then it should just work.
Notice that LocalDB is not meant to run in production, so you will likely experience issues when trying to run it in IIS (IIS Express and Console applications and WPF work fine).
Reasons for LocalDb not working with IIS is because ASP.NET (Core) applications within IIS run with a special user, but LocalDbs are always created in the users profile folder. Now, the accounts used by IIS don't have a profile and can't create the database and can't access any database outside (since localDbs are stored in user folder only the user who created it has access to it).
Here some source on it and the reasons behind it.
Using LocalDb with IIS
Write the connection string as follows.Hope it will work...
"DefaultConnection": "Server=YourPcName\\SQLExpressInstanceName;Database=aspnet-WebApplication3-53bc9b9d-9d6a-45d4-8429-2a2761773502;Trusted_Connection=True;MultipleActiveResultSets=true"

Visual Studio web and SQL publish

Here's the scenario:
MVC web project
Three MS SQL Server database projects
One of the databases must be populated with lookup tables
Other tables are user data and don't need data uploading
GoDaddy hosting
Visual Studio 2013
I'd like to deploy everything (web project, sql schemas, reference data) to GoDaddy in one fell swoop, but they appear to only offer FTP uploading. When using FTP in the Web Publishing Wizard, it says "Database preview not supported for this method" which I'm taking "method" to mean FTP. I can publish the web project fine in FTP, but of course without the databases the web application generates errors.
So here are my questions
There is a "Web Deploy" publish method listed in the wizard, but GoDaddy has no information on how to set this up. Can this be used with GoDaddy and will it publish DBs also?
How does one configure the project to use the local SQL Server when running on localhost, but when deployed it uses the GoDaddy SQL Servers?
Can the data in the local DB be uploaded as part of the publishing wizard process, or is SQL Server Management Studio the tool of choice?
Thanks!
I don't believe GoDaddy supports WebDeploy. They didn't when I left their service a few years ago. You can talk to them to confirm whether this has changed.
This is the role of Web.Config Transforms. For an intro to the topic, see here; the article is a little out of date and doesn't mention one of the most useful points - you can add transforms for each publish profile, so they're applied according to your publish settings.
You probably can't upload the local DB file. In almost every hosting situation, the SQL server and the web server are two separate machines, and don't share any files (corollary: the web server doesn't have the SQL service installed). One workaround you can try is to publish the DB directly from your own machine. That is, if you can connect to the DB from your machine, you can do a Web Deploy publish to your own machine but it will send the SQL changes to your GoDaddy DB server.
A more advanced workaround for #3:
Set up your FTP publishing settings for your files
Figure out how to publish your DB through WebDeploy only/from the command line (you can refer to here for a sample using WebDeploy from the command line; note this is going from GoDaddy -> Local, but it's trivial to turn it around)
Customize the web publish pipeline to insert an MSBuild target to execute your WebDeploy command line (see here for an example of modifying the pipeline; you can add the target directly in your .pubxml file if you're not intending to use it for multiple projects).
This will give you a single publish profile which will separately publish your files (via FTP) and your DB (via WebDeploy).

Migrating dotnetnuke from development to test server

I am a newbie with DotNetNuke and have been stumbling on how to deploy from the development server to the deployment server. For starters my development and deployment servers are one and the same machine. Here are the steps that I did:
DNN Setup
Downloaded DNN using WebMatrix.
Launched DNN and proceeded with the installation wizard, which is basically just testing the environment and then creating the DNN database.
After the wizard's installation launching DNN will now proceed to the Getting Started page
Added "localhost/dnn" in the site alias list
Moving to ISS
In IIS I added application (folder) DNN in the web root
I copied all the files from the original webmatrix path to the dnn folder in c:\inetpub\webroot making sure that the file/folder hierarchies are the same
Result:
When launching DNN using my browser I am directed to the installation wizard page instead of the Getting Started page. What am I missing?
Thanks!
Confirm that the permissions on the folder containing DNN are the same on your test server as they are on your development server. (I give Network Service read/write and IUsr Read/Execute)
Confirm that the application pool running your application has the proper identity (Network Service is suggested) and is running the proper .NET Framework version; based upon your question, I think you are set on this.
Gain access to your web.config file. You will see a ConnectionStrings section. You probably need to update the connection strings.
If your test server runs off of a different database than your dev server, figure out the connection string of your test server and update your connection string accordingly
You may be able to restore a .BAK file of your DB to your test server
If you do this, you will probably need to (in SSMS) edit your Portal Alias table to include the host name that you are using in your test server environment. Examples: Maybe you access the site via localhost/ on your dev environment, but you access to test site via test.Ronald.com? test.Ronald.com would be your PortalAlias
If your test server runs off the same database server as your dev server, it sounds like you need to open up access in firewalls so that your test server has connectivity to your dev database
A word of advice
Once you get it running, you will be making changes to each database separately (assuming your test site and dev site use different DB Servers). This sync issue can be a royal pain with DNN, as your page structures, module assignments, html module contents, installed modules/extensions will get out of sync. While restoring backups is nice, it is not a very good long-term solution. I recommend database-syncing tools
The problem is most likely a problem with DNN not being able to find the Database. If it can't find the database, it will run the wizard in order to create one.

Resources