Azure DevOps Pipeline Azure SQL Deploy won't work with SQL script (Failed to login) - sql-server

I've been trying to deploy a database via a pipeline from Azure DevOps to an Azure resource group.
I have an ARM template for my database server in my Repo along with my DACPAC file.
In my release pipeline I first deploy the database server with an admin user defined in the ARM template to my resource group.
Then I use the "Azure SQL DacpacTask" to deploy the database schema. Here I give the admin credentials and it works flawlessly.
The issue is that the customer doesn't want it deployed as a DACPAC but rather an SQL script. They've given me a "CREATE TO..."-script from their database, created inside MS SQL Studio, which is also in my Repo.
The "Azure SQL DacpacTask" inside my release pipeline has an option for using an SQL file instead of DACPAC, but it doesn't work for me.
But no matter how I do it, my pipeline fails when running the "Azure SQL DacpacTask" and all I get, even in debug-mode is "Login failed for user '***'".
I can connect to the SQL Server through MSSQL Studio on my local machine using the admin credentials defined in my ARM template.
I've tried by adding the agent's IP before running the SQL script, but with no success.
Can anyone point me in the right direction or maybe tell me what I'm doing wrong? Why is it that it keeps failing to log in?
They want it deployed with little to no human interaction. Is it doable only through the ADO pipeline?
EDIT
Additional info:
I have tried with hardcoded password and user with no luck.
If I manually create a database and then try to deploy one via script and pipeline, it fails because a database already exists, and not because of a failed login.
I've setup Audit in Azure which generates 2 files with little to no info. I'm not sure what I'm looking at.

Related

Azure Devops deploying web app to a server - how to also update database

We have a web app which due to budgetary reasons is running on a windows VM (IIS) with its database also running in SQL Server on the same server.
We have a build pipeline set up in Azure Devops which builds the web app and then creates an idempotent SQL migration file (We use Entity Framework), both the compiled app and the SQL file are copied into the build artefact.
We then have a release pipeline which deploys the web app into IIS on the server.
What I cant figure out is how to get the SQL file run into the database.
I have tried the "SQL Database Deploy" task but that seems to only want a .dacpac file, or the path of a SQL file which is already on the server - I dont seem to be able to give it a file that exists on the build machine to execute remotely.
I know that because we are using EF I could just make the application do its migrations on startup but that means that the app needs to run as a user with schema privileges which we don't really want, currently the app is only a data reader/writer.
Is there some mechanism that I can take the SQL script from the build artefact and run it on the remote VM? If not, what are my options for getting the file onto the VM so that I can run it using the "SQL Database Deploy" task? I don't want to deploy it with the web app as although its a small risk we don't want it lingering in a public folder.
Any help appreciated.

PowerBI - refresh from SQL Job (SSIS, PowerShell, C#?)

I need to figure out how to schedule refresh of PowerBI model just after SSIS packages (which loads data into underlying database) are completed successfully.
I cannot provide the script with credentials of an account which would refresh the PowerBI model published in workspace.
The SSIS packages are run in the context of Proxy account which credentials are already created in SQL Server instance.
I am wondering if there is a smart way to execute PowerBI refresh from as very last step of SSIS packages?
Other option would be to execute PowerShell script, but if I do it as SQL job step - the context of that script would be SQL Server Agent account (correct me if I am wrong). So I would have to either use the credentials in that PowerShell script (which I cannot do of course) or somehow read them from Proxy account?

Continous deployments for on prem databases with Azure DevOps

Everyone, I'm looking on a way to deploy updates to our on prem databases using Azure DevOps and I'm running into a roadblock on the release definition. I have my DACPAC ready to go, but not sure how to get that over to my on prem server.
I see the WinRM-SQL Server DB Deployment as a task, but not sure how to set that up. I have seen a couple of videos that use the SQL Server Database Deploy as an option, but it looks like that task has been deprecated, so it looks like I will need to use the WinRM-SQL task. So, could anyone point me in the right direction on how to set this task up to use my local SQL server or possibly a tutorial that help get me started?
You will also have to install a release agent on the target server where you will be deploying the database, assign it to a Deployment Group, create your release pipeline template and then run a release. I wrote a blog post about how to deploy a database to an on-prem SQL Server by leveraging Azure DevOps: https://jpvelasco.com/deploying-a-sql-server-database-onto-an-on-prem-server-using-azure-devops/
Hope this helps.
If you already created a Deployment Group, within your Pipeline:
Click to Add a new stage: [1]: https://i.stack.imgur.com/vc5TI.png
On the right side, (SELECT A TEMPLATE screen) type SQL in the search box
Select: IIS website and SQL database deployment, this will add a Stage with Two tasks: IIS deployment and SQL DB Deploy.
Delete the IIS Deployment Task
Configure the SQL DB Deploy task - It does not say it is deprecated.

SQL Project in VSTS Continuous Integration via WinRM

Here is my solution.
I have a DB project in it, which multiple publish profiles. When I run publish from Visual Studio Manually by loading any profile it works perfectly.
We have our IIS in Azure VM and SQL in another Azure VM
We have a Continuous builds setup on Visual Studio Team Services, which will deploy site to Azure VM.
To the same build definition, I added a WinRm - SQL Server DB Deployment
Here is my Buil defination for dacpac step
When I run my build I get the following error
Microsoft.PowerShell.Commands.WriteErrorException: Deployment on one
or more machines failed. System.Exception: Invalid Publish Profile [
xxxxxxx ] provided
I don't really find any references on how to give profile information and why is it failing. Any help would be greatly appreciated.
Update
In order to resolve this as per #Jason Ye - MSFT suggestion, I copied my publish profile to Azure SQL VM and copied its full path and updated the profile path in build definition.
Then I got stuck with the following issue
Microsoft.PowerShell.Commands.WriteErrorException: Deployment on one
or more machines failed. System.Exception: Publishing to database
'DbName' on server 'xxxxxx.eastus.cloudapp.azure.com'. * Could
not load package from '/*.dacpac'. Illegal characters in path.
Since dacpac will keep changing, I cannot copy it to server before. How to get the path.

SQL Server 2012 Integration Services failed when connecting thru SSMS

I had recently installed SQL server 2012 and I used mostly the default settings. Database works fine and I can happily connect using SSMS (SQL Server Management Studio) but when I connect to the Integration Services Server I get this message
Connecting to the Integration Services service on the computer
"localhost" failed with the following error: "Access is denied."
By default, only administrators have access to the Integration
Services service. On Windows Vista and later, the process must be
running with administrative privileges in order to connect to the
Integration Services service. See the help topic for information on
how to configure access to the service.
here is the screenshot
I am not sure why but I am the domain admin and have full rights over the server. Also why when I connect from my Desktop it can successfully connect, only if I connect from the server itself which gives me this issues. How do I fix this so that I can make SSMS on the server connect to its Integration Services instance.
As I understand it, User Access Control, or UAC, can basically intercept requests for your group membership so in this case, it appears it was preventing your membership getting passed to SQL Server.
Others have noted in their comments that you may still need to right click and run SSMS as an Administrator.
As noted by an astute observer "This is a quick-fix, not a real solution. People shouldn't just be running stuff as administrator. These security walls are in place for a reason" And I agree. UAC is designed to get Windows users into a Principle of least privilege mindset - only escalate to a powerful account when required. The issue is that SSMS is known to not "play well" with UAC. As I see it, this leaves you with three options
You can turn off UAC and get your work done
Leave UAC on and tell your boss you are unable to work
Write your own query tool that is not affected by UAC
Go to all programs Click on Microsoft SQL Server 2012 folder Right click on SQL Server Management Studio Click on Run as Administrator
This should take care of problem for now. (With this you need to always repeat the same process). To avoid this every time and for a more persistent solution you need to get permission(s). Please do the following process and you should be good.
In previous versions of SQL Server, by default when you installed SQL Server all users in the Users group had access to the Integration Services service. When you install the current release of SQL Server, users do not have access to the Integration Services service. The service is secure by default. After SQL Server is installed, the administrator must grant access to the service.
To grant access to the Integration Services service
Run Dcomcnfg.exe. Dcomcnfg.exe provides a user interface for modifying certain settings in the registry.
In the Component Services dialog, expand the Component Services > Computers > My Computer > DCOM Config node.
Right-click Microsoft SQL Server Integration Services 11.0, and then click Properties.
On the Security tab, click Edit in the Launch and Activation Permissions area.
Add users and assign appropriate permissions, and then click Ok.
Repeat steps 4 - 5 for Access Permissions.
Restart SQL Server Management Studio.
Restart the Integration Services Service.
(Source MSDN)
I hope this will help
Right Click on the Sql Server Management Studio and select Run as Administrator and try to connect
if it is installed on the local instance
You should check to see what user the SSIS Service is running under. Go to Start > Run > Type "services.msc" and scroll down to the SQL Server Integration Services 11.0 entry. Right click and check the properties to find out what user it's running under. The second tab should be the LogOn tab. Since you're just running on a local instance, you can set your user as the LogOn User account and SSIS will have the same permissions that you do.
Lost a day of work on that problem. My package has a .NET script task to copy file from a shared network folder to a local folder and I was stuck with the "access denied" exception every time I tried to execute the package from the server (Through SQL Studio). The package works fine when running locally.
Tried many things picked up here and there and at the end of the day what worked is to create a Job (owner is sa) which execute the package as SSISExecutor.
I have to mention that the file on the network has read access for everyone, and that I still don't understand what was wrong.

Resources