Getting 'Invalid object name' in stored procedure using Azure release pipeline - sql-server

I'm trying to deploy database changes to a Windows server using an Azure release pipeline. I'm using a DACPAC file with table data included.
This is my code to generate the DACPAC which I run in a build pipeline:
sqlpackage `
/Action:Extract `
/SourceConnectionString:"Data Source=localhost;Initial Catalog=Shipping;Integrated Security=False;Persist Security Info=True;User ID=username;Password=password" `
/TargetFile:"$PublishDir/Shipping.dacpac" `
/p:IgnoreUserLoginMappings=True `
/p:VerifyExtraction=False `
/p:ExtractAllTableData=True
In the release pipeline I have a deployment group job that contains the following steps:
Disable triggers
Disable constraints
Deploy DACPAC
Enable triggers
Enable constraints
I use the SQL Server Deployment task for the Deploy DACPAC step. Every time it gets to that step this error is thrown:
*** Could not deploy package.
Error SQL72014: Framework Microsoft SqlClient Data Provider: Msg 208, Level 16, State 1, Procedure vGetTransportPackages, Line 31 Invalid object name 'FreightDataStore.TP.TransportPackage'.
Error SQL72045: Script execution error. The executed script:
CREATE VIEW [fds].[vGetTransportPackages] AS WITH PACKAGES AS (SELECT DISTINCT tp.transportPackageID AS TransportPackageID, tp.tpID AS CarTransportPackageNumber, tp.airWayBillNo AS AirwayBillNumber, tp.pod AS DestinationPort,
(SELECT TOP 1 tl.LocationName FROM fds.TransportLocation AS tl WHERE tl.UnLocCode = tp.pod) AS DestinationPortName, pod.countryCode AS DestinatonPortCountry, (SELECT TOP 1 c.CountryName FROM ref.Countries AS c WHERE c.CountryCode = pod.countryCode) AS DestinationPortCountryName, tp.poe AS OriginPort, (SELECT TOP 1 tl.LocationName FROM fds.TransportLocation AS tl WHERE tl.UnLocCode = tp.poe
I'm relatively new to database deployment. Do I need to change the way I generate the DACPAC?

I figured out the issue. I didn't install the other databases on the server that my Shipping database depends on. Once I installed FreightDataStore and a few other databases I was able to deploy the DACPAC successfully.

Related

Dacpac deployment on Azure Sql server: "Cannot perform delete operation because following scope(s) are locked"

I am trying to deploy a .dacpac on my newly created azure sql database.
Both the server and the database have a lock on top of them:
I am deploying my dacpac through Azure Devops Pipeline by using SqlAzureDacpacDeployment task.
Everything seems in place except the fact that when it tries to deploy it gives me the error:
##[error]*** An error occurred during deployment plan generation. Deployment cannot continue.
##[error]A project which specifies SQL Server 2016 as the target platform cannot be published to Microsoft Azure SQL Database v12.
##[error]The scope '/subscriptions//resourceGroups//providers/Microsoft.Sql/servers//firewallRules/' cannot perform delete operation because following scope(s) are locked: '/subscriptions//resourceGroups//providers/Microsoft.Sql/servers/*****'. Please remove the lock and try again.
My question is: why is my Azure Pipeline deployment step trying to delete the database? My dacpac only creates tables so far, and even in the remote case my dacpac was removing tables or anything else, the lock in on the database level and it feels like the deployment step wants to delete the database itself. What am I missing?
The step in my yaml deployment is:
- task: SqlAzureDacpacDeployment#1
inputs:
azureSubscription: '****'
AuthenticationType: 'server'
ServerName: '****.database.windows.net'
DatabaseName: '****'
SqlUsername: '****'
SqlPassword: '****'
deployType: 'DacpacTask'
DeploymentAction: 'Publish'
DacpacFile: '****.Db.dacpac'
IpDetectionMethod: 'AutoDetect'
Azure Pipeline deployment step doesn't try to delete the database. Task SqlAzureDacpacDeployment has a parameter Delete Rule After Task Ends*: If selected then after the task ends, the IP Addresses specified here are deleted from the 'Allowed IP Addresses' list of the Azure SQL Server's Firewall., and this parameter is enabled by default.
This task is open source on GitHub, you can check it in the link below:
https://github.com/microsoft/azure-pipelines-tasks/tree/master/Tasks/SqlAzureDacpacDeploymentV1

Error in Azure DevOps SQL Server database deploy task not failing pipeline

I'm using Azure DevOps SQL Server database deploy task to run a SQL Server Agent Job using its inline SQL feature which works.
However if the job fails I have some logic that generates an error from the SQL script as per below. When the script errors, that error correctly gets displayed in the task logs, however it doesn't fail the whole task but rather returns a succeeded status. This confuses the developer as they think their changes are not breaking the job when in fact they are.
Is there anyway to bubble up the error generated form the SQL script so that the SQL Server database deploy task returns a failed status instead of a succeeded status.
Please see screen shots and YAML of the release issue below:
steps:
- task: SqlDacpacDeploymentOnMachineGroup#0
displayName: 'Run SQL Agent Job'
inputs:
TaskType: sqlInline
InlineSql: |
/* SQL Agent Job Logic Here... */
/* Raise error if job fails */
RAISERROR('##vso[task.logissue type=error]]%s job has either failed or is not in a good state!', 16, 1, 'JobName') WITH LOG
DatabaseName: master
AdditionalArgumentsSql: '-QueryTimeout 1800 -IncludeSqlUserErrors -OutputSqlErrors 1 -Verbose'
Error in Azure DevOps SQL Server database deploy task not failing pipeline
You could try to use the Logging Commands to fail build from your Inline Sql scripts:
##vso[task.complete result=Failed;]DONE
##vso[task.complete]current operation
result=Succeeded|SucceededWithIssues|Failed
Finish timeline record for current task, set task result and current
operation. When result not provided, set result to succeeded. Example:
##vso[task.complete result=Succeeded;]DONE

ELASTIC_POOL not recognised by Visual Studio Database Project

I am creating a Database project for a SQL Database in Azure.
The Target Platform is Microsoft Azure SQL Database V12 which I have set in the project settings.
I then have a post deployment script containing the following code to add the database to an elastic pool
ALTER DATABASE [$(DatabaseName)] MODIFY ( SERVICE_OBJECTIVE = ELASTIC_POOL ( name = mypoolname ) )
Ref: https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-database-azure-sql-database?view=azuresqldb-current#b-moving-a-database-to-a-different-elastic-pool
Because my Post Deployment script is set to build (because it has to as part of being a post deployment script) I get the following error:
Error: SQL72007: The syntax check failed 'Incorrect syntax near
ELASTIC_POOL.' in the batch near:
But I know the syntax is ok, it is obviously not recognising the Azure TSQL.
Is it possible to do this as part of my Visual Studio Database Project?
I'm not sure how you're actually sending the query over to the database server, but in order to bypass a preliminary clientside syntax check you could simply use
exec sp_executesql N'ALTER DATABASE [$(DatabaseName)] MODIFY ( SERVICE_OBJECTIVE = ELASTIC_POOL ( name = mypoolname ) )'

Azure Import Error: The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'

For some reasons I cannot import new BACPACs from Azure. I still can import old ones.
This is the error message I get:
Internal Error. The internal target platform type SqlAzureV12DatabaseSchemaProvider does not support schema file version '3.1'.
I've tried this this solution , but it didn't help, because all my settings are already set up to default.
I also downloaded latest SSMS Preview, but on import it gives me other errors:
Warning SQL0: A project which specifies Microsoft Azure SQL Database v12 as the target platform may experience compatibility issues with SQL Server 2014.
Error SQL72014: .Net SqlClient Data Provider: Msg 102, Level 15, State 1, Line 1 Incorrect syntax near 'Admin'.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Error SQL72014: .Net SqlClient Data Provider: Msg 319, Level 15, State 1, Line 2 Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon.
Error SQL72045: Script execution error. The executed script:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
I have SSMS 2014 CU6 installed.
Any help would be much appreciated! Thank you!
Finally figured out what happened. It's a specific case, but maybe it helps someone else.
We tried to use elasic query to write queries across databases. To do it you need to create database scoped credentials. When package was imported, it tried to do the same locally and failed executing this:
CREATE DATABASE SCOPED CREDENTIAL [Admin]
WITH IDENTITY = N'Admin';
Since we decided to use different approach, I dropped scoped credentials and external data source (couldn't drop credentials without dropping data source):
DROP EXTERNAL DATA SOURCE Source
DROP DATABASE SCOPED CREDENTIAL Admin
Now everything is working again. Just be aware you cannot import database from Azure if it has scoped credentials created.
Make sure that you are using the new SQL Server Management Studio
https://msdn.microsoft.com/en-us/library/mt238290.aspx

How to check the SSIS package job results after it has completed its execution?

I have an SSIS package which imports the data into the SQL Server 2008 database. I have set up the schedule job in the SQL Server Agent to run that package. When I check the history, I could only see whether the job ran successfully or not. I could not see other messages apart from that.
I would like to know how many records are imported whenever the job is executed. How can I monitor that? Should I use the additional components in SSIS package or set some configurations in SQL Server Agent Job Setup?
I found some logging facilities in SQL Server Agent Job Setup but I am not sure it can fulfill my requirements or not.
If you are just interested in knowing the columns being processed and not interested with the info for further use, one possible option is making use of the SSIS logging feature. Here is how it works for data flow tasks.
Click on the SSIS package.
On the menus, select SSIS --> Logging...
On the Configure SSIS Logs: dialog, select the provider type and click Add. I have chosen SQL Server for this example. Check the Name checkbox and provide the data source under Configuration column. Here SQLServer is the name of the connection manager. SSIS will create a table named dbo.sysssislog and stored procedure dbo.sp_ssis_addlogentry in the database that you selected. Refer screenshot #1 below.
If you need the rows processed, select the checkbox OnInformation. Here in the example, the package executed successfully so the log records were found under OnInformation. You may need to fine tune this event selection according to your requirements. Refer screenshot #2 below.
Here is a sample package execution within data flow task. Refer screenshot #3 below.
Here is a sample output of the log table dbo.sysssislog. I have only displayed the columns id and message. There are many other columns in the table. In the query, I am filtering the output only for the package named 'Package1' and the event 'OnInformation'. You can notice that records with ids 7, 14 and 15 contain the rows processed. Refer screenshot #4 below.
Hope that helps.
Screenshot #1:
Screenshot #2:
Screenshot #3:
Screenshot #4:
use the below procedure for getting SSIS errors with execution id
CREATE PROCEDURE [dbo].[get_ssis_status] #EXECUTION_ID INT\n
AS
BEGIN
SELECT o.operation_id EXECUTION_ID
,convert(datetimeoffset,OM.message_time,109) TIME
,D.message_source_desc ERROR_SOURCE
,OM.message ERROR_MESSAGE
,CASE ex.STATUS
WHEN 4 THEN 'Package Failed'
WHEN 7 THEN CASE EM.message_type
WHEN 120 THEN 'package failed'
WHEN 130 THEN 'package failed' ELSE 'Package Succeed'END
END AS STATUS
FROM SSISDB.CATALOG.operation_messages AS OM
INNER JOIN SSISDB.CATALOG.operations AS O ON O.operation_id = OM.operation_id
INNER JOIN SSISDB.CATALOG.executions AS EX ON o.operation_id = ex.execution_id
INNER JOIN (VALUES (- 1,'Unknown'),(120,'Error'),(110,'Warning'),(130,'TaskFailed')) EM(message_type, message_desc) ON EM.message_type = OM.message_type
INNER JOIN (VALUES
(10,'Entry APIs, such as T-SQL and CLR Stored procedures')
,(20,'External process used to run package (ISServerExec.exe)')
,(30,'Package-level objects')
,(40,'Control Flow tasks')
,(50,'Control Flow containers')
,(60,'Data Flow task')
) D(message_source_type, message_source_desc) ON D.message_source_type = OM.message_source_type
WHERE ex.execution_id = #EXECUTION_ID
AND OM.message_type IN (120,130,-1);
END
Here's another approach for when SQL Server job history is not showing output from SSIS packages: use DTEXEC command lines.
(Upside: this approach puts the job's output where anyone else supporting it would expect to find it: in job history.
Downside for big packages: if you have a long SSIS package, with lots of tasks or components, and lots of output, then the job history will split package output into many lines of job history, making the approach in the previous answer--logging to a table--easier to read.)
To show SSIS package output in the job's View History:
(1) Change the job steps from type "SQL Server Integration Services Package", to "Operating system (CmdExec)",
(2) Use DTEXEC command lines, to execute the packages.
Example of command line:
DTExec /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
Note that if the SSIS package requires 32-BIT execution (true for exporting to Excel, for example), then use the DTEXEC utility in "Program Files (x86)" by fully qualifying it. Example, where the SQL Server application was installed on an "E:" drive, and where SQL Server 2014 is being used:
"E:\Program Files (x86)\Microsoft SQL Server\120\DTS\Binn\DTExec.exe" /DTS "\MSDB\myPkgName" /DECRYPT pkgPass /MAXCONCURRENT " -1 " /CHECKPOINTING OFF
If your SSIS packages are in the file system (as ".dtsx" files), then replace "/DTS" with "/FILE".
If your SSIS packages were placed in SSISDB (using the "project deployment model", which is available starting with SQL Server 2012, instead of the older "package deployment model"), then replace "/DTS" with "/ISSERVER".
Next, go into the job step's "Advanced" page, and make sure that the box is checked for "Include step output in history".
Lastly, consider your job step's "Run as": if your job steps "Run as" were already set to a proxy, on job steps of type "SQL Server Integration Services Package", then you already made that proxy active to the subsystem "SQL Server Integration Services Package". Now, to do command lines like the above, check the proxy's properties, and make sure it is also active to the subsystem "Operating system (CmdExec)".
MSDN reference: SSIS Output on Sql Agent history
If you have deployed the package to the database's Integration Services Catalog (rather than load it from a file system) you can easily get detailed reporting.
Open the catalog node in SQL Server Management Studio, right click the Package name, select Reports | Standard Reports | All Executions and see details about every step of the job and its subcomponents, including records imported.

Resources