We use web deploy packages to create and deploy a SQL Server database, when running it we encounter on a specific stored procedure the following error:
Error: An error occurred during execution of the database script. The
approximate location of the error was between lines '1481' and '1586'
of the script. The verbose log may have more information about the
error. The command started with :
CREATE PROCEDURE [dbo].[csp_MyCoolStoredProcedure
Error: Transaction (Process ID 53) was deadlocked on lock resources with another process
and has been chosen as the deadlock victim. Rerun the transaction.
Error count: 1."
When we run the database script that is in the package using SQL Server Management Studio everything works fine so there are no errors in the script, anyone has an idea on what is going wrong, the verbose option doesn't give any better error message :(
inside the project file, set transaction="false" for every sql script element
Related
I'm having issues using Visual Studio 2019 to publish a database project to a target server where the database does not yet exist. During the publish process, the following error happens:
(46075,1): SQL72014: .Net SqlClient Data Provider: Msg 8623, Level 16, State 1, Line 14 The query processor ran out of internal resources and could not produce a query plan. This is a rare event and only expected for extremely complex queries or queries that reference a very large number of tables or partitions. Please simplify the query. If you believe you have received this message in error, contact Customer Support Services for more information.
(46062,0): SQL72045: Script execution error. The executed script:
The error does not appear to be related to specific SQL as the error message would suggest. If I comment out a script that generates the error, the error shows up in the next script in the sequence. Overall, the publish script produced by VS2019 is approximately 72k lines. The error pops up after approximately 46k lines.
EDIT:
Server details:
SQL Server 2019 Developer edition (15.0.2000.5). 4 processors, 16 gb memory
This was my system having the problem that Anthony posted about for me. We've now figured out the cause. The root cause was a post deployment script to populate a table with initial values. The script uses a reasonably complex merge statement and tried to insert about 18,500 rows of data.
We were thrown off initially because the error output pointed to a different script in the set, not the one causing a problem. Evidently just what was in the error buffer when it burped.
I'm using Azure DevOps SQL Server database deploy task to run a SQL Server Agent Job using its inline SQL feature which works.
However if the job fails I have some logic that generates an error from the SQL script as per below. When the script errors, that error correctly gets displayed in the task logs, however it doesn't fail the whole task but rather returns a succeeded status. This confuses the developer as they think their changes are not breaking the job when in fact they are.
Is there anyway to bubble up the error generated form the SQL script so that the SQL Server database deploy task returns a failed status instead of a succeeded status.
Please see screen shots and YAML of the release issue below:
steps:
- task: SqlDacpacDeploymentOnMachineGroup#0
displayName: 'Run SQL Agent Job'
inputs:
TaskType: sqlInline
InlineSql: |
/* SQL Agent Job Logic Here... */
/* Raise error if job fails */
RAISERROR('##vso[task.logissue type=error]]%s job has either failed or is not in a good state!', 16, 1, 'JobName') WITH LOG
DatabaseName: master
AdditionalArgumentsSql: '-QueryTimeout 1800 -IncludeSqlUserErrors -OutputSqlErrors 1 -Verbose'
Error in Azure DevOps SQL Server database deploy task not failing pipeline
You could try to use the Logging Commands to fail build from your Inline Sql scripts:
##vso[task.complete result=Failed;]DONE
##vso[task.complete]current operation
result=Succeeded|SucceededWithIssues|Failed
Finish timeline record for current task, set task result and current
operation. When result not provided, set result to succeeded. Example:
##vso[task.complete result=Succeeded;]DONE
When I try to deploy an SQL Server Database using SqlPackage I get a strange error.
The call is:
SqlPackage.exe /Action:Publish /SourceFile:myDb.dacpac /TargetServerName:myServer /TargetDatabaseName:myDb /OverwriteFiles:True /Diagnostics:True
The error is:
Microsoft.Data.Tools.Diagnostics.Tracer Error: 1 : Error detected when
reverse engineering the database. Severity:'Warning' Prefix:'' Error
Code:'0' Message:The permission 'EAES' was not recognized and was not
imported. If this problem persists, contact customer support.
Microsoft.Data.Tools.Diagnostics.Tracer Error: 1 : Error detected when
reverse engineering the database. Severity:'Warning' Prefix:'' Error
Code:'0' Message:The permission 'EAES' was not recognized and was not
imported. If this problem persists, contact customer support.
Microsoft.Data.Tools.Diagnostics.Tracer Error: 19 :
Microsoft.SqlServer.Dac.DacServicesException: An error occurred during
deployment plan generation. Deployment cannot continue. Error
SQL72018: Permission could not be imported but one or more of these
objects exist in your source.
---> Microsoft.Data.Tools.Schema.Sql.Deployment.DeploymentFailedException:
Errors occurred while modeling the target database. Deployment can
not continue.
at Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment.ThrowIfErrors(String
message, ErrorManager errors, Object category)
at Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeploymentEndpointServer.OnLoad(ErrorManager
errors, DeploymentEngineContext context)
at Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment.PrepareModels()
at Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment.InitializePlanGeneratator()
at Microsoft.Data.Tools.Schema.Sql.Deployment.SqlDeployment.CreateController(Action`1
msgHandler)
at Microsoft.SqlServer.Dac.DacServices.CreateController(SqlDeployment
deploymentEngine, ErrorManager errorManager)
--- End of inner exception stack trace ---
Initializing deployment (Failed)
Microsoft.Data.Tools.Diagnostics.Tracer Information: 0 : SqlPackage completed
When I do a schema compare in Visual Studio Data Tools, only one column is changed, but no permissions. When I use SSMS to look for that permission 'EAES', I cannot find it.
This solution worked for a while without problems.
What is that permission 'EAES' or how can I make the SqlPackage execution succeed again?
This is a bug in Visual Studio. See https://feedback.azure.com/forums/908035-sql-server/suggestions/32896864-grant-execute-any-external-script-not-recognised-b
It will work again if you revoke the permission to execute R scripts in SQL Server
REVOKE EXECUTE ANY EXTERNAL SCRIPT TO [myUser]
Try to update your SQLPackage.exe, my problem was solved when I did this.
I got this problem when I was trying to publish in a SQL Server 2019 database, got the latest version from it in https://learn.microsoft.com/pt-br/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver15 and it worked like a charm.
I occasionally get this error message on a Tridion 2011 SP1 development machine:
There is insufficient system memory in resource pool 'internal' to run this query.
Unable to save Schema (tcm:0-0-0). A database error occurred while executing Stored Procedure "EDA_ORG_ITEMS_FINDUNIQUENESSCONFLICTS".EDA_ORG_ITEMS_FINDUNIQUENESSCONFLICTS
Searching for the error message here on StackOverflow suggests rewriting the stored procedure to not use temporary tables. Without doing that, does anyone know how to get rid of this error message?
Although it is hard to be certain that the upgrade fixed it, the problem hasn't occurred anymore after upgrading SQL Server 2008 R2 to SP1.
I implemented SSIS package which moves data from Sql Server database to another one. This package has set of Data Flow Tasks which copy data simultaneously in different tables. Each Data Flow Task contain OLE DB datasource and Sql Server destination.
Package worked fine until I decided to implement transaction. I found that it is not possible to just set TransactionOption to Supported on package level, because SSIS cannot handle transactions in multiple simultaneous processes. So, I decided to use this way:
http://consultingblogs.emc.com/jamiethomson/archive/2005/08/20/SSIS-Nugget_3A00_-RetainSameConnection-property-of-the-OLE-DB-Connection-Manager.aspx
But now I have another problem. I have "Unable to bulk copy data. You may need to run this package as an administrator" errors. These errors occur in random places. For example if I ran package in the first time Data Flow Task named "Task A" can be executed correctly, but when I run package in the second time it can throw the error.
How do I can implement transaction in my case? (Changing of package in order to perform execution of Data Flow tasks sequentially is not an option)
I got a recent error with our MS SQL Server 2008R2 and SSIS. Found the error:
[SQL Server Destination [16]] Error: Unable to bulk copy data. You may need to run this package as an administrator.
[SSIS.Pipeline] Error: component "SQL Server Destination" (16) failed the pre-execute phase and returned error code 0xC0202071.
but could not solve it with running as Admin. The error only came with one step and I finally found out that I get rid of the error when I increased the timeout of the SQL Server Destination. Funny is that with the read of external ADO NET Source I get a proper error that helped me to see the timeout is the problem.