How could our database be causing SqlPackage to fail? (SQL72018) - sql-server

Hoping someone else has come across this, because Google returns only nine results for this error! Information about SqlPackage seems a little scant still.
We're currently going through the process of migrating to a continuous deployment environment. As part of this, we're using database projects to store database schema and the build server is using SqlPackage.exe to generate an upgrade script by comparing each project's .dacpac file with its associated schema template database that's hosted on the server.
We have six databases so far (with more to come) and they all work fine apart from one, which throws the following error when SqlPackage is modelling the 'target' database:
Error SQL72018: Trigger could not be imported but one or more of these objects exist in your source.
The only thing we can think of is that it's a problem with the size of the target database; perhaps SqlPackage is running out of memory? It's the largest database schema we have, so it's certainly feasible. If it's down to a memory limitation of SqlPackage, how do we go about increasing it?
We're going to start removing objects from the target database and the source project to see if we can establish whether it's down to size or a specific schema object, but any ideas and suggestions in the meantime would be greatly appreciated!
Update
I just tried removing all triggers from the target database, and now it spits out the upgrade script with no errors. Next I'll try removing only half of them, and see if I can narrow it down to one specific trigger. I suspect it may simply be the size of the schema, which goes back to the SqlPackage memory question.

Okay .. worked it out.
We went through a process of dropping triggers until we narrowed it down to a single trigger that was causing the error. Turns out that there was something rather wrong with it - it seems that it's meant to be attached to a table, but isn't (i.e. it doesn't appear in the triggers list). So I'm guessing this minor corruption caused SqlPackage to fail.
In case anyone finds it useful, this is the script I used to drop ranges of triggers, which helped me find the culprit:
http://www.codeproject.com/Tips/662699/Drop-all-Triggers-belonging-to-any-schema-in-MS-SQ
Use ClaimsSqlPackageTest
DECLARE #SQLCmd nvarchar(1000)
DECLARE #Trig varchar(500)
DECLARE #sch varchar(500)
Declare #count int = 0
DECLARE TGCursor CURSOR FOR
SELECT ISNULL(tbl.name, vue.name) AS [schemaName]
, trg.name AS triggerName
FROM sys.triggers trg
LEFT OUTER JOIN (SELECT tparent.object_id, ts.name
FROM sys.tables tparent
INNER JOIN sys.schemas ts ON TS.schema_id = tparent.SCHEMA_ID)
AS tbl ON tbl.OBJECT_ID = trg.parent_id
LEFT OUTER JOIN (SELECT vparent.object_id, vs.name
FROM sys.views vparent
INNER JOIN sys.schemas vs ON vs.schema_id = vparent.SCHEMA_ID)
AS vue ON vue.OBJECT_ID = trg.parent_id
OPEN TGCursor
FETCH NEXT FROM TGCursor INTO #sch,#Trig
WHILE ##FETCH_STATUS = 0
BEGIN
SET #SQLCmd = N'DROP TRIGGER [' + #sch + '].[' + #Trig + ']'
If #count >= 155 AND #count <= 160 Begin
EXEC sp_executesql #SQLCmd
PRINT #SQLCmd
End
Set #count = #count + 1
FETCH next FROM TGCursor INTO #sch,#Trig
END
CLOSE TGCursor
DEALLOCATE TGCursor

Putting a trigger on a table that has the extended property 'microsoft_database_tools_support' is a simple way to produce this error. For example putting a trigger on the sysdiagrams table.
Apparently this extended property makes the object invisible to VS and the existence of a trigger on an invisible object is confusing. You can either make both objects visible by removing the microsoft_database_tools_support properties or make them both invisible by adding the property to the other object.
So in my example of putting a trigger named iut_sysdiagrams on sysdiagrams I'd use this:
EXEC sys.sp_addextendedproperty #level0type = N'SCHEMA' ,#level0name = [dbo]
,#level1type = N'TABLE' ,#level1name = [sysdiagrams]
,#level2type = N'TRIGGER' ,#level2name = [iut_sysdiagrams]
,#name = N'microsoft_database_tools_support' ,#value = 1;

Related

Exec tSQLt.Faketable crashing our original table constraint and data

During debugging the tsqlt code, I have directly run the below statement without wrapping it as sp and my original table constraint got deleted and some data missing from the original table.
Exec tSQLt.FakeTable #TableName = N'DBO.Employee', #Identity=1;
Exec tSQLt.FakeTable #TableName = N'DBO.Salary', #Identity=1;
How do I prevent running faketable statement in tsqlt is impacting the original table?
There is no way to prevent executing tSQLt.FakeTable outside of the framework. There are also good reasons to not prevent that, so I do not think that adding that functionality is the right approach.
However, if you’re using the newest version of tSQLt, you can use tSQLt.UndoTestDoubles to get the original object(s) back.
Ugh, been there... You can't prevent it, short of contributing to the project and putting a pull request in to add the functionality.
FakeTable creates a backup of your original table so you should be able to get the original table back. These backup table names start with tSQLt.tempobject and end in an identifier. You can delete the new "fake" table (which now has the name of your original table) and rename the tempobject table if/when you find it.
Something I've done in the past is to query for a column that I know is in the table to find the name of the tSQLt table:
SELECT t.name
FROM sys.columns c
INNER JOIN sys.tables t ON t.object_id = c.object_id
WHERE c.name = 'SomeCol';

SQL - table schema change in trigger then update dependencies (views)

Here is the scenario (code is C#):
We allow for the creation of columns on our tables from within our app. When a record is put into a table (our metadata table, going forward called mdt), we create a column on a table that corresponds to the data we just created in mdt. We have a trigger on mdt that does the creation of this column. At the end of this trigger we are updating all the dependencies for the table that just got the newly created column.
In SSMS I can run all the SQL and it works, however in code, all the SQL runs, but the dependencies are not updated. The code runs all SQL in a transaction, however I have run tests in SSMS using transaction and still, the SQL in SSMS works while the code in our app does not. By not work, I mean that the dependencies do not refresh. The column creation works which means that the trigger is firing in both cases which means that the code to try to regenerate the view dependencies is running.
So, in short:
Insert record > Fire trigger > Create Column > Refresh View Dependencies
SSMS - everything works
From code - everything works, except that the refreshing of the view dependencies is not actually refreshing the views
My first thought is that there is a difference in the way the transactions are being done (between code and SSMS), however I added code in the app that ran the script to refresh the view dependencies outside of the transaction that would end up creating the new column and still, the dependencies are not updated, however if I run the same SQL in SSMS, the dependencies are updated fine.
Any ideas would be appreciated.
This is the code we are using to refresh the views:
IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[uspRefreshViewDependencies]') AND type in ('P'))
DROP Procedure [dbo].[uspRefreshViewDependencies]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [dbo].[uspRefreshViewDependencies](
#nvcViewName nvarchar(776),
#bitIncludeCustomerObjects bit = 1
)
AS
/*
This procedure will correct View and UDF dependencies after columns have changed. Given the name of a View
for #nvcViewName, it will detect all of the Views and UDFs (both table-type and inline-table) that reference
them, and call sys.sp_refreshsqlmodule to update them. It runs recursively (so it can correct dependencies
on dependencies on dependencies, and so on.)
if #bitIncludeCustomerObjects is false, the algorithm will skip custom database objects (anything in the dbo
schema that starts with the letter "c".)
This was created for managing Entity Views but it should also work for tables and UDFs.
*/
DECLARE c CURSOR FORWARD_ONLY STATIC FOR
WITH dependencies (referencing_id, referencing_object_name, type, level)
AS
(
SELECT referencing_id, OBJECT_SCHEMA_NAME(referencing_id) + '.' + OBJECT_NAME(referencing_id), SO.type, 1 as level
FROM sys.dm_sql_referencing_entities(OBJECT_SCHEMA_NAME(OBJECT_ID(#nvcViewName)) + '.' + OBJECT_NAME(OBJECT_ID(#nvcViewName)), 'OBJECT') sre
INNER JOIN sys.objects SO ON sre.referencing_id = SO.object_id
WHERE SO.type IN ('V', 'TF', 'IF', 'P')
AND (
#bitIncludeCustomerObjects = 1
OR OBJECT_NAME(referencing_id) NOT LIKE 'c%')
UNION ALL
SELECT SED.referencing_id, OBJECT_SCHEMA_NAME(SED.referencing_id) + '.' + OBJECT_NAME(SED.referencing_id), SO.type, level + 1
FROM dependencies D
INNER JOIN sys.sql_expression_dependencies SED ON D.referencing_id = sed.referenced_id
INNER JOIN sys.objects SO ON SED.referencing_id = SO.object_id
WHERE SO.type IN ('V', 'TF', 'IF', 'P')
AND (
#bitIncludeCustomerObjects = 1
OR OBJECT_NAME(SED.referencing_id) NOT LIKE 'c%')
)
SELECT referencing_object_name, type, Max(level) as level
FROM dependencies
Where (OBJECT_SCHEMA_NAME(referencing_id) <> 'Custom')
Group By referencing_object_name, type
ORDER BY level
DECLARE #nvcDependency nvarchar(776)
DECLARE #nvcType nvarchar(30)
DECLARE #intLevel Int
OPEN c
FETCH NEXT FROM c INTO #nvcDependency, #nvcType, #intLevel
WHILE ##FETCH_STATUS = 0
BEGIN
PRINT 'Refreshing ' + #nvcDependency
EXEC sys.sp_refreshsqlmodule #nvcDependency
FETCH NEXT FROM c INTO #nvcDependency, #nvcType, #intLevel
END
CLOSE c
DEALLOCATE c
Go
I have found out that when running this in SSMS that everything runs as expected, however when run from C#, that the CTE seems to not return any results and never gets into the cursor.

Data loss warning adding column in the middle of a table

A new column has been added to a table, but the new column was not added to the end of the table definition (rightmost column), but the middle of the table.
When I try to commit this in Redgate SQL Source Control, I get the warning "These changes may result in data loss"
Will data loss really occurr?
Is there a way preview the change script to confirm that no data will be lost?
Can I copy the script and easily turn it into a Migrations V2 script?
Will I just have to
Edit the table in SSMS and move the new column to the end
or write a migration script?
If so, are there any handy tools to do the repetitive stuff?
Up front disclosure that I work for Red Gate on SQL Source Control.
That change will need to re-create a table. By default SSMS won't let you save that change. However that option must have been disabled in SSMS. It's under Tools->Options->Designers->Table and Database Designers->Prevent saving changes that require a table re-creating.
Given that feature is disabled SQL Source Control has then picked that up as a potential data loss situation, and prompted to see if you want to add a migration script.
If other developers within your team pull this change in through a get latest, then SQL Source Control will let them about any potential data loss with more details, depending on the current state of their local database. If the only change is adding columns to an existing table then this will not drop the data in columns that are unchanged.
If you are deploying to another DB (e.g. staging/UAT/prod) and you have SQL Compare you can use that to see exactly what will be applied to a DB if you try and run this against another non-local database. Choose the create deployment script option and you can sanity check the SQL before running.
As you say adding the column to the end of the table will avoid the need for the rebuild, so is probably the simplest way to avoid this if you don't need to worry about where the column is.
Alternatively you can add a migration script to:
Create a new table with the new structure using a temp name
Copy the existing data to the temp table
Drop the existing table
Rename the new temp table to the original name
You mention Migrations v2, the beta feature that changes how migrations work in order to better support branching and merging and DVCS systems. See http://www.red-gate.com/migrations
Version 1 migration scripts will need some modifications in order to be converted to a v2 migration script. It's a fairly trivial change. We're working on documenting this at the moment, and please reach out to us on the Google Group if you'd like more information on this change. https://groups.google.com/forum/#!forum/red-gate-migrations
I moved the column to the end of the table using SSMS to negate the need for a migration script.
In a similar scenario, where it was not convenient to move the column, this is what I did to convert an SSMS script to a Migrations V2 script.
Undo the change in SSMS (deleted the column)
Redo the change in SSMS, but instead of saving the change direct to the database, I saved the change script
Modified the change script
Trimmed the SSMS transaction & environment wrapper
Added a guard clause: IF COL_LENGTH('MyTable','MyColumn') IS NULL
Wrapped the script in BEGIN TRAN - ROLLBACK TRAN to test the script without dirtying the database
Replaced GO with END BEGIN
Tested within rolled-back transaction
Removed BEGIN TRAN - ROLLBACK TRAN development wrapper
Here is the simple sql query that will help to insert column in database table without data loss.
Lets say CCDetails is the table in which we want to insert column GlobaleNote just before column Sys_CreatedBy:
declare #str1 nvarchar(1000)
declare #tableName nvarchar(1000)
set #tableName='CCDetails'
set #str1 = ''
SELECT #str1 = #str1 + ', ' + COLUMN_NAME
FROM Information_Schema.Columns
WHERE Table_Name = #tableName
ORDER BY Ordinal_Position
set #str1 = right(#str1, len(#str1) - 2)
set #str1 = 'select ' + #str1 +' into '+#tableName+'Temp from '+#tableName+' ; Drop Table '+ #tableName + ' ; EXEC sp_rename '+#tableName+'Temp, '+#tableName
set #str1 = REPLACE(#str1,'Sys_CreatedBy','CAST('''' as nvarchar(max)) As GlobaleNote , Sys_CreatedBy' )
exec sp_executesql #str1

SQL Server 2008 - sp_refreshview bombing out on some views

I've inherited a fairly substantial project, which makes extensive use of SQL Server (2005 and 2008) views.
One step in the build process is the call the sp_refreshviews system stored procedure, to make sure, no changes on any tables have broken our views. This works fine .... except for about three or four (out of 200+) views....
With those, it just bombs out - gives odd error messages like
Msg 15165, Level 16, State 1,
Procedure
sp_refreshsqlmodule_internal, Line 55
Could not find object
'vYourViewNameHere' or you do not
have permission.
which is dead wrong - that view does exist, and I definitely can select from it.
I cannot seem to find any good concise information about why this happens, what triggers it... any ideas? Is there anything I could do to detect such problematic views? Can I change their definitino so that they'd be refreshable again?
Update: I logged a bug report on Microsoft Connect for this - if you agree this seems odd and needs to be fixed, please vote for it!
https://connect.microsoft.com/SQLServer/feedback/details/676728/sp-refreshview-crashes-with-misleading-error-on-views-with-schemabinding
I noticed in the comments you mention it has SCHEMABINDING. I can almost guarantee that is the issue. Books online specifically says this is for use on non-schema bound views.
A scheme-bound view wouldn't allow a breaking change to occur so updating the meta-data is un-necessary. You can safely skip it.
You can identify all the schemabound views like this:
SELECT * FROM sys.views WHERE OBJECTPROPERTY(object_id, 'IsSchemaBound')=1
I ran into the same error when using sp_helptext. In my case the cause was using sp_rename to rename the view. The following code reproduces this error.
create view demo as select dummy = 1
go
exec sp_rename 'demo', 'new_demo'
go
exec sp_refreshview 'new_demo'
go
The only solution is to manually alter the view. Apply this fix to the above solution and you get:
create view demo as select dummy = 1
go
exec sp_rename 'demo', 'new_demo'
go
-- This statement fixes the problem
alter view new_demo as select dummy = 1
go
exec sp_refreshview 'new_demo'
go
My incarnation of this error was:
Msg 8116, Level 16, State 1, Procedure sp_refreshsqlmodule_internal,
Line 75 Argument data type int is invalid for argument 1 of substring
function.
This error message was being reported at various places in the db script. I would say wrong places. If I commented out the SQL this error was reported at, the same error would be reported elsewhere.
I commented out the following call in my script as a workaround, and the script would complete successfully.
-- EXECUTE sp_refreshview #viewName;
Note: My database didn't report having an schemabound views when running the query suggested in RThomas' adjacent answer https://stackoverflow.com/a/6460532/179972
UPDATE - SOLUTION:
After our database script ran successfully with the sp_refreshview command commented out (shown above), we then ran the view refresh code on its own, and it was successful too.
--
This answer doesn't make sense to me as to how it was able to work successfully, however I've documenting it here in case it proves helpful to somebody else.
To find which view is your problem add a print to the normal sppRefreshViews. Nothing earth shattering here, but i thought I would share.
CREATE procedure sppRefreshViews2
as
declare #t varchar (1024)
declare tbl_cur cursor for
select TABLE_NAME from INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'VIEW' and table_name like 'sp%'
OPEN tbl_cur
FETCH NEXT from tbl_cur INTO #t
WHILE ##FETCH_STATUS = 0
BEGIN
print #t
exec ('sp_refreshview ''' + #t + '''')
FETCH NEXT from tbl_cur INTO #t
END
CLOSE tbl_cur
DEALLOCATE tbl_Cur

MSSQL Database Cleanup - How do you find unused objects (Tables, Views, Procs, Functions)

Lets say you have inherited a MS SQL 2000 or 2005 database, and you know that some of the Tables, Views, Procs, and Functions are not actually used in the final product.
Is there some kind of internal logging or another mechanism that could tell me what objects are NOT being called? or have only been called a few times versus thousands of times.
This SO question, Identifying Unused Objects In Microsoft SQL Server 2005, might be relevant.
The answer will depend a little on how the database has been put together, but my approach to a similar problem was 3-fold:
Figure out which objects have no internal dependencies. You can work this out from queries against sysdepends such as:
select
id,
name
from
sys.sysdepends sd
inner join sys.sysobjects so
on so.id = sd.id
where
not exists (
select
1
from
sysdepends sd2
where
sd2.depid = so.id
)
You should combine this with collecting the type of object (sysobjects.xtype) as you'll only want to isolate the tables, functions, stored procs and views. Also ignore any procedures starting "sp_", unless people have been creating procedures with those names for your application!
Many of the returned procedures may be your application's entry points. That is to say the procedures that are called from your application layer or from some other remote call and don't have any objects that depend on them within the database.
Assuming the process won't be too invasive (it will create some additional load, though not too much) you can now switch on some profiling of the SP:Starting, SQL:BatchStarting and / or SP:StmtStarting events. Run this for as long as you see fit, ideally logging into a sql table for easy cross referencing. You should be able to eliminate many of the procedures that are called directly from your application.
By cross referencing the text data from this log and your dependent object list you will hopefully have isolated most of the unused procedures.
Finally, you may want to take your candidate list resulting from this process and grep your sourcecode base against them. This is a cumbersome task and just because you find references in your code doesn't mean you need them! It may simply be that the code hasn't been removed though it's now logically inaccessible.
This is far from a perfect process. A relatively clean alternative is to set up far more detailed (and therefore invasive) profiling on the server to monitor all the activity. This can include every SQL statement called during the time the log is active. You can then work back through the dependent tables or even cross-database dependencies from this text data. I've found the reliability of the log detail (too many rows per second attempting to be parsed) and the sheer quanitity of data difficult to deal with. If your application is less likely to suffer from this then it may be a good approach.
Caveat:
Because, so far as I'm aware, there isn't a perfect answer to this be particularly wary of removing tables. Procedures, functions and views are easily replaced if something goes wrong (though make sure you have them in source control before burning them of course!). If you're feeling really nervous why not rename the table and create a view with the old name, you've then got an easy out.
We can also find unused columns and table using following query. I tired to write cursor. Cursor will give you information aboout each column n each table.
declare #name varchar(200), #id bigint, #columnname varchar(500)
declare #temptable table
(
table_name varchar(500),
Status bit
)
declare #temp_column_name table
(
table_name varchar(500),
column_name varchar(500),
Status bit
)
declare find_table_dependency cursor for
select name, id from sysobjects where xtype ='U'
open find_table_dependency
fetch find_table_dependency into #name, #id
while ##fetch_Status = 0
begin
if exists(select top 1 name from sysobjects where id in
(select id from syscomments where text like '%'+#name +'%'))
insert into #temptable
select #name, 1
else
insert into #temptable
select #name, 0
declare find_column_dependency cursor for
select name from syscolumns where id = #id
open find_column_dependency
fetch find_column_dependency into #columnname
while ##fetch_Status = 0
begin
if exists(select top 1 name from sysobjects where id in
(select id from syscomments where text like '%'+#columnname +'%'))
insert into #temp_column_name
select #name,#columnname, 1
else
insert into #temp_column_name
select #name,#columnname, 0
fetch find_column_dependency into #columnname
end
close find_column_dependency
deallocate find_column_dependency
fetch find_table_dependency into #name, #id
end
close find_table_dependency
deallocate find_table_dependency
select * from #temptable
select * from #temp_column_name

Resources